Uncertainty Budget Definitions

How to read and understand uncertainty budgets


In 2013,  business has been great! I increased my client-base, my bottom-line, and the awareness of my company, ISOBUDGETS LLC. As a result, I have received a lot of questions this year, from both clients and assessors, regarding the definitions of the terms used in my reports. Therefore, I have decided to create this post in order provide clarification on how my uncertainty budgets should be interpreted. Furthermore, new uncertainty budgets and reports created in 2014 will include a new section that provides the definitions of these terms.

The following words and phrases are defined by my organization to describe the use of such terms to perform uncertainty analysis. The basis of each definition has been derived from the definitions provided in the JCGM 200:2012, ‘The International Vocabulary of Metrology,’ but has been altered to describe how each value is quantified and what each observer should deduce from the reported information. Should you have any questions or comments, please post your questions or email me at [email protected].


Def – The variability (i.e. standard deviation) in measurement precision under replicate measurement conditions over a short period of time.


Def – The variability (i.e. standard deviation) in measurement precision under reproducible conditions of measurement which may include, but are not limited to, one or more of the following conditions; day-to-day variability, operator-to-operator variability, system-to-system variability, and/or location-to-location variability.


Def – The run to run variability (i.e. standard deviation) estimated from a set of data collected from replicate trials or the run to run variability (i.e. standard deviation) estimated from a set of data collected from replicate calibration events.


Def – estimate of systematic error calculated by the difference between the mean of replicate indications, reported in calibration reports, and a reference quantity value.


Def – estimate of incremental change, or difference, in metrological properties over time or between calibration events.


Def – smallest change in quantity being measured that causes a perceptible change in the corresponding indication.


Def – published accuracy specifications endorsed by a manufacturer or a reported tolerance specification endorsed by a calibration service provider.

Reference Standard Uncertainty

Def – estimated mean or reported value of uncertainty in measurement associated with the performance of calibration.

Reference Standard Stability

Def – estimated variability (i.e. standard deviation) of reference standard uncertainty associated with multiple calibration events.


About the Author

Richard Hogan

Richard Hogan is the CEO of ISO Budgets, L.L.C., a U.S.-based consulting and data analysis firm. Services include measurement consulting, data analysis, uncertainty budgets, and control charts. Richard is a systems engineer who has laboratory management and quality control experience in the Metrology industry. He specializes in uncertainty analysis, industrial statistics, and process optimization. Richard holds a Masters degree in Engineering from Old Dominion University in Norfolk, VA. Connect with Richard on LinkedIn.


Leave a Reply

Your email address will not be published. Required fields are marked *