Skip to content NREL Buildings Research National Renewable Energy Laboratory (NREL)
Field Test Best Practices: A Resource for Practical Residential Building Science

Main menu

Uncertainty Analysis

Every measurement includes a margin of error, or uncertainty.  When reporting measurements, the report is incomplete without an indication of the degree of uncertainty.  When decisions are based on measurements, uncertainty analysis is needed to manage the measurement decision risk, which includes false accept risk and false reject risk [1].  For example, we may be interested in testing a prototype home to verify 30% energy savings before building out a large number of homes with the same design.  If the energy savings is measured as 32% with an uncertainty of 5 percentage points, there is an identifiable risk in proceeding with the build-out.  If the measurement is 32±1%, we can proceed with more confidence.  If the measurement is 32% with an unspecified uncertainty, the risk is unknown.  On the other hand, if the measurement is 27% with an unspecified uncertainty, there is a risk of abandoning a good design based on an uncertain measurement.  In order to avoid these kinds of problems, the following procedure is recommended:

  1. Specify the desired measurements, including uncertainty.
  2. Design the measurement system.
  3. Analyze the uncertainties in the measurement system design.
  4. Reconcile any gap between the estimated and desired uncertainty specifications.
  5. Assemble the measurement system and proceed with the field test.
  6. Report the results with uncertainties.

Reference [2] presents a thorough guide to analyzing the uncertainties in a building energy measurement system.  Some highlights of that procedure are as follows, including the overview shown in Table 1.

 

Table 1. Uncertainty Analysis Procedure
STEPS

1. Define Measurement Problem: List measurements, instruments, accuracies, and equations used in analysis

2. Identify Error Sources: List potential sources of error and estimated uncertainties

3. List Uncertainties: List uncertainties in a table format by type (random or systematic)

4. Determine Sensitivity Coefficients: Determine the sensitivity coefficients from functions used in the analysis and enter in the table

5. Determine the Degrees of Freedom and Coverage Factor: List degrees of freedom in the table, determine the effective degrees of freedom, and determine the desired coverage factor

6. Combine the Uncertainties: Combine random and systematic uncertainties separately then combine and apply the appropriate coverage factor

7. Report the Uncertainties: Report the final result with the uncertainty and confidence interval

 

In completing Steps 2 and 3 in Table 1, the sources of uncertainty listed in Table 2 should be considered.  The comments and equations cited in Table 2 refer to content in Reference [2].

 

                                          Table 2.  Summary of Uncertainty Sources

Source

Type

Uncertainty

Degrees of Freedom

Comments

Repeated Observations

Random

UA = S

n = n – 1

Equations C.3 and C.4

Regression Analysis

Random

UA = SY/X

n = n – p

Equation C.5

Sensor Calibration Accuracy, s

Systematic

UB = S/2

n > 30

If there is confidence in the sensor and the calibration, assume (1) symmetric and normal distribution, (2) 95% coverage, and (3) large degree of freedom (>30).

Sensor Calibration Accuracy, s

Systematic

UB = S

n > 30

If there is limited confidence in and information about the sensor calibration, assume (1) symmetric and normal distribution, (2)  68% coverage, and (3) large degree of freedom (>30).

Sensor Calibration Accuracy, s

Systematic and Random

Estimate
UA & UB from S

 

If detailed information is known about the sensor accuracy, it can be used to estimate UA, UB, and n.

Resolution and Round Off Error

Systematic

UB = a/3½

n → inf

Assume (1) rectangular distribution with equal probability and (2) half width of distribution a = (a- + a+)/2.

Measurement
& Analysis
Methods

Systematic

Estimated

Equation

C.10

Uncertainty is based on best engineering judgment and degrees of freedom is based on an assumed reliability of the estimated uncertainty.

Other

 

 

 

Use best engineering judgment along with other references where appropriate.

(C.3)

 

 

(C.4)

 

 

 

(C.5)

 

 

 

(C.10)

 

Further explanations of types of error to be considered are summarized in Reference [2] based on References [3] and [4]:

  • Calibration uncertainties are from the limited precision of instruments.  Instruments are calibrated to achieve a small combination of systematic uncertainty of the standard instrument and the random uncertainty of the comparison.  The magnitude of this uncertainty can be obtained from the manufacturers’ specifications or field calibrations.  If there is not enough information to estimate the division between random and systematic uncertainty, this procedure assumes that all uncertainty is systematic.  Most calibration uncertainties will be assumed to have symmetric and normal distributions and to have 95% coverage (UB = s/2).  If the calibration history is not well known, a more conservative approach is to assume 68% coverage (UB = s).
  • Data acquisition uncertainties include limitations in sensing and recording signals, signal conditioning, and the sensors.  These uncertainties can be reduced by overall measurement system field calibrations.  The data logger error may be stated as a percent of the measurement at the data logger (usually a current or voltage).  There may also be a resolution error when the analog signal is rounded off because only a limited number of digits can be stored and transmitted.  These uncertainties also include those introduced by manual reading and recording data.  Manually read meters sometimes have low resolution and can be misread.
  • Data reduction uncertainties come from processing raw data.  Computational round off errors are usually small very and are neglected.  However, errors from curve fits to measured data can be significant.  Regression models are often used to relate a dependent variable to independent variables, such as energy consumption to outdoor temperature.  Regression models can be used to fill in missing data or extrapolate beyond the measurement period.  The simplest estimate of the modeling uncertainty is given by the residual standard deviation as shown in Equation C.5.  Uncertainty should also be estimated for all data from sources that are not directly measured.
  • Uncertainties due to methods are from the techniques in the measurement process.  Examples of these uncertainties include those embedded in calculations such as constants or material properties; obtrusive disturbance of a medium by the sensors; spatial averaging of discrete points; environmental effects such as convection, conduction, and radiation; and instability and hysteresis in the sensor.  Installation effects should be minimized with careful planning and field calibration of the measurement systems.  Even with careful placement, they represent the largest potential uncertainties for measurement of physical phenomena such as temperatures and fluid flows.  For example, consider the temperature measurement of a fluid in a pipe by inserting a thermocouple (TC) probe in a thermal well in the pipe.  Probe errors will result from thermal resistance between the fluid and the TC and from conduction along the thermal well and thermal couple to the surrounding environment.  Spatial errors will result from measuring the fluid temperature in one spot in the pipe and assigning this value to be the average of all the fluid in the cross section of pipe.  Another spatial error that may be significant is that the sensor may be at a different location along the length of the pipe relative to the desired point of measurement.

Formulas for analyzing and combing uncertainties from the various sources are provided in Reference [2].  Several examples of uncertainty analysis applied to measurements in buildings are also provided in Reference [2].

References

  1. Castrup, H. (1995). Uncertainty Analysis for Risk Management.  Presented at the 1995 Measurement Science Conference, Anaheim, California, January 27.  http://www.isgmax.com/articles_papers/msc95.pdf
  2. Barley, D.; Deru, M.; Pless, S.; Torcellini, P. (2005).  Procedure for Measuring and Reporting Commercial Building Energy Performance; Appendix C: Uncertainty Analysis.  NREL Report No. TP-550-38601.  http://www.nrel.gov/docs/fy06osti/38601.pdf.
  3. ASME (1998).  Test Uncertainty: Instruments and Apparatus.  ASME Standard PTC 19.1-98.  New York, NY:  American Society of Mechanical Engineers.
  4. Dieck, R.H. (1997).  Measurement Uncertainty, Methods and Applications, Second ed.  Research Triangle Park, NC:  Instrument Society of America.