Understanding Measurement Terms and Concepts
Measurement Terms and Concepts
Accuracy and Precision
Accuracy
The closeness with which the reading of a measuring instrument approaches the true value of the measured variable.
Precision
A measure of the degree to which successive measurements differ from each other, without the need for the true value.
Repeatability
The ability of a measuring instrument to repeatedly indicate identical output signals for the same measured variable value under the same conditions and in the same direction of variation, covering the whole range.
Conditions of Repetition
- The same measurement procedure
- The same operator
- The same measuring instrument under the same conditions
- The same place
- Measurements taken within a short period of time
Reproducibility
Agreement between the results of a set of measurements on the same measurand, carried out under different measurement conditions. e.g., measuring principle, measuring method, person performing the reading, and reference standard.
Calibration
The process of defining the correction factor to multiply the reading value by, in order to match the measurement to the standard value.
Error
The difference between the measured quantity and the instrumental reading.
Tolerance of a Variable
The difference between the maximum and minimum values that a measured variable can take to be consistent with the specifications.
Sensitivity
Expresses how much the output changes for a given input change. Ideally, this should be high and constant over the measurement range. It is expressed quantitatively.
Resolution
Expresses the ability to discriminate between values due to the graduation of the instrument. We often speak of the number of digits for digital numerical indicators and percentage of scale for needle instruments.
Random Error
The result of a measurement minus the measurement that would result from an infinite number of measurements of the same quantity carried out under repeatability conditions.
Systematic Error
The error that always has the same value when measurements are performed under the same conditions.
Uncertainty
The maximum positive or negative value that the error between the true value and the measured value can take.
Measurement Class
Used to compare different sensors. All sensors belonging to the same class have a measurement error not exceeding a particular value called the class index.
Range
Expresses the upper and lower limits of the measurement range.
Working Range of Operation
The range in which the instrument operates with optimal resolution, but not necessarily the best sensitivity.
Dead Band
The minimum change in the sensed quantity required for the instrument to respond.
Response Time
The delay involved in the measurement of any variable in any process.
Hysteresis
The maximum difference observed in the levels indicated by the index or pen for the same instrument value.
Transfer Function
A mathematical model that represents the input/output relationship of an instrument.
Measuring Range
The spectrum or set of variable values within the upper and lower limits of the instrument’s measurement capability.
Rangeability
The ratio between the upper and lower values of an instrument’s measurement range.
Span or Reach
The algebraic difference between the upper and lower limits of the measurement range.
Accuracy
The tolerance or transmission accuracy of a measurement instrument, defining the limits of error committed when the instrument is used under normal conditions.
Measurement Range with Elevation of 0
The measurement range where the value of the measured variable or signal is greater than the lower value of the range.
Measurement Range with Suppression of 0
The measurement range where the value 0 of the measured variable or signal is less than the lower value of the range.
Elevation of 0
The amount by which the value 0 of the variable exceeds the lower value of the range.
Suppression of 0
The amount by which the lower value of the range exceeds the value 0 of the variable.
Drifting
A variation in the output signal that occurs over a given time period while the measured variable and all environmental conditions are held constant. Ideally, drift should be 0.
Reliability
A measure of the probability that an instrument will behave within the specified limits of error over a given time and under specific conditions.
Noise
Any unwanted electrical or signal perturbation that accidentally modifies the transmission.
Stability
The ability of an instrument to maintain its performance over its useful life and specified storage conditions.
Temperature of Use
The temperature range in which the instrument is expected to operate within specified limits of error.
Service Life
The specified minimum time during which the characteristics of a continuous or intermittent instrument apply without changes beyond specified tolerances.
Inaccuracy
The difference between the average of the measurements and the actual value of the measured quantity. It indicates how close a measurement is to the actual value.
Measurement Error
The algebraic difference between the value read or transmitted by the instrument and the actual value of the measured variable.
Zero Adjustment
A linear displacement of the measurement range to obtain the correct zero reading.
Span Adjustment
Corrects the angular error, which presents an accurate reading at an arbitrary point of the scale and an error that increases proportionally to the distance from this point.
Instruments without Indication (Blind Instruments)
Instruments that have no visible indication of the variable.
Instruments with a Display
Instruments that have an index and a scale to read the value of the variable.
On-Off Control
The most elementary type of control, activating the command when the variable is below the setpoint and deactivating it when the variable is above the setpoint. (Advantages) Low wear on electromechanical contacts.
Setpoint
The desired value of the variable that the control system should maintain.
Error
The difference between the value of the process variable and the setpoint.
Proportional Band
The band located below the setpoint at which the output varies proportionally to the error, reducing as it gets closer to the setpoint.