Table of Contents
ToggleIn every process plant whether it is oil & gas, power, chemical, water, or pharma instrument calibration plays a very important role.
No matter how expensive or advanced an instrument is, if it is not calibrated correctly, the measurement cannot be trusted.
Many students and junior technicians feel confused when they first hear calibration terms like accuracy, repeatability, uncertainty, traceability, and drift. These words sound technical, but their meanings are actually simple when explained in the right way.
This article explains the most common terms used in instrument calibration. If you understand these basic terms clearly, your foundation in instrument calibration will become very strong.
What Is Instrument Calibration?
In very simple words:
Instrument calibration means comparing an instrument with a known correct reference and checking how accurate it is.
During instrument calibration, we compare:
- A working instrument (like a pressure gauge or temperature sensor)
- With a standard instrument (whose value is already known)
If there is any error, we either record it or adjust the instrument.
Instrument calibration helps to:
- Improve measurement accuracy
- Maintain product quality
- Ensure plant safety
- Meet audit and ISO requirements
- Avoid costly process failures
1. Accuracy
Accuracy means how close the instrument reading is to the true value.
Example:
If the true pressure is 10 bar and your gauge shows 9.95 bar, the gauge is very accurate.
- High accuracy = reading is close to the actual value
- Low accuracy = reading is far from the actual value
Accuracy is one of the most important goals of instrument calibration.
2. Precision
Precision means how consistently an instrument shows the same reading under the same conditions.
An instrument can be:
- Precise but not accurate
- Accurate but not precise
- Both accurate and precise
Example:
If a thermometer always shows 98.0°C even when the actual value is 100°C, it is precise but not accurate.
Instrument calibration improves accuracy, not precision.
You can also read our article: Difference between Accuracy and Precision
3. Repeatability
Repeatability means the ability of an instrument to give the same output when the same input is applied repeatedly.
Example:
If you apply 5 bar pressure three times and the gauge shows:
- 5.01 bar
- 5.00 bar
- 5.02 bar
The repeatability is very good.
Good repeatability is essential for reliable instrument calibration.
4. Reproducibility
Reproducibility means getting the same result:
- At different times
- By different people
- Using different instruments of the same type
Repeatability is short-term consistency, while reproducibility checks long-term consistency in instrument calibration.
5. Resolution
Resolution is the smallest change in input that an instrument can detect.
Example:
- A digital thermometer with resolution 0.1°C can detect temperature change of 0.1°C
- A pressure gauge with 100 divisions has higher resolution than one with 50 divisions
Higher resolution helps during fine instrument calibration adjustments.
6. Range
Range is the minimum and maximum value that an instrument can measure.
Example:
A pressure transmitter with range:
- 0 to 10 bar
- 5 to 50 bar
Always perform instrument calibration within the specified range.
7. Span
Span = Maximum value – Minimum value
Example:
If the range is 0 to 100 °C:
Span = 100 – 0 = 100 °C
Span is used during zero and span adjustment in instrument calibration.
You can read our article: Zero and Span adjustment
8. Zero
Zero is the lowest value of the instrument range.
- 0 bar for pressure
- 0°C for temperature
- 4 mA signal for transmitter zero
Zero setting is the first step in most instrument calibration procedures.
9. Zero Shift
When the zero point changes over time due to:
- Vibration
- Temperature change
- Mechanical stress
This is called zero shift.
Zero shift must be corrected through instrument calibration.
10. Span Shift
When the full-scale output changes but zero stays correct, it is called span shift.
Example:
- 0 bar reads correctly
- 10 bar reads as 9.5 bar
This requires span adjustment during instrument calibration.
11. Drift
Drift is the slow change in instrument reading over a long period, even if input is constant.
Drift can occur due to:
- Aging of components
- Temperature variation
- Electronics degradation
This is why instrument calibration must be done periodically.
12. Linearity
Linearity means how close the instrument output follows a straight-line relationship with the input.
A perfectly linear instrument gives proportional output across the entire range.
Non-linearity adds error and must be checked during instrument calibration.
13. Sensitivity
Sensitivity means how much the output changes for a given input change.
High sensitivity = small input change gives noticeable output change.
It affects how responsive an instrument is during instrument calibration.
14. Hysteresis
Hysteresis is the difference in output when the same input is approached from:
- Increasing side
- Decreasing side
Example:
- 5 bar while increasing → gauge shows 5.02 bar
- 5 bar while decreasing → gauge shows 4.98 bar
This difference is hysteresis, checked during instrument calibration.
15. Tolerance
Tolerance is the acceptable limit of error allowed by the manufacturer or process standard.
Example:
- ±0.5% of full scale
- ±1°C
If an error is within tolerance, the instrument calibration is usually accepted.
16. Uncertainty
Uncertainty tells us how confident we are that the measured value is correct.
It includes:
- Instrument error
- Environmental effect
- Human handling error
- Reference standard accuracy
Every professional instrument calibration report mentions uncertainty.
17. Traceability
Traceability means the calibration standard used is linked to:
- National standards
- International standards (like NIST)
Traceability ensures that instrument calibration results are globally accepted.
18. As-Found Condition
This is the condition of the instrument before calibration.
It shows:
- How much the instrument has drifted
- Whether it was within tolerance or not
As-found data is very important in instrument calibration records.
19. As-Left Condition
This is the condition of the instrument after calibration and adjustment.
It confirms the instrument is now:
- Correct
- Within tolerance
- Ready for service
20. Calibration Interval
Calibration interval is the time between two calibration cycles.
It depends on:
- Instrument type
- Process criticality
- Past drift history
- Regulatory requirements
Common intervals:
- 3 months
- 6 months
- 1 year
Choosing the right instrument calibration interval saves money and avoids risk.
21. Working Standard
A working standard is the instrument used daily to calibrate plant instruments.
It is also calibrated periodically using a higher-level standard.
22. Master Standard
A master standard is a highly accurate reference used to calibrate working standards.
It forms the backbone of the instrument calibration chain.
23. Calibration Certificate
A calibration certificate is the official document that shows:
- Instrument details
- Calibration date
- As-found readings
- As-left readings
- Error values
- Tolerance
- Uncertainty
- Traceability
- Technician signature
Always check certificates after calibration.
24. Adjustment vs Calibration (Very Important)
Many people confuse these two terms.
- Calibration = Checking and comparing
- Adjustment = Physically correcting the error
You can calibrate without adjusting, but you cannot adjust without calibration.
Common Confusions
Accuracy vs Precision
- Accuracy = closeness to true value
- Precision = consistency of reading
Repeatability vs Reproducibility
- Repeatability = same conditions
- Reproducibility = different conditions
Calibration vs Verification
- Calibration = comparing & possibly adjusting
- Verification = only checking
Understanding these differences improves your instrument calibration knowledge greatly.
What we learn today?
Understanding common terms used in instrument-calibration is the first step toward becoming a skilled instrumentation professional. These terms are not just textbook words but they are used daily in real plants, audits, and maintenance activities.
When you clearly understand concepts like accuracy, repeatability, uncertainty, traceability, drift, and tolerance, you automatically improve your practical skills in calibration.

2 Comments