Calibration of instruments
Instrument calibration is one of the primary processes used to maintain the instrument accuracy. Calibration is process of configuring an instrument to provide a result for a sample within an acceptable range. Eliminating factors that cause inaccurate measurements is a fundamental aspect of instrumentation design.
Although the exact procedure vary from product to product, the calibration process generally involves using the instrument to test samples of one or more known values called as “calibrators.” The results are used to establish a relationship between the measurement technique used by the instrument and the known values. The process in essence teaches the instrument to produce results that are more accurate than those that would occur otherwise. The instrument can then provide more accurate results when samples of unknown values are tested in the normal usage of the product.
Calibrations are performed using only few calibrators to establish the correlation at some specific points within instrument’s operating range. While it might be desirable to use a large number of calibrators to establish calibration relationship, or “curve”, the time and labour associated with preparing and testing a large number of calibrators might outweigh the resulting level of performance. A trade-off must be made between the desired level of product performance and the effort associated with accomplishing the calibration. The instrument will provide the best performance when intermediate points provided in the manufacturer’s performance specifications are used for calibration; the specified process essentially eliminates, or zeroes out, the inherent instrument error at these points.
![]() |
| Figure 1 Figure Reference : https://www.surecontrols.com/
|
Why calibration is important ?
The accuracy of all measuring devices degrade over time. This is typically caused by a normal wear and tear. However, changes in accuracy can also be caused by electric or mechanical shock or hazardous manufacturing environment (oils, metal chips etc.). Depending on the type of instrument and the environment in which it is being used, it may degrade very quickly or over a long period of time. The bottom line is that calibration improves the accuracy of the measuring device. Accurate measuring devices improve the product quality.
When should you calibrate your measuring device?
A measuring device should be calibrated :
- According to the recommendation of the manufacturer.
- After any mechanical or electrical shock.
- Periodically (annually, quarterly, monthly)
The hidden costs and risks associated with un-calibrated measuring devices could be much higher than the cost of calibration. Therefore, it is recommended that the measuring instruments are calibrated regularly by a reputable company to ensure that errors associated with the measurements are in the acceptable range.
PHYSICAL FEATURES TO BE CALIBRATED
The physical features are to be calibrated depending on the characteristics of the measuring instrument. Calibration is applied in a diverse range of measurement instruments and processes. Some examples follow.
Flow calibration:
There are many devices and facilities available for the measurement of liquid, air, or solid Once the method of measurement is determined by appropriate flowmeter set up, static or dynamic calibrations can be carried out. In the case of a static-gravimetric liquid flow, a calibration facility may include a reservoir, pumping system, pipeline, flowmeter under test located on the pipeline, collection system, computers and interface, supporting software, and so on. The calibration of the flow of fluid through the meter can be determined by collecting the prescribed mass of steady fluid flowing over a measured time interval.
Sensor calibration:
Sensors may have to be calibrated after having their data integrated with a signal conditioning system. This process for calibrating the processing part of the system requires the injection of known input signal into the sensor. By observing output, a correct output scale can be configured for that particular application. If the sensor is used for time-varying inputs, dynamic calibration becomes necessary. In most cases, the transient behaviour of sensor top step response may be sufficient to assess the dynamic response of the sensor.
Calibration of food products:
As food contains many chemical substances, the calibration of devices become complex. In the case of honey, for example, the following parameters need to be identified by calibrated instruments and processes: fructose, glucose, turanose, maltose, moisture level, acidity, and so on.
Calibration of images:
Calibration is one of the first steps in image processing. For example, astronomical images are calibrated to eliminate the effect of cameras, light pollution, and distortions. Various methods are used to ensure calibrated images by eliminating thermal, readout, and other effects. For thermal effects, the cameras are cooled below certain temperatures and dark frames are used to compensate for the noise generated by camera electronics.
Basic calibration process
Purpose and scope
The calibration process begins with design of the measuring instrument that needs to be calibrated. The design has to be able to hold a calibration through its calibration interval. In other words, the design has to be capable of measurements that are "within engineering tolerance" when used within the stated environmental conditions over some reasonable period of time.Having a design with all these characteristics increases the likelihood of the actual measuring instruments performing as expected. Basically, the purpose of calibration is for maintaining the quality of measurement as well as to ensure the proper working of particular instrument.
Frequency
The exact mechanism for assigning tolerance values varies by country and as per the type of industry . The measuring of equipment is manufacturer generally assigns the measurement tolerance, suggests calibration interval (CI) and specifies the environmental range of use and storage. The using organization generally assigns the actual calibration interval, which is dependent on this specific measuring equipment's likely usage level. The assignment of calibration intervals can be a formal process based on the results of previous calibrations.
Standards required and Accuracy
The next step is to define the calibration process. The selection of a standard or standards is the most visible part of calibration process. Ideally, standard has less than 1/4 of the measurement uncertainty of the device being calibrated. When this goal is met, then the accumulated measurement uncertainty of all of the standards involved is considered to be insignificant when the final measurement is also made with the 4:1 ratio.
Maintaining a 4:1 accuracy ratio with modern equipment is difficult. The test equipment being calibrated can be just as accurate as the working standard. If the accuracy ratio is less than 4:1, then the calibration tolerance can be reduced to compensate. When 1:1 is reached, only an exact match between the standard and the device being calibrated is a completely correct calibration. Another common method for dealing with this capability mismatch is to reduce the accuracy of the device being calibrated.
Manual and Automatic Calibrations
Calibration methods for modern devices can be manual or can be automatic. As an example, a manual process may be used for the calibration of a pressure gauge. The procedure requires multiple steps to connect the gauge under test to the reference master gauge and an adjustable pressure source, to apply fluid pressure to both reference and test gauges at definite points over the span of the gauge, and to compare the readings of the two. The gauge under test may be adjusted to ensure it's zero point and response to pressure comply as closely as possible to the intended accuracy. Each step of these process requires manual record keeping.
![]() |
| Figure 2- Manual Calibration Figure Reference : https://en.wikipedia.org/wiki/Calibration |
An automatic pressure calibrator is a device that combines an electronic control unit, pressure intensifier used to compress a gas such as Nitrogen, a pressure transducer used to detect desired levels in a hydraulic accumulator, and accessories such as liquid traps and gauge fittings. An automatic system may also include data collection facilities to automate the gathering of data for record-keeping.
![]() |
| Figure 3- Automatic Calibration Figure Reference : https://en.wikipedia.org/wiki/Calibration |
Benefits of Calibration
Calibration is a process of testing and comparing errors of measurement instruments and processes with accepted standards in order to detect and correct variations in performance. Therefore, calibration assures that devices and processes meet expected performance specifications within universally acceptable levels and accuracy. Hence, calibration has the following benefits:
• It determines whether measurements made before calibration were valid.
• It gives confidence that future measurements will be accurate.
• It assures consistency and compatibility with those made elsewhere.
• It leads to repeatability and reproducibility assessments of the instruments and processes.
• Without calibration, the product quality may be poor, thus opening up legal challenges and high failure rates of the products, thus increasing costs.
• It increases efficiency by ensuring that the measurements are correct.
• In the process industry, calibration of devices assures that the processes are well controlled and that the products meet expected specifications.
• It leads to the documentation of performance of instruments and processes to meet quality standards such as the ISO 9000, ISO 1400, and QS-9000.
• Frequent calibrations can provide a graphical view of equipment uncertainty over time, thus leading to the reliability of performance. This gives in-service life analysis; hence, depreciation and replacements can be predicted in an informed manner.
• Measurements made within international standards promotes global acceptance, thus increasing competitiveness.
• It helps the convenient implementation of related regulations and legislation that govern the use of equipment in a particular application. • As the technology changes, the regulations and legislation of test and measuring instruments change continually, and calibration helps compliance validity of measurements and processes under changing conditions.
Costs of Calibration
A successful calibration process requires hardware and software, special equipment, and manpower, hence the costs are variable depending intensity of use of all the variables. The cost of calibration depends on what is calibrated and who is calibrating by it. In simple cases where a one-off instrument is involved, the cost can be lower than one hundred dollars, but complex cases can cost thousands of dollars. Calibration cost depends on whether the calibration is carried out on the premises of calibrating laboratories or on the factory floor being outsourced to third parties. Certification by ISO 10012-1, ISO 9001, MIL-STD 45662A, and MIL-HDBK-52B requires calibration for measuring equipment. In many situations, such as weighing systems calibration, it is statutory requirement. One of the major factors for cost is the frequency of calibration of an instrument. Most calibration systems issue a validity period during which the instrument can be used without concern for major errors and uncertainties. Some organizations use finely worked out methods for determining calibration intervals, while others use conservative calibration intervals barely able to meet the legal demands. The perception exists that calibration cost can be reduced if the interval can be stretched legitimately. The use of uncalibrated instruments in an organization can be costly as it may affect the product quality and quality of downstream operations. Standards such as MIL-STD 45662A suggest good calibration intervals. As a rule of thumb, 85 to 95% of all instruments returned for calibration meet calibration limits. The calibration limits are determined by probability charts of the age of instruments and their failure data. Usually, an instrument must be calibrated if the failure rate increases or functionality deteriorates when compared to other standard instruments.
Trends in Calibration
With the availability of advancing technology, the classical calibration process is changing on at least three fronts, these being the following:
1. Electronic calibration
2. E-calibration using Internet and communication techniques
3. Intelligent and self-calibrating instruments and sensors.
Electronic calibration: Many instruments offer features for closed-case calibrations so that the electronic calibration can be employed. Electronic calibration is a single connection and one- or two-port calibration technique without disturbing the components inside the case. Once the calibrating equipment, like example, a computer, is linked with the device under calibration, appropriate software generates the necessary calibration information. Errors due to gains and offsets of the instrument are corrected mathematically within the instrument processor to obtain the correct measured values. Analog corrections can also be made via the adjustment of the parameters of the digital-to-analog converters. Corrected calibration constants are kept within the non-volatile memory for permanent use.
1) Upendra Dani-55 ; Mechanical Engineering Department , VIT Pune.
2) Vishwajeet Desai -64; Mechanical Engineering Department , VIT Pune.
3) Nivedita Deshmukh-66; Mechanical Engineering Department , VIT Pune.
4) Swanand Deshmukh-69 ; Mechanical Engineering Department , VIT Pune.
5) Saurabh Dhirde -76; Mechanical Engineering Department , VIT Pune



Comments
Post a Comment