HOME

TheInfoList



OR:

In
measurement Measurement is the quantification of attributes of an object or event, which can be used to compare with other objects or events. In other words, measurement is a process of determining how large or small a physical quantity is as compared ...
technology and
metrology Metrology is the scientific study of measurement. It establishes a common understanding of units, crucial in linking human activities. Modern metrology has its roots in the French Revolution's political motivation to standardise units in Fran ...
, calibration is the comparison of
measurement Measurement is the quantification of attributes of an object or event, which can be used to compare with other objects or events. In other words, measurement is a process of determining how large or small a physical quantity is as compared ...
values delivered by a
device under test A device under test (DUT), also known as equipment under test (EUT) and unit under test (UUT), is a manufactured product undergoing testing, either at first manufacture or later during its life cycle as part of ongoing functional testing and calibra ...
with those of a calibration standard of known accuracy. Such a standard could be another measurement device of known accuracy, a device generating the quantity to be measured such as a
voltage Voltage, also known as electric pressure, electric tension, or (electric) potential difference, is the difference in electric potential between two points. In a static electric field, it corresponds to the work needed per unit of charge to ...
, a
sound In physics, sound is a vibration that propagates as an acoustic wave, through a transmission medium such as a gas, liquid or solid. In human physiology and psychology, sound is the ''reception'' of such waves and their ''perception'' by ...
tone, or a physical artifact, such as a
meter The metre ( British spelling) or meter ( American spelling; see spelling differences) (from the French unit , from the Greek noun , "measure"), symbol m, is the primary unit of length in the International System of Units (SI), though its pr ...
ruler. The outcome of the comparison can result in one of the following: * no significant error being noted on the device under test * a significant error being noted but no adjustment made * an adjustment made to correct the error to an acceptable level Strictly speaking, the term "calibration" means just the act of comparison and does not include any subsequent adjustment. The calibration standard is normally traceable to a national or international standard held by a metrology body.


BIPM Definition

The formal definition of calibration by the
International Bureau of Weights and Measures The International Bureau of Weights and Measures (french: Bureau international des poids et mesures, BIPM) is an intergovernmental organisation, through which its 59 member-states act together on measurement standards in four areas: chemistry, ...
(BIPM) is the following: "Operation that, under specified conditions, in a first step, establishes a relation between the quantity values with measurement uncertainties provided by measurement standards and corresponding indications with associated measurement uncertainties (of the calibrated instrument or secondary standard) and, in a second step, uses this information to establish a relation for obtaining a measurement result from an indication."
JCGM 200:2008 International vocabulary of metrology
'' — Basic and general concepts and associated terms (VIM)
This definition states that the calibration process is purely a comparison, but introduces the concept of
measurement uncertainty In metrology, measurement uncertainty is the expression of the statistical dispersion of the values attributed to a measured quantity. All measurements are subject to uncertainty and a measurement result is complete only when it is accompanied by ...
in relating the accuracies of the device under test and the standard.


Modern calibration processes

The increasing need for known accuracy and uncertainty and the need to have consistent and comparable standards internationally has led to the establishment of national laboratories. In many countries a National Metrology Institute (NMI) will exist which will maintain primary standards of measurement (the main
SI units The International System of Units, known by the international abbreviation SI in all languages and sometimes Pleonasm#Acronyms and initialisms, pleonastically as the SI system, is the modern form of the metric system and the world's most wid ...
plus a number of derived units) which will be used to provide
traceability Traceability is the capability to trace something. In some cases, it is interpreted as the ability to verify the history, location, or application of an item by means of documented recorded identification. Other common definitions include the capab ...
to customer's instruments by calibration. The NMI supports the metrological infrastructure in that country (and often others) by establishing an unbroken chain, from the top level of standards to an instrument used for measurement. Examples of National Metrology Institutes are NPL in the UK,
NIST The National Institute of Standards and Technology (NIST) is an agency of the United States Department of Commerce whose mission is to promote American innovation and industrial competitiveness. NIST's activities are organized into physical sc ...
in the
United States The United States of America (U.S.A. or USA), commonly known as the United States (U.S. or US) or America, is a country primarily located in North America. It consists of 50 states, a federal district, five major unincorporated territo ...
, PTB in Germany and many others. Since the Mutual Recognition Agreement was signed it is now straightforward to take traceability from any participating NMI and it is no longer necessary for a company to obtain traceability for measurements from the NMI of the country in which it is situated, such as the National Physical Laboratory in the UK.


Quality

To improve the quality of the calibration and have the results accepted by outside organizations it is desirable for the calibration and subsequent measurements to be "traceable" to the internationally defined measurement units. Establishing
traceability Traceability is the capability to trace something. In some cases, it is interpreted as the ability to verify the history, location, or application of an item by means of documented recorded identification. Other common definitions include the capab ...
is accomplished by a formal comparison to a
standard Standard may refer to: Symbols * Colours, standards and guidons, kinds of military signs * Standard (emblem), a type of a large symbol or emblem used for identification Norms, conventions or requirements * Standard (metrology), an object ...
which is directly or indirectly related to national standards (such as
NIST The National Institute of Standards and Technology (NIST) is an agency of the United States Department of Commerce whose mission is to promote American innovation and industrial competitiveness. NIST's activities are organized into physical sc ...
in the USA), international standards, or
certified reference materials Certified reference materials (CRMs) are 'controls' or standards used to check the quality and metrological traceability of products, to validate analytical measurement methods, or for the calibration of instruments. A certified reference materi ...
. This may be done by national standards laboratories operated by the government or by private firms offering metrology services.
Quality management system A quality management system (QMS) is a collection of business processes focused on consistently meeting customer requirements and enhancing their satisfaction. It is aligned with an organization's purpose and strategic direction (ISO 9001:2015). I ...
s call for an effective
metrology Metrology is the scientific study of measurement. It establishes a common understanding of units, crucial in linking human activities. Modern metrology has its roots in the French Revolution's political motivation to standardise units in Fran ...
system which includes formal, periodic, and documented calibration of all measuring instruments.
ISO 9000 The ISO 9000 family is a set of five quality management systems (QMS) standards that help organizations ensure they meet customer and other stakeholder needs within statutory and regulatory requirements related to a product or service. ISO ...
ISO 9001: "Quality management systems — Requirements" (2008), section 7.6. and
ISO 17025 ISO/IEC 17025 General requirements for the competence of testing and calibration laboratories is the main ISO/IEC standard used by testing and calibration laboratories. In most countries, ISO/IEC 17025 is the standard for which most labs must ho ...
ISO 17025: "General requirements for the competence of testing and calibration laboratories" (2005), section 5. standards require that these traceable actions are to a high level and set out how they can be quantified. To communicate the quality of a calibration the calibration value is often accompanied by a traceable uncertainty statement to a stated confidence level. This is evaluated through careful uncertainty analysis. Some times a DFS (Departure From Spec) is required to operate machinery in a degraded state. Whenever this does happen, it must be in writing and authorized by a manager with the technical assistance of a calibration technician. Measuring devices and instruments are categorized according to the physical quantities they are designed to measure. These vary internationally, e.g.,
NIST The National Institute of Standards and Technology (NIST) is an agency of the United States Department of Commerce whose mission is to promote American innovation and industrial competitiveness. NIST's activities are organized into physical sc ...
150-2G in the U.S. and NABL-141 in India. Together, these standards cover instruments that measure various physical quantities such as
electromagnetic radiation In physics, electromagnetic radiation (EMR) consists of waves of the electromagnetic (EM) field, which propagate through space and carry momentum and electromagnetic radiant energy. It includes radio waves, microwaves, infrared, (visible) l ...
(
RF probe An RF probe is a device which allows electronic test equipment to measure radio frequency (RF) signal in an electronic circuit. History In 1980 Reed Gleason and Eric Strid invented the first high frequency wafer probe while working at Tektronix. Th ...
s),
sound In physics, sound is a vibration that propagates as an acoustic wave, through a transmission medium such as a gas, liquid or solid. In human physiology and psychology, sound is the ''reception'' of such waves and their ''perception'' by ...
(
sound level meter A sound level meter (also called sound pressure level meter (SPL)) is used for acoustic measurements. It is commonly a hand-held instrument with a microphone. The best type of microphone for sound level meters is the condenser microphone, whic ...
or
noise dosimeter A noise dosimeter (American English) or noise dosemeter (British English) is a specialized sound level meter intended specifically to measure the noise exposure of a person integrated over a period of time; usually to comply with Health and Safet ...
), time and frequency (
intervalometer An intervalometer, also called an interval meter or interval timer, is a device that measures short intervals of time. People commonly use such devices to signal, in accurate time intervals, the operation of some other device. The intervalomete ...
), ionizing radiation (
Geiger counter A Geiger counter (also known as a Geiger–Müller counter) is an electronic instrument used for detecting and measuring ionizing radiation. It is widely used in applications such as radiation dosimetry, radiological protection, experimental p ...
), light (
light meter A light meter is a device used to measure the amount of light. In photography, a light meter (more correctly an exposure meter) is used to determine the proper exposure for a photograph. The meter will include either a digital or analog calcul ...
), mechanical quantities ( limit switch,
pressure gauge Pressure measurement is the measurement of an applied force by a fluid (liquid or gas) on a surface. Pressure is typically measured in units of force per unit of surface area. Many techniques have been developed for the measurement of pressur ...
,
pressure switch A pressure switch is a form of switch that operates an electrical contact when a certain set fluid pressure has been reached on its input. The switch may be designed to make contact either on pressure rise or on pressure fall. Pressure switches ar ...
), and, thermodynamic or thermal properties (
thermometer A thermometer is a device that measures temperature or a temperature gradient (the degree of hotness or coldness of an object). A thermometer has two important elements: (1) a temperature sensor (e.g. the bulb of a mercury-in-glass thermomete ...
, temperature controller). The standard instrument for each test device varies accordingly, e.g., a dead weight tester for pressure gauge calibration and a dry block temperature tester for temperature gauge calibration.


Instrument calibration prompts

Calibration may be required for the following reasons: * a new instrument * after an instrument has been repaired or modified * moving from one location to another location * when a specified time period has elapsed * when a specified usage (operating hours) has elapsed * before and/or after a critical measurement * after an event, for example ** after an instrument has been exposed to a shock,
vibration Vibration is a mechanical phenomenon whereby oscillations occur about an equilibrium point. The word comes from Latin ''vibrationem'' ("shaking, brandishing"). The oscillations may be periodic, such as the motion of a pendulum—or random, suc ...
, or physical damage, which might potentially have compromised the integrity of its calibration ** sudden changes in weather * whenever observations appear questionable or instrument indications do not match the output of surrogate instruments * as specified by a requirement, e.g., customer specification, instrument manufacturer recommendation. In general use, calibration is often regarded as including the process of adjusting the output or indication on a measurement instrument to agree with value of the applied standard, within a specified accuracy. For example, a
thermometer A thermometer is a device that measures temperature or a temperature gradient (the degree of hotness or coldness of an object). A thermometer has two important elements: (1) a temperature sensor (e.g. the bulb of a mercury-in-glass thermomete ...
could be calibrated so the error of indication or the correction is determined, and adjusted (e.g. via
calibration In measurement technology and metrology, calibration is the comparison of measurement values delivered by a device under test with those of a calibration standard of known accuracy. Such a standard could be another measurement device of known a ...
constants) so that it shows the true temperature in
Celsius The degree Celsius is the unit of temperature on the Celsius scale (originally known as the centigrade scale outside Sweden), one of two temperature scales used in the International System of Units (SI), the other being the Kelvin scale. The ...
at specific points on the scale. This is the perception of the instrument's end-user. However, very few instruments can be adjusted to exactly match the standards they are compared to. For the vast majority of calibrations, the calibration process is actually the comparison of an unknown to a known and recording the results.


Basic calibration process


Purpose and scope

The calibration process begins with the design of the measuring instrument that needs to be calibrated. The design has to be able to "hold a calibration" through its calibration interval. In other words, the design has to be capable of measurements that are "within
engineering tolerance Engineering tolerance is the permissible limit or limits of variation in: # a physical dimension; # a measured value or physical property of a material, manufactured object, system, or service; # other measured values (such as temperature, hu ...
" when used within the stated environmental conditions over some reasonable period of time. Having a design with these characteristics increases the likelihood of the actual measuring instruments performing as expected. Basically, the purpose of calibration is for maintaining the quality of measurement as well as to ensure the proper working of particular instrument.


Frequency

The exact mechanism for assigning tolerance values varies by country and as per the industry type. The measuring of equipment is manufacturer generally assigns the measurement tolerance, suggests a calibration interval (CI) and specifies the environmental range of use and storage. The using organization generally assigns the actual calibration interval, which is dependent on this specific measuring equipment's likely usage level. The assignment of calibration intervals can be a formal process based on the results of previous calibrations. The standards themselves are not clear on recommended CI values: :''
ISO 17025 ISO/IEC 17025 General requirements for the competence of testing and calibration laboratories is the main ISO/IEC standard used by testing and calibration laboratories. In most countries, ISO/IEC 17025 is the standard for which most labs must ho ...
'' ::"A calibration certificate (or calibration label) shall not contain any recommendation on the calibration interval except where this has been agreed with the customer. This requirement may be superseded by legal regulations.” :''ANSI/NCSL Z540'' ::"...shall be calibrated or verified at periodic intervals established and maintained to assure acceptable reliability..." :'' ISO-9001'' ::"Where necessary to ensure valid results, measuring equipment shall...be calibrated or verified at specified intervals, or prior to use...” :''MIL-STD-45662A'' ::"... shall be calibrated at periodic intervals established and maintained to assure acceptable accuracy and reliability...Intervals shall be shortened or may be lengthened, by the contractor, when the results of previous calibrations indicate that such action is appropriate to maintain acceptable reliability."


Standards required and accuracy

The next step is defining the calibration process. The selection of a standard or standards is the most visible part of the calibration process. Ideally, the standard has less than 1/4 of the measurement uncertainty of the device being calibrated. When this goal is met, the accumulated measurement uncertainty of all of the standards involved is considered to be insignificant when the final measurement is also made with the 4:1 ratio. This ratio was probably first formalized in Handbook 52 that accompanied MIL-STD-45662A, an early US Department of Defense metrology program specification. It was 10:1 from its inception in the 1950s until the 1970s, when advancing technology made 10:1 impossible for most electronic measurements. Maintaining a 4:1 accuracy ratio with modern equipment is difficult. The test equipment being calibrated can be just as accurate as the working standard. If the accuracy ratio is less than 4:1, then the calibration tolerance can be reduced to compensate. When 1:1 is reached, only an exact match between the standard and the device being calibrated is a completely correct calibration. Another common method for dealing with this capability mismatch is to reduce the accuracy of the device being calibrated. For example, a gauge with 3% manufacturer-stated accuracy can be changed to 4% so that a 1% accuracy standard can be used at 4:1. If the gauge is used in an application requiring 16% accuracy, having the gauge accuracy reduced to 4% will not affect the accuracy of the final measurements. This is called a limited calibration. But if the final measurement requires 10% accuracy, then the 3% gauge never can be better than 3.3:1. Then perhaps adjusting the calibration tolerance for the gauge would be a better solution. If the calibration is performed at 100 units, the 1% standard would actually be anywhere between 99 and 101 units. The acceptable values of calibrations where the test equipment is at the 4:1 ratio would be 96 to 104 units, inclusive. Changing the acceptable range to 97 to 103 units would remove the potential contribution of all of the standards and preserve a 3.3:1 ratio. Continuing, a further change to the acceptable range to 98 to 102 restores more than a 4:1 final ratio. This is a simplified example. The mathematics of the example can be challenged. It is important that whatever thinking guided this process in an actual calibration be recorded and accessible. Informality contributes to
tolerance stacks Tolerance analysis is the general term for activities related to the study of accumulated variation in mechanical parts and assemblies. Its methods may be used on other types of systems subject to accumulated variation, such as mechanical and elec ...
and other difficult to diagnose post calibration problems. Also in the example above, ideally the calibration value of 100 units would be the best point in the gauge's range to perform a single-point calibration. It may be the manufacturer's recommendation or it may be the way similar devices are already being calibrated. Multiple point calibrations are also used. Depending on the device, a zero unit state, the absence of the phenomenon being measured, may also be a calibration point. Or zero may be resettable by the user-there are several variations possible. Again, the points to use during calibration should be recorded. There may be specific connection techniques between the standard and the device being calibrated that may influence the calibration. For example, in electronic calibrations involving analog phenomena, the impedance of the cable connections can directly influence the result.


Manual and automatic calibrations

Calibration methods for modern devices can be manual or automatic. As an example, a manual process may be used for calibration of a pressure gauge. The procedure requires multiple steps, to connect the gauge under test to a reference master gauge and an adjustable pressure source, to apply fluid pressure to both reference and test gauges at definite points over the span of the gauge, and to compare the readings of the two. The gauge under test may be adjusted to ensure its zero point and response to pressure comply as closely as possible to the intended accuracy. Each step of the process requires manual record keeping. An automatic pressure calibrator is a device that combines an electronic control unit, a pressure intensifier used to compress a gas such as
Nitrogen Nitrogen is the chemical element with the symbol N and atomic number 7. Nitrogen is a nonmetal and the lightest member of group 15 of the periodic table, often called the pnictogens. It is a common element in the universe, estimated at sevent ...
, a
pressure transducer A pressure sensor is a device for pressure measurement of gases or liquids. Pressure is an expression of the force required to stop a fluid from expanding, and is usually stated in terms of force per unit area. A pressure sensor usually act ...
used to detect desired levels in a
hydraulic accumulator A hydraulic accumulator is a pressure storage reservoir in which an incompressible hydraulic fluid is held under pressure that is applied by an external source of mechanical energy. The external source can be an engine, a spring, a raised weight ...
, and accessories such as liquid traps and gauge fittings. An automatic system may also include data collection facilities to automate the gathering of data for record keeping.


Process description and documentation

All of the information above is collected in a calibration procedure, which is a specific
test method A test method is a method for a test in science or engineering, such as a physical test, chemical test, or statistical test. It is a definitive procedure that produces a test result. In order to ensure accurate and relevant test results, a test m ...
. These procedures capture all of the steps needed to perform a successful calibration. The manufacturer may provide one or the organization may prepare one that also captures all of the organization's other requirements. There are clearinghouses for calibration procedures such as the Government-Industry Data Exchange Program (GIDEP) in the United States. This exact process is repeated for each of the standards used until transfer standards,
certified reference materials Certified reference materials (CRMs) are 'controls' or standards used to check the quality and metrological traceability of products, to validate analytical measurement methods, or for the calibration of instruments. A certified reference materi ...
and/or natural physical constants, the measurement standards with the least uncertainty in the laboratory, are reached. This establishes the
traceability Traceability is the capability to trace something. In some cases, it is interpreted as the ability to verify the history, location, or application of an item by means of documented recorded identification. Other common definitions include the capab ...
of the calibration. See
Metrology Metrology is the scientific study of measurement. It establishes a common understanding of units, crucial in linking human activities. Modern metrology has its roots in the French Revolution's political motivation to standardise units in Fran ...
for other factors that are considered during calibration process development. After all of this, individual instruments of the specific type discussed above can finally be calibrated. The process generally begins with a basic damage check. Some organizations such as nuclear power plants collect "as-found" calibration data before any routine maintenance is performed. After routine maintenance and deficiencies detected during calibration are addressed, an "as-left" calibration is performed. More commonly, a calibration technician is entrusted with the entire process and signs the calibration certificate, which documents the completion of a successful calibration. The basic process outlined above is a difficult and expensive challenge. The cost for ordinary equipment support is generally about 10% of the original purchase price on a yearly basis, as a commonly accepted rule-of-thumb. Exotic devices such as scanning electron microscopes,
gas chromatograph Gas chromatography (GC) is a common type of chromatography used in analytical chemistry for separating and analyzing compounds that can be vaporized without decomposition. Typical uses of GC include testing the purity of a particular substance, ...
systems and
laser A laser is a device that emits light through a process of optical amplification based on the stimulated emission of electromagnetic radiation. The word "laser" is an acronym for "light amplification by stimulated emission of radiation". The firs ...
interferometer Interferometry is a technique which uses the ''interference'' of superimposed waves to extract information. Interferometry typically uses electromagnetic waves and is an important investigative technique in the fields of astronomy, fiber opti ...
devices can be even more costly to maintain. The 'single measurement' device used in the basic calibration process description above does exist. But, depending on the organization, the majority of the devices that need calibration can have several ranges and many functionalities in a single instrument. A good example is a common modern
oscilloscope An oscilloscope (informally a scope) is a type of electronic test instrument that graphically displays varying electrical voltages as a two-dimensional plot of one or more signals as a function of time. The main purposes are to display repetiti ...
. There easily could be 200,000 combinations of settings to completely calibrate and limitations on how much of an all-inclusive calibration can be automated. To prevent unauthorized access to an instrument tamper-proof seals are usually applied after calibration. The picture of the oscilloscope rack shows these, and prove that the instrument has not been removed since it was last calibrated as they will possible unauthorized to the adjusting elements of the instrument. There also are labels showing the date of the last calibration and when the calibration interval dictates when the next one is needed. Some organizations also assign unique identification to each instrument to standardize the record keeping and keep track of accessories that are integral to a specific calibration condition. When the instruments being calibrated are integrated with computers, the integrated computer programs and any calibration corrections are also under control.


Historical development


Origins

The words "calibrate" and "calibration" entered the English language as recently as the
American Civil War The American Civil War (April 12, 1861 – May 26, 1865; also known by other names) was a civil war in the United States. It was fought between the Union ("the North") and the Confederacy ("the South"), the latter formed by states ...
, in descriptions of
artillery Artillery is a class of heavy military ranged weapons that launch munitions far beyond the range and power of infantry firearms. Early artillery development focused on the ability to breach defensive walls and fortifications during sie ...
, thought to be derived from a measurement of the calibre of a gun. Some of the earliest known systems of measurement and calibration seem to have been created between the ancient civilizations of
Egypt Egypt ( ar, مصر , ), officially the Arab Republic of Egypt, is a transcontinental country spanning the northeast corner of Africa and southwest corner of Asia via a land bridge formed by the Sinai Peninsula. It is bordered by the Medite ...
,
Mesopotamia Mesopotamia ''Mesopotamíā''; ar, بِلَاد ٱلرَّافِدَيْن or ; syc, ܐܪܡ ܢܗܪ̈ܝܢ, or , ) is a historical region of Western Asia situated within the Tigris–Euphrates river system, in the northern part of the ...
and the
Indus Valley The Indus ( ) is a transboundary river of Asia and a trans- Himalayan river of South and Central Asia. The river rises in mountain springs northeast of Mount Kailash in Western Tibet, flows northwest through the disputed region of Kash ...
, with excavations revealing the use of angular gradations for construction. The term "calibration" was likely first associated with the precise division of linear distance and angles using a
dividing engine A dividing engine is a device employed to mark graduations on measuring instruments to allow for reading smaller measurements than can be allowed by directly engraving them. The well-known vernier scale and micrometer screw-gauge are classic e ...
and the measurement of gravitational
mass Mass is an intrinsic property of a body. It was traditionally believed to be related to the quantity of matter in a physical body, until the discovery of the atom and particle physics. It was found that different atoms and different eleme ...
using a
weighing scale A scale or balance is a device used to measure weight or mass. These are also known as mass scales, weight scales, mass balances, and weight balances. The traditional scale consists of two plates or bowls suspended at equal distances from a ...
. These two forms of measurement alone and their direct derivatives supported nearly all commerce and technology development from the earliest civilizations until about AD 1800.


Calibration of weights and distances ()

Early measurement devices were ''direct'', i.e. they had the same units as the quantity being measured. Examples include length using a yardstick and mass using a weighing scale. At the beginning of the twelfth century, during the reign of Henry I (1100-1135), it was decreed that a yard be "the distance from the tip of the King's nose to the end of his outstretched thumb." However, it wasn't until the reign of Richard I (1197) that we find documented evidence. :''Assize of Measures'' :"Throughout the realm there shall be the same yard of the same size and it should be of iron." Other standardization attempts followed, such as the
Magna Carta (Medieval Latin for "Great Charter of Freedoms"), commonly called (also ''Magna Charta''; "Great Charter"), is a royal charter of rights agreed to by King John of England at Runnymede, near Windsor, on 15 June 1215. First drafted by ...
(1225) for liquid measures, until the
Mètre des Archives The history of the metre starts with the Scientific Revolution that is considered to have begun with Nicolaus Copernicus's publication of ''De revolutionibus orbium coelestium'' in 1543. Increasingly accurate measurements were required, and sc ...
from France and the establishment of the
Metric system The metric system is a system of measurement that succeeded the decimalised system based on the metre that had been introduced in France in the 1790s. The historical development of these systems culminated in the definition of the Intern ...
.


The early calibration of pressure instruments

One of the earliest pressure measurement devices was the Mercury barometer, credited to Torricelli (1643), which read atmospheric pressure using Mercury. Soon after, water-filled
manometer Pressure measurement is the measurement of an applied force by a fluid (liquid or gas) on a surface. Pressure is typically measured in units of force per unit of surface area. Many techniques have been developed for the measurement of pressur ...
s were designed. All these would have linear calibrations using gravimetric principles, where the difference in levels was proportional to pressure. The normal units of measure would be the convenient inches of mercury or water. In the direct reading hydrostatic manometer design on the right, applied pressure Pa pushes the liquid down the right side of the manometer U-tube, while a length scale next to the tube measures the difference of levels. The resulting height difference "H" is a direct measurement of the pressure or vacuum with respect to
atmospheric pressure Atmospheric pressure, also known as barometric pressure (after the barometer), is the pressure within the atmosphere of Earth. The standard atmosphere (symbol: atm) is a unit of pressure defined as , which is equivalent to 1013.25 millibars, ...
. In the absence of differential pressure both levels would be equal, and this would be used as the zero point. The
Industrial Revolution The Industrial Revolution was the transition to new manufacturing processes in Great Britain, continental Europe, and the United States, that occurred during the period from around 1760 to about 1820–1840. This transition included going f ...
saw the adoption of "indirect" pressure measuring devices, which were more practical than the manometer. An example is in high pressure (up to 50 psi) steam engines, where mercury was used to reduce the scale length to about 60 inches, but such a manometer was expensive and prone to damage. This stimulated the development of indirect reading instruments, of which the Bourdon tube invented by Eugène Bourdon is a notable example. In the front and back views of a Bourdon gauge on the right, applied pressure at the bottom fitting reduces the curl on the flattened pipe proportionally to pressure. This moves the free end of the tube which is linked to the pointer. The instrument would be calibrated against a manometer, which would be the calibration standard. For measurement of indirect quantities of pressure per unit area, the calibration uncertainty would be dependent on the density of the manometer fluid, and the means of measuring the height difference. From this other units such as pounds per square inch could be inferred and marked on the scale.


See also

* Calibration curve * Calibrated geometry *
Calibration (statistics) There are two main uses of the term calibration in statistics that denote special types of statistical inference problems. "Calibration" can mean :*a reverse process to regression, where instead of a future dependent variable being predicted from ...
*
Color calibration The aim of color calibration is to measure and/or adjust the color response of a device (input or output) to a known state. In International Color Consortium (ICC) terms, this is the basis for an additional color characterization of the device ...
– used to calibrate a
computer monitor A computer monitor is an output device that displays information in pictorial or textual form. A discrete monitor comprises a visual display, support electronics, power supply, housing, electrical connectors, and external user controls. The d ...
or display. * Deadweight tester *
EURAMET EURAMET (European Association of National Metrology Institutes, previously known as EUROMET, the European Collaboration in Measurement Standards) is a collaborative alliance of national metrological organizations from member states of the European U ...
Association of European NMIs * Measurement Microphone Calibration *
Measurement uncertainty In metrology, measurement uncertainty is the expression of the statistical dispersion of the values attributed to a measured quantity. All measurements are subject to uncertainty and a measurement result is complete only when it is accompanied by ...
*
Musical tuning In music, there are two common meanings for tuning: * Tuning practice, the act of tuning an instrument or voice. * Tuning systems, the various systems of pitches used to tune an instrument, and their theoretical bases. Tuning practice Tun ...
– tuning, in music, means calibrating musical instruments into playing the right pitch. * Precision measurement equipment laboratory * Scale test car – a device used to calibrate
weighing scales A scale or balance is a device used to measure weight or mass. These are also known as mass scales, weight scales, mass balances, and weight balances. The traditional scale consists of two plates or bowls suspended at equal distances from a ...
that weigh
railroad car A railroad car, railcar ( American and Canadian English), railway wagon, railway carriage, railway truck, railwagon, railcarriage or railtruck (British English and UIC), also called a train car, train wagon, train carriage or train truck, is ...
s. *
Systems of measurement A system of measurement is a collection of units of measurement and rules relating them to each other. Systems of measurement have historically been important, regulated and defined for the purposes of science and commerce. Systems of measuremen ...


References


Sources

* Crouch, Stanley & Skoog, Douglas A. (2007). ''Principles of Instrumental Analysis''. Pacific Grove: Brooks Cole. {{ISBN, 0-495-01201-7. Accuracy and precision Standards Measurement Metrology