The Celsius scale, previously known as the centigrade scale, is a temperature scale used by the International System of Units (SI). As an SI derived unit, it is used by all countries in the world, except the United States, Myanmar, and Liberia. It is named after the Swedish astronomer Anders Celsius (1701–1744), who developed a similar temperature scale. The degree Celsius (symbol: °C) can refer to a specific temperature on the Celsius scale as well as a unit to indicate a temperature interval, a difference between two temperatures or an uncertainty. Before being renamed to honor Anders Celsius in 1948, the unit was called centigrade, from the Latin centum, which means 100, and gradus, which means steps.
Before 1954, the Celsius scale was based on 0 °C for the freezing point of water and 100 °C for the boiling point of water at 1 atm pressure following a change introduced in 1743 by Jean-Pierre Christin to reverse the Celsius thermometer scale (from water boiling at 0 degrees and ice melting at 100 degrees). This scale is widely taught in schools today.
By international agreement, since 1954 the unit "degree Celsius" and the Celsius scale are defined by absolute zero and the triple point of Vienna Standard Mean Ocean Water (VSMOW), a specially purified water. This definition also precisely relates the Celsius scale to the Kelvin scale, which defines the SI base unit of thermodynamic temperature with symbol K. Absolute zero, the lowest temperature possible, is defined as being exactly 0 K and −273.15 °C. The temperature of the triple point of water is defined as exactly 273.16 K (0.01 °C; 32.02 °F). Thus, a temperature difference of one degree Celsius and that of one kelvin is exactly the same, with the null point of the Kelvin scale (0 K) at exactly −273.15 °C, and the null point of the Celsius scale (0 °C) at exactly 273.15 K.
In 1742, Swedish astronomer Anders Celsius (1701–1744) created a temperature scale which was the reverse of the scale now known by the name "Celsius": 0 represented the boiling point of water, while 100 represented the freezing point of water. In his paper Observations of two persistent degrees on a thermometer, he recounted his experiments showing that the melting point of ice is essentially unaffected by pressure. He also determined with remarkable precision how the boiling point of water varied as a function of atmospheric pressure. He proposed that the zero point of his temperature scale, being the boiling point, would be calibrated at the mean barometric pressure at mean sea level. This pressure is known as one standard atmosphere. The BIPM's 10th General Conference on Weights and Measures (CGPM) later defined one standard atmosphere to equal precisely 1013250dynes per square centimetre (101.325 kPa).
In 1743, the Lyonnais physicist Jean-Pierre Christin, permanent secretary of the Académie des sciences, belles-lettres et arts de LyonFR, working independently of Celsius, developed a scale where zero represented the freezing point of water and 100 represented the boiling point of water. On 19 May 1743 he published the design of a mercury thermometer, the "Thermometer of Lyon" built by the craftsman Pierre Casati that used this scale.
In 1744, coincident with the death of Anders Celsius, the Swedish botanist Carl Linnaeus (1707–1778) reversed Celsius's scale. His custom-made "linnaeus-thermometer", for use in his greenhouses, was made by Daniel Ekström, Sweden's leading maker of scientific instruments at the time and whose workshop was located in the basement of the Stockholm observatory. As often happened in this age before modern communications, numerous physicists, scientists, and instrument makers are credited with having independently developed this same scale; among them were Pehr Elvius, the secretary of the Royal Swedish Academy of Sciences (which had an instrument workshop) and with whom Linnaeus had been corresponding; Daniel Ekström[SV], the instrument maker; and Mårten Strömer (1707–1770) who had studied astronomy under Anders Celsius.
The first known Swedish document reporting temperatures in this modern "forward" Celsius scale is the paper Hortus Upsaliensis dated 16 December 1745 that Linnaeus wrote to a student of his, Samuel Nauclér. In it, Linnaeus recounted the temperatures inside the orangery at the University of Uppsala Botanical Garden:
...since the caldarium (the hot part of the greenhouse) by the angle of the windows, merely from the rays of the sun, obtains such heat that the thermometer often reaches 30 degrees, although the keen gardener usually takes care not to let it rise to more than 20 to 25 degrees, and in winter not under 15 degrees...
Since the 19th century, the scientific and thermometry communities worldwide referred to this scale as the centigrade scale. Temperatures on the centigrade scale were often reported simply as degrees or, when greater specificity was desired, as degrees centigrade (symbol: °C). Because the term centigrade was also the Spanish and French language name for a unit of angular measurement (1/10000 of a right angle) and had a similar connotation in other languages, the term centesimal degree (known as the gradian, "grad" or "gon": 1ᵍ = 0.9°, 100ᵍ = 90°) was used when very precise, unambiguous language was required by international standards bodies such as the BIPM. More properly, what was defined as "centigrade" then would now be "hectograde". Furthermore, in the context here, centigrade/hectograde is referring to the whole 0–100 range, not the given part thereof, hence "20° centigrade" means "20ᵍ per 100 gradians" (or 20% hectograde), not its literal description, "0.2 gradians".
To eliminate such confusion, the 9th CGPM and the CIPM (Comité international des poids et mesures) formally adopted "degree Celsius" in 1948,[a] formally keeping the recognized degree symbol, rather than adopting the gradian/centesimal degree symbol.
For scientific use, "Celsius" is the term usually used, with "centigrade" otherwise continuing to be in common but decreasing use, especially in informal contexts in English-speaking countries. It was not until February 1985 that the forecasts issued by the BBC switched from "centigrade" to "Celsius".
Some key temperatures relating the Celsius scale to other temperature scales are shown in the table below.
(precisely, by definition)
|0 K||−273.15 °C||−459.67 °F|
|Boiling point of liquid nitrogen||77.4 K||−195.8 °C||−320.4 °F|
|Sublimation point of dry ice.||195.1 K||−78 °C||−108.4 °F|
|Intersection of Celsius and Fahrenheit scales.||233.15 K||−40 °C||−40 °F|
|Melting point of H2O (purified ice)||273.1499 K||−0.0001 °C||31.9998 °F|
|Water's triple point
(precisely, by definition)
|273.16 K||0.01 °C||32.018 °F|
|Normal human body temperature (approximate average)||310.15 K||37.0 °C||98.6 °F|
|Water's boiling point at 1 atm (101.325 kPa)
(approximate: see Boiling point)[b]
|373.1339 K||99.9839 °C||211.971 °F|
The "degree Celsius" has been the only SI unit whose full unit name contains an uppercase letter since the SI base unit for temperature, the kelvin, became the proper name in 1967 replacing the term degrees Kelvin. The plural form is degrees Celsius.
The general rule of the International Bureau of Weights and Measures (BIPM) is that the numerical value always precedes the unit, and a space is always used to separate the unit from the number, e.g. "30.2 °C" (not "30.2°C" or "30.2° C"). Thus the value of the quantity is the product of the number and the unit, the space being regarded as a multiplication sign (just as a space between units implies multiplication). The only exceptions to this rule are for the unit symbols for degree, minute, and second for plane angle (°, ′, and ″, respectively), for which no space is left between the numerical value and the unit symbol. Other languages, and various publishing houses, may follow different typographical rules.
Unicode provides the Celsius symbol at codepoint U+2103 ℃ degree celsius. However, this is a compatibility character provided for roundtrip compatibility with legacy encodings. It easily allows correct rendering for vertically written East Asian scripts, such as Chinese. The Unicode standard explicitly discourages the use of this character: "In normal use, it is better to represent degrees Celsius "°C" with a sequence of U+00B0 ° degree sign + U+0043 C latin capital letter c, rather than U+2103 ℃ degree celsius. For searching, treat these two sequences as identical."
Shown below is the degree Celsius character followed immediately by the two-component version:
When viewed on computers that properly support Unicode, the above line may be similar to the image in the line below (enlarged for clarity):
The degree Celsius is a special name for the kelvin for use in expressing Celsius temperatures. The degree Celsius is also subject to the same rules as the kelvin with regard to the use of its unit name and symbol. Thus, besides expressing specific temperatures along its scale (e.g. "Gallium melts at 29.7646 °C" and "The temperature outside is 23 degrees Celsius"), the degree Celsius is also suitable for expressing temperature intervals: differences between temperatures or their uncertainties (e.g. "The output of the heat exchanger is hotter by 40 degrees Celsius", and "Our standard uncertainty is ±3 °C"). Because of this dual usage, one must not rely upon the unit name or its symbol to denote that a quantity is a temperature interval; it must be unambiguous through context or explicit statement that the quantity is an interval.[c] This is sometimes solved by using the symbol °C (pronounced "degrees Celsius") for a temperature, and C° (pronounced "Celsius degrees") for a temperature interval, although this usage is non-standard.
What is often confusing about the Celsius measurement is that it follows an interval system but not a ratio system; that it follows a relative scale not an absolute scale. This is put simply by illustrating that while 10 °C and 20 °C have the same interval difference as 20 °C and 30 °C, the temperature 20 °C is not twice the air heat energy as 10 °C. As this example shows, degrees Celsius is a useful interval measurement but does not possess the characteristics of ratio measures like weight or distance.
In science and in engineering, the Celsius scale and the Kelvin scale are often used in combination in close contexts, e.g. "...a measured value was 0.01023 °C with an uncertainty of 70 µK...". This practice is permissible because the magnitude of the degree Celsius is equal to that of the kelvin, but referring to a "microdegree Celsius" would be awkward.
Notwithstanding the official endorsement provided by decision #3 of Resolution 3 of the 13th CGPM, which stated "a temperature interval may also be expressed in degrees Celsius", the practice of simultaneously using both °C and K remains widespread throughout the scientific world as the use of SI-prefixed forms of the degree Celsius (such as "µ°C" or "microdegrees Celsius") to express a temperature interval has not been well-adopted.
One effect of defining the Celsius scale at the triple point of Vienna Standard Mean Ocean Water (VSMOW, 273.16 K and 0.01 °C), and at absolute zero (0 K and −273.15 °C), is that neither the melting nor boiling point of water under one standard atmosphere (101.325 kPa) remains a defining point for the Celsius scale. In 1948 when the 9th General Conference on Weights and Measures (CGPM) in Resolution 3 first considered using the triple point of water as a defining point, the triple point was so close to being 0.01 °C greater than water's known melting point, it was simply defined as precisely 0.01 °C. However, current measurements show that the difference between the triple and melting points of VSMOW is actually very slightly (<0.001 °C) greater than 0.01 °C. Thus, the actual melting point of ice is very slightly (less than a thousandth of a degree) below 0 °C. Also, defining water's triple point at 273.16 K precisely defined the magnitude of each 1 °C increment in terms of the absolute thermodynamic temperature scale (referencing absolute zero). Now decoupled from the actual boiling point of water, the value "100 °C" is hotter than 0 °C – in absolute terms – by a factor of precisely 373.15/ (approximately 36.61% thermodynamically hotter). When adhering strictly to the two-point definition for calibration, the boiling point of VSMOW under one standard atmosphere of pressure is actually 373.1339 K (99.9839 °C). When calibrated to ITS-90 (a calibration standard comprising many definition points and commonly used for high-precision instrumentation), the boiling point of VSMOW is slightly less, about 99.974 °C.
This boiling-point difference of 16.1 millikelvin between the Celsius scale's original definition and the current one (based on absolute zero and the triple point) has little practical meaning in common daily applications because water's boiling point is very sensitive to variations in barometric pressure. For example, an altitude change of only 28 cm (11 in) causes the boiling point to change by one millikelvin.
|from Celsius||to Celsius|
|Fahrenheit||[°F] = [°C] × 9⁄5 + 32||[°C] = ([°F] − 32) × 5⁄9|
|Kelvin||[K] = [°C] + 273.15||[°C] = [K] − 273.15|
|Rankine||[°R] = ([°C] + 273.15) × 9⁄5||[°C] = ([°R] − 491.67) × 5⁄9|
|For temperature intervals rather than specific temperatures,
1 °C = 1 K = 9⁄5 °F = 9⁄5 °R
Comparisons among various temperature scales
Celsius temperature scale, also called centigrade temperature scale, scale based on 0° for the freezing point of water and 100° for the boiling point of water at 1 atm pressure.
1743 Jean-Pierre Christin inverts the fixed points on Celsius' scale, to produce the scale used today.
|Look up Celsius in Wiktionary, the free dictionary.|