The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of
information
Information is an Abstraction, abstract concept that refers to something which has the power Communication, to inform. At the most fundamental level, it pertains to the Interpretation (philosophy), interpretation (perhaps Interpretation (log ...
or
information entropy
In information theory, the entropy of a random variable quantifies the average level of uncertainty or information associated with the variable's potential states or possible outcomes. This measures the expected amount of information needed ...
, based on
natural logarithms
The natural logarithm of a number is its logarithm to the base of the mathematical constant , which is an irrational and transcendental number approximately equal to . The natural logarithm of is generally written as , , or sometimes, if ...
and powers of
''e'', rather than the powers of 2 and
base 2 logarithms, which define the
shannon. This unit is also known by its unit symbol, the nat. One nat is the information content of an event when the probability of that event occurring is 1/
''e''.
One nat is equal to
shannons ≈ 1.44 Sh or, equivalently,
hartleys ≈ 0.434 Hart.
History
Boulton and
Wallace used the term ''nit'' in conjunction with
minimum message length, which was subsequently changed by the
minimum description length
Minimum Description Length (MDL) is a model selection principle where the shortest description of the data is the best model. MDL methods learn through a data compression perspective and are sometimes described as mathematical applications of Occam ...
community to ''nat'' to avoid confusion with the
nit used as a unit of
luminance
Luminance is a photometric measure of the luminous intensity per unit area of light travelling in a given direction. It describes the amount of light that passes through, is emitted from, or is reflected from a particular area, and falls wit ...
.
Alan Turing
Alan Mathison Turing (; 23 June 1912 – 7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher and theoretical biologist. He was highly influential in the development of theoretical computer ...
used the ''natural
ban''.
Entropy
Shannon entropy
Shannon may refer to:
People
* Shannon (given name)
* Shannon (surname)
* Shannon (American singer), stage name of singer Brenda Shannon Greene (born 1958)
* Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum ...
(information entropy), being the
expected value
In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first Moment (mathematics), moment) is a generalization of the weighted average. Informa ...
of the information of an event, is inherently a quantity of the same type and with a unit of information. The
International System of Units
The International System of Units, internationally known by the abbreviation SI (from French ), is the modern form of the metric system and the world's most widely used system of measurement. It is the only system of measurement with official s ...
, by assigning the same unit (
joule
The joule ( , or ; symbol: J) is the unit of energy in the International System of Units (SI). In terms of SI base units, one joule corresponds to one kilogram- metre squared per second squared One joule is equal to the amount of work d ...
per
kelvin
The kelvin (symbol: K) is the base unit for temperature in the International System of Units (SI). The Kelvin scale is an absolute temperature scale that starts at the lowest possible temperature (absolute zero), taken to be 0 K. By de ...
) both to
heat capacity
Heat capacity or thermal capacity is a physical property of matter, defined as the amount of heat to be supplied to an object to produce a unit change in its temperature. The SI unit of heat capacity is joule per kelvin (J/K).
Heat capacity is a ...
and to
thermodynamic entropy
In classical thermodynamics, entropy () is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-19th century to explain the relati ...
implicitly treats information entropy as a
quantity of dimension one
Dimensionless quantities, or quantities of dimension one, are quantities implicitly defined in a manner that prevents their aggregation into units of measurement. ISBN 978-92-822-2272-0. Typically expressed as ratios that align with another sy ...
, with . Systems of natural units that normalize the
Boltzmann constant
The Boltzmann constant ( or ) is the proportionality factor that relates the average relative thermal energy of particles in a ideal gas, gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin (K) and the ...
to 1 are effectively measuring thermodynamic entropy with the nat as unit.
When the Shannon entropy is written using a natural logarithm,
it is implicitly giving a number measured in nats.
See also
*
Perplexity
In information theory, perplexity is a measure of uncertainty in the value of a sample from a discrete probability distribution. The larger the perplexity, the less likely it is that an observer can guess the value which will be drawn from the ...
Notes
References
Further reading
* {{cite book , first=Fazlollah M. , last=Reza , title=An Introduction to Information Theory , location=New York , publisher=Dover , year=1994 , isbn=0-486-68210-2
Units of information