Natural unit of information
   HOME

TheInfoList



OR:

The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of
information Information is an abstract concept that refers to that which has the power to inform. At the most fundamental level information pertains to the interpretation of that which may be sensed. Any natural process that is not completely random ...
, based on natural logarithms and powers of ''e'', rather than the powers of 2 and base 2 logarithms, which define the shannon. This unit is also known by its unit symbol, the nat. One nat is the information content of an event when the probability of that event occurring is 1/ ''e''. One nat is equal to   shannons ≈ 1.44 Sh or, equivalently,   hartleys ≈ 0.434 Hart.


History

Boulton and Wallace used the term ''nit'' in conjunction with
minimum message length Minimum message length (MML) is a Bayesian information-theoretic method for statistical model comparison and selection. It provides a formal information theory restatement of Occam's Razor: even when models are equal in their measure of fit-accurac ...
, which was subsequently changed by the
minimum description length Minimum Description Length (MDL) is a model selection principle where the shortest description of the data is the best model. MDL methods learn through a data compression perspective and are sometimes described as mathematical applications of Occa ...
community to ''nat'' to avoid confusion with the nit used as a unit of luminance.
Alan Turing Alan Mathison Turing (; 23 June 1912 – 7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher, and theoretical biologist. Turing was highly influential in the development of theoretical co ...
used the ''natural ban''.


Entropy

Shannon entropy Shannon may refer to: People * Shannon (given name) * Shannon (surname) * Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958) * Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum W ...
(information entropy), being the expected value of the information of an event, is a quantity of the same type and with the same units as information. The International System of Units, by assigning the same units (
joule The joule ( , ; symbol: J) is the unit of energy in the International System of Units (SI). It is equal to the amount of work done when a force of 1 newton displaces a mass through a distance of 1 metre in the direction of the force applie ...
per
kelvin The kelvin, symbol K, is the primary unit of temperature in the International System of Units (SI), used alongside its prefixed forms and the degree Celsius. It is named after the Belfast-born and University of Glasgow-based engineer and phy ...
) both to
heat capacity Heat capacity or thermal capacity is a physical property of matter, defined as the amount of heat to be supplied to an object to produce a unit change in its temperature. The SI unit of heat capacity is joule per kelvin (J/K). Heat capacity ...
and to
thermodynamic entropy In classical thermodynamics, entropy is a property of a thermodynamic system that expresses the direction or outcome of spontaneous changes in the system. The term was introduced by Rudolf Clausius in the mid-nineteenth century from the Greek word ...
implicitly treats information entropy as a
quantity of dimension one A dimensionless quantity (also known as a bare quantity, pure quantity, or scalar quantity as well as quantity of dimension one) is a quantity to which no physical dimension is assigned, with a corresponding SI unit of measurement of one (or 1) ...
, with . Physical systems of natural units that normalize the
Boltzmann constant The Boltzmann constant ( or ) is the proportionality factor that relates the average relative kinetic energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin and the gas constant, ...
to 1 are effectively measuring thermodynamic entropy in nats. When the Shannon entropy is written using a natural logarithm, \Eta = - \sum_i p_i \ln p_i it is implicitly giving a number measured in nats.


Notes


References


Further reading

*{{Cite book , first=Fazlollah M. , last=Reza , title=An Introduction to Information Theory , location=New York , publisher=Dover , year=1994 , isbn=0-486-68210-2 Units of information