
In
statistical mechanics
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. Sometimes called statistical physics or statistical thermodynamics, its applicati ...
, Boltzmann's entropy formula (also known as the Boltzmann–Planck equation, not to be confused with the more general
Boltzmann equation, which is a
partial differential equation
In mathematics, a partial differential equation (PDE) is an equation which involves a multivariable function and one or more of its partial derivatives.
The function is often thought of as an "unknown" that solves the equation, similar to ho ...
) is a probability equation relating the
entropy
Entropy is a scientific concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the micros ...
, also written as
, of an
ideal gas
An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is ...
to the
multiplicity (commonly denoted as
or
), the number of real
microstates corresponding to the gas's
macrostate
In statistical mechanics, a microstate is a specific configuration of a system that describes the precise positions and momenta of all the individual particles or components that make up the system. Each microstate has a certain probability of ...
:
where
is the
Boltzmann constant
The Boltzmann constant ( or ) is the proportionality factor that relates the average relative thermal energy of particles in a ideal gas, gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin (K) and the ...
(also written as simply
) and equal to 1.380649 × 10
−23 J/K, and
is the
natural logarithm
The natural logarithm of a number is its logarithm to the base of a logarithm, base of the e (mathematical constant), mathematical constant , which is an Irrational number, irrational and Transcendental number, transcendental number approxima ...
function (or
log base
e, as in the image above).
In short, the
Boltzmann formula shows the relationship between entropy and the number of ways the
atom
Atoms are the basic particles of the chemical elements. An atom consists of a atomic nucleus, nucleus of protons and generally neutrons, surrounded by an electromagnetically bound swarm of electrons. The chemical elements are distinguished fr ...
s or
molecule
A molecule is a group of two or more atoms that are held together by Force, attractive forces known as chemical bonds; depending on context, the term may or may not include ions that satisfy this criterion. In quantum physics, organic chemi ...
s of a certain kind of
thermodynamic system
A thermodynamic system is a body of matter and/or radiation separate from its surroundings that can be studied using the laws of thermodynamics.
Thermodynamic systems can be passive and active according to internal processes. According to inter ...
can be arranged.
History
The equation was originally formulated by
Ludwig Boltzmann
Ludwig Eduard Boltzmann ( ; ; 20 February 1844 – 5 September 1906) was an Austrian mathematician and Theoretical physics, theoretical physicist. His greatest achievements were the development of statistical mechanics and the statistical ex ...
between 1872 and 1875, but later put into its current form by
Max Planck
Max Karl Ernst Ludwig Planck (; ; 23 April 1858 – 4 October 1947) was a German Theoretical physics, theoretical physicist whose discovery of energy quantum, quanta won him the Nobel Prize in Physics in 1918.
Planck made many substantial con ...
in about 1900. To quote Planck, "the
logarithm
In mathematics, the logarithm of a number is the exponent by which another fixed value, the base, must be raised to produce that number. For example, the logarithm of to base is , because is to the rd power: . More generally, if , the ...
ic connection between
entropy
Entropy is a scientific concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the micros ...
and
probability
Probability is a branch of mathematics and statistics concerning events and numerical descriptions of how likely they are to occur. The probability of an event is a number between 0 and 1; the larger the probability, the more likely an e ...
was first stated by L. Boltzmann in his
kinetic theory of gases".
[Max Planc]
(1914) The theory of heat radiation
equation 164, p.119
A 'microstate' is a state specified in terms of the constituent particles of a body of matter or radiation that has been specified as a macrostate in terms of such variables as
internal energy
The internal energy of a thermodynamic system is the energy of the system as a state function, measured as the quantity of energy necessary to bring the system from its standard internal state to its present internal state of interest, accoun ...
and pressure. A macrostate is experimentally observable, with at least a finite extent in
spacetime
In physics, spacetime, also called the space-time continuum, is a mathematical model that fuses the three dimensions of space and the one dimension of time into a single four-dimensional continuum. Spacetime diagrams are useful in visualiz ...
. A microstate can be instantaneous, or can be a trajectory composed of a temporal progression of instantaneous microstates. In experimental practice, such are scarcely observable. The present account concerns instantaneous microstates.
The value of was originally intended to be proportional to the ''Wahrscheinlichkeit'' (the German word for probability) of a
macroscopic
The macroscopic scale is the length scale on which objects or phenomena are large enough to be visible with the naked eye, without magnifying optical instruments. It is the opposite of microscopic.
Overview
When applied to physical phenome ...
state for some
probability distribution
In probability theory and statistics, a probability distribution is a Function (mathematics), function that gives the probabilities of occurrence of possible events for an Experiment (probability theory), experiment. It is a mathematical descri ...
of possible
microstates—the collection of (unobservable microscopic single particle) "ways" in which the (observable macroscopic)
thermodynamic state of a system can be realized by assigning different
positions and
momenta to the respective molecules.
There are many instantaneous microstates that apply to a given macrostate. Boltzmann considered collections of such microstates. For a given macrostate, he called the collection of all possible instantaneous microstates of a certain kind by the name ''monode'', for which Gibbs' term ''ensemble'' is used nowadays. For single particle instantaneous microstates, Boltzmann called the collection an ''ergode''. Subsequently, Gibbs called it a ''microcanonical ensemble'', and this name is widely used today, perhaps partly because Bohr was more interested in the writings of Gibbs than of Boltzmann.
Interpreted in this way, Boltzmann's formula is the most basic formula for the thermodynamic
entropy
Entropy is a scientific concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the micros ...
. Boltzmann's
paradigm
In science and philosophy, a paradigm ( ) is a distinct set of concepts or thought patterns, including theories, research methods, postulates, and standards for what constitute legitimate contributions to a field. The word ''paradigm'' is Ancient ...
was an
ideal gas
An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is ...
of ''identical'' particles, of which are in the -th microscopic condition (range) of position and momentum. For this case, the probability of each microstate of the system is equal, so it was equivalent for Boltzmann to calculate the number of microstates associated with a macrostate. was historically misinterpreted as literally meaning the number of microstates, and that is what it usually means today. can be counted using the formula for
permutations
In mathematics, a permutation of a Set (mathematics), set can mean one of two different things:
* an arrangement of its members in a sequence or linear order, or
* the act or process of changing the linear order of an ordered set.
An example ...
where ranges over all possible molecular conditions and "" denotes
factorial
In mathematics, the factorial of a non-negative denoted is the Product (mathematics), product of all positive integers less than or equal The factorial also equals the product of n with the next smaller factorial:
\begin
n! &= n \times ...
. The "correction" in the denominator is due to the fact that identical particles in the same condition are
indistinguishable. is sometimes called the "thermodynamic probability" since it is an
integer
An integer is the number zero (0), a positive natural number (1, 2, 3, ...), or the negation of a positive natural number (−1, −2, −3, ...). The negations or additive inverses of the positive natural numbers are referred to as negative in ...
greater than one, while
mathematical probabilities are always
number
A number is a mathematical object used to count, measure, and label. The most basic examples are the natural numbers 1, 2, 3, 4, and so forth. Numbers can be represented in language with number words. More universally, individual numbers can ...
s between zero and one.
Introduction of the natural logarithm
In Boltzmann’s 1877 paper, he clarifies molecular state counting to determine the state distribution number introducing the logarithm to simplify the equation.
Boltzmann writes:
“The first task is to determine the permutation number, previously designated by
𝒫
, for any state distribution. Denoting by J the sum of the permutations
𝒫
for all possible state distributions, the quotient
𝒫
/J is the state distribution’s probability, henceforth denoted by W. We would first like to calculate the permutations
𝒫
for
the state distribution characterized by w
0 molecules with
kinetic energy
In physics, the kinetic energy of an object is the form of energy that it possesses due to its motion.
In classical mechanics, the kinetic energy of a non-rotating object of mass ''m'' traveling at a speed ''v'' is \fracmv^2.Resnick, Rober ...
0, w
1 molecules with kinetic energy ϵ, etc. …
“The most likely state distribution will be for those w
0, w
1 … values for which
𝒫
is a maximum or since the numerator is a constant, for which the denominator is a minimum. The values w
0, w
1 must simultaneously satisfy the two constraints (1) and (2). Since the denominator of
𝒫
is a product, it is easiest to determine the minimum of its logarithm, …”
Therefore, by making the denominator small, he maximizes the number of states. So to simplify the product of the factorials, he uses their natural logarithm to add them. This is the reason for the natural logarithm in Boltzmann’s entropy formula.
Generalization
Boltzmann's formula applies to microstates of a system, each possible microstate of which is presumed to be equally probable.
But in thermodynamics, the universe is divided into a
system
A system is a group of interacting or interrelated elements that act according to a set of rules to form a unified whole. A system, surrounded and influenced by its open system (systems theory), environment, is described by its boundaries, str ...
of interest, plus its surroundings; then the entropy of Boltzmann's microscopically specified system can be identified with the system entropy in classical thermodynamics. The microstates of such a thermodynamic system are ''not'' equally probable—for example, high energy microstates are less probable than low energy microstates for a thermodynamic system kept at a fixed temperature by allowing contact with a heat bath.
For thermodynamic systems where microstates of the system may not have equal probabilities, the appropriate generalization, called the
Gibbs entropy formula, is:
This reduces to equation () if the probabilities ''p''
i are all equal.
Boltzmann used a
formula as early as 1866.
He interpreted as a density in phase space—without mentioning probability—but since this satisfies the axiomatic definition of a probability measure we can retrospectively interpret it as a probability anyway.
Gibbs gave an explicitly probabilistic interpretation in 1878.
Boltzmann himself used an expression equivalent to () in his later work
[;
] and recognized it as more general than equation (). That is, equation () is a corollary of
equation ()—and not vice versa. In every situation where equation () is valid,
equation () is valid also—and not vice versa.
Boltzmann entropy excludes statistical dependencies
The term Boltzmann entropy is also sometimes used to indicate entropies calculated based on the approximation that the overall probability can be factored into an identical separate term for each particle—i.e., assuming each particle has an identical independent probability distribution, and ignoring interactions and correlations between the particles. This is exact for an ideal gas of identical particles that move independently apart from instantaneous collisions, and is an approximation, possibly a poor one, for other systems.
[ Jaynes, E. T. (1965)]
Gibbs vs Boltzmann entropies
''American Journal of Physics'', 33, 391-8.
The Boltzmann entropy is obtained if one assumes one can treat all the component particles of a
thermodynamic system
A thermodynamic system is a body of matter and/or radiation separate from its surroundings that can be studied using the laws of thermodynamics.
Thermodynamic systems can be passive and active according to internal processes. According to inter ...
as statistically independent. The probability distribution of the system as a whole then factorises into the product of ''N'' separate identical terms, one term for each particle; and when the summation is taken over each possible state in the 6-dimensional
phase space
The phase space of a physical system is the set of all possible physical states of the system when described by a given parameterization. Each possible state corresponds uniquely to a point in the phase space. For mechanical systems, the p ...
of a ''single'' particle (rather than the 6''N''-dimensional phase space of the system as a whole), the Gibbs entropy formula
simplifies to the Boltzmann entropy
.
This reflects the original statistical entropy function introduced by Ludwig Boltzmann in 1872. For the special case of an
ideal gas
An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is ...
it exactly corresponds to the proper
thermodynamic entropy.
For anything but the most dilute of real gases,
leads to increasingly wrong predictions of entropies and physical behaviours, by ignoring the interactions and correlations between different molecules. Instead one must consider the
ensemble of states of the system as a whole, called by Boltzmann a ''holode'', rather than single particle states.
[Cercignani, C. (1998). ''Ludwig Boltzmann: the Man who Trusted Atoms'', Oxford University Press, Oxford UK, , p. 134.] Gibbs considered several such kinds of ensembles; relevant here is the ''canonical'' one.
[
]
See also
* History of entropy
* H theorem
* Gibbs entropy formula
* nat (unit)
The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of information or information entropy, based on natural logarithms and powers of e (mathematical constant), ''e'', rather than the powers of 2 and binary loga ...
* Shannon entropy
* von Neumann entropy
References
External links
Introduction to Boltzmann's Equation
Vorlesungen über Gastheorie, Ludwig Boltzmann (1896) vol. I, J.A. Barth, Leipzig
Vorlesungen über Gastheorie, Ludwig Boltzmann (1898) vol. II. J.A. Barth, Leipzig.
{{Statistical mechanics topics
Eponymous equations of physics
Thermodynamic entropy
Thermodynamic equations
Ludwig Boltzmann