HOME

TheInfoList



OR:

In
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic b ...
, Boltzmann's equation (also known as the Boltzmann–Planck equation) is a probability equation relating the
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodyna ...
S, also written as S_\mathrm, of an
ideal gas An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is a ...
to the multiplicity (commonly denoted as \Omega or W), the number of real microstates corresponding to the gas's macrostate: where k_\mathrm B is the
Boltzmann constant The Boltzmann constant ( or ) is the proportionality factor that relates the average relative kinetic energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin and the gas constan ...
(also written as simply k) and equal to 1.380649 × 10−23 J/K, and \log is the
natural logarithm The natural logarithm of a number is its logarithm to the base of the mathematical constant , which is an irrational and transcendental number approximately equal to . The natural logarithm of is generally written as , , or sometimes, if ...
function. In short, the Boltzmann formula shows the relationship between entropy and the number of ways the
atom Every atom is composed of a nucleus and one or more electrons bound to the nucleus. The nucleus is made of one or more protons and a number of neutrons. Only the most common variety of hydrogen has no neutrons. Every solid, liquid, gas, a ...
s or
molecule A molecule is a group of two or more atoms held together by attractive forces known as chemical bonds; depending on context, the term may or may not include ions which satisfy this criterion. In quantum physics, organic chemistry, and b ...
s of a certain kind of
thermodynamic system A thermodynamic system is a body of matter and/or radiation, confined in space by walls, with defined permeabilities, which separate it from its surroundings. The surroundings may include other thermodynamic systems, or physical systems that are ...
can be arranged.


History

The equation was originally formulated by
Ludwig Boltzmann Ludwig Eduard Boltzmann (; 20 February 1844 – 5 September 1906) was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of ther ...
between 1872 and 1875, but later put into its current form by
Max Planck Max Karl Ernst Ludwig Planck (, ; 23 April 1858 – 4 October 1947) was a German theoretical physicist whose discovery of energy quanta won him the Nobel Prize in Physics in 1918. Planck made many substantial contributions to theoretical p ...
in about 1900. To quote Planck, "the
logarithm In mathematics, the logarithm is the inverse function to exponentiation. That means the logarithm of a number  to the base  is the exponent to which must be raised, to produce . For example, since , the ''logarithm base'' 10 ...
ic connection between
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodyna ...
and
probability Probability is the branch of mathematics concerning numerical descriptions of how likely an event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and 1, where, roughly speaking, ...
was first stated by L. Boltzmann in his kinetic theory of gases". A 'microstate' is a state specified in terms of the constituent particles of a body of matter or radiation that has been specified as a macrostate in terms of such variables as internal energy and pressure. A macrostate is experimentally observable, with at least a finite extent in
spacetime In physics, spacetime is a mathematical model that combines the three dimensions of space and one dimension of time into a single four-dimensional manifold. Spacetime diagrams can be used to visualize relativistic effects, such as why differ ...
. A microstate can be instantaneous, or can be a trajectory composed of a temporal progression of instantaneous microstates. In experimental practice, such are scarcely observable. The present account concerns instantaneous microstates. The value of was originally intended to be proportional to the ''Wahrscheinlichkeit'' (the German word for probability) of a
macroscopic The macroscopic scale is the length scale on which objects or phenomena are large enough to be visible with the naked eye, without magnifying optical instruments. It is the opposite of microscopic. Overview When applied to physical phenomena a ...
state for some probability distribution of possible microstates—the collection of (unobservable microscopic single particle) "ways" in which the (observable macroscopic)
thermodynamic Thermodynamics is a branch of physics that deals with heat, work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed by the four laws of ...
state of a system can be realized by assigning different positions and
momenta Momenta is an autonomous driving company headquartered in Beijing, China that aims to build the 'Brains' for autonomous vehicles. In December 2021, Momenta and BYD established a 100 million yuan ($15.7 million) joint venture to deploy autonomous ...
to the respective molecules. There are many instantaneous microstates that apply to a given macrostate. Boltzmann considered collections of such microstates. For a given macrostate, he called the collection of all possible instantaneous microstates of a certain kind by the name ''monode'', for which Gibbs' term ''ensemble'' is used nowadays. For single particle instantaneous microstates, Boltzmann called the collection an ''ergode''. Subsequently, Gibbs called it a ''microcanonical ensemble'', and this name is widely used today, perhaps partly because Bohr was more interested in the writings of Gibbs than of Boltzmann. Interpreted in this way, Boltzmann's formula is the most basic formula for the thermodynamic
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodyna ...
. Boltzmann's
paradigm In science and philosophy, a paradigm () is a distinct set of concepts or thought patterns, including theories, research methods, postulates, and standards for what constitute legitimate contributions to a field. Etymology ''Paradigm'' comes f ...
was an
ideal gas An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is a ...
of ''identical'' particles, of which are in the -th microscopic condition (range) of position and momentum. For this case, the probability of each microstate of the system is equal, so it was equivalent for Boltzmann to calculate the number of microstates associated with a macrostate. was historically misinterpreted as literally meaning the number of microstates, and that is what it usually means today. can be counted using the formula for
permutations In mathematics, a permutation of a set is, loosely speaking, an arrangement of its members into a sequence or linear order, or if the set is already ordered, a rearrangement of its elements. The word "permutation" also refers to the act or p ...
where ranges over all possible molecular conditions and "" denotes
factorial In mathematics, the factorial of a non-negative denoted is the product of all positive integers less than or equal The factorial also equals the product of n with the next smaller factorial: \begin n! &= n \times (n-1) \times (n-2) \ ...
. The "correction" in the denominator is due to the fact that identical particles in the same condition are indistinguishable. is sometimes called the "thermodynamic probability" since it is an
integer An integer is the number zero (), a positive natural number (, , , etc.) or a negative integer with a minus sign ( −1, −2, −3, etc.). The negative numbers are the additive inverses of the corresponding positive numbers. In the languag ...
greater than one, while mathematical probabilities are always
number A number is a mathematical object used to count, measure, and label. The original examples are the natural numbers 1, 2, 3, 4, and so forth. Numbers can be represented in language with number words. More universally, individual number ...
s between zero and one.


Generalization

Boltzmann's formula applies to microstates of a system, each possible microstate of which is presumed to be equally probable. But in thermodynamics, the universe is divided into a
system A system is a group of interacting or interrelated elements that act according to a set of rules to form a unified whole. A system, surrounded and influenced by its environment, is described by its boundaries, structure and purpose and express ...
of interest, plus its surroundings; then the entropy of Boltzmann's microscopically specified system can be identified with the system entropy in classical thermodynamics. The microstates of such a thermodynamic system are ''not'' equally probable—for example, high energy microstates are less probable than low energy microstates for a thermodynamic system kept at a fixed temperature by allowing contact with a heat bath. For thermodynamic systems where microstates of the system may not have equal probabilities, the appropriate generalization, called the
Gibbs entropy The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy ...
, is: This reduces to equation () if the probabilities ''p''i are all equal. Boltzmann used a \rho\ln\rho formula as early as 1866. He interpreted as a density in phase space—without mentioning probability—but since this satisfies the axiomatic definition of a probability measure we can retrospectively interpret it as a probability anyway. Gibbs gave an explicitly probabilistic interpretation in 1878. Boltzmann himself used an expression equivalent to () in his later work; and recognized it as more general than equation (). That is, equation () is a corollary of equation ()—and not vice versa. In every situation where equation () is valid, equation () is valid also—and not vice versa.


Boltzmann entropy excludes statistical dependencies

The term Boltzmann entropy is also sometimes used to indicate entropies calculated based on the approximation that the overall probability can be factored into an identical separate term for each particle—i.e., assuming each particle has an identical independent probability distribution, and ignoring interactions and correlations between the particles. This is exact for an ideal gas of identical particles that move independently apart from instantaneous collisions, and is an approximation, possibly a poor one, for other systems. Jaynes, E. T. (1965)
Gibbs vs Boltzmann entropies
''American Journal of Physics'', 33, 391-8.
The Boltzmann entropy is obtained if one assumes one can treat all the component particles of a
thermodynamic system A thermodynamic system is a body of matter and/or radiation, confined in space by walls, with defined permeabilities, which separate it from its surroundings. The surroundings may include other thermodynamic systems, or physical systems that are ...
as statistically independent. The probability distribution of the system as a whole then factorises into the product of ''N'' separate identical terms, one term for each particle; and when the summation is taken over each possible state in the 6-dimensional
phase space In dynamical system theory, a phase space is a space in which all possible states of a system are represented, with each possible state corresponding to one unique point in the phase space. For mechanical systems, the phase space usuall ...
of a ''single'' particle (rather than the 6''N''-dimensional phase space of the system as a whole), the Gibbs entropy simplifies to the Boltzmann entropy S_. This reflects the original statistical entropy function introduced by
Ludwig Boltzmann Ludwig Eduard Boltzmann (; 20 February 1844 – 5 September 1906) was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of ther ...
in 1872. For the special case of an
ideal gas An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is a ...
it exactly corresponds to the proper thermodynamic entropy. For anything but the most dilute of real gases, S_ leads to increasingly wrong predictions of entropies and physical behaviours, by ignoring the interactions and correlations between different molecules. Instead one must consider the ensemble of states of the system as a whole, called by Boltzmann a ''holode'', rather than single particle states.Cercignani, C. (1998). ''Ludwig Boltzmann: the Man who Trusted Atoms'', Oxford University Press, Oxford UK, , p. 134. Gibbs considered several such kinds of ensembles; relevant here is the ''canonical'' one.


See also

*
History of entropy The concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work. Early heat-powere ...
*
Gibbs entropy The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy ...
*
nat (unit) The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of information, based on natural logarithms and powers of ''e'', rather than the powers of 2 and base 2 logarithms, which define the shannon. This unit is ...
*
Shannon entropy Shannon may refer to: People * Shannon (given name) * Shannon (surname) * Shannon (American singer), stage name of singer Shannon Brenda Greene (born 1958) * Shannon (South Korean singer), British-South Korean singer and actress Shannon Arrum W ...
*
von Neumann entropy In physics, the von Neumann entropy, named after John von Neumann, is an extension of the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics. For a quantum-mechanical system described by a density matrix ...


References


External links


Introduction to Boltzmann's Equation

Vorlesungen über Gastheorie, Ludwig Boltzmann (1896) vol. I, J.A. Barth, Leipzig

Vorlesungen über Gastheorie, Ludwig Boltzmann (1898) vol. II. J.A. Barth, Leipzig.
{{Statistical mechanics topics Equations of physics Thermodynamic entropy Thermodynamic equations Ludwig Boltzmann