HOME

TheInfoList



OR:

In classical
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic b ...
, the ''H''-theorem, introduced by
Ludwig Boltzmann Ludwig Eduard Boltzmann (; 20 February 1844 – 5 September 1906) was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of ther ...
in 1872, describes the tendency to decrease in the quantity ''H'' (defined below) in a nearly-
ideal gas An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is a ...
of molecules. L. Boltzmann,
Weitere Studien über das Wärmegleichgewicht unter Gasmolekülen
" Sitzungsberichte Akademie der Wissenschaften 66 (1872): 275-370.
English translation:
As this quantity ''H'' was meant to represent the
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodyna ...
of thermodynamics, the ''H''-theorem was an early demonstration of the power of
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic b ...
as it claimed to derive the
second law of thermodynamics The second law of thermodynamics is a physical law based on universal experience concerning heat and energy interconversions. One simple statement of the law is that heat always moves from hotter objects to colder objects (or "downhill"), unle ...
—a statement about fundamentally
irreversible process In science, a process that is not reversible is called irreversible. This concept arises frequently in thermodynamics. All complex natural processes are irreversible, although a phase transition at the coexistence temperature (e.g. melting of ...
es—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions. The ''H''-theorem is a natural consequence of the kinetic equation derived by Boltzmann that has come to be known as Boltzmann's equation. The ''H''-theorem has led to considerable discussion about its actual implications, with major themes being: * What is entropy? In what sense does Boltzmann's quantity ''H'' correspond to the thermodynamic entropy? * Are the assumptions (especially the assumption of molecular chaos) behind Boltzmann's equation too strong? When are these assumptions violated?


Name and pronunciation

Boltzmann in his original publication writes the symbol ''E'' (as in
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodyna ...
) for its statistical function''.'' Years later,
Samuel Hawksley Burbury Samuel Hawksley Burbury, FRS (18 May 1831 – 18 August 1911) was a British mathematician. Life He was born on 18 May 1831 at Kenilworth, the only son of Samuel Burbury of Clarendon Square, Leamington, by Helen his wife. He was educated at Shre ...
, one of the critics of the theorem, wrote the function with the symbol ''H,'' a notation that was subsequently adopted by Boltzmann when referring to his ''"H-''theorem". The notation has led to some confusion regarding the name of the theorem. Even though the statement is usually referred to as the "''
Aitch H, or h, is the eighth letter in the Latin alphabet, used in the modern English alphabet, the alphabets of other western European languages and others worldwide. Its name in English is ''aitch'' (pronounced , plural ''aitches''), or regio ...
'' theorem''"'', sometimes it is instead called the "''
Eta Eta (uppercase , lowercase ; grc, ἦτα ''ē̂ta'' or ell, ήτα ''ita'' ) is the seventh letter of the Greek alphabet, representing the close front unrounded vowel . Originally denoting the voiceless glottal fricative in most dialects, ...
'' theorem", as the capital
Greek letter The Greek alphabet has been used to write the Greek language since the late 9th or early 8th century BCE. It is derived from the earlier Phoenician alphabet, and was the earliest known alphabetic script to have distinct letters for vowels as ...
''Eta'' (''Η'') is undistinguishable from the capital version of Latin letter ''h'' (''H'')''.'' Discussions have been raised on how the symbol should be understood, but it remains unclear due to the lack of written sources from the time of the theorem. Studies of the
typography Typography is the art and technique of arranging type to make written language legible, readable and appealing when displayed. The arrangement of type involves selecting typefaces, point sizes, line lengths, line-spacing ( leading), an ...
and the work of
J.W. Gibbs Josiah Willard Gibbs (; February 11, 1839 – April 28, 1903) was an American scientist who made significant theoretical contributions to physics, chemistry, and mathematics. His work on the applications of thermodynamics was instrumental in t ...
seem to favour the interpretation of ''H'' as ''Eta''.


Definition and meaning of Boltzmann's ''H''

The ''H'' value is determined from the function ''f''(''E'', ''t'') ''dE'', which is the energy distribution function of molecules at time ''t''. The value ''f''(''E'', ''t'') ''dE'' is the number of molecules that have kinetic energy between ''E'' and ''E'' + ''dE''. ''H'' itself is defined as : H(t) = \int_0^\infty f(E,t) \left( \ln\frac - 1 \right) \, dE. For an isolated ideal gas (with fixed total energy and fixed total number of particles), the function ''H'' is at a minimum when the particles have a
Maxwell–Boltzmann distribution In physics (in particular in statistical mechanics), the Maxwell–Boltzmann distribution, or Maxwell(ian) distribution, is a particular probability distribution named after James Clerk Maxwell and Ludwig Boltzmann. It was first defined and use ...
; if the molecules of the ideal gas are distributed in some other way (say, all having the same kinetic energy), then the value of ''H'' will be higher. Boltzmann's ''H''-theorem, described in the next section, shows that when collisions between molecules are allowed, such distributions are unstable and tend to irreversibly seek towards the minimum value of ''H'' (towards the Maxwell–Boltzmann distribution). (Note on notation: Boltzmann originally used the letter ''E'' for quantity ''H''; most of the literature after Boltzmann uses the letter ''H'' as here. Boltzmann also used the symbol ''x'' to refer to the kinetic energy of a particle.)


Boltzmann's ''H'' theorem

Boltzmann considered what happens during the collision between two particles. It is a basic fact of mechanics that in the elastic collision between two particles (such as hard spheres), the energy transferred between the particles varies depending on initial conditions (angle of collision, etc.). Boltzmann made a key assumption known as the ''Stosszahlansatz'' ( molecular chaos assumption), that during any collision event in the gas, the two particles participating in the collision have 1) independently chosen kinetic energies from the distribution, 2) independent velocity directions, 3) independent starting points. Under these assumptions, and given the mechanics of energy transfer, the energies of the particles after the collision will obey a certain new random distribution that can be computed. Considering repeated uncorrelated collisions, between any and all of the molecules in the gas, Boltzmann constructed his kinetic equation ( Boltzmann's equation). From this kinetic equation, a natural outcome is that the continual process of collision causes the quantity ''H'' to decrease until it has reached a minimum.


Impact

Although Boltzmann's ''H''-theorem turned out not to be the absolute proof of the second law of thermodynamics as originally claimed (see Criticisms below), the ''H''-theorem led Boltzmann in the last years of the 19th century to more and more probabilistic arguments about the nature of thermodynamics. The probabilistic view of thermodynamics culminated in 1902 with
Josiah Willard Gibbs Josiah Willard Gibbs (; February 11, 1839 – April 28, 1903) was an American scientist who made significant theoretical contributions to physics, chemistry, and mathematics. His work on the applications of thermodynamics was instrumental in t ...
's statistical mechanics for fully general systems (not just gases), and the introduction of generalized
statistical ensemble In physics, specifically statistical mechanics, an ensemble (also statistical ensemble) is an idealization consisting of a large number of virtual copies (sometimes infinitely many) of a system, considered all at once, each of which represents a ...
s. The kinetic equation and in particular Boltzmann's molecular chaos assumption inspired a whole family of Boltzmann equations that are still used today to model the motions of particles, such as the electrons in a semiconductor. In many cases the molecular chaos assumption is highly accurate, and the ability to discard complex correlations between particles makes calculations much simpler. The process of
thermalisation In physics, thermalisation is the process of physical bodies reaching thermal equilibrium through mutual interaction. In general the natural tendency of a system is towards a state of equipartition of energy and uniform temperature that maximizes ...
can be described using the H-theorem or the
relaxation theorem Relaxation stands quite generally for a release of tension, a return to equilibrium. In the sciences, the term is used in the following ways: * Relaxation (physics), and more in particular: ** Relaxation (NMR), processes by which nuclear magnetiza ...
.


Criticism and exceptions

There are several notable reasons described below why the ''H''-theorem, at least in its original 1871 form, is not completely rigorous. As Boltzmann would eventually go on to admit, the arrow of time in the ''H''-theorem is not in fact purely mechanical, but really a consequence of assumptions about initial conditions.


Loschmidt's paradox

Soon after Boltzmann published his ''H'' theorem, Johann Josef Loschmidt objected that it should not be possible to deduce an irreversible process from time-symmetric dynamics and a time-symmetric formalism. If the ''H'' decreases over time in one state, then there must be a matching reversed state where ''H'' increases over time ( Loschmidt's paradox). The explanation is that Boltzmann's equation is based on the assumption of " molecular chaos", i.e., that it follows from, or at least is consistent with, the underlying kinetic model that the particles be considered independent and uncorrelated. It turns out that this assumption breaks time reversal symmetry in a subtle sense, and therefore
begs the question In classical rhetoric and logic, begging the question or assuming the conclusion ( Latin: ') is an informal fallacy that occurs when an argument's premises assume the truth of the conclusion, instead of supporting it. For example: * "Green i ...
. Once the particles are allowed to collide, their velocity directions and positions in fact ''do'' become correlated (however, these correlations are encoded in an extremely complex manner). This shows that an (ongoing) assumption of independence is not consistent with the underlying particle model. Boltzmann's reply to Loschmidt was to concede the possibility of these states, but noting that these sorts of states were so rare and unusual as to be impossible in practice. Boltzmann would go on to sharpen this notion of the "rarity" of states, resulting in his famous equation, his entropy formula of 1877 (see Boltzmann's entropy formula).


Spin echo

As a demonstration of Loschmidt's paradox, a famous modern counter example (not to Boltzmann's original gas-related ''H''-theorem, but to a closely related analogue) is the phenomenon of
spin echo In magnetic resonance, a spin echo or Hahn echo is the refocusing of spin magnetisation by a pulse of resonant electromagnetic radiation. Modern nuclear magnetic resonance (NMR) and magnetic resonance imaging (MRI) make use of this effect. The NMR ...
. In the spin echo effect, it is physically possible to induce time reversal in an interacting system of spins. An analogue to Boltzmann's ''H'' for the spin system can be defined in terms of the distribution of spin states in the system. In the experiment, the spin system is initially perturbed into a non-equilibrium state (high ''H''), and, as predicted by the ''H'' theorem the quantity ''H'' soon decreases to the equilibrium value. At some point, a carefully constructed electromagnetic pulse is applied that reverses the motions of all the spins. The spins then undo the time evolution from before the pulse, and after some time the ''H'' actually ''increases'' away from equilibrium (once the evolution has completely unwound, the ''H'' decreases once again to the minimum value). In some sense, the time reversed states noted by Loschmidt turned out to be not completely impractical.


Poincaré recurrence

In 1896,
Ernst Zermelo Ernst Friedrich Ferdinand Zermelo (, ; 27 July 187121 May 1953) was a German logician and mathematician, whose work has major implications for the foundations of mathematics. He is known for his role in developing Zermelo–Fraenkel axiomatic ...
noted a further problem with the ''H'' theorem, which was that if the system's ''H'' is at any time not a minimum, then by
Poincaré recurrence Poincaré is a French surname. Notable people with the surname include: * Henri Poincaré (1854–1912), French physicist, mathematician and philosopher of science * Henriette Poincaré (1858-1943), wife of Prime Minister Raymond Poincaré * Luci ...
, the non-minimal ''H'' must recur (though after some extremely long time). Boltzmann admitted that these recurring rises in ''H'' technically would occur, but pointed out that, over long times, the system spends only a tiny fraction of its time in one of these recurring states. The
second law of thermodynamics The second law of thermodynamics is a physical law based on universal experience concerning heat and energy interconversions. One simple statement of the law is that heat always moves from hotter objects to colder objects (or "downhill"), unle ...
states that the entropy of an
isolated system In physical science, an isolated system is either of the following: # a physical system so far removed from other systems that it does not interact with them. # a thermodynamic system enclosed by rigid immovable walls through which neither ...
always increases to a maximum equilibrium value. This is strictly true only in the thermodynamic limit of an infinite number of particles. For a finite number of particles, there will always be entropy fluctuations. For example, in the fixed volume of the isolated system, the maximum entropy is obtained when half the particles are in one half of the volume, half in the other, but sometimes there will be temporarily a few more particles on one side than the other, and this will constitute a very small reduction in entropy. These entropy fluctuations are such that the longer one waits, the larger an entropy fluctuation one will probably see during that time, and the time one must wait for a given entropy fluctuation is always finite, even for a fluctuation to its minimum possible value. For example, one might have an extremely low entropy condition of all particles being in one half of the container. The gas will quickly attain its equilibrium value of entropy, but given enough time, this same situation will happen again. For practical systems, e.g. a gas in a 1-liter container at room temperature and atmospheric pressure, this time is truly enormous, many multiples of the age of the universe, and, practically speaking, one can ignore the possibility.


Fluctuations of ''H'' in small systems

Since ''H'' is a mechanically defined variable that is not conserved, then like any other such variable (pressure, etc.) it will show
thermal fluctuations In statistical mechanics, thermal fluctuations are random deviations of a system from its average state, that occur in a system at equilibrium.In statistical mechanics they are often simply referred to as fluctuations. All thermal fluctuations b ...
. This means that ''H'' regularly shows spontaneous increases from the minimum value. Technically this is not an exception to the ''H'' theorem, since the ''H'' theorem was only intended to apply for a gas with a very large number of particles. These fluctuations are only perceptible when the system is small and the time interval over which it is observed is not enormously large. If ''H'' is interpreted as entropy as Boltzmann intended, then this can be seen as a manifestation of the
fluctuation theorem The fluctuation theorem (FT), which originated from statistical mechanics, deals with the relative probability that the entropy of a system which is currently away from thermodynamic equilibrium (i.e., maximum entropy) will increase or decrease ov ...
.


Connection to information theory

''H'' is a forerunner of Shannon's
information entropy In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable X, which takes values in the alphabet \ ...
.
Claude Shannon Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as a "father of information theory". As a 21-year-old master's degree student at the Massachusetts I ...
denoted his measure of
information entropy In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable X, which takes values in the alphabet \ ...
''H'' after the H-theorem. The article on Shannon's
information entropy In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable X, which takes values in the alphabet \ ...
contains an
explanation An explanation is a set of statements usually constructed to describe a set of facts which clarifies the causes, context, and consequences of those facts. It may establish rules or laws, and may clarify the existing rules or laws in relatio ...
of the discrete counterpart of the quantity ''H'', known as the information entropy or information uncertainty (with a minus sign). By extending the discrete information entropy to the continuous information entropy, also called
differential entropy Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuo ...
, one obtains the expression in the equation from the section above, Definition and Meaning of Boltzmann's H, and thus a better feel for the meaning of ''H''. The ''H''-theorem's connection between information and entropy plays a central role in a recent controversy called the Black hole information paradox.


Tolman's ''H''-theorem

Richard C. Tolman Richard Chace Tolman (March 4, 1881 – September 5, 1948) was an American mathematical physicist and physical chemist who made many contributions to statistical mechanics. He also made important contributions to theoretical cosmology in t ...
's 1938 book ''The Principles of Statistical Mechanics'' dedicates a whole chapter to the study of Boltzmann's ''H'' theorem, and its extension in the generalized classical statistical mechanics of Gibbs. A further chapter is devoted to the quantum mechanical version of the ''H''-theorem.


Classical mechanical

We let q_i and p_i be our
generalized coordinates In analytical mechanics, generalized coordinates are a set of parameters used to represent the state of a system in a configuration space. These parameters must uniquely define the configuration of the system relative to a reference state.,p. 39 ...
for a set of r particles. Then we consider a function f that returns the probability density of particles, over the states in
phase space In dynamical system theory, a phase space is a space in which all possible states of a system are represented, with each possible state corresponding to one unique point in the phase space. For mechanical systems, the phase space usuall ...
. Note how this can be multiplied by a small region in phase space, denoted by \delta q_1 ... \delta p_r, to yield the (average) expected number of particles in that region. :\delta n = f(q_1 ... p_r,t)\,\delta q_1\delta p_1 ... \delta q_r \delta p_r.\, Tolman offers the following equations for the definition of the quantity ''H'' in Boltzmann's original ''H'' theorem. : H= \sum_i f_i \ln f_i \,\delta q_1 \cdots \delta p_r Here we sum over the regions into which phase space is divided, indexed by i. And in the limit for an infinitesimal phase space volume \delta q_i \rightarrow 0, \delta p_i \rightarrow 0 \; \forall \, i, we can write the sum as an integral. : H= \int \cdots \int f \ln f \,d q_1 \cdots dp_r ''H'' can also be written in terms of the number of molecules present in each of the cells. : \begin H & = \sum( n_i \ln n_i - n_i \ln \delta v_\gamma) \\ & = \sum n_i \ln n_i + \text \end Tolman 1938 pg. 135 formula 47.7 An additional way to calculate the quantity ''H'' is: : H = -\ln P + \text\, where ''P'' is the probability of finding a system chosen at random from the specified
microcanonical ensemble In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assumed to be isolated in the sense that it canno ...
. It can finally be written as: : H = -\ln G + \text\, where ''G'' is the number of classical states. The quantity ''H'' can also be defined as the integral over velocity space : : where ''P''(''v'') is the probability distribution. Using the Boltzmann equation one can prove that ''H'' can only decrease. For a system of ''N'' statistically independent particles, ''H'' is related to the thermodynamic entropy ''S'' through: :S \ \stackrel\ - V k H + \text So, according to the ''H''-theorem, ''S'' can only increase.


Quantum mechanical

In quantum statistical mechanics (which is the quantum version of classical statistical mechanics), the H-function is the function: : H= \sum_i p_i \ln p_i, \, where summation runs over all possible distinct states of the system, and ''pi'' is the probability that the system could be found in the ''i''-th state. This is closely related to the entropy formula of Gibbs, :S = - k \sum_i p_i \ln p_i \; and we shall (following e.g., Waldram (1985), p. 39) proceed using ''S'' rather than ''H''. First, differentiating with respect to time gives :\begin \frac & = - k \sum_i \left(\frac \ln p_i + \frac\right) \\ & = - k \sum_i \frac \ln p_i \\ \end (using the fact that Σ ''dp''''i''/''dt'' = 0, since Σ ''p''''i'' = 1, so the second term vanishes. We will see later that it will be useful to break this into two sums.) Now
Fermi's golden rule In quantum physics, Fermi's golden rule is a formula that describes the transition rate (the probability of a transition per unit time) from one energy eigenstate of a quantum system to a group of energy eigenstates in a continuum, as a result of a ...
gives a
master equation In physics, chemistry and related fields, master equations are used to describe the time evolution of a system that can be modelled as being in a probabilistic combination of states at any given time and the switching between states is determined ...
for the average rate of quantum jumps from state α to β; and from state β to α. (Of course, Fermi's golden rule itself makes certain approximations, and the introduction of this rule is what introduces irreversibility. It is essentially the quantum version of Boltzmann's ''Stosszahlansatz''.) For an isolated system the jumps will make contributions :\begin \frac & = \sum_\beta \nu_(p_\beta - p_\alpha) \\ \frac & = \sum_\alpha \nu_(p_\alpha - p_\beta) \\ \end where the reversibility of the dynamics ensures that the same transition constant ''ν''''αβ'' appears in both expressions. So :\frac = \frac k \sum_ \nu_(\ln p_-\ln p_)(p_- p_). The two differences terms in the summation always have the same sign. For example: :\begin w_ < w_ \end then :\begin \ln w_ < \ln w_ \end so overall the two negative signs will cancel. Therefore, :\Delta S \geq 0 \, for an isolated system. The same mathematics is sometimes used to show that relative entropy is a
Lyapunov function In the theory of ordinary differential equations (ODEs), Lyapunov functions, named after Aleksandr Lyapunov, are scalar functions that may be used to prove the stability of an equilibrium of an ODE. Lyapunov functions (also called Lyapunov’s se ...
of a
Markov process A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happen ...
in
detailed balance The principle of detailed balance can be used in kinetic systems which are decomposed into elementary processes (collisions, or steps, or elementary reactions). It states that at equilibrium, each elementary process is in equilibrium with its reve ...
, and other chemistry contexts.


Gibbs' ''H''-theorem

Josiah Willard Gibbs Josiah Willard Gibbs (; February 11, 1839 – April 28, 1903) was an American scientist who made significant theoretical contributions to physics, chemistry, and mathematics. His work on the applications of thermodynamics was instrumental in t ...
described another way in which the entropy of a microscopic system would tend to increase over time.Chapter XII, from Later writers have called this "Gibbs' ''H''-theorem" as its conclusion resembles that of Boltzmann's. Gibbs himself never called it an ''H''-theorem, and in fact his definition of entropy—and mechanism of increase—are very different from Boltzmann's. This section is included for historical completeness. The setting of Gibbs' entropy production theorem is in ensemble statistical mechanics, and the entropy quantity is the
Gibbs entropy The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy ...
(information entropy) defined in terms of the probability distribution for the entire state of the system. This is in contrast to Boltzmann's ''H'' defined in terms of the distribution of states of individual molecules, within a specific state of the system. Gibbs considered the motion of an ensemble which initially starts out confined to a small region of phase space, meaning that the state of the system is known with fair precision though not quite exactly (low Gibbs entropy). The evolution of this ensemble over time proceeds according to
Liouville's equation :''For Liouville's equation in dynamical systems, see Liouville's theorem (Hamiltonian).'' : ''For Liouville's equation in quantum mechanics, see Von Neumann equation.'' : ''For Liouville's equation in Euclidean space, see Liouville–Bratu–Gelf ...
. For almost any kind of realistic system, the Liouville evolution tends to "stir" the ensemble over phase space, a process analogous to the mixing of a dye in an incompressible fluid. After some time, the ensemble appears to be spread out over phase space, although it is actually a finely striped pattern, with the total volume of the ensemble (and its Gibbs entropy) conserved. Liouville's equation is guaranteed to conserve Gibbs entropy since there is no random process acting on the system; in principle, the original ensemble can be recovered at any time by reversing the motion. The critical point of the theorem is thus: If the fine structure in the stirred-up ensemble is very slightly blurred, for any reason, then the Gibbs entropy increases, and the ensemble becomes an equilibrium ensemble. As to why this blurring should occur in reality, there are a variety of suggested mechanisms. For example, one suggested mechanism is that the phase space is coarse-grained for some reason (analogous to the pixelization in the simulation of phase space shown in the figure). For any required finite degree of fineness the ensemble becomes "sensibly uniform" after a finite time. Or, if the system experiences a tiny uncontrolled interaction with its environment, the sharp coherence of the ensemble will be lost.
Edwin Thompson Jaynes Edwin Thompson Jaynes (July 5, 1922 – April 30, 1998) was the Wayman Crow Distinguished Professor of Physics at Washington University in St. Louis. He wrote extensively on statistical mechanics and on foundations of probability and statist ...
argued that the blurring is subjective in nature, simply corresponding to a loss of knowledge about the state of the system.E.T. Jaynes; Gibbs vs Boltzmann Entropies; American Journal of Physics,391,1965 In any case, however it occurs, the Gibbs entropy increase is irreversible provided the blurring cannot be reversed. The exactly evolving entropy, which does not increase, is known as ''fine-grained entropy''. The blurred entropy is known as ''coarse-grained entropy''.
Leonard Susskind Leonard Susskind (; born June 16, 1940)his 60th birthday was celebrated with a special symposium at Stanford University.in Geoffrey West's introduction, he gives Suskind's current age as 74 and says his birthday was recent. is an American physicis ...
analogizes this distinction to the notion of the volume of a fibrous ball of cotton:
Leonard Susskind Leonard Susskind (; born June 16, 1940)his 60th birthday was celebrated with a special symposium at Stanford University.in Geoffrey West's introduction, he gives Suskind's current age as 74 and says his birthday was recent. is an American physicis ...
, Statistical Mechanics Lecture 7 (2013)
Video
at
YouTube YouTube is a global online video sharing and social media platform headquartered in San Bruno, California. It was launched on February 14, 2005, by Steve Chen, Chad Hurley, and Jawed Karim. It is owned by Google, and is the second mo ...
.
On one hand the volume of the fibers themselves is constant, but in another sense there is a larger coarse-grained volume, corresponding to the outline of the ball. Gibbs' entropy increase mechanism solves some of the technical difficulties found in Boltzmann's ''H''-theorem: The Gibbs entropy does not fluctuate nor does it exhibit Poincare recurrence, and so the increase in Gibbs entropy, when it occurs, is therefore irreversible as expected from thermodynamics. The Gibbs mechanism also applies equally well to systems with very few degrees of freedom, such as the single-particle system shown in the figure. To the extent that one accepts that the ensemble becomes blurred, then, Gibbs' approach is a cleaner proof of the
second law of thermodynamics The second law of thermodynamics is a physical law based on universal experience concerning heat and energy interconversions. One simple statement of the law is that heat always moves from hotter objects to colder objects (or "downhill"), unle ...
. Unfortunately, as pointed out early on in the development of
quantum statistical mechanics Quantum statistical mechanics is statistical mechanics applied to quantum mechanical systems. In quantum mechanics a statistical ensemble (probability distribution over possible quantum states) is described by a density operator ''S'', which is ...
by
John von Neumann John von Neumann (; hu, Neumann János Lajos, ; December 28, 1903 – February 8, 1957) was a Hungarian-American mathematician, physicist, computer scientist, engineer and polymath. He was regarded as having perhaps the widest c ...
and others, this kind of argument does not carry over to quantum mechanics. In quantum mechanics, the ensemble cannot support an ever-finer mixing process, because of the finite dimensionality of the relevant portion of Hilbert space. Instead of converging closer and closer to the equilibrium ensemble (time-averaged ensemble) as in the classical case, the
density matrix In quantum mechanics, a density matrix (or density operator) is a matrix that describes the quantum state of a physical system. It allows for the calculation of the probabilities of the outcomes of any measurement performed upon this system, using ...
of the quantum system will constantly show evolution, even showing recurrences. Developing a quantum version of the ''H''-theorem without appeal to the ''Stosszahlansatz'' is thus significantly more complicated.


See also

* Loschmidt's paradox *
Arrow of time The arrow of time, also called time's arrow, is the concept positing the "one-way direction" or "asymmetry" of time. It was developed in 1927 by the British astrophysicist Arthur Eddington, and is an unsolved general physics question. This ...
*
Second law of thermodynamics The second law of thermodynamics is a physical law based on universal experience concerning heat and energy interconversions. One simple statement of the law is that heat always moves from hotter objects to colder objects (or "downhill"), unle ...
*
Fluctuation theorem The fluctuation theorem (FT), which originated from statistical mechanics, deals with the relative probability that the entropy of a system which is currently away from thermodynamic equilibrium (i.e., maximum entropy) will increase or decrease ov ...
* Ehrenfest diffusion model


Notes


References

* * * * * * * * {{Statistical mechanics topics Non-equilibrium thermodynamics Thermodynamic entropy Philosophy of thermal and statistical physics Physics theorems Statistical mechanics theorems