HOME

TheInfoList



OR:

In
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. Sometimes called statistical physics or statistical thermodynamics, its applicati ...
, a semi-classical derivation of
entropy Entropy is a scientific concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the micros ...
that does not take into account the indistinguishability of particles yields an expression for entropy which is not extensive (is not proportional to the amount of substance in question). This leads to a
paradox A paradox is a logically self-contradictory statement or a statement that runs contrary to one's expectation. It is a statement that, despite apparently valid reasoning from true or apparently true premises, leads to a seemingly self-contradictor ...
known as the Gibbs paradox, after
Josiah Willard Gibbs Josiah Willard Gibbs (; February 11, 1839 – April 28, 1903) was an American mechanical engineer and scientist who made fundamental theoretical contributions to physics, chemistry, and mathematics. His work on the applications of thermodynami ...
, who proposed this
thought experiment A thought experiment is an imaginary scenario that is meant to elucidate or test an argument or theory. It is often an experiment that would be hard, impossible, or unethical to actually perform. It can also be an abstract hypothetical that is ...
in 1874‒1875. The paradox allows for the entropy of closed systems to decrease, violating the
second law of thermodynamics The second law of thermodynamics is a physical law based on Universal (metaphysics), universal empirical observation concerning heat and Energy transformation, energy interconversions. A simple statement of the law is that heat always flows spont ...
. A related paradox is the " mixing paradox". If one takes the perspective that the definition of entropy must be changed so as to ignore particle permutation, in the
thermodynamic limit In statistical mechanics, the thermodynamic limit or macroscopic limit, of a system is the Limit (mathematics), limit for a large number of particles (e.g., atoms or molecules) where the volume is taken to grow in proportion with the number of ...
, the paradox is averted.


Illustration of the problem

Gibbs considered the following difficulty that arises if the ideal gas entropy is not extensive. Reprinted in and in Two containers of an ideal gas sit side-by-side. The gas in container #1 is identical in every respect to the gas in container #2 (i.e. in volume, mass, temperature, pressure, etc). Accordingly, they have the same entropy ''S''. Now a door in the container wall is opened to allow the gas particles to mix between the containers. No macroscopic changes occur, as the system is in equilibrium. But if the formula for entropy is not extensive, the entropy of the combined system will not be 2''S''. In fact, the particular non-extensive entropy quantity considered by Gibbs predicts additional entropy (more than 2''S''). Closing the door then reduces the entropy again to ''S'' per box, in apparent violation of the
second law of thermodynamics The second law of thermodynamics is a physical law based on Universal (metaphysics), universal empirical observation concerning heat and Energy transformation, energy interconversions. A simple statement of the law is that heat always flows spont ...
. As understood by Gibbs, and reemphasized more recently, this is a misuse of Gibbs' non-extensive entropy quantity. If the gas particles are distinguishable, closing the doors will not return the system to its original state many of the particles will have switched containers. There is a freedom in defining what is "ordered", and it would be a mistake to conclude that the entropy has not increased. In particular, Gibbs' non-extensive entropy quantity for an ideal gas is not intended for situations where the number of particles changes. The paradox is averted by assuming the indistinguishability (at least effective indistinguishability) of the particles in the volume. This results in the extensive Sackur–Tetrode equation for entropy, as derived next.


Calculating the entropy of ideal gas, and making it extensive

In classical mechanics, the state of an
ideal gas An ideal gas is a theoretical gas composed of many randomly moving point particles that are not subject to interparticle interactions. The ideal gas concept is useful because it obeys the ideal gas law, a simplified equation of state, and is ...
of energy ''U'', volume ''V'' and with ''N'' particles, each particle having mass ''m'', is represented by specifying the
momentum In Newtonian mechanics, momentum (: momenta or momentums; more specifically linear momentum or translational momentum) is the product of the mass and velocity of an object. It is a vector quantity, possessing a magnitude and a direction. ...
vector ''p'' and the position vector ''x'' for each particle. This can be thought of as specifying a point in a 6''N''-dimensional
phase space The phase space of a physical system is the set of all possible physical states of the system when described by a given parameterization. Each possible state corresponds uniquely to a point in the phase space. For mechanical systems, the p ...
, where each of the axes corresponds to one of the momentum or position coordinates of one of the particles. The set of points in phase space that the gas could occupy is specified by the constraint that the gas will have a particular energy: U = \frac \sum_^N \left(p_^2 + p_^2 + p_^2\right) and be contained inside of the volume ''V'' (let's say ''V'' is a cube of side ''X'' so that ): 0 \le x_ \le X for i = 1, \dots, N and j = 1, 2, 3 The first constraint defines the surface of a 3N-dimensional
hypersphere In mathematics, an -sphere or hypersphere is an - dimensional generalization of the -dimensional circle and -dimensional sphere to any non-negative integer . The circle is considered 1-dimensional and the sphere 2-dimensional because a point ...
of radius (2''mU'')1/2 and the second is a 3''N''-dimensional
hypercube In geometry, a hypercube is an ''n''-dimensional analogue of a square ( ) and a cube ( ); the special case for is known as a ''tesseract''. It is a closed, compact, convex figure whose 1- skeleton consists of groups of opposite parallel l ...
of volume ''V''''N''. These combine to form a 6''N''-dimensional hypercylinder. Just as the area of the wall of a cylinder is the
circumference In geometry, the circumference () is the perimeter of a circle or ellipse. The circumference is the arc length of the circle, as if it were opened up and straightened out to a line segment. More generally, the perimeter is the curve length arou ...
of the base times the height, so the area ''φ'' of the wall of this hypercylinder is: The entropy is proportional to the logarithm of the number of states that the gas could have while satisfying these constraints. In classical physics, the number of states is infinitely large, but according to quantum mechanics it is finite. Before the advent of quantum mechanics, this infinity was regularized by making phase space discrete. Phase space was divided up in blocks of volume ''h''3''N''. The constant ''h'' thus appeared as a result of a mathematical trick and thought to have no physical significance. However, using quantum mechanics one recovers the same formalism in the semi-classical limit, but now with ''h'' being the
Planck constant The Planck constant, or Planck's constant, denoted by h, is a fundamental physical constant of foundational importance in quantum mechanics: a photon's energy is equal to its frequency multiplied by the Planck constant, and the wavelength of a ...
. One can qualitatively see this from Heisenberg's uncertainty principle; a volume in ''N'' phase space smaller than ''h''3''N'' (''h'' is the Planck constant) cannot be specified. To compute the number of states we must compute the volume in phase space in which the system can be found and divide that by ''h''3''N''. This leads us to another problem: The volume seems to approach zero, as the region in phase space in which the system can be is an area of zero thickness. This problem is an artifact of having specified the energy ''U'' with infinite accuracy. In a generic system without symmetries, a full quantum treatment would yield a discrete non-degenerate set of energy eigenstates. An exact specification of the energy would then fix the precise state the system is in, so the number of states available to the system would be one, the entropy would thus be zero. When we specify the internal energy to be ''U'', what we really mean is that the total energy of the gas lies somewhere in an interval of length \delta U around ''U''. Here \delta U is taken to be very small, it turns out that the entropy doesn't depend strongly on the choice of \delta U for large ''N''. This means that the above "area" ''φ'' must be extended to a shell of a thickness equal to an uncertainty in momentum \delta p = \delta\left(\sqrt\right) = \sqrt \delta U, so the entropy is given by: S = k_\text \, \ln(\phi \delta p/h^) where the constant of proportionality is ''k''B, the
Boltzmann constant The Boltzmann constant ( or ) is the proportionality factor that relates the average relative thermal energy of particles in a ideal gas, gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin (K) and the ...
. Using
Stirling's approximation In mathematics, Stirling's approximation (or Stirling's formula) is an asymptotic approximation for factorials. It is a good approximation, leading to accurate results even for small values of n. It is named after James Stirling, though a related ...
for the
Gamma function In mathematics, the gamma function (represented by Γ, capital Greek alphabet, Greek letter gamma) is the most common extension of the factorial function to complex numbers. Derived by Daniel Bernoulli, the gamma function \Gamma(z) is defined ...
which omits terms of less than order ''N'', the entropy for large ''N'' becomes: S = k_\text N \ln \left( \frac \right) + \frac k_\text N \left( 1+ \ln\frac\right) This quantity is not extensive as can be seen by considering two identical volumes with the same
particle number In thermodynamics, the particle number (symbol ) of a thermodynamic system is the number of constituent particles in that system. The particle number is a fundamental thermodynamic property which is conjugate to the chemical potential. Unlike m ...
and the same energy. Suppose the two volumes are separated by a barrier in the beginning. Removing or reinserting the wall is reversible, but the entropy increases when the barrier is removed by the amount \delta S = k_\text \left 2N \ln(2V) - N\ln V - N \ln V \right= 2 k_\text N \ln 2 > 0 which is in contradiction to thermodynamics if you re-insert the barrier. This is the Gibbs paradox. The paradox is resolved by postulating that the gas particles are in fact indistinguishable. This means that all states that differ only by a permutation of particles should be considered as the same state. For example, if we have a 2-particle gas and we specify ''AB'' as a state of the gas where the first particle (''A'') has momentum p1 and the second particle (''B'') has momentum p2, then this state as well as the ''BA'' state where the ''B'' particle has momentum p1 and the ''A'' particle has momentum p2 should be counted as the same state. For an ''N''-particle gas, there are ''N''! states which are identical in this sense, if one assumes that each particle is in a different single particle state. One can safely make this assumption provided the gas isn't at an extremely high density. Under normal conditions, one can thus calculate the volume of phase space occupied by the gas, by dividing Equation 1 by ''N''!. Using the Stirling approximation again for large ''N'', ln(''N''!) ≈ ''N'' ln(''N'') − ''N'', the entropy for large ''N'' is: S = k_\text N \ln \left( \frac\right) + k_\text N \left( \frac + \frac \ln\frac\right) which can be easily shown to be extensive. This is the Sackur–Tetrode equation.


Mixing paradox

A closely related paradox to the Gibbs paradox is the ''mixing paradox''. The Gibbs paradox is a special case of the "mixing paradox" which contains all the salient features. The difference is that the mixing paradox deals with ''arbitrary'' distinctions in the two gases, not just distinctions in particle ordering as Gibbs had considered. In this sense, it is a straightforward generalization to the argument laid out by Gibbs. Again take a box with a partition in it, with gas A on one side, gas B on the other side, and both gases are at the same temperature and pressure. If gas A and B are different gases, there is an entropy that arises once the gases are mixed, the
entropy of mixing In thermodynamics, the entropy of mixing is the increase in the total entropy when several initially separate systems of different composition, each in a thermodynamic state of internal equilibrium, are mixed without chemical reaction by the ther ...
. If the gases are the same, no additional entropy is calculated. The additional entropy from mixing does not depend on the character of the gases; it only depends on the fact that the gases are different. The two gases may be arbitrarily similar, but the entropy from mixing does not disappear unless they are the same gas – a paradoxical discontinuity. This "paradox" can be explained by carefully considering the definition of entropy. In particular, as concisely explained by
Edwin Thompson Jaynes Edwin Thompson Jaynes (July 5, 1922 – April 30, 1998) was the Wayman Crow Distinguished Professor of Physics at Washington University in St. Louis. He wrote extensively on statistical mechanics and on foundations of probability and statistic ...
, definitions of entropy are arbitrary. As a central example in Jaynes' paper points out, one can develop a theory that treats two gases as similar even if those gases may in reality be distinguished through sufficiently detailed measurement. As long as we do not perform these detailed measurements, the theory will have no internal inconsistencies. (In other words, it does not matter that we call gases A and B by the same name if we have not yet discovered that they are distinct.) If our theory calls gases A and B the same, then entropy does not change when we mix them. If our theory calls gases A and B different, then entropy ''does'' increase when they are mixed. This insight suggests that the ideas of "thermodynamic state" and of "entropy" are somewhat subjective. The differential increase in entropy (''dS'') as a result of mixing dissimilar gases, multiplied by the temperature (''T''), equals the minimum amount of work we must do to restore the gases to their original separated state. Suppose that two gases are different, but that we are unable to detect their differences. If these gases are in a box, segregated from one another by a partition, how much work does it take to restore the system's original state after we remove the partition and let the gases mix? None – simply reinsert the partition. Even though the gases have mixed, there was never a detectable change of state in the system, because by hypothesis the gases are experimentally indistinguishable. As soon as we can distinguish the difference between gases, the work necessary to recover the pre-mixing macroscopic configuration from the post-mixing state becomes nonzero. This amount of work does not depend on how different the gases are, but only on whether they are distinguishable. This line of reasoning is particularly informative when considering the concepts of
indistinguishable particles In quantum mechanics, indistinguishable particles (also called identical or indiscernible particles) are particles that cannot be distinguished from one another, even in principle. Species of identical particles include, but are not limited to, ...
and correct Boltzmann counting. Boltzmann's original expression for the number of states available to a gas assumed that a state could be expressed in terms of a number of energy "sublevels" each of which contain a particular number of particles. While the particles in a given sublevel were considered indistinguishable from each other, particles in different sublevels were considered distinguishable from particles in any other sublevel. This amounts to saying that the exchange of two particles in two different sublevels will result in a detectably different "exchange macrostate" of the gas. For example, if we consider a simple gas with ''N'' particles, at sufficiently low density that it is practically certain that each sublevel contains either one particle or none (i.e. a Maxwell–Boltzmann gas), this means that a simple container of gas will be in one of ''N''! detectably different "exchange macrostates", one for each possible particle exchange. Just as the mixing paradox begins with two detectably different containers, and the extra entropy that results upon mixing is proportional to the average amount of work needed to restore that initial state after mixing, so the extra entropy in Boltzmann's original derivation is proportional to the average amount of work required to restore the simple gas from some "exchange macrostate" to its original "exchange macrostate". If we assume that there is in fact no experimentally detectable difference in these "exchange macrostates" available, then using the entropy which results from assuming the particles are indistinguishable will yield a consistent theory. This is "correct Boltzmann counting". It is often said that the resolution to the Gibbs paradox derives from the fact that, according to the quantum theory, like particles are indistinguishable in principle. By Jaynes' reasoning, if the particles are experimentally indistinguishable for whatever reason, Gibbs paradox is resolved, and quantum mechanics only provides an assurance that in the quantum realm, this indistinguishability will be true as a matter of principle, rather than being due to an insufficiently refined experimental capability.


Non-extensive entropy of two ideal gases and how to fix it

In this section, we present in rough outline a purely classical derivation of the non-extensive entropy for an ideal gas considered by Gibbs before "correct counting" (indistinguishability of particles) is accounted for. This is followed by a brief discussion of two standard methods for making the entropy extensive. Finally, we present a third method, due to R. Swendsen, for an extensive (additive) result for the entropy of two systems if they are allowed to exchange particles with each other.


Setup

We will present a simplified version of the calculation. It differs from the full calculation in three ways: # The ideal gas consists of particles confined to one spatial dimension. # We keep only the terms of order n \log(n), dropping all terms of size ''n'' or less, where ''n'' is the number of particles. For our purposes, this is enough, because this is where the Gibbs paradox shows up and where it must be resolved. The neglected terms play a role when the number of particles is not very large, such as in
computer simulation Computer simulation is the running of a mathematical model on a computer, the model being designed to represent the behaviour of, or the outcome of, a real-world or physical system. The reliability of some mathematical models can be determin ...
and
nanotechnology Nanotechnology is the manipulation of matter with at least one dimension sized from 1 to 100 nanometers (nm). At this scale, commonly known as the nanoscale, surface area and quantum mechanical effects become important in describing propertie ...
. Also, they are needed in deriving the Sackur–Tetrode equation. # The subdivision of phase space into units of the
Planck constant The Planck constant, or Planck's constant, denoted by h, is a fundamental physical constant of foundational importance in quantum mechanics: a photon's energy is equal to its frequency multiplied by the Planck constant, and the wavelength of a ...
(''h'') is omitted. Instead, the entropy is defined using an integral over the "accessible" portion of phase space. This serves to highlight the purely classical nature of the calculation. We begin with a version of Boltzmann's entropy in which the integrand is all of accessible
phase space The phase space of a physical system is the set of all possible physical states of the system when described by a given parameterization. Each possible state corresponds uniquely to a point in the phase space. For mechanical systems, the p ...
: S = k_\text \ln\Omega = k_\text \ln The integral is restricted to a contour of available regions of phase space, subject to conservation of energy. In contrast to the one-dimensional
line integral In mathematics, a line integral is an integral where the function (mathematics), function to be integrated is evaluated along a curve. The terms ''path integral'', ''curve integral'', and ''curvilinear integral'' are also used; ''contour integr ...
s encountered in elementary physics, the contour of constant energy possesses a vast number of dimensions. The justification for integrating over phase space using the canonical measure involves the assumption of equal probability. The assumption can be made by invoking the
ergodic hypothesis In physics and thermodynamics, the ergodic hypothesis says that, over long periods of time, the time spent by a system in some region of the phase space of microstates with the same energy is proportional to the volume of this region, i.e., tha ...
as well as the Liouville's theorem of
Hamiltonian Hamiltonian may refer to: * Hamiltonian mechanics, a function that represents the total energy of a system * Hamiltonian (quantum mechanics), an operator corresponding to the total energy of that system ** Dyall Hamiltonian, a modified Hamiltonian ...
systems. (The ergodic hypothesis underlies the ability of a physical system to reach
thermal equilibrium Two physical systems are in thermal equilibrium if there is no net flow of thermal energy between them when they are connected by a path permeable to heat. Thermal equilibrium obeys the zeroth law of thermodynamics. A system is said to be in t ...
, but this may not always hold for computer simulations (see the Fermi–Pasta–Ulam–Tsingou problem) or in certain real-world systems such as non-thermal plasmas.) Liouville's theorem assumes a fixed number of dimensions that the system 'explores'. In calculations of entropy, the number dimensions is proportional to the number of particles in the system, which forces phase space to abruptly change dimensionality when particles are added or subtracted. This may explain the difficulties in constructing a clear and simple derivation for the dependence of entropy on the number of particles. For the ideal gas, the accessible phase space is an (''n'' − 1)-sphere (also called a hypersphere) in the ''n''-dimensional \mathbf space: E = \sum_^n \frac m v_j^2\,, To recover the paradoxical result that entropy is not extensive, we integrate over phase space for a gas of n
monatomic In physics and chemistry, "monatomic" is a combination of the words "mono" and "atomic", and means "single atom". It is usually applied to gases: a monatomic gas is a gas in which atoms are not bound to each other. Examples at standard conditions ...
particles confined to a single spatial dimension by 0. Since our only purpose is to illuminate a paradox, we simplify notation by taking the particle's mass and the Boltzmann constant equal to unity: m = k = 1. We represent points in phase-space and its ''x'' and ''v'' parts by ''n'' and 2''n'' dimensional vectors: \boldsymbol\xi = _1, \dots, x_n, v_1, \dots, v_n= \mathbf, \mathbf /math> where \begin \mathbf &= _1, \dots, x_n\\ \mathbf &= _1, \dots, v_n,. \end To calculate entropy, we use the fact that the (n-1)-sphere, \sum v_j^2=R^2 , has an -dimensional "hypersurface volume" of \tilde A_n(R)=\frac R^\,. For example, if ''n'' = 2, the 1-sphere is the circle \tilde A_2(R)=2\pi R, a "hypersurface" in the plane. When the sphere is even-dimensional (''n'' odd), it will be necessary to use the
gamma function In mathematics, the gamma function (represented by Γ, capital Greek alphabet, Greek letter gamma) is the most common extension of the factorial function to complex numbers. Derived by Daniel Bernoulli, the gamma function \Gamma(z) is defined ...
to give meaning to the factorial; see below.


Gibbs paradox in a one-dimensional gas

Gibbs paradox arises when entropy is calculated using an n dimensional phase space, where n is also the number of particles in the gas. These particles are spatially confined to the one-dimensional interval \ell^n. The volume of the surface of fixed energy is \Omega_ = \left( \int dx_1 \int dx_2 \cdots \int dx_n\right) \underbrace_ The subscripts on \Omega are used to define the 'state variables' and will be discussed later, when it is argued that the number of particles, n lacks full status as a state variable in this calculation. The integral over configuration space is \ell^n. As indicated by the underbrace, the integral over velocity space is restricted to the "surface area" of the dimensional hypersphere of radius \sqrt, and is therefore equal to the "area" of that hypersurface. Thus \Omega_ = \ell^n \frac (2E)^ : After approximating the factorial and dropping the small terms, we obtain \begin \ln\Omega_ &\approx n\ln\ell + n \ln\sqrt + \text\\ &= \underbrace_ + \,n\ln n +\text\\ \end In the second expression, the term n\ln n was subtracted and added, using the fact that \ln\ell-\ln n = \ln (\ell /n). This was done to highlight exactly how the "entropy" defined here fails to be an extensive property of matter. The first two terms are extensive: if the volume of the system doubles, but gets filled with the same density of particles with the same energy, then each of these terms doubles. But the third term is neither extensive nor
intensive In grammar, an intensive word form is one which denotes stronger, more forceful, or more concentrated action relative to the root on which the intensive is built. Intensives are usually lexical formations, but there may be a regular process for for ...
and is therefore wrong. The arbitrary constant has been added because entropy can usually be viewed as being defined up to an arbitrary additive constant. This is especially necessary when entropy is defined as the logarithm of a phase space volume measured in units of momentum-position. Any change in how these units are defined will add or subtract a constant from the value of the entropy.


Two standard ways to make the classical entropy extensive

As discussed
above Above may refer to: *Above (artist) Tavar Zawacki (b. 1981, California) is a Polish, Portuguese - American abstract artist and internationally recognized visual artist based in Berlin, Germany. From 1996 to 2016, he created work under the ...
, an extensive form of entropy is recovered if we divide the volume of phase space, \Omega_, by ''n''!. An alternative approach is to argue that the dependence on particle number cannot be trusted on the grounds that changing n also changes the dimensionality of phase space. Such changes in dimensionality lie outside the scope of
Hamiltonian mechanics In physics, Hamiltonian mechanics is a reformulation of Lagrangian mechanics that emerged in 1833. Introduced by Sir William Rowan Hamilton, Hamiltonian mechanics replaces (generalized) velocities \dot q^i used in Lagrangian mechanics with (gener ...
and Liouville's theorem. For that reason it is plausible to allow the arbitrary constant to be a function of n. Defining the function to be, f(n)=-\frac32 n\ln n, we have: \begin S = \ln\Omega_ &\approx n\ln\ell + n \ln\sqrt + \text\\ &= n\ln\ell + n \ln\sqrt + f(n)\\ \ln\Omega_ &\approx n\ln\frac + n \ln\sqrt +\text,\\ \end which has extensive scaling: S(\alpha E,\alpha\ell,\alpha n) = \alpha\, S(E,\ell,n)


Swendsen's particle-exchange approach

Following Swendsen, we allow two systems to exchange particles. This essentially 'makes room' in phase space for particles to enter or leave without requiring a change in the number of dimensions of phase space. The total number of particles is N: * n_A particles have coordinates 0. *: The total energy of these particles is E_A * n_B particles have coordinates 0. *: The total energy of these particles is E_B * The system is subject to the constraints, E_A+E_B=E and n_A+n_B=N Taking the integral over phase space, we have: \begin \Omega_ &= \underbrace_ \underbrace_ \underbrace_ \\ ex&= ^ ^ \underbrace_ \underbrace_ \underbrace_ \end The question marks (?) serve as a reminder that we may not assume that the first ''nA'' particles (i.e. 1 through ''nA'') are in system ''A'' while the other particles (''nB'' through ''N'') are in system ''B''. (This is further discussed in the next section.) Taking the logarithm and keeping only the largest terms, we have: S =\ln \Omega_ \approx n_A\ln\left(\frac\sqrt\right)+ n_B\ln\left(\frac\sqrt\right)+ N\ln N + \text This can be interpreted as the sum of the entropy of system ''A'' and system ''B'', both extensive. And there is a term, N\ln N, that is not extensive.


Visualizing the particle-exchange approach in three dimensions

The correct (extensive) formulas for systems ''A'' and ''B'' were obtained because we included all the possible ways that the two systems could exchange particles. The use of
combination In mathematics, a combination is a selection of items from a set that has distinct members, such that the order of selection does not matter (unlike permutations). For example, given three fruits, say an apple, an orange and a pear, there are ...
s (i.e. ''N'' particles choose ''NA'') was used to ascertain the number of ways N particles can be divided into system ''A'' containing ''nA'' particles and system ''B'' containing ''nB'' particles. This counting is not justified on physical grounds, but on the need to integrate over phase space. As will be illustrated below, phase space contains not a single ''nA''-sphere and a single ''nB''-sphere, but instead \binom = \frac pairs of ''n''-spheres, all situated in the same -dimensional velocity space. The integral over accessible phase space must include all of these ''n''-spheres, as can be seen in the figure, which shows the ''actual'' velocity phase space associated with a gas that consists of three particles. Moreover, this gas has been divided into two systems, ''A'' and ''B''. If we ignore the spatial variables, the phase space of a gas with three particles is three dimensional, which permits one to sketch the ''n''-spheres over which the integral over phase space must be taken. If all three particles are together, the split between the two gases is 3, 0. Accessible phase space is delimited by an ordinary sphere (
2-sphere A sphere (from Greek , ) is a surface analogous to the circle, a curve. In solid geometry, a sphere is the set of points that are all at the same distance from a given point in three-dimensional space.. That given point is the ''center' ...
) with a radius that is either \sqrt or \sqrt (depending which system has the particles). If the split is 2, 1, then phase space consists of
circle A circle is a shape consisting of all point (geometry), points in a plane (mathematics), plane that are at a given distance from a given point, the Centre (geometry), centre. The distance between any point of the circle and the centre is cal ...
s and
points A point is a small dot or the sharp tip of something. Point or points may refer to: Mathematics * Point (geometry), an entity that has a location in space or on a plane, but has no extent; more generally, an element of some abstract topologica ...
. Each circle occupies two dimensions, and for each circle, two points lie on the third axis, equidistant from the center of the circle. In other words, if system ''A'' has 2 particles, accessible phase space consists of 3 pairs of ''n''-spheres, each pair being a 1-sphere and a
0-sphere In mathematics, an -sphere or hypersphere is an - dimensional generalization of the -dimensional circle and -dimensional sphere to any non-negative integer . The circle is considered 1-dimensional and the sphere 2-dimensional because a point ...
: \begin v_1^2 + v_2^2 &= 2E_A, & v_3^2 &= 2E_B, \\ v_2^2 + v_3^2 &= 2E_A, & v_1^2 &= 2E_B, \\ v_3^2 + v_1^2 &= 2E_A, & v_2^2 &= 2E_B \end Note that \binom = 3.


References


Further reading

* *


External links


Gibbs paradox and its resolutions
– varied collected papers {{DEFAULTSORT:Gibbs Paradox Entropy Statistical mechanics Thermodynamics Particle statistics Physical paradoxes