In

The Fundamental Thermodynamic Relation

Thermodynamics Statistical mechanics Thermodynamic equations

thermodynamics
Thermodynamics is a branch of physics that deals with heat, Work (thermodynamics), work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed b ...

, the fundamental thermodynamic relation are four fundamental equations which demonstrate how four important thermodynamic quantities depend on variables that can be controlled and measured experimentally. Thus, they are essentially equations of state, and using the fundamental equations, experimental data can be used to determine sought-after quantities like ''G'' or ''H''. The relation is generally expressed as a microscopic change in internal energy
The internal energy of a thermodynamic system is the total energy contained within it. It is the energy necessary to create or prepare the system in its given internal state, and includes the contributions of potential energy and internal kinet ...

in terms of microscopic changes in entropy
Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...

, and volume
Volume is a measure of occupied three-dimensional space. It is often quantified numerically using SI derived units (such as the cubic metre and litre) or by various imperial or US customary units (such as the gallon, quart, cubic inch). ...

for a closed system
A closed system is a natural physical system that does not allow transfer of matter in or out of the system, although — in contexts such as physics, chemistry or engineering — the transfer of energy (''e.g.'' as work or heat) is allowed.
In ...

in thermal equilibrium in the following way.
:$\backslash mathrmU=\; T\backslash ,\backslash mathrmS\; -\; P\backslash ,\backslash mathrmV\backslash ,$
Here, ''U'' is internal energy
The internal energy of a thermodynamic system is the total energy contained within it. It is the energy necessary to create or prepare the system in its given internal state, and includes the contributions of potential energy and internal kinet ...

, ''T'' is absolute temperature
Thermodynamic temperature is a quantity defined in thermodynamics as distinct from Kinetic theory of gases, kinetic theory or statistical mechanics.
Historically, thermodynamic temperature was defined by William Thomson, 1st Baron Kelvin, Kelvin ...

, ''S'' is entropy
Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...

, ''P'' is pressure
Pressure (symbol: ''p'' or ''P'') is the force applied perpendicular to the surface of an object per unit area over which that force is distributed. Gauge pressure (also spelled ''gage'' pressure)The preferred spelling varies by country and e ...

, and ''V'' is volume
Volume is a measure of occupied three-dimensional space. It is often quantified numerically using SI derived units (such as the cubic metre and litre) or by various imperial or US customary units (such as the gallon, quart, cubic inch). ...

.
This is only one expression of the fundamental thermodynamic relation. It may be expressed in other ways, using different variables (e.g. using thermodynamic potentials). For example, the fundamental relation may be expressed in terms of the enthalpy
Enthalpy , a property of a thermodynamic system, is the sum of the system's internal energy and the product of its pressure and volume. It is a state function used in many measurements in chemical, biological, and physical systems at a constant p ...

as
:$\backslash mathrmH\; =\; T\backslash ,\backslash mathrmS\; +\; V\backslash ,\backslash mathrmP\backslash ,$
in terms of the Helmholtz free energy
In thermodynamics, the Helmholtz free energy (or Helmholtz energy) is a thermodynamic potential that measures the useful work (thermodynamics), work obtainable from a closed system, closed thermodynamic system at a constant temperature (Isotherma ...

(''F'') as
:$\backslash mathrmF=\; -S\backslash ,\backslash mathrmT\; -\; P\backslash ,\backslash mathrmV\backslash ,$
and in terms of the Gibbs free energy
In thermodynamics, the Gibbs free energy (or Gibbs energy; symbol G) is a thermodynamic potential that can be used to calculate the maximum amount of work (physics), work that may be performed by a closed system, thermodynamically closed system a ...

(''G'') as
:$\backslash mathrmG=\; -S\backslash ,\backslash mathrmT\; +\; V\backslash ,\backslash mathrmP\backslash ,$.
The first and second laws of thermodynamics

Thefirst law of thermodynamics
The first law of thermodynamics is a formulation of the law of conservation of energy, adapted for thermodynamic processes. It distinguishes in principle two forms of energy transfer, heat and Work (thermodynamics), thermodynamic work for a syst ...

states that:
:$\backslash mathrmU\; =\; \backslash delta\; Q\; -\; \backslash delta\; W\backslash ,$
where $\backslash delta\; Q$ and $\backslash delta\; W$ are infinitesimal amounts of heat supplied to the system by its surroundings and work done by the system on its surroundings, respectively.
According to the second law of thermodynamics
The second law of thermodynamics is a physical law based on universal experience concerning heat and Energy transformation, energy interconversions. One simple statement of the law is that heat always moves from hotter objects to colder objects ( ...

we have for a reversible process:
:$\backslash mathrmS\; =\; \backslash frac\backslash ,$
Hence:
:$\backslash delta\; Q\; =\; T\backslash ,\backslash mathrmS\backslash ,$
By substituting this into the first law, we have:
:$\backslash mathrmU\; =\; T\backslash ,\backslash mathrmS\; -\; \backslash delta\; W\backslash ,$
Letting $\backslash delta\; W$ be reversible pressure-volume work done by the system on its surroundings,
:$\backslash delta\; W\backslash \; =\; P\backslash mathrmV\backslash ,$
we have:
:$\backslash mathrmU\; =\; T\backslash ,\backslash mathrmS\; -\; P\backslash ,\backslash mathrmV\backslash ,$
This equation has been derived in the case of reversible changes. However, since ''U'', ''S'', and ''V'' are thermodynamic state functions, the above relation holds also for non-reversible changes. If the composition, i.e. the amounts $n\_$ of the chemical components, in a system of uniform temperature and pressure can also change, e.g. due to a chemical reaction, the fundamental thermodynamic relation generalizes to:
:$\backslash mathrmU\; =\; T\backslash ,\backslash mathrmS\; -\; P\backslash ,\backslash mathrmV\backslash \; +\; \backslash sum\_\backslash mu\_\backslash ,\backslash mathrmn\_\backslash ,$
The $\backslash mu\_$ are the chemical potential
In thermodynamics, the chemical potential of a Chemical specie, species is the energy that can be absorbed or released due to a change of the particle number of the given species, e.g. in a chemical reaction or phase transition. The chemical potent ...

s corresponding to particles of type $i$.
If the system has more external parameters than just the volume that can change, the fundamental thermodynamic relation generalizes to
:$\backslash mathrmU\; =\; T\backslash ,\backslash mathrmS\; +\; \backslash sum\_X\_\backslash ,\backslash mathrmx\_\; +\; \backslash sum\_\backslash mu\_\backslash ,\backslash mathrmn\_\backslash ,$
Here the $X\_$ are the generalized forces Generalized forces find use in Lagrangian mechanics, where they play a role conjugate to generalized coordinates. They are obtained from the applied forces, Fi, i=1,..., n, acting on a system
A system is a group of interacting or interrelate ...

corresponding to the external parameters $x\_$. (The negative sign used with pressure is unusual and arises because pressure represents a compressive stress that tends to decrease volume. Other generalized forces tend to increase their conjugate displacements.)
Relationship to statistical mechanics

The fundamental thermodynamic relation and statistical mechanical principles can be derived from one another.Derivation from statistical mechanical principles

The above derivation uses the first and second laws of thermodynamics. The first law of thermodynamics is essentially a definition ofheat
In thermodynamics, heat is defined as the form of energy crossing the boundary of a thermodynamic system by virtue of a temperature difference across the boundary. A thermodynamic system does not ''contain'' heat. Nevertheless, the term is al ...

, i.e. heat is the change in the internal energy of a system that is not caused by a change of the external parameters of the system.
However, the second law of thermodynamics is not a defining relation for the entropy. The fundamental definition of entropy of an isolated system containing an amount of energy $E$ is:
:$S\; =\; k\; \backslash log\backslash left;\; href="/html/ALL/s/Omega\backslash left(E\backslash right)\backslash right.html"\; ;"title="Omega\backslash left(E\backslash right)\backslash right">Omega\backslash left(E\backslash right)\backslash right$
where $\backslash Omega\backslash left(E\backslash right)$ is the number of quantum states in a small interval between $E$ and $E\; +\backslash delta\; E$. Here $\backslash delta\; E$ is a macroscopically small energy interval that is kept fixed. Strictly speaking this means that the entropy depends on the choice of $\backslash delta\; E$. However, in the thermodynamic limit (i.e. in the limit of infinitely large system size), the specific entropy (entropy per unit volume or per unit mass) does not depend on $\backslash delta\; E$. The entropy is thus a measure of the uncertainty about exactly which quantum state the system is in, given that we know its energy to be in some interval of size $\backslash delta\; E$.
Deriving the fundamental thermodynamic relation from first principles thus amounts to proving that the above definition of entropy implies that for reversible processes we have:
:$dS\; =\backslash frac$
The fundamental assumption of statistical mechanics
In physics
Physics is the natural science that studies matter, its Elementary particle, fundamental constituents, its motion and behavior through Spacetime, space and time, and the related entities of energy and force. "Physical scien ...

is that all the $\backslash Omega\backslash left(E\backslash right)$ states at a particular energy are equally likely. This allows us to extract all the thermodynamical quantities of interest. The temperature is defined as:
: $\backslash frac\backslash equiv\backslash beta\backslash equiv\backslash frac\backslash ,$
This definition can be derived from the microcanonical ensemble
In statistical mechanics, the microcanonical ensemble is a statistical ensemble (mathematical physics), statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assum ...

, which is a system of a constant number of particles, a constant volume and that does not exchange energy with its environment. Suppose that the system has some external parameter, x, that can be changed. In general, the energy eigenstates
In quantum physics, a quantum state is a mathematical entity that provides a probability distribution for the outcomes of each possible measurement in quantum mechanics, measurement on a system. Knowledge of the quantum state together with the rul ...

of the system will depend on ''x''. According to the adiabatic theorem
The adiabatic theorem is a concept in quantum mechanics. Its original form, due to Max Born and Vladimir Fock (1928), was stated as follows:
:''A physical system remains in its instantaneous eigenstate if a given perturbation theory (quantum mec ...

of quantum mechanics, in the limit of an infinitely slow change of the system's Hamiltonian, the system will stay in the same energy eigenstate and thus change its energy according to the change in energy of the energy eigenstate it is in.
The generalized force, ''X'', corresponding to the external parameter ''x'' is defined such that $X\; dx$ is the work performed by the system if ''x'' is increased by an amount ''dx''. E.g., if ''x'' is the volume, then ''X'' is the pressure. The generalized force for a system known to be in energy eigenstate $E\_r$ is given by:
:$X\; =\; -\backslash frac$
Since the system can be in any energy eigenstate within an interval of $\backslash delta\; E$, we define the generalized force for the system as the expectation value of the above expression:
:$X\; =\; -\backslash left\backslash langle\backslash frac\backslash right\backslash rangle\backslash ,$
To evaluate the average, we partition the $\backslash Omega(E)$ energy eigenstates by counting how many of them have a value for $\backslash frac$ within a range between $Y$ and $Y\; +\; \backslash delta\; Y$. Calling this number $\backslash Omega\_\backslash left(E\backslash right)$, we have:
:$\backslash Omega(E)=\backslash sum\_Y\backslash Omega\_Y(E)\backslash ,$
The average defining the generalized force can now be written:
:$X\; =\; -\backslash frac\backslash sum\_Y\; Y\backslash Omega\_Y(E)\backslash ,$
We can relate this to the derivative of the entropy with respect to x at constant energy E as follows. Suppose we change ''x'' to ''x'' + ''dx''. Then $\backslash Omega\backslash left(E\backslash right)$ will change because the energy eigenstates depend on x, causing energy eigenstates to move into or out of the range between $E$ and $E+\backslash delta\; E$. Let's focus again on the energy eigenstates for which $\backslash frac$ lies within the range between $Y$ and $Y\; +\; \backslash delta\; Y$. Since these energy eigenstates increase in energy by ''Y'' ''dx'', all such energy eigenstates that are in the interval ranging from ''E'' − ''Y'' ''dx'' to ''E'' move from below ''E'' to above ''E''. There are
:$N\_Y\; (E)=\backslash frac\; Y\backslash ,\; dx$
such energy eigenstates. If $Y\; dx\backslash leq\backslash delta\; E$, all these energy eigenstates will move into the range between $E$ and $E+\backslash delta\; E$ and contribute to an increase in $\backslash Omega$. The number of energy eigenstates that move from below $E+\backslash delta\; E$ to above $E+\backslash delta\; E$ is, of course, given by $N\_\backslash left(E+\backslash delta\; E\backslash right)$. The difference
:$N\_Y(E)\; -\; N\_Y(E+\backslash delta\; E)\backslash ,$
is thus the net contribution to the increase in $\backslash Omega$. Note that if Y dx is larger than $\backslash delta\; E$ there will be energy eigenstates that move from below $E$ to above $E+\backslash delta\; E$. They are counted in both $N\_Y\; (E)$ and $N\_Y(E+\backslash delta\; E)$, therefore the above expression is also valid in that case.
Expressing the above expression as a derivative with respect to E and summing over Y yields the expression:
:$\backslash left(\backslash frac\backslash right)\_E\; =\; -\backslash sum\_Y\; Y\backslash left(\backslash frac\backslash right)\_x=\; \backslash left(\backslash frac\backslash right)\_x\backslash ,$
The logarithmic derivative of $\backslash Omega$ with respect to ''x'' is thus given by:
:$\backslash left(\backslash frac\backslash right)\_E\; =\; \backslash beta\; X\; +\backslash left(\backslash frac\; \backslash right)\_x\backslash ,$
The first term is intensive, i.e. it does not scale with system size. In contrast, the last term scales as the inverse system size and thus vanishes in the thermodynamic limit. We have thus found that:
:$\backslash left(\backslash frac\backslash right)\_\; =\; \backslash frac\backslash ,$
Combining this with
:$\backslash left(\backslash frac\backslash right)\_\; =\; \backslash frac\backslash ,$
Gives:
:$dS\; =\; \backslash left(\backslash frac\backslash right)\_x\; \backslash ,\; dE+\backslash left(\backslash frac\backslash right)\_E\; \backslash ,\; dx\; =\; \backslash frac\; +\; \backslash frac\; \backslash ,\; dx\backslash ,$
which we can write as:
:$dE\; =\; T\; \backslash ,\; dS\; -\; X\; \backslash ,\; dx$
Derivation of statistical mechanical principles from the fundamental thermodynamic relation

It has been shown that the fundamental thermodynamic relation together with the following three postulates is sufficient to build the theory of statistical mechanics without the equal a priori probability postulate. For example, in order to derive theBoltzmann distribution
In statistical mechanics and mathematics, a Boltzmann distribution (also called Gibbs distribution Translated by J.B. Sykes and M.J. Kearsley. See section 28) is a probability distribution or probability measure that gives the probability th ...

, we assume the probability density of microstate satisfies $\backslash Pr(i)\backslash propto\; f(E\_i,T)$. The normalization factor (partition function) is therefore
:$Z\; =\; \backslash sum\_i\; f(E\_i,\; T).$
The entropy is therefore given by
:$S\; =\; k\_B\; \backslash sum\_i\; \backslash frac\; \backslash log\backslash left(\backslash frac\backslash right).$
If we change the temperature by while keeping the volume of the system constant, the change of entropy satisfies
:$dS=\backslash left(\backslash frac\backslash right)\_V\; dT$
where
:$\backslash left(\backslash frac\backslash right)\_V\; =\; -k\_B\; \backslash sum\_i\backslash frac\; =\; -k\_B\; \backslash sum\_i\; \backslash frac\; \backslash left(\backslash frac\backslash right)\backslash cdot\backslash log\; f(E\_i,\; T)$
Considering that
:$\backslash left\backslash langle\; E\backslash right\backslash rangle\; =\; \backslash sum\_i\; \backslash frac\backslash cdot\; E\_i$
we have
:$d\backslash left\backslash langle\; E\backslash right\backslash rangle\; =\; \backslash sum\_i\; \backslash frac\; \backslash left(\backslash frac\backslash right)\backslash cdot\; E\_i\; \backslash cdot\; dT$
From the fundamental thermodynamic relation, we have
:$-\backslash frac+\backslash frac\; +\; \backslash fracdV\; =\; 0$
Since we kept constant when perturbing , we have $dV=0$. Combining the equations above, we have
:$\backslash sum\_i\; \backslash frac\; \backslash left(\backslash frac\backslash right)\backslash cdot\; \backslash left;\; href="/html/ALL/s/log\_f(E\_i,\_T)+\backslash frac\backslash right.html"\; ;"title="log\; f(E\_i,\; T)+\backslash frac\backslash right">log\; f(E\_i,\; T)+\backslash frac\backslash right$
Physics laws should be universal, i.e., the above equation must hold for arbitrary systems, and the only way for this to happen is
:$\backslash log\; f(E\_i,\; T)+\backslash frac\; =\; 0$
That is
:$f(E\_i,\; T)=\backslash exp\backslash left(-\backslash frac\backslash right).$
It has been shown that the third postulate in the above formalism can be replaced by the following:
However, the mathematical derivation will be much more complicated.
References

{{ReflistExternal links

The Fundamental Thermodynamic Relation

Thermodynamics Statistical mechanics Thermodynamic equations