fundamental thermodynamic relation
   HOME

TheInfoList



OR:

In
thermodynamics Thermodynamics is a branch of physics that deals with heat, Work (thermodynamics), work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed b ...
, the fundamental thermodynamic relation are four fundamental equations which demonstrate how four important thermodynamic quantities depend on variables that can be controlled and measured experimentally. Thus, they are essentially equations of state, and using the fundamental equations, experimental data can be used to determine sought-after quantities like ''G'' (
Gibbs free energy In thermodynamics, the Gibbs free energy (or Gibbs energy as the recommended name; symbol is a thermodynamic potential that can be used to calculate the maximum amount of Work (thermodynamics), work, other than Work (thermodynamics)#Pressure–v ...
) or ''H'' (
enthalpy Enthalpy () is the sum of a thermodynamic system's internal energy and the product of its pressure and volume. It is a state function in thermodynamics used in many measurements in chemical, biological, and physical systems at a constant extern ...
). The relation is generally expressed as a microscopic change in
internal energy The internal energy of a thermodynamic system is the energy of the system as a state function, measured as the quantity of energy necessary to bring the system from its standard internal state to its present internal state of interest, accoun ...
in terms of microscopic changes in
entropy Entropy is a scientific concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the micros ...
, and
volume Volume is a measure of regions in three-dimensional space. It is often quantified numerically using SI derived units (such as the cubic metre and litre) or by various imperial or US customary units (such as the gallon, quart, cubic inch) ...
for a
closed system A closed system is a natural physical system that does not allow transfer of matter in or out of the system, althoughin the contexts of physics, chemistry, engineering, etc.the transfer of energy (e.g. as work or heat) is allowed. Physics In cl ...
in thermal equilibrium in the following way. \mathrmU= T\,\mathrmS - P\,\mathrmV\, Here, ''U'' is
internal energy The internal energy of a thermodynamic system is the energy of the system as a state function, measured as the quantity of energy necessary to bring the system from its standard internal state to its present internal state of interest, accoun ...
, ''T'' is
absolute temperature Thermodynamic temperature, also known as absolute temperature, is a physical quantity which measures temperature starting from absolute zero, the point at which particles have minimal thermal motion. Thermodynamic temperature is typically expres ...
, ''S'' is
entropy Entropy is a scientific concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the micros ...
, ''P'' is
pressure Pressure (symbol: ''p'' or ''P'') is the force applied perpendicular to the surface of an object per unit area over which that force is distributed. Gauge pressure (also spelled ''gage'' pressure)The preferred spelling varies by country and eve ...
, and ''V'' is
volume Volume is a measure of regions in three-dimensional space. It is often quantified numerically using SI derived units (such as the cubic metre and litre) or by various imperial or US customary units (such as the gallon, quart, cubic inch) ...
. This is only one expression of the fundamental thermodynamic relation. It may be expressed in other ways, using different variables (e.g. using thermodynamic potentials). For example, the fundamental relation may be expressed in terms of the
enthalpy Enthalpy () is the sum of a thermodynamic system's internal energy and the product of its pressure and volume. It is a state function in thermodynamics used in many measurements in chemical, biological, and physical systems at a constant extern ...
''H'' as \mathrmH = T\,\mathrmS + V\,\mathrmP\, in terms of the
Helmholtz free energy In thermodynamics, the Helmholtz free energy (or Helmholtz energy) is a thermodynamic potential that measures the useful work obtainable from a closed thermodynamic system at a constant temperature ( isothermal). The change in the Helmholtz ene ...
''F'' as \mathrmF= -S\,\mathrmT - P\,\mathrmV\, and in terms of the
Gibbs free energy In thermodynamics, the Gibbs free energy (or Gibbs energy as the recommended name; symbol is a thermodynamic potential that can be used to calculate the maximum amount of Work (thermodynamics), work, other than Work (thermodynamics)#Pressure–v ...
''G'' as \mathrmG= -S\,\mathrmT + V\,\mathrmP\,.


The first and second laws of thermodynamics

The
first law of thermodynamics The first law of thermodynamics is a formulation of the law of conservation of energy in the context of thermodynamic processes. For a thermodynamic process affecting a thermodynamic system without transfer of matter, the law distinguishes two ...
states that: \mathrmU = \delta Q - \delta W\, where \delta Q and \delta W are infinitesimal amounts of heat supplied to the system by its surroundings and work done by the system on its surroundings, respectively. According to the
second law of thermodynamics The second law of thermodynamics is a physical law based on Universal (metaphysics), universal empirical observation concerning heat and Energy transformation, energy interconversions. A simple statement of the law is that heat always flows spont ...
we have for a reversible process: \mathrmS = \frac\, Hence: \delta Q = T\,\mathrmS\, By substituting this into the first law, we have: \mathrmU = T\,\mathrmS - \delta W\, Letting \delta W be reversible pressure-volume work done by the system on its surroundings, \delta W\ = P\mathrmV\, we have: \mathrmU = T\,\mathrmS - P\,\mathrmV\, This equation has been derived in the case of reversible changes. However, since ''U'', ''S'', and ''V'' are thermodynamic
state functions In the Thermodynamics#Equilibrium thermodynamics, thermodynamics of equilibrium, a state function, function of state, or point function for a thermodynamic system is a Function (mathematics), mathematical function relating several state variables ...
that depend on only the initial and final states of a thermodynamic process, the above relation holds also for non-reversible changes. If the composition, i.e. the amounts n_ of the chemical components, in a system of uniform temperature and pressure can also change, e.g. due to a chemical reaction, the fundamental thermodynamic relation generalizes to: \mathrmU = T\,\mathrmS - P\,\mathrmV\ + \sum_\mu_\,\mathrmn_\, The \mu_ are the
chemical potential In thermodynamics, the chemical potential of a Chemical specie, species is the energy that can be absorbed or released due to a change of the particle number of the given species, e.g. in a chemical reaction or phase transition. The chemical potent ...
s corresponding to particles of type i. If the system has more external parameters than just the volume that can change, the fundamental thermodynamic relation generalizes to \mathrmU = T\,\mathrmS + \sum_X_\,\mathrmx_ + \sum_\mu_\,\mathrmn_\, Here the X_ are the generalized forces corresponding to the external parameters x_. (The negative sign used with pressure is unusual and arises because pressure represents a compressive stress that tends to decrease volume. Other generalized forces tend to increase their conjugate displacements.)


Relationship to statistical mechanics

The fundamental thermodynamic relation and statistical mechanical principles can be derived from one another.


Derivation from statistical mechanical principles

The above derivation uses the first and second laws of thermodynamics. The first law of thermodynamics is essentially a definition of
heat In thermodynamics, heat is energy in transfer between a thermodynamic system and its surroundings by such mechanisms as thermal conduction, electromagnetic radiation, and friction, which are microscopic in nature, involving sub-atomic, ato ...
, i.e. heat is the change in the internal energy of a system that is not caused by a change of the external parameters of the system. However, the second law of thermodynamics is not a defining relation for the entropy. The fundamental definition of entropy of an isolated system containing an amount of energy E is: S = k_\text \log \left Omega\left(E\right)\right, where \Omega\left(E\right) is the number of microstates in a small interval between E and E +\delta E. Here \delta E is a macroscopically small energy interval that is kept fixed. Strictly speaking this means that the entropy depends on the choice of \delta E. However, in the thermodynamic limit (i.e. in the limit of infinitely large system size), the specific entropy (entropy per unit volume or per unit mass) does not depend on \delta E. The entropy is thus a measure of the uncertainty about exactly which microstate the system is in, given that we know its energy to be in some interval of size \delta E. Deriving the fundamental thermodynamic relation from first principles thus amounts to proving that the above definition of entropy implies that for reversible processes we have: dS =\frac The relevant assumption from
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. Sometimes called statistical physics or statistical thermodynamics, its applicati ...
is that all the \Omega\left(E\right) states at a particular energy are equally likely. This allows us to extract all the thermodynamical quantities of interest. The temperature is defined as: \frac \equiv \beta \equiv \frac This definition can be derived from the
microcanonical ensemble In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assumed to be isolated in the sense that it canno ...
, which is a system of a constant number of particles, a constant volume and that does not exchange energy with its environment. Suppose that the system has some external parameter, x, that can be changed. In general, the energy eigenstates of the system will depend on ''x''. According to the adiabatic theorem of quantum mechanics, in the limit of an infinitely slow change of the system's Hamiltonian, the system will stay in the same energy eigenstate and thus change its energy according to the change in energy of the energy eigenstate it is in. The generalized force, ''X'', corresponding to the external parameter ''x'' is defined such that X dx is the work performed by the system if ''x'' is increased by an amount ''dx''. E.g., if ''x'' is the volume, then ''X'' is the pressure. The generalized force for a system known to be in energy eigenstate E_r is given by: X = -\frac Since the system can be in any energy eigenstate within an interval of \delta E, we define the generalized force for the system as the expectation value of the above expression: X = -\left\langle\frac\right\rangle\, To evaluate the average, we partition the \Omega(E) energy eigenstates by counting how many of them have a value for \frac within a range between Y and Y + \delta Y. Calling this number \Omega_\left(E\right), we have: \Omega(E)=\sum_Y\Omega_Y(E)\, The average defining the generalized force can now be written: X = -\frac\sum_Y Y\Omega_Y(E)\, We can relate this to the derivative of the entropy with respect to x at constant energy E as follows. Suppose we change ''x'' to ''x'' + ''dx''. Then \Omega\left(E\right) will change because the energy eigenstates depend on x, causing energy eigenstates to move into or out of the range between E and E+\delta E. Let's focus again on the energy eigenstates for which \frac lies within the range between Y and Y + \delta Y. Since these energy eigenstates increase in energy by ''Y'' ''dx'', all such energy eigenstates that are in the interval ranging from ''E'' − ''Y'' ''dx'' to ''E'' move from below ''E'' to above ''E''. There are N_Y (E) = \frac Y\, dx such energy eigenstates. If Y dx\leq\delta E, all these energy eigenstates will move into the range between E and E+\delta E and contribute to an increase in \Omega. The number of energy eigenstates that move from below E+\delta E to above E+\delta E is, of course, given by N_\left(E+\delta E\right). The difference N_Y(E) - N_Y(E+\delta E)\, is thus the net contribution to the increase in \Omega. Note that if Y dx is larger than \delta E there will be energy eigenstates that move from below E to above E+\delta E. They are counted in both N_Y (E) and N_Y(E+\delta E), therefore the above expression is also valid in that case. Expressing the above expression as a derivative with respect to E and summing over Y yields the expression: \left(\frac\right)_E = -\sum_Y Y\left(\frac\right)_x= \left(\frac\right)_x\, The logarithmic derivative of \Omega with respect to ''x'' is thus given by: \left(\frac\right)_E = \beta X +\left(\frac \right)_x\, The first term is intensive, i.e. it does not scale with system size. In contrast, the last term scales as the inverse system size and thus vanishes in the thermodynamic limit. We have thus found that: \left(\frac\right)_ = \frac\, Combining this with \left(\frac\right)_ = \frac\, Gives: dS = \left(\frac\right)_x \, dE+\left(\frac\right)_E \, dx = \frac + \frac \, dx\, which we can write as: dE = T \, dS - X \, dx


Derivation of statistical mechanical principles from the fundamental thermodynamic relation

It has been shown that the fundamental thermodynamic relation together with the following three postulates is sufficient to build the theory of statistical mechanics without the equal a priori probability postulate. For example, in order to derive the Boltzmann distribution, we assume the probability density of microstate satisfies \Pr(i)\propto f(E_i,T). The normalization factor (partition function) is therefore Z = \sum_i f(E_i, T). The entropy is therefore given by S = k_B \sum_i \frac \log\left(\frac\right). If we change the temperature by while keeping the volume of the system constant, the change of entropy satisfies dS = \left(\frac\right)_V dT where \begin \left(\frac\right)_V &= -k_B \sum_i\frac \\ &= -k_B \sum_i \frac \left(\frac\right)\cdot\log f(E_i, T) \\ \end Considering that \left\langle E\right\rangle = \sum_i \frac\cdot E_i we have d\left\langle E\right\rangle = \sum_i \frac \cdot E_i \cdot dT From the fundamental thermodynamic relation, we have -\frac + \frac + \frac dV = 0 Since we kept constant when perturbing , we have dV=0. Combining the equations above, we have \sum_i \frac \cdot \left log f(E_i, T) + \frac\right\cdot dT = 0 Physics laws should be universal, i.e., the above equation must hold for arbitrary systems, and the only way for this to happen is \log f(E_i, T) + \frac = 0 That is f(E_i, T) = \exp\left(-\frac\right). It has been shown that the third postulate in the above formalism can be replaced by the following: However, the mathematical derivation will be much more complicated.


References

{{Reflist


External links


The Fundamental Thermodynamic Relation
Thermodynamics Statistical mechanics Thermodynamic equations