HOME

TheInfoList



OR:

Entropy production (or generation) is the amount of entropy which is produced in any
irreversible process In science, a thermodynamic processes, process that is not Reversible process (thermodynamics), reversible is called irreversible. This concept arises frequently in thermodynamics. All complex natural processes are irreversible, although a phase ...
es such as heat and mass transfer processes including motion of bodies, heat exchange, fluid flow, substances expanding or mixing, anelastic deformation of solids, and any irreversible thermodynamic cycle, including thermal machines such as
power plant A power station, also referred to as a power plant and sometimes generating station or generating plant, is an industrial facility for the electricity generation, generation of electric power. Power stations are generally connected to an el ...
s, heat engines, refrigerators,
heat pump A heat pump is a device that can heat a building (or part of a building) by transferring thermal energy from the outside using a refrigeration cycle. Many heat pumps can also operate in the opposite direction, cooling the building by removing ...
s, and air conditioners. In the dual representation
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
exergy In thermodynamics, the exergy of a system is the maximum useful work possible during a process that brings the system into equilibrium with a heat reservoir, reaching maximum entropy. When the surroundings are the reservoir, exergy is the pot ...
for accounting the second law of thermodynamics it can be expressed in equivalent terms of exergy disruption.


Short history

Entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
is produced in irreversible processes. The importance of avoiding irreversible processes (hence reducing the entropy production) was recognized as early as 1824 by Carnot. In 1865
Rudolf Clausius Rudolf Julius Emanuel Clausius (; 2 January 1822 – 24 August 1888) was a German physicist and mathematician and is considered one of the central founding fathers of the science of thermodynamics. By his restatement of Sadi Carnot's principle ...
expanded his previous work from 1854 on the concept of "unkompensierte Verwandlungen" (uncompensated transformations), which, in our modern nomenclature, would be called the entropy production. In the same article in which he introduced the name entropy, Clausius gives the expression for the entropy production for a cyclical process in a closed system, which he denotes by ''N'', in equation (71) which reads :N=S-S_0-\int\frac. Here ''S'' is the entropy in the final state and ''S0'' the entropy in the initial state; ''S0-S'' is the entropy difference for the backwards part of the process. The integral is to be taken from the initial state to the final state, giving the entropy difference for the forwards part of the process. From the context, it is clear that if the process is reversible and in case of an irreversible process.


First and second law

The laws of thermodynamics system apply to well-defined systems. Fig. 1 is a general representation of a thermodynamic system. We consider systems which, in general, are inhomogeneous. Heat and mass are transferred across the boundaries (nonadiabatic, open systems), and the boundaries are moving (usually through pistons). In our formulation we assume that heat and mass transfer and volume changes take place only separately at well-defined regions of the system boundary. The expression, given here, are not the most general formulations of the first and second law. E.g. kinetic energy and potential energy terms are missing and exchange of matter by diffusion is excluded. The rate of entropy production, denoted by \dot S_\text , is a key element of the second law of thermodynamics for open inhomogeneous systems which reads : \frac = \sum_k \frac + \sum_k \dot S_k + \sum_k \dot S_ \text\dot S_ \geq 0. Here ''S'' is the entropy of the system; ''T''''k'' is the temperature at which the heat enters the system at heat flow rate \dot Q_k; \dot S_k = \dot n_k S_ = \dot m_k s_k represents the entropy flow into the system at position ''k'', due to matter flowing into the system (\dot n_k, \dot m_k are the molar flow rate and mass flow rate and ''S''m''k'' and ''s''''k'' are the molar entropy (i.e. entropy per unit amount of substance) and specific entropy (i.e. entropy per unit mass) of the matter, flowing into the system, respectively); \dot S_ represents the entropy production rates due to internal processes. The subscript 'i' in \dot S_ refers to the fact that the entropy is produced due to irreversible processes. The entropy-production rate of every process in nature is always positive or zero. This is an essential aspect of the second law. The Σ's indicate the algebraic sum of the respective contributions if there are more heat flows, matter flows, and internal processes. In order to demonstrate the impact of the second law, and the role of entropy production, it has to be combined with the first law which reads : \frac = \sum_k \dot Q_k + \sum_k \dot H_k - \sum_k p_k\frac+P, with ''U'' the internal energy of the system; \dot H_k= \dot n_k H_ = \dot m_k h_k the
enthalpy Enthalpy , a property of a thermodynamic system, is the sum of the system's internal energy and the product of its pressure and volume. It is a state function used in many measurements in chemical, biological, and physical systems at a constant ...
flows into the system due to the matter that flows into the system (''H''m''k'' its molar enthalpy, ''h''''k'' the specific enthalpy (i.e. enthalpy per unit mass)), and d''V''''k''/d''t'' are the rates of change of the volume of the system due to a moving boundary at position ''k'' while ''p''''k'' is the pressure behind that boundary; ''P'' represents all other forms of power application (such as electrical). The first and second law have been formulated in terms of time derivatives of ''U'' and ''S'' rather than in terms of total differentials d''U'' and d''S'' where it is tacitly assumed that d''t'' > 0. So, the formulation in terms of time derivatives is more elegant. An even bigger advantage of this formulation is, however, that it emphasizes that ''heat flow rate'' and ''power'' are the basic thermodynamic properties and that heat and work are derived quantities being the time integrals of the heat flow rate and the power respectively.


Examples of irreversible processes

Entropy is produced in irreversible processes. Some important irreversible processes are: *heat flow through a thermal resistance *fluid flow through a flow resistance such as in the
Joule expansion The Joule expansion (also called free expansion) is an irreversible process in thermodynamics in which a volume of gas is kept in one side of a thermally isolated container (via a small partition), with the other side of the container being evacu ...
or the Joule–Thomson effect *diffusion *chemical reactions *Joule heating *friction between solid surfaces *fluid viscosity within a system. The expression for the rate of entropy production in the first two cases will be derived in separate sections.


Performance of heat engines and refrigerators

Most heat engines and refrigerators are closed cyclic machines. In the steady state the internal energy and the entropy of the machines after one cycle are the same as at the start of the cycle. Hence, on average, d''U''/d''t'' = 0 and d''S''/d''t'' = 0 since ''U'' and ''S'' are functions of state. Furthermore they are closed systems (\dot n = 0) and the volume is fixed (d''V''/d''t'' = 0). This leads to a significant simplification of the first and second law: : 0 =\sum_k \dot Q_k +P and : 0 =\sum_k \frac + \dot S_\text. The summation is over the (two) places where heat is added or removed.


Engines

For a heat engine (Fig. 2a) the first and second law obtain the form : 0 =\dot Q_\text - \dot Q_\text -P and : 0 =\frac - \frac + \dot S_\text. Here \dot Q_\text is the heat supplied at the high temperature ''T''H, \dot Q_\text is the heat removed at ambient temperature ''T''a, and ''P'' is the power delivered by the engine. Eliminating \dot Q_\text gives : P = \frac\dot Q_\text - T_\text \dot S_\text. The efficiency is defined by : \eta = \frac. If \dot S_\text=0 the performance of the engine is at its maximum and the efficiency is equal to the Carnot efficiency : \eta_\text = \frac.


Refrigerators

For refrigerators (Fig. 2b) holds : 0 =\dot Q_\text - \dot Q_\text +P and : 0 =\frac - \frac + \dot S_\text. Here ''P'' is the power, supplied to produce the cooling power \dot Q_\text at the low temperature ''T''L. Eliminating \dot Q_\text now gives : \dot Q_\text=\frac(P-T_\text\dot S_\text) . The coefficient of performance of refrigerators is defined by : \xi=\frac. If \dot S_\text=0 the performance of the cooler is at its maximum. The COP is then given by the Carnot coefficient of performance : \xi_\text=\frac.


Power dissipation

In both cases we find a contribution T_\text \dot S_\text which reduces the system performance. This product of ambient temperature and the (average) entropy production rate P_\text=T_\text \dot S_\text is called the dissipated power.


Equivalence with other formulations

It is interesting to investigate how the above mathematical formulation of the second law relates with other well-known formulations of the second law. We first look at a heat engine, assuming that \dot Q_\text=0. In other words: the heat flow rate \dot Q_\text is completely converted into power. In this case the second law would reduce to : 0=\frac+\dot S_\text. Since \dot Q_\text\ge 0 and T_\text>0 this would result in \dot S_\text \leq 0 which violates the condition that the entropy production is always positive. Hence: ''No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work.'' This is the Kelvin statement of the second law. Now look at the case of the refrigerator and assume that the input power is zero. In other words: heat is transported from a low temperature to a high temperature without doing work on the system. The first law with would give :\dot Q_\text=\dot Q_\text and the second law then yields : 0=\frac-\frac+\dot S_\text or : \dot S_\text =\dot Q_\text\left(\frac-\frac\right). Since \dot Q_\text\ge 0 and T_\text>T_\text this would result in \dot S_\text\leq 0 which again violates the condition that the entropy production is always positive. Hence: ''No process is possible whose sole result is the transfer of heat from a body of lower temperature to a body of higher temperature.'' This is the Clausius statement of the second law.


Expressions for the entropy production


Heat flow

In case of a heat flow rate \dot Q from ''T''1 to ''T''2 (with T_1 \geq T_2) the rate of entropy production is given by : \dot S_\text=\dot Q\left(\frac-\frac\right). If the heat flow is in a bar with length ''L'', cross-sectional area ''A'', and thermal conductivity ''κ'', and the temperature difference is small : \dot Q=\kappa \frac(T_1-T_2) the entropy production rate is : \dot S_\text=\kappa \frac\frac.


Flow of mass

In case of a volume flow rate \dot V from a pressure ''p''1 to ''p''2 : \dot S_\text=-\int _ ^ \frac \mathrmp. For small pressure drops and defining the flow conductance ''C'' by \dot V=C(p_1-p_2) we get : \dot S_\text = C\frac. The dependences of \dot S_\text on and on are quadratic. This is typical for expressions of the entropy production rates in general. They guarantee that the entropy production is positive.


Entropy of mixing

In this Section we will calculate the entropy of mixing when two ideal gases diffuse into each other. Consider a volume ''V''t divided in two volumes ''V''a and ''V''b so that . The volume ''V''a contains amount of substance ''n''a of an ideal gas a and ''V''b contains amount of substance ''n''b of gas b. The total amount of substance is . The temperature and pressure in the two volumes is the same. The entropy at the start is given by :S_\text=S_\text+S_\text. When the division between the two gases is removed the two gases expand, comparable to a Joule–Thomson expansion. In the final state the temperature is the same as initially but the two gases now both take the volume ''V''t. The relation of the entropy of an amount of substance ''n'' of an ideal gas is :S=nC_\text\ln\frac+nR\ln\frac where ''C''V is the molar heat capacity at constant volume and ''R'' is the molar gas constant. The system is an adiabatic closed system, so the entropy increase during the mixing of the two gases is equal to the entropy production. It is given by :S_\Delta=S_\text-S_\text. As the initial and final temperature are the same, the temperature terms cancel, leaving only the volume terms. The result is :S_\Delta=n_\textR\ln\frac+n_\textR\ln\frac. Introducing the concentration ''x'' = ''n''a/''n''t = ''V''a/''V''t we arrive at the well-known expression :S_\Delta=-n_\textR \ln x+(1-x)\ln(1-x)


Joule expansion

The
Joule expansion The Joule expansion (also called free expansion) is an irreversible process in thermodynamics in which a volume of gas is kept in one side of a thermally isolated container (via a small partition), with the other side of the container being evacu ...
is similar to the mixing described above. It takes place in an adiabatic system consisting of a gas and two rigid vessels a and b of equal volume, connected by a valve. Initially, the valve is closed. Vessel a contains the gas while the other vessel b is empty. When the valve is opened, the gas flows from vessel a into b until the pressures in the two vessels are equal. The volume, taken by the gas, is doubled while the internal energy of the system is constant (adiabatic and no work done). Assuming that the gas is ideal, the molar internal energy is given by . As ''C''V is constant, constant ''U'' means constant ''T''. The molar entropy of an ideal gas, as function of the molar volume ''V''m and ''T'', is given by : S_\text=C_\text\ln\frac+R\ln\frac. The system consisting of the two vessels and the gas is closed and adiabatic, so the entropy production during the process is equal to the increase of the entropy of the gas. So, doubling the volume with ''T'' constant gives that the molar entropy produced is : S_\text=R\ln 2.


Microscopic interpretation

The Joule expansion provides an opportunity to explain the entropy production in statistical mechanical (i.e., microscopic) terms. At the expansion, the volume that the gas can occupy is doubled. This means that, for every molecule there are now two possibilities: it can be placed in container a or b. If the gas has amount of substance ''n'', the number of molecules is equal to ''n''⋅''N''A, where ''N''A is the
Avogadro constant The Avogadro constant, commonly denoted or , is the proportionality factor that relates the number of constituent particles (usually molecules, atoms or ions) in a sample with the amount of substance in that sample. It is an SI defining c ...
. The number of microscopic possibilities increases by a factor of 2 per molecule due to the doubling of volume, so in total the factor is 2''n''⋅''N''A. Using the well-known Boltzmann expression for the
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
: S=k\ln \Omega, where ''k'' is the Boltzmann constant and Ω is the number of microscopic possibilities to realize the macroscopic state. This gives change in molar entropy of :S_ = S_\Delta / n = k \ln(2^) / n = k N_\text \ln 2 = R \ln 2 . So, in an irreversible process, the number of microscopic possibilities to realize the macroscopic state is increased by a certain factor.


Basic inequalities and stability conditions

In this section we derive the basic inequalities and stability conditions for closed systems. For closed systems the first law reduces to :\frac = \dot Q - p\frac+P. The second law we write as :\frac - \frac \geq 0. For ''adiabatic systems'' \dot Q = 0 so . In other words: the entropy of adiabatic systems cannot decrease. In equilibrium the entropy is at its maximum. Isolated systems are a special case of adiabatic systems, so this statement is also valid for isolated systems. Now consider systems with ''constant temperature and volume''. In most cases ''T'' is the temperature of the surroundings with which the system is in good thermal contact. Since ''V'' is constant the first law gives \dot Q=\mathrmU/\mathrmt-P. Substitution in the second law, and using that ''T'' is constant, gives :\frac - \frac +P\geq 0. With the Helmholtz free energy, defined as :F=U-TS, we get :\frac-P \leq 0. If ''P'' = 0 this is the mathematical formulation of the general property that the free energy of systems with fixed temperature and volume tends to a minimum. The expression can be integrated from the initial state i to the final state f resulting in :W_\text \leq F_\text-F_\text where ''W''S is the work done ''by'' the system. If the process inside the system is completely reversible the equality sign holds. Hence the maximum work, that can be extracted from the system, is equal to the free energy of the initial state minus the free energy of the final state. Finally we consider systems with ''constant temperature and pressure'' and take . As ''p'' is constant the first laws gives :\frac = \dot Q - \frac. Combining with the second law, and using that ''T'' is constant, gives :\frac - \frac - \frac \geq 0. With the Gibbs free energy, defined as :G=U+pV-TS, we get :\frac \leq 0.


Homogeneous systems

In homogeneous systems the temperature and pressure are well-defined and all internal processes are reversible. Hence \dot S_\text = 0 . As a result the second law, multiplied by ''T'', reduces to :T \frac = \dot Q + \dot n TS_\text. With ''P'' = 0 the first law becomes :\frac = \dot Q + \dot n H_\text - p\frac. Eliminating \dot Q and multiplying with d''t'' gives : \mathrmU = T\mathrmS - p\mathrmV + (H_\text-TS_\text) \mathrmn. Since :H_\text-TS_\text=G_\text=\mu with ''G''m the molar
Gibbs free energy In thermodynamics, the Gibbs free energy (or Gibbs energy; symbol G) is a thermodynamic potential that can be used to calculate the maximum amount of work that may be performed by a thermodynamically closed system at constant temperature and ...
and ''μ'' the molar
chemical potential In thermodynamics, the chemical potential of a species is the energy that can be absorbed or released due to a change of the particle number of the given species, e.g. in a chemical reaction or phase transition. The chemical potential of a species ...
we obtain the well-known result : \mathrmU = T\mathrmS - p\mathrmV+ \mu \mathrmn.


Entropy production in stochastic processes

Since physical processes can be described by stochastic processes, such as Markov chains and diffusion processes, entropy production can be defined mathematically in such processes. For a continuous-time Markov chain with instantaneous probability distribution p_i(t) and transition rate q_, the instantaneous entropy production rate is :e_p(t)=\frac\sum_
_i(t)q_-p_j(t)q_ I, or i, is the ninth letter and the third vowel letter of the Latin alphabet, used in the modern English alphabet, the alphabets of other western European languages and others worldwide. Its name in English is ''i'' (pronounced ), plural ...
log\frac. The long-time behavior of entropy production is kept after a proper lifting of the process. This approach provides a dynamic explanation for the Kelvin statement and the Clausius statement of the second law of thermodynamics.


See also

*
Thermodynamics Thermodynamics is a branch of physics that deals with heat, work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed by the four laws of th ...
*
First law of thermodynamics The first law of thermodynamics is a formulation of the law of conservation of energy, adapted for thermodynamic processes. It distinguishes in principle two forms of energy transfer, heat and thermodynamic work for a system of a constant amou ...
* Second law of thermodynamics *
Irreversible process In science, a thermodynamic processes, process that is not Reversible process (thermodynamics), reversible is called irreversible. This concept arises frequently in thermodynamics. All complex natural processes are irreversible, although a phase ...
*
Non-equilibrium thermodynamics Non-equilibrium thermodynamics is a branch of thermodynamics that deals with physical systems that are not in thermodynamic equilibrium but can be described in terms of macroscopic quantities (non-equilibrium state variables) that represent an ext ...
*
High entropy alloys High-entropy alloys (HEAs) are alloys that are formed by mixing equal or relatively large proportions of (usually) five or more elements. Prior to the synthesis of these substances, typical metal alloys comprised one or two major components with ...


References


Further reading

* *{{cite journal , doi= 10.1103/PhysRevLett.95.040602 , pmid= 16090792 , title= Entropy Production along a Stochastic Trajectory and an Integral Fluctuation Theorem , journal= Physical Review Letters , volume= 95 , issue= 4 , pages= 040602 , type=Free PDF , arxiv= cond-mat/0503686 , year= 2005 , last1= Seifert , first1= Udo , bibcode= 2005PhRvL..95d0602S Cooling technology Cryogenics Heat pumps