In
mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
, the Gibbs measure, named after
Josiah Willard Gibbs, is a
probability measure
In mathematics, a probability measure is a real-valued function defined on a set of events in a probability space that satisfies measure properties such as ''countable additivity''. The difference between a probability measure and the more gener ...
frequently seen in many problems of
probability theory and
statistical mechanics
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic be ...
. It is a generalization of the
canonical ensemble to infinite systems.
The canonical ensemble gives the probability of the system ''X'' being in state ''x'' (equivalently, of the
random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
''X'' having value ''x'') as
:
Here, is a function from the space of states to the real numbers; in physics applications, is interpreted as the energy of the configuration ''x''. The parameter is a free parameter; in physics, it is the
inverse temperature. The
normalizing constant
The concept of a normalizing constant arises in probability theory and a variety of other areas of mathematics. The normalizing constant is used to reduce any probability function to a probability density function with total probability of one.
...
is the
partition function. However, in infinite systems, the total energy is no longer a finite number and cannot be used in the traditional construction of the probability distribution of a canonical ensemble. Traditional approaches in statistical physics studied the limit of
intensive properties as the size of a finite system approaches infinity (the
thermodynamic limit). When the energy function can be written as a sum of terms that each involve only variables from a finite subsystem, the notion of a Gibbs measure provides an alternative approach. Gibbs measures were proposed by probability theorists such as
Dobrushin,
Lanford, and
Ruelle and provided a framework to directly study infinite systems, instead of taking the limit of finite systems.
A measure is a Gibbs measure if the conditional probabilities it induces on each finite subsystem satisfy a consistency condition: if all degrees of freedom outside the finite subsystem are frozen, the canonical ensemble for the subsystem subject to these
boundary conditions
In mathematics, in the field of differential equations, a boundary value problem is a differential equation together with a set of additional constraints, called the boundary conditions. A solution to a boundary value problem is a solution to th ...
matches the probabilities in the Gibbs measure
conditional
Conditional (if then) may refer to:
* Causal conditional, if X then Y, where X is a cause of Y
* Conditional probability, the probability of an event A given that another event B has occurred
*Conditional proof, in logic: a proof that asserts a ...
on the frozen degrees of freedom.
The
Hammersley–Clifford theorem implies that any probability measure that satisfies a
Markov property is a Gibbs measure for an appropriate choice of (locally defined) energy function. Therefore, the Gibbs measure applies to widespread problems outside of
physics, such as
Hopfield networks,
Markov networks,
Markov logic networks, and
boundedly rational potential games in game theory and economics.
A Gibbs measure in a system with local (finite-range) interactions maximizes the
entropy density for a given expected
energy density
In physics, energy density is the amount of energy stored in a given system or region of space per unit volume. It is sometimes confused with energy per unit mass which is properly called specific energy or .
Often only the ''useful'' or extract ...
; or, equivalently, it minimizes the
free energy density.
The Gibbs measure of an infinite system is not necessarily unique, in contrast to the canonical ensemble of a finite system, which is unique. The existence of more than one Gibbs measure is associated with statistical phenomena such as
symmetry breaking and
phase coexistence.
Statistical physics
The set of Gibbs measures on a system is always convex, so there is either a unique Gibbs measure (in which case the system is said to be "
ergodic
In mathematics, ergodicity expresses the idea that a point of a moving system, either a dynamical system or a stochastic process, will eventually visit all parts of the space that the system moves in, in a uniform and random sense. This implies tha ...
"), or there are infinitely many (and the system is called "nonergodic"). In the nonergodic case, the Gibbs measures can be expressed as the set of
convex combinations of a much smaller number of special Gibbs measures known as "pure states" (not to be confused with the related but distinct notion of
pure states in quantum mechanics). In physical applications, the Hamiltonian (the energy function) usually has some sense of
''locality'', and the pure states have the
cluster decomposition property that "far-separated subsystems" are independent. In practice, physically realistic systems are found in one of these pure states.
If the Hamiltonian possesses a symmetry, then a unique (i.e. ergodic) Gibbs measure will necessarily be invariant under the symmetry. But in the case of multiple (i.e. nonergodic) Gibbs measures, the pure states are typically ''not'' invariant under the Hamiltonian's symmetry. For example, in the infinite ferromagnetic
Ising model below the critical temperature, there are two pure states, the "mostly-up" and "mostly-down" states, which are interchanged under the model's
symmetry.
Markov property
An example of the
Markov property can be seen in the Gibbs measure of the
Ising model. The probability for a given spin to be in state ''s'' could, in principle, depend on the states of all other spins in the system. Thus, we may write the probability as
:
.
However, in an Ising model with only finite-range interactions (for example, nearest-neighbor interactions), we actually have
:
,
where is a neighborhood of the site . That is, the probability at site depends ''only'' on the spins in a finite neighborhood. This last equation is in the form of a local
Markov property. Measures with this property are sometimes called
Markov random fields. More strongly, the converse is also true: ''any'' positive probability distribution (nonzero density everywhere) having the Markov property can be represented as a Gibbs measure for an appropriate energy function.
[Ross Kindermann and J. Laurie Snell]
Markov Random Fields and Their Applications
(1980) American Mathematical Society, This is the
Hammersley–Clifford theorem.
Formal definition on lattices
What follows is a formal definition for the special case of a random field on a lattice. The idea of a Gibbs measure is, however, much more general than this.
The definition of a Gibbs random field on a
lattice requires some terminology:
* The lattice: A countable set
.
* The single-spin space: A
probability space .
* The
configuration space:
, where
and
.
* Given a configuration and a subset
, the restriction of to is
. If
and
, then the configuration
is the configuration whose restrictions to and are
and
, respectively.
* The set
of all finite subsets of
.
* For each subset
,
is the
-algebra generated by the family of functions
, where
. The union of these -algebras as
varies over
is the algebra of
cylinder set In mathematics, the cylinder sets form a basis of the product topology on a product of sets; they are also a generating family of the cylinder σ-algebra.
General definition
Given a collection S of sets, consider the Cartesian product X = \prod ...
s on the lattice.
* The
potential: A family
of functions such that
*# For each
is
-
measurable, meaning it depends only on the restriction
(and does so measurably).
*# For all
and , the following series exists:
:::
We interpret as the contribution to the total energy (the Hamiltonian) associated to the interaction among all the points of finite set ''A''.
Then
as the contribution to the total energy of all the finite sets ''A'' that meet
. Note that the total energy is typically infinite, but when we "localize" to each
it may be finite, we hope.
* The
Hamiltonian in
with boundary conditions
, for the potential , is defined by
::
:where
.
* The
partition function in
with boundary conditions
and inverse temperature (for the potential and ) is defined by
::
:where
::
:is the product measure
:A potential is -admissible if
is finite for all
and .
:A
probability measure
In mathematics, a probability measure is a real-valued function defined on a set of events in a probability space that satisfies measure properties such as ''countable additivity''. The difference between a probability measure and the more gener ...
on
is a Gibbs measure for a -admissible potential if it satisfies the Dobrushin–Lanford–Ruelle (DLR) equation
::
:for all
and
.
An example
To help understand the above definitions, here are the corresponding quantities in the important example of the
Ising model with nearest-neighbor interactions (coupling constant ) and a magnetic field (), on :
* The lattice is simply
.
* The single-spin space is
* The potential is given by
::
See also
*
Boltzmann distribution
*
Exponential family
*
Gibbs algorithm
*
Gibbs sampling
*
Interacting particle system
*
Potential game
*
Softmax
The softmax function, also known as softargmax or normalized exponential function, converts a vector of real numbers into a probability distribution of possible outcomes. It is a generalization of the logistic function to multiple dimensions, a ...
*
Stochastic cellular automata
References
Further reading
*
*
{{Stochastic processes
Measures (measure theory)
Statistical mechanics
Game theory equilibrium concepts