In
probability theory and
mathematical physics, a random matrix is a
matrix
Matrix most commonly refers to:
* ''The Matrix'' (franchise), an American media franchise
** '' The Matrix'', a 1999 science-fiction action film
** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchi ...
-valued
random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
—that is, a matrix in which some or all elements are random variables. Many important properties of
physical system
A physical system is a collection of physical objects.
In physics, it is a portion of the physical universe chosen for analysis. Everything outside the system is known as the environment. The environment is ignored except for its effects on the ...
s can be represented mathematically as matrix problems. For example, the
thermal conductivity of a
lattice can be computed from the dynamical matrix of the particle-particle interactions within the lattice.
Applications
Physics
In
nuclear physics, random matrices were introduced by
Eugene Wigner to model the nuclei of heavy atoms.
Wigner postulated that the spacings between the lines in the spectrum of a heavy atom nucleus should resemble the spacings between the
eigenvalues
In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
of a random matrix, and should depend only on the symmetry class of the underlying evolution.
In
solid-state physics
Solid-state physics is the study of rigid matter, or solids, through methods such as quantum mechanics, crystallography, electromagnetism, and metallurgy. It is the largest branch of condensed matter physics. Solid-state physics studies how the l ...
, random matrices model the behaviour of large disordered
Hamiltonians in the
mean-field approximation.
In
quantum chaos, the Bohigas–Giannoni–Schmit (BGS) conjecture asserts that the spectral statistics of quantum systems whose classical counterparts exhibit chaotic behaviour are described by random matrix theory.
In
quantum optics, transformations described by random unitary matrices are crucial for demonstrating the advantage of quantum over classical computation (see, e.g., the
boson sampling
Boson sampling is a restricted model of non-universal quantum computation introduced by Scott Aaronson and Alex Arkhipov after the original work of Lidror Troyansky and Naftali Tishby, that explored possible usage of boson scattering to evaluate ...
model). Moreover, such random unitary transformations can be directly implemented in an optical circuit, by mapping their parameters to optical circuit components (that is
beam splitter
A beam splitter or ''beamsplitter'' is an optical device that splits a beam of light into a transmitted and a reflected beam. It is a crucial part of many optical experimental and measurement systems, such as interferometers, also finding wide ...
s and phase shifters).
Random matrix theory has also found applications to the chiral Dirac operator in
quantum chromodynamics
In theoretical physics, quantum chromodynamics (QCD) is the theory of the strong interaction between quarks mediated by gluons. Quarks are fundamental particles that make up composite hadrons such as the proton, neutron and pion. QCD is a type ...
,
quantum gravity
Quantum gravity (QG) is a field of theoretical physics that seeks to describe gravity according to the principles of quantum mechanics; it deals with environments in which neither gravitational nor quantum effects can be ignored, such as in the vi ...
in two dimensions,
mesoscopic physics,
spin-transfer torque, the
fractional quantum Hall effect,
Anderson localization,
quantum dots, and
superconductors
Mathematical statistics and numerical analysis
In
multivariate statistics, random matrices were introduced by
John Wishart, who sought to
estimate covariance matrices of large samples.
Chernoff Chernoff is a surname. Notable people with the surname include:
* Herman Chernoff applied mathematician, statistician and physicist
** Chernoff bound, also called Chernoff's inequality
** Chernoff face
** Chernoff's distribution
* Maxine Chernoff ...
-,
Bernstein-, and
Hoeffding-type inequalities can typically be strengthened when applied to the maximal eigenvalue of a finite sum of random
Hermitian matrices.
In
numerical analysis, random matrices have been used since the work of
John von Neumann and
Herman Goldstine
Herman Heine Goldstine (September 13, 1913 – June 16, 2004) was a mathematician and computer scientist, who worked as the director of the IAS machine at Princeton University's Institute for Advanced Study and helped to develop ENIAC, the ...
to describe computation errors in operations such as
matrix multiplication. Although random entries are traditional "generic" inputs to an algorithm, the
concentration of measure associated with random matrix distributions implies that random matrices will not test large portions of an algorithm's input space.
Number theory
In
number theory, the distribution of zeros of the
Riemann zeta function
The Riemann zeta function or Euler–Riemann zeta function, denoted by the Greek letter (zeta), is a mathematical function of a complex variable defined as \zeta(s) = \sum_^\infty \frac = \frac + \frac + \frac + \cdots for \operatorname(s) > ...
(and other
L-functions) is modeled by the distribution of eigenvalues of certain random matrices. The connection was first discovered by
Hugh Montgomery and
Freeman J. Dyson. It is connected to the
Hilbert–Pólya conjecture In mathematics, the Hilbert–Pólya conjecture states that the non-trivial zeros of the Riemann zeta function correspond to eigenvalues of a self-adjoint operator. It is a possible approach to the Riemann hypothesis, by means of spectral theory. ...
.
Theoretical neuroscience
In the field of theoretical neuroscience, random matrices are increasingly used to model the network of synaptic connections between neurons in the brain. Dynamical models of neuronal networks with random connectivity matrix were shown to exhibit a phase transition to chaos when the variance of the synaptic weights crosses a critical value, at the limit of infinite system size. Results on random matrices have also shown that the dynamics of random-matrix models are insensitive to mean connection strength. Instead, the stability of fluctuations depends on connection strength variation and time to synchrony depends on network topology.
Optimal control
In
optimal control
Optimal control theory is a branch of mathematical optimization that deals with finding a control for a dynamical system over a period of time such that an objective function is optimized. It has numerous applications in science, engineering and ...
theory, the evolution of ''n'' state variables through time depends at any time on their own values and on the values of ''k'' control variables. With linear evolution, matrices of coefficients appear in the state equation (equation of evolution). In some problems the values of the parameters in these matrices are not known with certainty, in which case there are random matrices in the state equation and the problem is known as one of
stochastic control
Stochastic control or stochastic optimal control is a sub field of control theory that deals with the existence of uncertainty either in observations or in the noise that drives the evolution of the system. The system designer assumes, in a Bayes ...
. A key result in the case of
linear-quadratic control with stochastic matrices is that the
certainty equivalence principle
Stochastic control or stochastic optimal control is a sub field of control theory that deals with the existence of uncertainty either in observations or in the noise that drives the evolution of the system. The system designer assumes, in a Bayesi ...
does not apply: while in the absence of
multiplier uncertainty (that is, with only additive uncertainty) the optimal policy with a quadratic loss function coincides with what would be decided if the uncertainty were ignored, the optimal policy may differ if the state equation contains random coefficients.
Gaussian ensembles
The most-commonly studied random matrix
distributions are the Gaussian ensembles.
The Gaussian unitary ensemble
is described by the
Gaussian measure with density
:
on the space of
Hermitian matrices . Here
is a normalization constant, chosen so that the integral of the density is equal to one. The term ''unitary'' refers to the fact that the distribution is invariant under unitary conjugation.
The Gaussian unitary ensemble models
Hamiltonians lacking time-reversal symmetry.
The Gaussian orthogonal ensemble
is described by the Gaussian measure with density
:
on the space of ''n × n'' real symmetric matrices ''H'' = (''H''
''ij''). Its distribution is invariant under orthogonal conjugation, and it models Hamiltonians with time-reversal symmetry.
The Gaussian symplectic ensemble
is described by the Gaussian measure with density
:
on the space of ''n × n'' Hermitian
quaternionic matrices, e.g. symmetric square matrices composed of
quaternion
In mathematics, the quaternion number system extends the complex numbers. Quaternions were first described by the Irish mathematician William Rowan Hamilton in 1843 and applied to mechanics in three-dimensional space. Hamilton defined a quatern ...
s, ''H'' = (''H''
''ij''). Its distribution is invariant under conjugation by the
symplectic group, and it models Hamiltonians with time-reversal symmetry but no rotational symmetry.
The Gaussian ensembles GOE, GUE and GSE are often denoted by their
Dyson index, ''β'' = 1 for GOE, ''β'' = 2 for GUE, and ''β'' = 4 for GSE. This index counts the number of real components per matrix element. The ensembles as defined here have Gaussian distributed matrix elements with mean ⟨''H''
''ij''⟩ = 0, and two-point correlations given by
:
,
from which all higher correlations follow by
Isserlis' theorem
In probability theory, Isserlis' theorem or Wick's probability theorem is a formula that allows one to compute higher-order moments of the multivariate normal distribution in terms of its covariance matrix. It is named after Leon Isserlis.
This t ...
.
The joint
probability density for the
eigenvalues
In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted b ...
''λ''
1,''λ''
2,...,''λ''
''n'' of GUE/GOE/GSE is given by
:
where ''Z''
''β'',''n'' is a normalization constant which can be explicitly computed, see
Selberg integral. In the case of GUE (''β'' = 2), the formula (1) describes a
determinantal point process In mathematics, a determinantal point process is a stochastic point process, the probability distribution of which is characterized as a determinant of some function. Such processes arise as important tools in random matrix theory, combinatorics, ph ...
. Eigenvalues repel as the joint probability density has a zero (of
th order) for coinciding eigenvalues
.
For the distribution of the largest eigenvalue for GOE, GUE and Wishart matrices of finite dimensions, see.
Distribution of level spacings
From the ordered sequence of eigenvalues
, one defines the normalized
spacings , where
is the mean spacing. The probability distribution of spacings is approximately given by,
:
for the orthogonal ensemble GOE
,
:
for the unitary ensemble GUE
, and
:
for the symplectic ensemble GSE
.
The numerical constants are such that
is normalized:
:
and the mean spacing is,
:
for
.
Generalizations
''Wigner matrices'' are random Hermitian matrices
such that the entries
:
above the main diagonal are independent random variables with zero mean and have identical second moments.
''Invariant matrix ensembles'' are random Hermitian matrices with density on the space of real symmetric/ Hermitian/ quaternionic Hermitian matrices, which is of the form
where the function ''V'' is called the potential.
The Gaussian ensembles are the only common special cases of these two classes of random matrices.
Spectral theory of random matrices
The spectral theory of random matrices studies the distribution of the eigenvalues as the size of the matrix goes to infinity.
Global regime
In the ''global regime'', one is interested in the distribution of linear statistics of the form
.
Empirical spectral measure
The ''empirical spectral measure'' ''μ
H'' of ''H'' is defined by
:
Usually, the limit of
is a deterministic measure; this is a particular case of
self-averaging. The
cumulative distribution function
In probability theory and statistics, the cumulative distribution function (CDF) of a real-valued random variable X, or just distribution function of X, evaluated at x, is the probability that X will take a value less than or equal to x.
Ev ...
of the limiting measure is called the
integrated density of states and is denoted ''N''(''λ''). If the integrated density of states is differentiable, its derivative is called the
density of states and is denoted ''ρ''(''λ'').
The limit of the empirical spectral measure for Wigner matrices was described by
Eugene Wigner; see
Wigner semicircle distribution and
Wigner surmise
In mathematical physics, the Wigner surmise is a statement about the probability distribution of the spaces between points in the spectra of nuclei of heavy atoms, which have many degrees of freedom, or quantum systems with few degrees of freedom ...
. As far as sample covariance matrices are concerned, a theory was developed by Marčenko and Pastur.
[.]
The limit of the empirical spectral measure of invariant matrix ensembles is described by a certain integral equation which arises from
potential theory.
Fluctuations
For the linear statistics ''N''
''f'',''H'' = ''n''
−1 Σ ''f''(''λ''
''j''), one is also interested in the fluctuations about ∫ ''f''(''λ'') ''dN''(''λ''). For many classes of random matrices, a central limit theorem of the form
:
is known.
Local regime
In the ''local regime'', one is interested in the spacings between eigenvalues, and, more generally, in the joint distribution of eigenvalues in an interval of length of order 1/''n''. One distinguishes between ''bulk statistics'', pertaining to intervals inside the support of the limiting spectral measure, and ''edge statistics'', pertaining to intervals near the boundary of the support.
Bulk statistics
Formally, fix
in the
interior
Interior may refer to:
Arts and media
* ''Interior'' (Degas) (also known as ''The Rape''), painting by Edgar Degas
* ''Interior'' (play), 1895 play by Belgian playwright Maurice Maeterlinck
* ''The Interior'' (novel), by Lisa See
* Interior de ...
of the
support
Support may refer to:
Arts, entertainment, and media
* Supporting character
Business and finance
* Support (technical analysis)
* Child support
* Customer support
* Income Support
Construction
* Support (structure), or lateral support, a ...
of
. Then consider the
point process
:
where
are the eigenvalues of the random matrix.
The point process
captures the statistical properties of eigenvalues in the vicinity of
. For the
Gaussian ensembles, the limit of
is known;
thus, for GUE it is a
determinantal point process In mathematics, a determinantal point process is a stochastic point process, the probability distribution of which is characterized as a determinant of some function. Such processes arise as important tools in random matrix theory, combinatorics, ph ...
with the kernel
:
(the ''sine kernel'').
The ''universality'' principle postulates that the limit of
as
should depend only on the symmetry class of the random matrix (and neither on the specific model of random matrices nor on
). Rigorous proofs of universality are known for invariant matrix ensembles and Wigner matrices.
Edge statistics
Correlation functions
The joint probability density of the eigenvalues of
random Hermitian matrices
, with partition functions of the form
:
where
:
and
is the standard Lebesgue measure on the
space
of Hermitian
matrices, is given by
:
The
-point correlation functions (or ''marginal distributions'')
are defined as
:
which are skew symmetric functions of their variables.
In particular, the one-point correlation function, or ''density of states'', is
:
Its integral over a Borel set
gives the expected number of eigenvalues contained in
:
:
The following result expresses these correlation functions as determinants of the matrices formed from evaluating the appropriate integral kernel at the pairs
of points appearing within the correlator.
Theorem
yson-Mehta
For any
,
the
-point correlation function
can be written as a determinant
:
where
is the
th Christoffel-Darboux kernel
:
associated to
, written in terms of the quasipolynomials
:
where
is a complete sequence of monic polynomials, of the degrees indicated, satisfying the orthogonilty conditions
:
Other classes of random matrices
Wishart matrices
''Wishart matrices'' are ''n × n'' random matrices of the form ''H'' = ''X'' ''X''
*, where ''X'' is an ''n × m'' random matrix (''m'' ≥ ''n'') with independent entries, and ''X''
* is its
conjugate transpose. In the important special case considered by Wishart, the entries of ''X'' are identically distributed Gaussian random variables (either real or complex).
The
limit of the empirical spectral measure of Wishart matrices was found
by
Vladimir Marchenko and
Leonid Pastur.
Random unitary matrices
:''See
circular ensembles In the theory of random matrices, the circular ensembles are measures on spaces of unitary matrices introduced by Freeman Dyson as modifications of the Gaussian matrix ensembles. The three main examples are the circular orthogonal ensemble (COE) ...
.''
Non-Hermitian random matrices
:''See
circular law.''
References
Books
*
*
*
Survey articles
*
*
*
*
*
Historic works
*
*
*
Footnotes
External links
*
*
{{DEFAULTSORT:Random Matrix
Algebra of random variables
Mathematical physics