HOME

TheInfoList



OR:

In mathematics – specifically, in
stochastic analysis Stochastic calculus is a branch of mathematics that operates on stochastic processes. It allows a consistent theory of integration to be defined for integrals of stochastic processes with respect to stochastic processes. This field was created ...
– an Itô diffusion is a solution to a specific type of
stochastic differential equation A stochastic differential equation (SDE) is a differential equation in which one or more of the terms is a stochastic process, resulting in a solution which is also a stochastic process. SDEs are used to model various phenomena such as stock ...
. That equation is similar to the
Langevin equation In physics, a Langevin equation (named after Paul Langevin) is a stochastic differential equation describing how a system evolves when subjected to a combination of deterministic and fluctuating ("random") forces. The dependent variables in a Lange ...
used in
physics Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge which rel ...
to describe the
Brownian motion Brownian motion, or pedesis (from grc, πήδησις "leaping"), is the random motion of particles suspended in a medium (a liquid or a gas). This pattern of motion typically consists of random fluctuations in a particle's position insi ...
of a particle subjected to a potential in a
viscous The viscosity of a fluid is a measure of its resistance to deformation at a given rate. For liquids, it corresponds to the informal concept of "thickness": for example, syrup has a higher viscosity than water. Viscosity quantifies the inte ...
fluid. Itô diffusions are named after the Japanese
mathematician A mathematician is someone who uses an extensive knowledge of mathematics in their work, typically to solve mathematical problems. Mathematicians are concerned with numbers, data, quantity, mathematical structure, structure, space, Mathematica ...
Kiyosi Itô.


Overview

A (time-homogeneous) Itô diffusion in ''n''-dimensional
Euclidean space Euclidean space is the fundamental space of geometry, intended to represent physical space. Originally, that is, in Euclid's ''Elements'', it was the three-dimensional space of Euclidean geometry, but in modern mathematics there are Euclidean sp ...
R''n'' is a
process A process is a series or set of activities that interact to produce a result; it may occur once-only or be recurrent or periodic. Things called a process include: Business and management *Business process, activities that produce a specific se ...
''X'' :  , +∞) × Ω → R''n'' defined on a probability space (Ω, Σ, P) and satisfying a stochastic differential equation of the form :\mathrm X_ = b(X_t) \, \mathrm t + \sigma (X_) \, \mathrm B_, where ''B'' is an ''m''-dimensional
Brownian motion Brownian motion, or pedesis (from grc, πήδησις "leaping"), is the random motion of particles suspended in a medium (a liquid or a gas). This pattern of motion typically consists of random fluctuations in a particle's position insi ...
and ''b'' : R''n'' → R''n'' and σ : R''n'' → R''n''×''m'' satisfy the usual Lipschitz continuity condition :, b(x) - b(y) , + , \sigma (x) - \sigma (y) , \leq C , x - y , for some constant ''C'' and all ''x'', ''y'' ∈ R''n''; this condition ensures the existence of a unique strong solution ''X'' to the stochastic differential equation given above. The vector field ''b'' is known as the drift coefficient of ''X''; the matrix field σ is known as the diffusion coefficient of ''X''. It is important to note that ''b'' and σ do not depend upon time; if they were to depend upon time, ''X'' would be referred to only as an '' Itô process'', not a diffusion. Itô diffusions have a number of nice properties, which include * sample and Feller continuity; * the
Markov property In probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process. It is named after the Russian mathematician Andrey Markov. The term strong Markov property is similar to the Markov prop ...
; * the
strong Markov property In probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process. It is named after the Russian mathematician Andrey Markov. The term strong Markov property is similar to the Markov propert ...
; * the existence of an infinitesimal generator; * the existence of a characteristic operator; *
Dynkin's formula In mathematics — specifically, in stochastic analysis — Dynkin's formula is a theorem giving the expected value of any suitably smooth statistic of an Itō diffusion at a stopping time. It may be seen as a stochastic generalization o ...
. In particular, an Itô diffusion is a continuous, strongly Markovian process such that the domain of its characteristic operator includes all twice-continuously differentiable functions, so it is a ''diffusion'' in the sense defined by Dynkin (1965).


Continuity


Sample continuity

An Itô diffusion ''X'' is a sample continuous process, i.e., for
almost all In mathematics, the term "almost all" means "all but a negligible amount". More precisely, if X is a set, "almost all elements of X" means "all elements of X but those in a negligible subset of X". The meaning of "negligible" depends on the mathem ...
realisations ''Bt''(ω) of the noise, ''Xt''(ω) is a continuous function of the time parameter, ''t''. More accurately, there is a "continuous version" of ''X'', a continuous process ''Y'' so that :\mathbf
X_t = Y_t X, or x, is the twenty-fourth and third-to-last letter in the Latin alphabet, used in the modern English alphabet, the alphabets of other western European languages and others worldwide. Its name in English is ''"ex"'' (pronounced ), ...
= 1 \mbox t. This follows from the standard existence and uniqueness theory for strong solutions of stochastic differential equations.


Feller continuity

In addition to being (sample) continuous, an Itô diffusion ''X'' satisfies the stronger requirement to be a Feller-continuous process. For a point ''x'' ∈ R''n'', let P''x'' denote the law of ''X'' given initial datum ''X''0 = ''x'', and let E''x'' denote
expectation Expectation or Expectations may refer to: Science * Expectation (epistemic) * Expected value, in mathematical probability theory * Expectation value (quantum mechanics) * Expectation–maximization algorithm, in statistics Music * ''Expectation' ...
with respect to P''x''. Let ''f'' : R''n'' → R be a Borel-
measurable function In mathematics and in particular measure theory, a measurable function is a function between the underlying sets of two measurable spaces that preserves the structure of the spaces: the preimage of any measurable set is measurable. This is i ...
that is
bounded below Boundedness or bounded may refer to: Economics * Bounded rationality, the idea that human rationality in decision-making is bounded by the available information, the cognitive limitations, and the time available to make the decision * Bounded e ...
and define, for fixed ''t'' ≥ 0, ''u'' : R''n'' → R by :u(x) = \mathbf^ f(X_t) * Lower semi-continuity: if ''f'' is lower semi-continuous, then ''u'' is lower semi-continuous. * Feller continuity: if ''f'' is bounded and continuous, then ''u'' is continuous. The behaviour of the function ''u'' above when the time ''t'' is varied is addressed by the Kolmogorov backward equation, the Fokker–Planck equation, etc. (See below.)


The Markov property


The Markov property

An Itô diffusion ''X'' has the important property of being ''Markovian'': the future behaviour of ''X'', given what has happened up to some time ''t'', is the same as if the process had been started at the position ''Xt'' at time 0. The precise mathematical formulation of this statement requires some additional notation: Let Σ denote the
natural Nature, in the broadest sense, is the physical world or universe. "Nature" can refer to the phenomena of the physical world, and also to life in general. The study of nature is a large, if not the only, part of science. Although humans are ...
filtration Filtration is a physical separation process that separates solid matter and fluid from a mixture using a ''filter medium'' that has a complex structure through which only the fluid can pass. Solid particles that cannot pass through the filte ...
of (Ω, Σ) generated by the Brownian motion ''B'': for ''t'' ≥ 0, :\Sigma_ = \Sigma_^ = \sigma \left \. It is easy to show that ''X'' is adapted to Σ (i.e. each ''Xt'' is Σ''t''-measurable), so the natural filtration ''F'' = ''F''''X'' of (Ω, Σ) generated by ''X'' has ''Ft'' ⊆ Σ''t'' for each ''t'' ≥ 0. Let ''f'' : R''n'' → R be a bounded, Borel-measurable function. Then, for all ''t'' and ''h'' ≥ 0, the
conditional expectation In probability theory, the conditional expectation, conditional expected value, or conditional mean of a random variable is its expected value – the value it would take “on average” over an arbitrarily large number of occurrences – given ...
conditioned on the
σ-algebra In mathematical analysis and in probability theory, a σ-algebra (also σ-field) on a set ''X'' is a collection Σ of subsets of ''X'' that includes the empty subset, is closed under complement, and is closed under countable unions and countabl ...
Σ''t'' and the expectation of the process "restarted" from ''Xt'' satisfy the Markov property: :\mathbf^ \big \Sigma_ \big(\omega) = \mathbf^
f(X_) F, or f, is the sixth Letter (alphabet), letter in the Latin alphabet, used in the English alphabet, modern English alphabet, the alphabets of other western European languages and others worldwide. Its name in English is English alphabet#Let ...
In fact, ''X'' is also a Markov process with respect to the filtration ''F'', as the following shows: :\begin \mathbf^ \left F_ \right &= \mathbf^ \left \Sigma_ \right\big"> F_ \right\\ &= \mathbf^ \left [ \mathbf^ \left [ f(X_) \right] \big, F_ \right] \\ &= \mathbf^ \left [ f(X_) \right ]. \end


The strong Markov property

The strong Markov property is a generalization of the Markov property above in which ''t'' is replaced by a suitable random time τ : Ω →  , +∞known as a
stopping time In probability theory, in particular in the study of stochastic processes, a stopping time (also Markov time, Markov moment, optional stopping time or optional time ) is a specific type of “random time”: a random variable whose value is inte ...
. So, for example, rather than "restarting" the process ''X'' at time ''t'' = 1, one could "restart" whenever ''X'' first reaches some specified point ''p'' of R''n''. As before, let ''f'' : R''n'' → R be a bounded, Borel-measurable function. Let τ be a stopping time with respect to the filtration Σ with τ < +∞
almost surely In probability theory, an event is said to happen almost surely (sometimes abbreviated as a.s.) if it happens with probability 1 (or Lebesgue measure 1). In other words, the set of possible exceptions may be non-empty, but it has probability 0 ...
. Then, for all ''h'' ≥ 0, :\mathbf^ \big \Sigma_ \big= \mathbf^ \big f(X_) \big


The generator


Definition

Associated to each Itô diffusion, there is a second-order
partial differential operator In mathematics, a differential operator is an operator defined as a function of the differentiation operator. It is helpful, as a matter of notation first, to consider differentiation as an abstract operation that accepts a function and return ...
known as the ''generator'' of the diffusion. The generator is very useful in many applications and encodes a great deal of information about the process ''X''. Formally, the infinitesimal generator of an Itô diffusion ''X'' is the operator ''A'', which is defined to act on suitable functions ''f'' : R''n'' → R by :A f (x) = \lim_ \frac. The set of all functions ''f'' for which this limit exists at a point ''x'' is denoted ''DA''(''x''), while ''DA'' denotes the set of all ''f'' for which the limit exists for all ''x'' ∈ R''n''. One can show that any compactly-supported ''C''2 (twice differentiable with continuous second derivative) function ''f'' lies in ''DA'' and that :Af(x) = \sum_ b_ (x) \frac (x) + \tfrac \sum_ \left( \sigma (x) \sigma (x)^ \right)_ \frac (x), or, in terms of the
gradient In vector calculus, the gradient of a scalar-valued differentiable function of several variables is the vector field (or vector-valued function) \nabla f whose value at a point p is the "direction and rate of fastest increase". If the gr ...
and
scalar Scalar may refer to: *Scalar (mathematics), an element of a field, which is used to define a vector space, usually the field of real numbers *Scalar (physics), a physical quantity that can be described by a single element of a number field such a ...
and Frobenius
inner product In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, often ...
s, :A f (x) = b(x) \cdot \nabla_ f(x) + \tfrac1 \left( \sigma(x) \sigma(x)^ \right ) : \nabla_ \nabla_ f(x).


An example

The generator ''A'' for standard ''n''-dimensional Brownian motion ''B'', which satisfies the stochastic differential equation d''Xt'' = d''Bt'', is given by :A f (x) = \tfrac1 \sum_ \delta_ \frac (x) = \tfrac1 \sum_ \frac (x), i.e., ''A'' = Δ/2, where Δ denotes the
Laplace operator In mathematics, the Laplace operator or Laplacian is a differential operator given by the divergence of the gradient of a scalar function on Euclidean space. It is usually denoted by the symbols \nabla\cdot\nabla, \nabla^2 (where \nabla is t ...
.


The Kolmogorov and Fokker–Planck equations

The generator is used in the formulation of Kolmogorov's backward equation. Intuitively, this equation tells us how the expected value of any suitably smooth statistic of ''X'' evolves in time: it must solve a certain
partial differential equation In mathematics, a partial differential equation (PDE) is an equation which imposes relations between the various partial derivatives of a multivariable function. The function is often thought of as an "unknown" to be solved for, similarly to ...
in which time ''t'' and the initial position ''x'' are the independent variables. More precisely, if ''f'' ∈ ''C''2(R''n''; R) has compact support and ''u'' :  , +∞) × R''n'' → R is defined by :u(t, x) = \mathbf^ [ f(X_t) then ''u''(''t'', ''x'') is differentiable with respect to ''t'', ''u''(''t'', ·) ∈ ''DA'' for all ''t'', and ''u'' satisfies the following
partial differential equation In mathematics, a partial differential equation (PDE) is an equation which imposes relations between the various partial derivatives of a multivariable function. The function is often thought of as an "unknown" to be solved for, similarly to ...
, known as Kolmogorov's backward equation: :\begin \dfrac(t, x) = A u (t, x), & t > 0, x \in \mathbf^; \\ u(0, x) = f(x), & x \in \mathbf^. \end The Fokker–Planck equation (also known as ''Kolmogorov's forward equation'') is in some sense the "adjoint" to the backward equation, and tells us how the probability density functions of ''Xt'' evolve with time ''t''. Let ρ(''t'', ·) be the density of ''Xt'' with respect to
Lebesgue measure In measure theory, a branch of mathematics, the Lebesgue measure, named after French mathematician Henri Lebesgue, is the standard way of assigning a measure to subsets of ''n''-dimensional Euclidean space. For ''n'' = 1, 2, or 3, it coincides ...
on R''n'', i.e., for any Borel-measurable set ''S'' ⊆ R''n'', :\mathbf \left X_t \in S \right = \int_ \rho(t, x) \, \mathrm x. Let ''A'' denote the
Hermitian adjoint In mathematics, specifically in operator theory, each linear operator A on a Euclidean vector space defines a Hermitian adjoint (or adjoint) operator A^* on that space according to the rule :\langle Ax,y \rangle = \langle x,A^*y \rangle, wher ...
of ''A'' (with respect to the ''L''2
inner product In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, often ...
). Then, given that the initial position ''X''0 has a prescribed density ρ0, ρ(''t'', ''x'') is differentiable with respect to ''t'', ρ(''t'', ·) ∈ ''DA''* for all ''t'', and ρ satisfies the following partial differential equation, known as the Fokker–Planck equation: :\begin \dfrac(t, x) = A^ \rho (t, x), & t > 0, x \in \mathbf^; \\ \rho(0, x) = \rho_ (x), & x \in \mathbf^. \end


The Feynman–Kac formula

The Feynman–Kac formula is a useful generalization of Kolmogorov's backward equation. Again, ''f'' is in ''C''2(R''n''; R) and has compact support, and ''q'' : R''n'' → R is taken to be a continuous function that is bounded below. Define a function ''v'' :  , +∞) × R''n'' → R by :v(t, x) = \mathbf^ \left[ \exp \left( - \int_^ q(X_) \, \mathrm s \right) f(X_) \right The Feynman–Kac formula states that ''v'' satisfies the partial differential equation :\begin \dfrac(t, x) = A v (t, x) - q(x) v(t, x), & t > 0, x \in \mathbf^; \\ v(0, x) = f(x), & x \in \mathbf^. \end Moreover, if ''w'' : [0, +∞) × R''n'' → R is ''C''1 in time, ''C''2 in space, bounded on ''K'' × R''n'' for all compact ''K'', and satisfies the above partial differential equation, then ''w'' must be ''v'' as defined above. Kolmogorov's backward equation is the special case of the Feynman–Kac formula in which ''q''(''x'') = 0 for all ''x'' ∈ R''n''.


The characteristic operator


Definition

The characteristic operator of an Itô diffusion ''X'' is a partial differential operator closely related to the generator, but somewhat more general. It is more suited to certain problems, for example in the solution of the Dirichlet problem. The characteristic operator \mathcal of an Itô diffusion ''X'' is defined by :\mathcal f (x) = \lim_ \frac, where the sets ''U'' form a sequence of open sets ''Uk'' that decrease to the point ''x'' in the sense that :U_ \subseteq U_ \mbox \bigcap_^ U_ = \, and :\tau_ = \inf \ is the first exit time from ''U'' for ''X''. D_ denotes the set of all ''f'' for which this limit exists for all ''x'' ∈ R''n'' and all sequences . If E''x'' ''U''nbsp;= +∞ for all open sets ''U'' containing ''x'', define :\mathcal f (x) = 0.


Relationship with the generator

The characteristic operator and infinitesimal generator are very closely related, and even agree for a large class of functions. One can show that :D_ \subseteq D_ and that :A f = \mathcal f \mbox f \in D_. In particular, the generator and characteristic operator agree for all ''C''2 functions ''f'', in which case :\mathcal f(x) = \sum_i b_i (x) \frac (x) + \tfrac1 \sum_ \left( \sigma (x) \sigma (x)^ \right)_ \frac (x).


Application: Brownian motion on a Riemannian manifold

Above, the generator (and hence characteristic operator) of Brownian motion on R''n'' was calculated to be ½Δ, where Δ denotes the Laplace operator. The characteristic operator is useful in defining Brownian motion on an ''m''-dimensional
Riemannian manifold In differential geometry, a Riemannian manifold or Riemannian space , so called after the German mathematician Bernhard Riemann, is a real, smooth manifold ''M'' equipped with a positive-definite inner product ''g'p'' on the tangent spac ...
(''M'', ''g''): a Brownian motion on ''M'' is defined to be a diffusion on ''M'' whose characteristic operator \mathcal in local coordinates ''xi'', 1 ≤ ''i'' ≤ ''m'', is given by ½ΔLB, where ΔLB is the Laplace-Beltrami operator given in local coordinates by :\Delta_ = \frac1 \sum_^ \frac \left( \sqrt \sum_^ g^ \frac \right), where 'gij''nbsp;=  'gij''sup>−1 in the sense of the inverse of a square matrix.


The resolvent operator

In general, the generator ''A'' of an Itô diffusion ''X'' is not a
bounded operator In functional analysis and operator theory, a bounded linear operator is a linear transformation L : X \to Y between topological vector spaces (TVSs) X and Y that maps bounded subsets of X to bounded subsets of Y. If X and Y are normed vecto ...
. However, if a positive multiple of the identity operator I is subtracted from ''A'' then the resulting operator is invertible. The inverse of this operator can be expressed in terms of ''X'' itself using the resolvent operator. For α > 0, the resolvent operator ''R''α, acting on bounded, continuous functions ''g'' : R''n'' → R, is defined by :R_ g (x) = \mathbf^ \left \int_^ e^ g(X_) \, \mathrm t \right It can be shown, using the Feller continuity of the diffusion ''X'', that ''R''α''g'' is itself a bounded, continuous function. Also, ''R''α and αI − ''A'' are mutually inverse operators: * if ''f'' : R''n'' → R is ''C''2 with compact support, then, for all α > 0, ::R_ (\alpha \mathbf - A) f = f; * if ''g'' : R''n'' → R is bounded and continuous, then ''R''α''g'' lies in ''DA'' and, for all α > 0, ::(\alpha \mathbf - A) R_ g = g.


Invariant measures

Sometimes it is necessary to find an
invariant measure In mathematics, an invariant measure is a measure that is preserved by some function. The function may be a geometric transformation. For examples, circular angle is invariant under rotation, hyperbolic angle is invariant under squeeze mapping, ...
for an Itô diffusion ''X'', i.e. a measure on R''n'' that does not change under the "flow" of ''X'': i.e., if ''X''0 is distributed according to such an invariant measure μ, then ''Xt'' is also distributed according to μ for any ''t'' ≥ 0. The Fokker–Planck equation offers a way to find such a measure, at least if it has a probability density function ρ: if ''X''0 is indeed distributed according to an invariant measure μ with density ρ, then the density ρ(''t'', ·) of ''Xt'' does not change with ''t'', so ρ(''t'', ·) = ρ, and so ρ must solve the (time-independent) partial differential equation :A^ \rho_ (x) = 0, \quad x \in \mathbf^. This illustrates one of the connections between stochastic analysis and the study of partial differential equations. Conversely, a given second-order linear partial differential equation of the form Λ''f'' = 0 may be hard to solve directly, but if Λ = ''A'' for some Itô diffusion ''X'', and an invariant measure for ''X'' is easy to compute, then that measure's density provides a solution to the partial differential equation.


Invariant measures for gradient flows

An invariant measure is comparatively easy to compute when the process ''X'' is a stochastic gradient flow of the form :\mathrm X_ = - \nabla \Psi (X_) \, \mathrm t + \sqrt \, \mathrm B_, where β > 0 plays the role of an inverse temperature and Ψ : R''n'' → R is a scalar potential satisfying suitable smoothness and growth conditions. In this case, the Fokker–Planck equation has a unique stationary solution ρ (i.e. ''X'' has a unique invariant measure μ with density ρ) and it is given by the
Gibbs distribution In statistical mechanics and mathematics, a Boltzmann distribution (also called Gibbs distribution Translated by J.B. Sykes and M.J. Kearsley. See section 28) is a probability distribution or probability measure that gives the probability ...
: :\rho_ (x) = Z^ \exp ( - \beta \Psi (x) ), where the partition function ''Z'' is given by :Z = \int_ \exp ( - \beta \Psi (x) ) \, \mathrm x. Moreover, the density ρ satisfies a
variational principle In science and especially in mathematical studies, a variational principle is one that enables a problem to be solved using calculus of variations, which concerns finding functions that optimize the values of quantities that depend on those funct ...
: it minimizes over all probability densities ρ on R''n'' the free energy functional ''F'' given by :F
rho Rho (uppercase Ρ, lowercase ρ or ; el, ρο or el, ρω, label=none) is the 17th letter of the Greek alphabet. In the system of Greek numerals it has a value of 100. It is derived from Phoenician letter res . Its uppercase form uses the sa ...
= E
rho Rho (uppercase Ρ, lowercase ρ or ; el, ρο or el, ρω, label=none) is the 17th letter of the Greek alphabet. In the system of Greek numerals it has a value of 100. It is derived from Phoenician letter res . Its uppercase form uses the sa ...
+ \frac1 S
rho Rho (uppercase Ρ, lowercase ρ or ; el, ρο or el, ρω, label=none) is the 17th letter of the Greek alphabet. In the system of Greek numerals it has a value of 100. It is derived from Phoenician letter res . Its uppercase form uses the sa ...
where :E
rho Rho (uppercase Ρ, lowercase ρ or ; el, ρο or el, ρω, label=none) is the 17th letter of the Greek alphabet. In the system of Greek numerals it has a value of 100. It is derived from Phoenician letter res . Its uppercase form uses the sa ...
= \int_ \Psi(x) \rho(x) \, \mathrm x plays the role of an energy functional, and :S
rho Rho (uppercase Ρ, lowercase ρ or ; el, ρο or el, ρω, label=none) is the 17th letter of the Greek alphabet. In the system of Greek numerals it has a value of 100. It is derived from Phoenician letter res . Its uppercase form uses the sa ...
= \int_ \rho(x) \log \rho(x) \, \mathrm x is the negative of the Gibbs-Boltzmann entropy functional. Even when the potential Ψ is not well-behaved enough for the partition function ''Z'' and the Gibbs measure μ to be defined, the free energy ''F'' �(''t'', ·)still makes sense for each time ''t'' ≥ 0, provided that the initial condition has ''F'' �(0, ·)nbsp;< +∞. The free energy functional ''F'' is, in fact, a
Lyapunov function In the theory of ordinary differential equations (ODEs), Lyapunov functions, named after Aleksandr Lyapunov, are scalar functions that may be used to prove the stability of an equilibrium of an ODE. Lyapunov functions (also called Lyapunov’s se ...
for the Fokker–Planck equation: ''F'' �(''t'', ·)must decrease as ''t'' increases. Thus, ''F'' is an ''H''-function for the ''X''-dynamics.


Example

Consider the Ornstein-Uhlenbeck process ''X'' on R''n'' satisfying the stochastic differential equation :\mathrm X_ = - \kappa ( X_ - m) \, \mathrm t + \sqrt \, \mathrm B_, where ''m'' ∈ R''n'' and β, κ > 0 are given constants. In this case, the potential Ψ is given by :\Psi(x) = \tfrac \kappa , x - m, ^2, and so the invariant measure for ''X'' is a
Gaussian measure In mathematics, Gaussian measure is a Borel measure on finite-dimensional Euclidean space R''n'', closely related to the normal distribution in statistics. There is also a generalization to infinite-dimensional spaces. Gaussian measures are named ...
with density ρ given by :\rho_ (x) = \left( \frac \right)^ \exp \left( - \frac \right). Heuristically, for large ''t'', ''Xt'' is approximately
normally distributed In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu is ...
with mean ''m'' and variance (βκ)−1. The expression for the variance may be interpreted as follows: large values of κ mean that the potential well Ψ has "very steep sides", so ''Xt'' is unlikely to move far from the minimum of Ψ at ''m''; similarly, large values of β mean that the system is quite "cold" with little noise, so, again, ''Xt'' is unlikely to move far away from ''m''.


The martingale property

In general, an Itô diffusion ''X'' is not a
martingale Martingale may refer to: * Martingale (probability theory), a stochastic process in which the conditional expectation of the next value, given the current and preceding values, is the current value * Martingale (tack) for horses * Martingale (coll ...
. However, for any ''f'' ∈ ''C''2(R''n''; R) with compact support, the process ''M'' : [0, +∞) × Ω → R defined by :M_ = f(X_) - \int_^ A f(X_) \, \mathrm s, where ''A'' is the generator of ''X'', is a martingale with respect to the natural filtration ''F'' of (Ω, Σ) by ''X''. The proof is quite simple: it follows from the usual expression of the action of the generator on smooth enough functions ''f'' and Itô's lemma (the stochastic chain rule) that :f(X_) = f(x) + \int_^ A f(X_) \, \mathrm s + \int_^ \nabla f(X_)^ \sigma(X_) \, \mathrm B_. Since Itô integrals are martingales with respect to the natural filtration Σ of (Ω, Σ) by ''B'', for ''t'' > ''s'', :\mathbf^ \big \Sigma_ \big= M_. Hence, as required, :\mathbf^ F_s= \mathbf^ \left \Sigma_ \big\big"> F_ \right= \mathbf^ \big F_ \big= M_, since ''Ms'' is ''Fs''-measurable.


Dynkin's formula

Dynkin's formula, named after
Eugene Dynkin Eugene Borisovich Dynkin (russian: link=no, Евгений Борисович Дынкин; 11 May 1924 – 14 November 2014) was a Soviet and American mathematician. He made contributions to the fields of probability and algebra, especially sem ...
, gives the
expected value In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a ...
of any suitably smooth statistic of an Itô diffusion ''X'' (with generator ''A'') at a stopping time. Precisely, if τ is a stopping time with E''x'' nbsp;< +∞, and ''f'' : R''n'' → R is ''C''2 with compact support, then :\mathbf^
(X_) X, or x, is the twenty-fourth and third-to-last Letter (alphabet), letter in the Latin alphabet, used in the English alphabet, modern English alphabet, the alphabets of other western European languages and others worldwide. Its English a ...
= f(x) + \mathbf^ \left \int_^ A f (X_) \, \mathrm s \right Dynkin's formula can be used to calculate many useful statistics of stopping times. For example, canonical Brownian motion on the real line starting at 0 exits the interval (−''R'', +''R'') at a random time τ''R'' with expected value :\mathbf^ tau_= R^. Dynkin's formula provides information about the behaviour of ''X'' at a fairly general stopping time. For more information on the distribution of ''X'' at a
hitting time In the study of stochastic processes in mathematics, a hitting time (or first hit time) is the first time at which a given process "hits" a given subset of the state space. Exit times and return times are also examples of hitting times. Definition ...
, one can study the ''harmonic measure'' of the process.


Associated measures


The harmonic measure

In many situations, it is sufficient to know when an Itô diffusion ''X'' will first leave a
measurable set In mathematics, the concept of a measure is a generalization and formalization of geometrical measures ( length, area, volume) and other common notions, such as mass and probability of events. These seemingly distinct concepts have many simi ...
''H'' ⊆ R''n''. That is, one wishes to study the first exit time :\tau_ (\omega) = \inf \. Sometimes, however, one also wishes to know the distribution of the points at which ''X'' exits the set. For example, canonical Brownian motion ''B'' on the real line starting at 0 exits the interval (−1, 1) at −1 with probability ½ and at 1 with probability ½, so ''B''τ(−1, 1) is uniformly distributed on the set . In general, if ''G'' is
compactly embedded In mathematics, the notion of being compactly embedded expresses the idea that one set or space is "well contained" inside another. There are versions of this concept appropriate to general topology and functional analysis. Definition (topological ...
within R''n'', then the harmonic measure (or hitting distribution) of ''X'' on the
boundary Boundary or Boundaries may refer to: * Border, in political geography Entertainment * ''Boundaries'' (2016 film), a 2016 Canadian film * ''Boundaries'' (2018 film), a 2018 American-Canadian road trip film *Boundary (cricket), the edge of the pla ...
∂''G'' of ''G'' is the measure μ''G''''x'' defined by :\mu_^ (F) = \mathbf^ \left X_ \in F \right /math> for ''x'' ∈ ''G'' and ''F'' ⊆ ∂''G''. Returning to the earlier example of Brownian motion, one can show that if ''B'' is a Brownian motion in R''n'' starting at ''x'' ∈ R''n'' and ''D'' ⊂ R''n'' is an
open ball In mathematics, a ball is the solid figure bounded by a ''sphere''; it is also called a solid sphere. It may be a closed ball (including the boundary points that constitute the sphere) or an open ball (excluding them). These concepts are def ...
centred on ''x'', then the harmonic measure of ''B'' on ∂''D'' is invariant under all
rotation Rotation, or spin, is the circular movement of an object around a '' central axis''. A two-dimensional rotating object has only one possible central axis and can rotate in either a clockwise or counterclockwise direction. A three-dimensional ...
s of ''D'' about ''x'' and coincides with the normalized
surface measure A surface, as the term is most generally used, is the outermost or uppermost layer of a physical object or space. It is the portion or region of the object that can first be perceived by an observer using the senses of sight and touch, and is ...
on ∂''D''. The harmonic measure satisfies an interesting mean value property: if ''f'' : R''n'' → R is any bounded, Borel-measurable function and φ is given by :\varphi (x) = \mathbf^ \left f(X_) \right then, for all Borel sets ''G'' ⊂⊂ ''H'' and all ''x'' ∈ ''G'', :\varphi (x) = \int_ \varphi (y) \, \mathrm \mu_^ (y). The mean value property is very useful in the solution of partial differential equations using stochastic processes.


The Green measure and Green formula

Let ''A'' be a partial differential operator on a domain ''D'' ⊆ R''n'' and let ''X'' be an Itô diffusion with ''A'' as its generator. Intuitively, the Green measure of a Borel set ''H'' is the expected length of time that ''X'' stays in ''H'' before it leaves the domain ''D''. That is, the Green measure of ''X'' with respect to ''D'' at ''x'', denoted ''G''(''x'', ·), is defined for Borel sets ''H'' ⊆ R''n'' by :G(x, H) = \mathbf^ \left \int_^ \chi_ (X_) \, \mathrm s \right or for bounded, continuous functions ''f'' : ''D'' → R by :\int_ f(y) \, G(x, \mathrm y) = \mathbf^ \left \int_^ f(X_) \, \mathrm s \right The name "Green measure" comes from the fact that if ''X'' is Brownian motion, then :G(x, H) = \int_ G(x, y) \, \mathrm y, where ''G''(''x'', ''y'') is
Green's function In mathematics, a Green's function is the impulse response of an inhomogeneous linear differential operator defined on a domain with specified initial conditions or boundary conditions. This means that if \operatorname is the linear differenti ...
for the operator ½Δ on the domain ''D''. Suppose that E''x'' ''D''nbsp;< +∞ for all ''x'' ∈ ''D''. Then the Green formula holds for all ''f'' ∈ ''C''2(R''n''; R) with compact support: :f(x) = \mathbf^ \left f \left( X_ \right) \right- \int_ A f (y) \, G(x, \mathrm y). In particular, if the support of ''f'' is
compactly embedded In mathematics, the notion of being compactly embedded expresses the idea that one set or space is "well contained" inside another. There are versions of this concept appropriate to general topology and functional analysis. Definition (topological ...
in ''D'', :f(x) = - \int_ A f (y) \, G(x, \mathrm y).


See also

*
Diffusion process In probability theory and statistics, diffusion processes are a class of continuous-time Markov process with almost surely continuous sample paths. Brownian motion, reflected Brownian motion and Ornstein–Uhlenbeck processes are examples of dif ...


References

* * * (See Sections 7, 8 and 9) {{DEFAULTSORT:Ito diffusion Stochastic differential equations