HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...
, a martingale is a
sequence In mathematics, a sequence is an enumerated collection of objects in which repetitions are allowed and order matters. Like a set, it contains members (also called ''elements'', or ''terms''). The number of elements (possibly infinite) is calle ...
of random variables (i.e., a stochastic process) for which, at a particular time, the conditional expectation of the next value in the sequence is equal to the present value, regardless of all prior values.


History

Originally, '' martingale'' referred to a class of betting strategies that was popular in 18th-century
France France (), officially the French Republic ( ), is a country primarily located in Western Europe. It also comprises of overseas regions and territories in the Americas and the Atlantic, Pacific and Indian Oceans. Its metropolitan area ...
. The simplest of these strategies was designed for a game in which the
gambler Gambling (also known as betting or gaming) is the wagering of something of value ("the stakes") on a random event with the intent of winning something else of value, where instances of strategy are discounted. Gambling thus requires three elem ...
wins their stake if a coin comes up heads and loses it if the coin comes up tails. The strategy had the gambler double their bet after every loss so that the first win would recover all previous losses plus win a profit equal to the original stake. As the gambler's wealth and available time jointly approach infinity, their probability of eventually flipping heads approaches 1, which makes the martingale betting strategy seem like a sure thing. However, the
exponential growth Exponential growth is a process that increases quantity over time. It occurs when the instantaneous rate of change (that is, the derivative) of a quantity with respect to time is proportional to the quantity itself. Described as a function, a ...
of the bets eventually bankrupts its users due to finite bankrolls. Stopped Brownian motion, which is a martingale process, can be used to model the trajectory of such games. The concept of martingale in probability theory was introduced by Paul Lévy in 1934, though he did not name it. The term "martingale" was introduced later by , who also extended the definition to continuous martingales. Much of the original development of the theory was done by Joseph Leo Doob among others. Part of the motivation for that work was to show the impossibility of successful betting strategies in games of chance.


Definitions

A basic definition of a
discrete-time In mathematical dynamics, discrete time and continuous time are two alternative frameworks within which variables that evolve over time are modeled. Discrete time Discrete time views values of variables as occurring at distinct, separate "po ...
martingale is a discrete-time stochastic process (i.e., a
sequence In mathematics, a sequence is an enumerated collection of objects in which repetitions are allowed and order matters. Like a set, it contains members (also called ''elements'', or ''terms''). The number of elements (possibly infinite) is calle ...
of random variables) ''X''1, ''X''2, ''X''3, ... that satisfies for any time ''n'', :\mathbf ( \vert X_n \vert )< \infty :\mathbf (X_\mid X_1,\ldots,X_n)=X_n. That is, the
conditional expected value In probability theory, the conditional expectation, conditional expected value, or conditional mean of a random variable is its expected value – the value it would take “on average” over an arbitrarily large number of occurrences – given ...
of the next observation, given all the past observations, is equal to the most recent observation.


Martingale sequences with respect to another sequence

More generally, a sequence ''Y''1, ''Y''2, ''Y''3 ... is said to be a martingale with respect to another sequence ''X''1, ''X''2, ''X''3 ... if for all ''n'' :\mathbf ( \vert Y_n \vert )< \infty :\mathbf (Y_\mid X_1,\ldots,X_n)=Y_n. Similarly, a continuous-time martingale with respect to the stochastic process ''Xt'' is a stochastic process ''Yt'' such that for all ''t'' :\mathbf ( \vert Y_t \vert )<\infty :\mathbf ( Y_ \mid \ ) = Y_s\quad \forall s \le t. This expresses the property that the conditional expectation of an observation at time ''t'', given all the observations up to time s , is equal to the observation at time ''s'' (of course, provided that ''s'' ≤ ''t''). Note that the second property implies that Y_n is measurable with respect to X_1 \dots X_n.


General definition

In full generality, a stochastic process Y:T\times\Omega\to S taking values in a Banach space S with norm \lVert \cdot \rVert_ is a martingale with respect to a filtration \Sigma_* and probability measure \mathbb P if * Σ is a filtration of the underlying
probability space In probability theory, a probability space or a probability triple (\Omega, \mathcal, P) is a mathematical construct that provides a formal model of a random process or "experiment". For example, one can define a probability space which models t ...
(Ω, Σ, \mathbb P); * ''Y'' is
adapted In biology, adaptation has three related meanings. Firstly, it is the dynamic evolutionary process of natural selection that fits organisms to their environment, enhancing their evolutionary fitness. Secondly, it is a state reached by the po ...
to the filtration Σ, i.e., for each ''t'' in the
index set In mathematics, an index set is a set whose members label (or index) members of another set. For instance, if the elements of a set may be ''indexed'' or ''labeled'' by means of the elements of a set , then is an index set. The indexing consists ...
''T'', the random variable ''Yt'' is a Σ''t''- measurable function; * for each ''t'', ''Yt'' lies in the ''Lp'' space ''L''1(Ω, Σ''t''\mathbb P; ''S''), i.e. ::\mathbf_ (\lVert Y_ \rVert_) < + \infty; * for all ''s'' and ''t'' with ''s'' < ''t'' and all ''F'' ∈ Σ''s'', ::\mathbf_ \left( _t-Y_schi_F\right) =0, :where ''χF'' denotes the indicator function of the event ''F''. In Grimmett and Stirzaker's ''Probability and Random Processes'', this last condition is denoted as ::Y_s = \mathbf_ ( Y_t \mid \Sigma_s ), :which is a general form of conditional expectation. It is important to note that the property of being a martingale involves both the filtration ''and'' the probability measure (with respect to which the expectations are taken). It is possible that ''Y'' could be a martingale with respect to one measure but not another one; the
Girsanov theorem In probability theory, the Girsanov theorem tells how stochastic processes change under changes in measure. The theorem is especially important in the theory of financial mathematics as it tells how to convert from the physical measure which desc ...
offers a way to find a measure with respect to which an
Itō process Itō may refer to: *Itō (surname), a Japanese surname *Itō, Shizuoka, Shizuoka Prefecture, Japan *Ito District, Wakayama Prefecture, Japan See also *Itô's lemma, used in stochastic calculus *Itoh–Tsujii inversion algorithm, in field theory ...
is a martingale.


Examples of martingales

* An unbiased random walk (in any number of dimensions) is an example of a martingale. * A gambler's fortune (capital) is a martingale if all the betting games which the gambler plays are fair. To be more specific: suppose ''Xn'' is a gambler's fortune after ''n'' tosses of a
fair coin In probability theory and statistics, a sequence of independent Bernoulli trials with probability 1/2 of success on each trial is metaphorically called a fair coin. One for which the probability is not 1/2 is called a biased or unfair coin. In the ...
, where the gambler wins $1 if the coin comes up heads and loses $1 if it comes up tails. The gambler's conditional expected fortune after the next trial, given the history, is equal to their present fortune. This sequence is thus a martingale. * Let ''Yn'' = ''Xn''2 − ''n'' where ''Xn'' is the gambler's fortune from the preceding example. Then the sequence is a martingale. This can be used to show that the gambler's total gain or loss varies roughly between plus or minus the
square root In mathematics, a square root of a number is a number such that ; in other words, a number whose ''square'' (the result of multiplying the number by itself, or  ⋅ ) is . For example, 4 and −4 are square roots of 16, because . ...
of the number of steps. * ( de Moivre's martingale) Now suppose the coin is unfair, i.e., biased, with probability ''p'' of coming up heads and probability ''q'' = 1 − ''p'' of tails. Let ::X_=X_n\pm 1 :with "+" in case of "heads" and "−" in case of "tails". Let ::Y_n=(q/p)^. :Then is a martingale with respect to . To show this :: \begin E _ \mid X_1,\dots,X_n& = p (q/p)^ + q (q/p)^ \\ pt& = p (q/p) (q/p)^ + q (p/q) (q/p)^ \\ pt& = q (q/p)^ + p (q/p)^ = (q/p)^=Y_n. \end * Pólya's urn contains a number of different-coloured marbles; at each
iteration Iteration is the repetition of a process in order to generate a (possibly unbounded) sequence of outcomes. Each repetition of the process is a single iteration, and the outcome of each iteration is then the starting point of the next iteration. ...
a marble is randomly selected from the urn and replaced with several more of that same colour. For any given colour, the fraction of marbles in the urn with that colour is a martingale. For example, if currently 95% of the marbles are red then, though the next iteration is more likely to add red marbles than another color, this bias is exactly balanced out by the fact that adding more red marbles alters the fraction much less significantly than adding the same number of non-red marbles would. * (
Likelihood-ratio test In statistics, the likelihood-ratio test assesses the goodness of fit of two competing statistical models based on the ratio of their likelihoods, specifically one found by maximization over the entire parameter space and another found after im ...
ing in statistics) A random variable ''X'' is thought to be distributed according either to probability density ''f'' or to a different probability density ''g''. A
random sample In statistics, quality assurance, and survey methodology, sampling is the selection of a subset (a statistical sample) of individuals from within a statistical population to estimate characteristics of the whole population. Statisticians atte ...
''X''1, ..., ''X''''n'' is taken. Let ''Y''''n'' be the "likelihood ratio" ::Y_n=\prod_^n\frac :If X is actually distributed according to the density ''f'' rather than according to ''g'', then is a martingale with respect to . * In an ecological community (a group of species that are in a particular trophic level, competing for similar resources in a local area), the number of individuals of any particular species of fixed size is a function of (discrete) time, and may be viewed as a sequence of random variables. This sequence is a martingale under the unified neutral theory of biodiversity and biogeography. * If is a
Poisson process In probability, statistics and related fields, a Poisson point process is a type of random mathematical object that consists of points randomly located on a mathematical space with the essential feature that the points occur independently of one ...
with intensity ''λ'', then the compensated Poisson process is a continuous-time martingale with right-continuous/left-limit sample paths * Wald's martingale


Submartingales, supermartingales, and relationship to harmonic functions

There are two popular generalizations of a martingale that also include cases when the current observation ''Xn'' is not necessarily equal to the future conditional expectation ''E''  ''X''1,...,''Xn''but instead an upper or lower bound on the conditional expectation. These definitions reflect a relationship between martingale theory and
potential theory In mathematics and mathematical physics, potential theory is the study of harmonic functions. The term "potential theory" was coined in 19th-century physics when it was realized that two fundamental forces of nature known at the time, namely gra ...
, which is the study of
harmonic function In mathematics, mathematical physics and the theory of stochastic processes, a harmonic function is a twice continuously differentiable function f: U \to \mathbb R, where is an open subset of that satisfies Laplace's equation, that is, : \f ...
s. Just as a continuous-time martingale satisfies E  nbsp;− ''X''''s'' = 0 ∀''s'' ≤ ''t'', a harmonic function ''f'' satisfies the partial differential equation Δ''f'' = 0 where Δ is the
Laplacian operator In mathematics, the Laplace operator or Laplacian is a differential operator given by the divergence of the gradient of a scalar function on Euclidean space. It is usually denoted by the symbols \nabla\cdot\nabla, \nabla^2 (where \nabla is the ...
. Given a
Brownian motion Brownian motion, or pedesis (from grc, πήδησις "leaping"), is the random motion of particles suspended in a medium (a liquid or a gas). This pattern of motion typically consists of random fluctuations in a particle's position insi ...
process ''W''''t'' and a harmonic function ''f'', the resulting process ''f''(''W''''t'') is also a martingale. * A discrete-time submartingale is a sequence X_1,X_2,X_3,\ldots of
integrable In mathematics, integrability is a property of certain dynamical systems. While there are several distinct formal definitions, informally speaking, an integrable system is a dynamical system with sufficiently many conserved quantities, or first ...
random variables satisfying ::\operatorname E _\mid X_1,\ldots,X_n\ge X_n. : Likewise, a continuous-time submartingale satisfies ::\operatorname E _t\mid\\ge X_s \quad \forall s \le t. :In potential theory, a
subharmonic function In mathematics, subharmonic and superharmonic functions are important classes of functions used extensively in partial differential equations, complex analysis and potential theory. Intuitively, subharmonic functions are related to convex functio ...
''f'' satisfies Δ''f'' ≥ 0. Any subharmonic function that is bounded above by a harmonic function for all points on the boundary of a ball is bounded above by the harmonic function for all points inside the ball. Similarly, if a submartingale and a martingale have equivalent expectations for a given time, the history of the submartingale tends to be bounded above by the history of the martingale. Roughly speaking, the prefix "sub-" is consistent because the current observation ''Xn'' is ''less than'' (or equal to) the conditional expectation ''E''  ''X''1,...,''Xn'' Consequently, the current observation provides support ''from below'' the future conditional expectation, and the process tends to increase in future time. * Analogously, a discrete-time supermartingale satisfies ::\operatorname E _\mid X_1,\ldots,X_n\le X_n. : Likewise, a continuous-time supermartingale satisfies ::\operatorname E _t\mid\\le X_s \quad \forall s \le t. :In potential theory, a superharmonic function ''f'' satisfies Δ''f'' ≤ 0. Any superharmonic function that is bounded below by a harmonic function for all points on the boundary of a ball is bounded below by the harmonic function for all points inside the ball. Similarly, if a supermartingale and a martingale have equivalent expectations for a given time, the history of the supermartingale tends to be bounded below by the history of the martingale. Roughly speaking, the prefix "super-" is consistent because the current observation ''Xn'' is ''greater than'' (or equal to) the conditional expectation ''E''  ''X''1,...,''Xn'' Consequently, the current observation provides support ''from above'' the future conditional expectation, and the process tends to decrease in future time.


Examples of submartingales and supermartingales

* Every martingale is also a submartingale and a supermartingale. Conversely, any stochastic process that is ''both'' a submartingale and a supermartingale is a martingale. * Consider again the gambler who wins $1 when a coin comes up heads and loses $1 when the coin comes up tails. Suppose now that the coin may be biased, so that it comes up heads with probability ''p''. ** If ''p'' is equal to 1/2, the gambler on average neither wins nor loses money, and the gambler's fortune over time is a martingale. ** If ''p'' is less than 1/2, the gambler loses money on average, and the gambler's fortune over time is a supermartingale. ** If ''p'' is greater than 1/2, the gambler wins money on average, and the gambler's fortune over time is a submartingale. * A convex function of a martingale is a submartingale, by Jensen's inequality. For example, the square of the gambler's fortune in the fair coin game is a submartingale (which also follows from the fact that ''Xn''2 − ''n'' is a martingale). Similarly, a concave function of a martingale is a supermartingale.


Martingales and stopping times

A
stopping time In probability theory, in particular in the study of stochastic processes, a stopping time (also Markov time, Markov moment, optional stopping time or optional time ) is a specific type of “random time”: a random variable whose value is inter ...
with respect to a sequence of random variables ''X''1, ''X''2, ''X''3, ... is a random variable τ with the property that for each ''t'', the occurrence or non-occurrence of the event ''τ'' = ''t'' depends only on the values of ''X''1, ''X''2, ''X''3, ..., ''X''''t''. The intuition behind the definition is that at any particular time ''t'', you can look at the sequence so far and tell if it is time to stop. An example in real life might be the time at which a gambler leaves the gambling table, which might be a function of their previous winnings (for example, he might leave only when he goes broke), but he can't choose to go or stay based on the outcome of games that haven't been played yet. In some contexts the concept of ''stopping time'' is defined by requiring only that the occurrence or non-occurrence of the event ''τ'' = ''t'' is probabilistically independent of ''X''''t'' + 1, ''X''''t'' + 2, ... but not that it is completely determined by the history of the process up to time ''t''. That is a weaker condition than the one appearing in the paragraph above, but is strong enough to serve in some of the proofs in which stopping times are used. One of the basic properties of martingales is that, if (X_t)_ is a (sub-/super-) martingale and \tau is a stopping time, then the corresponding stopped process (X_t^\tau)_ defined by X_t^\tau:=X_ is also a (sub-/super-) martingale. The concept of a stopped martingale leads to a series of important theorems, including, for example, the optional stopping theorem which states that, under certain conditions, the expected value of a martingale at a stopping time is equal to its initial value.


See also

*
Azuma's inequality In probability theory, the Azuma–Hoeffding inequality (named after Kazuoki Azuma and Wassily Hoeffding) gives a concentration result for the values of martingales that have bounded differences. Suppose \ is a martingale (or super-martingale ...
*
Brownian motion Brownian motion, or pedesis (from grc, πήδησις "leaping"), is the random motion of particles suspended in a medium (a liquid or a gas). This pattern of motion typically consists of random fluctuations in a particle's position insi ...
* Doob martingale *
Doob's martingale convergence theorems In mathematicsspecifically, in the stochastic processes, theory of stochastic processesDoob's martingale convergence theorems are a collection of results on the limit (mathematics), limits of Martingale (probability theory), supermartingales, named ...
* Doob's martingale inequality *
Doob–Meyer decomposition theorem The Doob–Meyer decomposition theorem is a theorem in stochastic calculus stating the conditions under which a submartingale may be decomposed in a unique way as the sum of a martingale and an increasing predictable process. It is named for ...
*
Local martingale In mathematics, a local martingale is a type of stochastic process, satisfying the localized version of the martingale property. Every martingale is a local martingale; every bounded local martingale is a martingale; in particular, every local m ...
* Markov chain *
Markov property In probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process. It is named after the Russian mathematician Andrey Markov. The term strong Markov property is similar to the Markov propert ...
*
Martingale (betting system) A martingale is a class of betting strategies that originated from and were popular in 18th-century France. The simplest of these strategies was designed for a game in which the gambler wins the stake if a coin comes up heads and loses if it co ...
*
Martingale central limit theorem In probability theory, the central limit theorem says that, under certain conditions, the sum of many independent identically-distributed random variables, when scaled appropriately, converges in distribution to a standard normal distribution. Th ...
*
Martingale difference sequence In probability theory, a martingale difference sequence (MDS) is related to the concept of the martingale. A stochastic series ''X'' is an MDS if its expectation with respect to the past is zero. Formally, consider an adapted sequence \_^ on a p ...
*
Martingale representation theorem In probability theory, the martingale representation theorem states that a random variable that is measurable with respect to the filtration generated by a Brownian motion can be written in terms of an Itô integral with respect to this Brownian m ...
*
Normal number In mathematics, a real number is said to be simply normal in an integer base b if its infinite sequence of digits is distributed uniformly in the sense that each of the b digit values has the same natural density 1/b. A number is said to b ...
*
Semimartingale In probability theory, a real valued stochastic process ''X'' is called a semimartingale if it can be decomposed as the sum of a local martingale and a càdlàg adapted finite-variation process. Semimartingales are "good integrators", forming the ...


Notes


References

* * Entire issue dedicated to Martingale probability theory (Laurent Mazliak and Glenn Shafer, Editors). * * * * * {{Authority control Stochastic processes Martingale theory Game theory Paul Lévy (mathematician)