Markov Property
   HOME

TheInfoList



OR:

In
probability theory Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
and
statistics Statistics (from German language, German: ', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a s ...
, the term Markov property refers to the
memoryless In probability and statistics, memorylessness is a property of probability distributions. It describes situations where previous failures or elapsed time does not affect future trials or further wait time. Only the geometric and exponential d ...
property of a
stochastic process In probability theory and related fields, a stochastic () or random process is a mathematical object usually defined as a family of random variables in a probability space, where the index of the family often has the interpretation of time. Sto ...
, which means that its future evolution is independent of its history. It is named after the
Russia Russia, or the Russian Federation, is a country spanning Eastern Europe and North Asia. It is the list of countries and dependencies by area, largest country in the world, and extends across Time in Russia, eleven time zones, sharing Borders ...
n
mathematician A mathematician is someone who uses an extensive knowledge of mathematics in their work, typically to solve mathematical problems. Mathematicians are concerned with numbers, data, quantity, mathematical structure, structure, space, Mathematica ...
Andrey Markov Andrey Andreyevich Markov (14 June 1856 – 20 July 1922) was a Russian mathematician best known for his work on stochastic processes. A primary subject of his research later became known as the Markov chain. He was also a strong, close to mas ...
. The term strong Markov property is similar to the Markov property, except that the meaning of "present" is defined in terms of a random variable known as a stopping time. The term Markov assumption is used to describe a model where the Markov property is assumed to hold, such as a hidden Markov model. A Markov random field extends this property to two or more dimensions or to random variables defined for an interconnected network of items. An example of a model for such a field is the
Ising model The Ising model (or Lenz–Ising model), named after the physicists Ernst Ising and Wilhelm Lenz, is a mathematical models in physics, mathematical model of ferromagnetism in statistical mechanics. The model consists of discrete variables that r ...
. A discrete-time stochastic process satisfying the Markov property is known as a
Markov chain In probability theory and statistics, a Markov chain or Markov process is a stochastic process describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally ...
.


Introduction

A stochastic process has the Markov property if the
conditional probability distribution In probability theory and statistics, the conditional probability distribution is a probability distribution that describes the probability of an outcome given the occurrence of a particular event. Given two jointly distributed random variables X ...
of future states of the process (conditional on both past and present values) depends only upon the present state; that is, given the present, the future does not depend on the past. A process with this property is said to be Markov or Markovian and known as a
Markov process In probability theory and statistics, a Markov chain or Markov process is a stochastic process describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, ...
. Two famous classes of Markov process are the
Markov chain In probability theory and statistics, a Markov chain or Markov process is a stochastic process describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally ...
and Brownian motion. Note that there is a subtle, often overlooked and very important point that is often missed in the plain English statement of the definition: the statespace of the process is constant through time. The conditional description involves a fixed "bandwidth". For example, without this restriction we could augment any process to one which includes the complete history from a given initial condition and it would be made to be Markovian. But the state space would be of increasing dimensionality over time and does not meet the definition.


History


Definition

Let (\Omega,\mathcal,P) be a
probability space In probability theory, a probability space or a probability triple (\Omega, \mathcal, P) is a mathematical construct that provides a formal model of a random process or "experiment". For example, one can define a probability space which models ...
with a
filtration Filtration is a physical separation process that separates solid matter and fluid from a mixture using a ''filter medium'' that has a complex structure through which only the fluid can pass. Solid particles that cannot pass through the filte ...
(\mathcal_s,\ s \in I), for some (
totally ordered In mathematics, a total order or linear order is a partial order in which any two elements are comparable. That is, a total order is a binary relation \leq on some set X, which satisfies the following for all a, b and c in X: # a \leq a ( r ...
) index set I; and let (S,\mathcal) be a measurable space. A (S,\mathcal)-valued stochastic process X=\_ adapted to the filtration is said to possess the ''Markov property'' if, for each A \in \mathcal and each s,t\in I with s, :P(X_t \in A \mid \mathcal_s) = P(X_t \in A\mid X_s). In the case where S is a discrete set with the discrete sigma algebra and I = \mathbb, this can be reformulated as follows: :P(X_=x_\mid X_n=x_n, \dots, X_1=x_1)=P(X_=x_\mid X_n=x_n) \text n \in \mathbb.


Alternative formulations

Alternatively, the Markov property can be formulated as follows. :\operatorname (X_t)\mid\mathcal_s\operatorname (X_t)\mid\sigma(X_s)/math> for all t\geq s\geq 0 and f:S\rightarrow \mathbb bounded and measurable.


Strong Markov property

Suppose that X=(X_t:t\geq 0) is a
stochastic process In probability theory and related fields, a stochastic () or random process is a mathematical object usually defined as a family of random variables in a probability space, where the index of the family often has the interpretation of time. Sto ...
on a
probability space In probability theory, a probability space or a probability triple (\Omega, \mathcal, P) is a mathematical construct that provides a formal model of a random process or "experiment". For example, one can define a probability space which models ...
(\Omega,\mathcal,P) with
natural filtration In the theory of stochastic processes in mathematics and statistics, the generated filtration or natural filtration associated to a stochastic process is a filtration associated to the process which records its "past behaviour" at each time. It is ...
\_. Then for any stopping time \tau on \Omega , we can define :\mathcal_=\. Then X is said to have the strong Markov property if, for each stopping time \tau, conditional on the event \, we have that for each t\ge 0, X_ is independent of \mathcal_ given X_\tau. The strong Markov property implies the ordinary Markov property since by taking the stopping time \tau=t, the ordinary Markov property can be deduced.


In forecasting

In the fields of
predictive modelling Predictive modelling uses statistics to Prediction, predict outcomes. Most often the event one wants to predict is in the future, but predictive modelling can be applied to any type of unknown event, regardless of when it occurred. For example, pre ...
and probabilistic forecasting, the Markov property is considered desirable since it may enable the reasoning and resolution of the problem that otherwise would not be possible to be resolved because of its intractability. Such a model is known as a
Markov model In probability theory, a Markov model is a stochastic model used to Mathematical model, model pseudo-randomly changing systems. It is assumed that future states depend only on the current state, not on the events that occurred before it (that is, ...
.


Examples

Assume that an urn contains two red balls and one green ball. One ball was drawn yesterday, one ball was drawn today, and the final ball will be drawn tomorrow. All of the draws are "without replacement". Suppose you know that today's ball was red, but you have no information about yesterday's ball. The chance that tomorrow's ball will be red is 1/2. That's because the only two remaining outcomes for this random experiment are: On the other hand, if you know that both today and yesterday's balls were red, then you are guaranteed to get a green ball tomorrow. This discrepancy shows that the probability distribution for tomorrow's color depends not only on the present value, but is also affected by information about the past. This stochastic process of observed colors doesn't have the Markov property. Using the same experiment above, if sampling "without replacement" is changed to sampling "with replacement," the process of observed colors will have the Markov property. An application of the Markov property in a generalized form is in
Markov chain Monte Carlo In statistics, Markov chain Monte Carlo (MCMC) is a class of algorithms used to draw samples from a probability distribution. Given a probability distribution, one can construct a Markov chain whose elements' distribution approximates it – that ...
computations in the context of
Bayesian statistics Bayesian statistics ( or ) is a theory in the field of statistics based on the Bayesian interpretation of probability, where probability expresses a ''degree of belief'' in an event. The degree of belief may be based on prior knowledge about ...
.


See also

* Causal Markov condition * Chapman–Kolmogorov equation *
Hysteresis Hysteresis is the dependence of the state of a system on its history. For example, a magnet may have more than one possible magnetic moment in a given magnetic field, depending on how the field changed in the past. Plots of a single component of ...
* Markov blanket *
Markov chain In probability theory and statistics, a Markov chain or Markov process is a stochastic process describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally ...
* Markov decision process *
Markov model In probability theory, a Markov model is a stochastic model used to Mathematical model, model pseudo-randomly changing systems. It is assumed that future states depend only on the current state, not on the events that occurred before it (that is, ...


References

{{Reflist Markov models Markov processes