In

probability theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...

and statistics, the term Markov property refers to the memoryless
In probability and statistics, memorylessness is a property of certain probability distributions. It usually refers to the cases when the distribution of a "waiting time" until a certain event does not depend on how much time has elapsed already ...

property of a stochastic process
In probability theory and related fields, a stochastic () or random process is a mathematical object usually defined as a family of random variables. Stochastic processes are widely used as mathematical models of systems and phenomena that a ...

. It is named after the Russia
Russia (, , ), or the Russian Federation, is a transcontinental country spanning Eastern Europe and Northern Asia. It is the largest country in the world, with its internationally recognised territory covering , and encompassing one-eigh ...

n mathematician Andrey Markov
Andrey Andreyevich Markov, first name also spelled "Andrei", in older works also spelled Markoff) (14 June 1856 – 20 July 1922) was a Russian mathematician best known for his work on stochastic processes. A primary subject of his research lat ...

. The term strong Markov property is similar to the Markov property, except that the meaning of "present" is defined in terms of a random variable known as a stopping time
In probability theory, in particular in the study of stochastic processes, a stopping time (also Markov time, Markov moment, optional stopping time or optional time ) is a specific type of “random time”: a random variable whose value is inte ...

.
The term Markov assumption is used to describe a model where the Markov assumption is assumed to hold, such as a hidden Markov model
A hidden Markov model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process — call it X — with unobservable ("''hidden''") states. As part of the definition, HMM requires that there be an o ...

.
A Markov random field
In the domain of physics and probability, a Markov random field (MRF), Markov network or undirected graphical model is a set of random variables having a Markov property described by an undirected graph. In other words, a random field is said ...

extends this property to two or more dimensions or to random variables defined for an interconnected network of items. An example of a model for such a field is the Ising model.
A discrete-time stochastic process satisfying the Markov property is known as a Markov chain
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happe ...

.
Introduction

A stochastic process has the Markov property if theconditional probability distribution
In probability theory and statistics, given two jointly distributed random variables X and Y, the conditional probability distribution of Y given X is the probability distribution of Y when X is known to be a particular value; in some cases the ...

of future states of the process (conditional on both past and present values) depends only upon the present state; that is, given the present, the future does not depend on the past. A process with this property is said to be Markov or Markovian and known as a Markov process
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happe ...

. The most famous Markov process is a Markov chain
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happe ...

. Brownian motion
Brownian motion, or pedesis (from grc, πήδησις "leaping"), is the random motion of particles suspended in a medium (a liquid or a gas).
This pattern of motion typically consists of random fluctuations in a particle's position insi ...

is another well-known Markov process.
History

Definition

Let $(\backslash Omega,\backslash mathcal,P)$ be a probability space with afiltration
Filtration is a physical separation process that separates solid matter and fluid from a mixture using a ''filter medium'' that has a complex structure through which only the fluid can pass. Solid particles that cannot pass through the filter ...

$(\backslash mathcal\_s,\backslash \; s\; \backslash in\; I)$, for some (totally ordered
In mathematics, a total or linear order is a partial order in which any two elements are comparable. That is, a total order is a binary relation \leq on some set X, which satisfies the following for all a, b and c in X:
# a \leq a ( reflexive) ...

) index set $I$; and let $(S,\backslash mathcal)$ be a measurable space
In mathematics, a measurable space or Borel space is a basic object in measure theory. It consists of a set and a σ-algebra, which defines the subsets that will be measured.
Definition
Consider a set X and a σ-algebra \mathcal A on X. Then ...

. A $(S,\backslash mathcal)$-valued stochastic process $X=\backslash \_$ adapted to the filtration is said to possess the ''Markov property'' if, for each $A\; \backslash in\; \backslash mathcal$ and each $s,t\backslash in\; I$ with $smath>,\; :$ P(X\_t\; \backslash in\; A\; \backslash mid\; \backslash mathcal\_s)\; =\; P(X\_t\; \backslash in\; A\backslash mid\; X\_s).$In\; the\; case\; where$ S$is\; a\; discrete\; set\; with\; the;\; href="/html/ALL/l/Sigma-algebra\#Simple\_set-based\_examples.html"\; ;"title="Sigma-algebra\#Simple\; set-based\; examples">discrete\; sigma\; algebra$Alternative formulations

Alternatively, the Markov property can be formulated as follows. :$\backslash operatorname;\; href="/html/ALL/l/(X\_t)\backslash mid\backslash mathcal\_s.html"\; ;"title="(X\_t)\backslash mid\backslash mathcal\_s">(X\_t)\backslash mid\backslash mathcal\_s$Strong Markov property

Suppose that $X=(X\_t:t\backslash geq\; 0)$ is astochastic process
In probability theory and related fields, a stochastic () or random process is a mathematical object usually defined as a family of random variables. Stochastic processes are widely used as mathematical models of systems and phenomena that a ...

on a probability space $(\backslash Omega,\backslash mathcal,P)$ with natural filtration $\backslash \_$. Then for any stopping time
In probability theory, in particular in the study of stochastic processes, a stopping time (also Markov time, Markov moment, optional stopping time or optional time ) is a specific type of “random time”: a random variable whose value is inte ...

$\backslash tau$ on $\backslash Omega$, we can define
:$\backslash mathcal\_=\backslash $.
Then $X$ is said to have the strong Markov property if, for each stopping time
In probability theory, in particular in the study of stochastic processes, a stopping time (also Markov time, Markov moment, optional stopping time or optional time ) is a specific type of “random time”: a random variable whose value is inte ...

$\backslash tau$, conditional on the event $\backslash $, we have that for each $t\backslash ge\; 0$, $X\_$ is independent of $\backslash mathcal\_$ given $X\_\backslash tau$.
The strong Markov property implies the ordinary Markov property since by taking the stopping time $\backslash tau=t$, the ordinary Markov property can be deduced.
In forecasting

In the fields ofpredictive modelling
Predictive modelling uses statistics to predict outcomes. Most often the event one wants to predict is in the future, but predictive modelling can be applied to any type of unknown event, regardless of when it occurred. For example, predictive mod ...

and probabilistic forecasting, the Markov property is considered desirable since it may enable the reasoning and resolution of the problem that otherwise would not be possible to be resolved because of its intractability. Such a model is known as a Markov model.
Examples

Assume that an urn contains two red balls and one green ball. One ball was drawn yesterday, one ball was drawn today, and the final ball will be drawn tomorrow. All of the draws are "without replacement". Suppose you know that today's ball was red, but you have no information about yesterday's ball. The chance that tomorrow's ball will be red is 1/2. That's because the only two remaining outcomes for this random experiment are: On the other hand, if you know that both today and yesterday's balls were red, then you are guaranteed to get a green ball tomorrow. This discrepancy shows that the probability distribution for tomorrow's color depends not only on the present value, but is also affected by information about the past. This stochastic process of observed colors doesn't have the Markov property. Using the same experiment above, if sampling "without replacement" is changed to sampling "with replacement," the process of observed colors will have the Markov property. An application of the Markov property in a generalized form is inMarkov chain Monte Carlo
In statistics, Markov chain Monte Carlo (MCMC) methods comprise a class of algorithms for sampling from a probability distribution. By constructing a Markov chain that has the desired distribution as its equilibrium distribution, one can obtain ...

computations in the context of Bayesian statistics
Bayesian statistics is a theory in the field of statistics based on the Bayesian interpretation of probability where probability expresses a ''degree of belief'' in an event. The degree of belief may be based on prior knowledge about the event ...

.
See also

* Causal Markov condition *Chapman–Kolmogorov equation In mathematics, specifically in the theory of Markovian stochastic processes in probability theory, the Chapman–Kolmogorov equation(CKE) is an identity relating the joint probability distributions of different sets of coordinates on a stochastic ...

* Hysteresis
*Markov blanket
In statistics and machine learning, when one wants to infer a random variable with a set of variables, usually a subset is enough, and other variables are useless. Such a subset that contains all the useful information is called a Markov blanket. ...

*Markov chain
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happe ...

* Markov decision process
* Markov model
References

{{Reflist Markov models Markov processes