A Markov chain on a measurable state space is a
discrete-time-homogeneous Markov chain with a
measurable space as state space.
History
The definition of Markov chains has evolved during the 20th century. In 1953 the term Markov chain was used for
stochastic processes with discrete or continuous index set, living on a countable or finite state space, see Doob. or Chung. Since the late 20th century it became more popular to consider a Markov chain as a stochastic process with discrete index set, living on a measurable state space.
[Daniel Revuz: ''Markov Chains''. 2nd edition, 1984.][Rick Durrett: ''Probability: Theory and Examples''. Fourth edition, 2005.]
Definition
Denote with
a measurable space and with
a
Markov kernel with source and target
.
A stochastic process
on
is called a time homogeneous Markov chain with Markov kernel
and start distribution
if
:
is satisfied for any
. One can construct for any Markov kernel and any probability measure an associated Markov chain.
Remark about Markov kernel integration
For any
measure we denote for
-integrable function
the
Lebesgue integral as
. For the measure