In the mathematical study of
stochastic processes, a Harris chain is a
Markov chain
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happen ...
where the chain returns to a particular part of the state space an unbounded number of times. Harris chains are
regenerative process
In applied probability, a regenerative process is a class of stochastic process with the property that certain portions of the process can be treated as being statistically independent of each other. This property can be used in the derivation of ...
es and are named after
Theodore Harris. The theory of Harris chains and Harris recurrence is useful for treating Markov chains on general (possibly uncountably infinite) state spaces.
Definition
Let
be a
Markov chain
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happen ...
on a general state space
with
stochastic kernel In probability theory, a Markov kernel (also known as a stochastic kernel or probability kernel) is a map that in the general theory of Markov processes plays the role that the transition matrix does in the theory of Markov processes with a finite ...
. The kernel represents a generalized one-step transition probability law, so that
for all states
in
and all measurable sets
. The chain
is a ''Harris chain''
[R. Durrett. ''Probability: Theory and Examples''. Thomson, 2005. .] if there exists
, and
probability measure
In mathematics, a probability measure is a real-valued function defined on a set of events in a probability space that satisfies measure properties such as ''countable additivity''. The difference between a probability measure and the more g ...
with
such that
# If
, then
for all
.
# If
and
(where
is measurable), then
.
The first part of the definition ensures that the chain returns to some state within
with probability 1, regardless of where it starts. It follows that it visits state
infinitely often (with probability 1). The second part implies that once the Markov chain is in state
, its next-state can be generated with the help of an independent Bernoulli coin flip. To see this, first note that the parameter
must be between 0 and 1 (this can be shown by applying the second part of the definition to the set
). Now let
be a point in
and suppose
. To choose the next state
, independently flip a biased coin with success probability
. If the coin flip is successful, choose the next state
according to the probability measure
. Else (and if
), choose the next state
according to the measure
(defined for all measurable subsets
).
Two random processes
and
that have the same probability law and are Harris chains according to the above definition can be coupled as follows: Suppose that
and
, where
and
are points in
. Using the same coin flip to decide the next-state of both processes, it follows that the next states are the same with probability at least
.
Examples
Example 1: Countable state space
Let Ω be a countable state space. The kernel ''K'' is defined by the one-step conditional transition probabilities P
''n''+1 = ''y'' "> ''X''''n'' = ''x''for ''x'',''y'' ∈ Ω. The measure ρ is a probability mass function on the states, so that ''ρ''(''x'') ≥ 0 for all ''x'' ∈ Ω, and the sum of the ''ρ''(x) probabilities is equal to one. Suppose the above definition is satisfied for
a given set ''A'' ⊆ Ω and a given parameter ε > 0. Then P
''n''+1 = ''c'' "> ''X''''n'' = ''x''≥ ''ερ''(''c'') for all ''x'' ∈ ''A'' and all ''c'' ∈ Ω.
Example 2: Chains with continuous densities
Let , ''X''
''n'' ∈ R
''d'' be a
Markov chain
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happen ...
with a
kernel
Kernel may refer to:
Computing
* Kernel (operating system), the central component of most operating systems
* Kernel (image processing), a matrix used for image convolution
* Compute kernel, in GPGPU programming
* Kernel method, in machine lea ...
that is
absolutely continuous
In calculus, absolute continuity is a smoothness property of functions that is stronger than continuity and uniform continuity. The notion of absolute continuity allows one to obtain generalizations of the relationship between the two central ope ...
with respect to
Lebesgue measure
In measure theory, a branch of mathematics, the Lebesgue measure, named after French mathematician Henri Lebesgue, is the standard way of assigning a measure to subsets of ''n''-dimensional Euclidean space. For ''n'' = 1, 2, or 3, it coincides ...
:
: ''K''(''x'', ''dy'') = ''K''(''x'', ''y'') ''dy''
such that ''K''(''x'', ''y'') is a
continuous function.
Pick (''x''
0, ''y''
0) such that ''K''(''x''
0, ''y''
0 ) > 0, and let ''A'' and Ω be
open sets
In mathematics, open sets are a generalization of open intervals in the real line.
In a metric space (a set along with a distance defined between any two points), open sets are the sets that, with every point , contain all points that are s ...
containing ''x''
0 and ''y''
0 respectively that are sufficiently small so that ''K''(''x'', ''y'') ≥ ''ε'' > 0 on ''A'' × Ω. Letting ''ρ''(''C'') = , Ω ∩ ''C'', /, Ω, where , Ω, is the
Lebesgue measure
In measure theory, a branch of mathematics, the Lebesgue measure, named after French mathematician Henri Lebesgue, is the standard way of assigning a measure to subsets of ''n''-dimensional Euclidean space. For ''n'' = 1, 2, or 3, it coincides ...
of Ω, we have that (2) in the above definition holds. If (1) holds, then is a Harris chain.
Reducibility and periodicity
In the following
; i.e.
is the first time after time 0 that the process enters region
. Let
denote the initial distribution of the Markov chain, i.e.
.
Definition: If for all
,
, then the Harris chain is called ''recurrent.''
Definition: A recurrent Harris chain
is ''aperiodic'' if
, such that
,
Theorem: Let
be an aperiodic recurrent Harris chain with stationary distribution
. If
then as
,
where
denotes the total variation for signed measures defined on the same measurable space.
References
Markov processes