A continuous-time Markov chain (CTMC) is a continuous
stochastic process in which, for each state, the process will change state according to an
exponential random variable
In probability theory and statistics, the exponential distribution is the probability distribution of the time between events in a Poisson point process, i.e., a process in which events occur continuously and independently at a constant averag ...
and then move to a different state as specified by the probabilities of a
stochastic matrix
In mathematics, a stochastic matrix is a square matrix used to describe the transitions of a Markov chain. Each of its entries is a nonnegative real number representing a probability. It is also called a probability matrix, transition matrix, ...
. An equivalent formulation describes the process as changing state according to the least value of a set of exponential random variables, one for each possible state it can move to, with the parameters determined by the current state.
An example of a CTMC with three states
is as follows: the process makes a transition after the amount of time specified by the holding time—an exponential random variable
, where ''i'' is its current state. Each random variable is independent and such that
,
and
. When a transition is to be made, the process moves according to the jump chain, a
discrete-time Markov chain
In probability, a discrete-time Markov chain (DTMC) is a sequence of random variables, known as a stochastic process, in which the value of the next variable depends only on the value of the current variable, and not any variables in the past. F ...
with stochastic matrix:
:
Equivalently, by the property of
competing exponentials
In probability theory and statistics, the exponential distribution is the probability distribution of the time between events in a Poisson point process, i.e., a process in which events occur continuously and independently at a constant averag ...
, this CTMC changes state from state ''i'' according to the minimum of two random variables, which are independent and such that
for
where the parameters are given by the Q-matrix
:
Each non-diagonal entry
can be computed as the probability that the jump chain moves from state ''i'' to state ''j'', divided by the expected holding time of state ''i''. The diagonal entries are chosen so that each row sums to 0.
A CTMC satisfies the
Markov property
In probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process. It is named after the Russian mathematician Andrey Markov. The term strong Markov property is similar to the Markov prop ...
, that its behavior depends only on its current state and not on its past behavior, due to the memorylessness of the exponential distribution and of discrete-time Markov chains.
Definition
Let
be a probability space, let
be a countable nonempty set, and let
(
for "time"). Equip
with the
discrete metric
Discrete may refer to:
*Discrete particle or quantum in physics, for example in quantum theory
*Discrete device, an electronic component with just one circuit element, either passive or active, other than an integrated circuit
*Discrete group, a ...
, so that we can make sense of
right continuity of functions
. A continuous-time Markov chain is defined by:
* A
probability vector
In mathematics and statistics, a probability vector or stochastic vector is a vector with non-negative entries that add up to one.
The positions (indices) of a probability vector represent the possible outcomes of a discrete random variable, an ...
on
(which below we will interpret as the ''initial distribution'' of the Markov chain), and
* A
rate matrix on
, that is, a function
such that
# for all distinct
,
# for all
(Even if
is infinite, this sum is ''a priori'' well defined (possibly equalling
) because each term appearing in the sum is nonnegative. ''A posteriori'', we know the sum must also be finite (not equal to
), since we're assuming it's equal to
and we've assumed
is real valued. Some authors instead use a definition that's word-for-word the same except for a modified stipulation
, and say
is ''stable'' or ''totally stable'' to mean
, i.e., every entry is real valued.)
Note that the row sums of
are 0:
or more succinctly,
. This situation contrasts with the situation for
discrete-time Markov chain
In probability, a discrete-time Markov chain (DTMC) is a sequence of random variables, known as a stochastic process, in which the value of the next variable depends only on the value of the current variable, and not any variables in the past. F ...
s, where all row sums of the transition matrix equal unity.
Now, let
such that
is
-measurable. There are three equivalent ways to define
being ''Markov with initial distribution
and rate matrix
'': via transition probabilities or via the jump chain and holding times.
As a prelude to a transition-probability definition, we first motivate the definition of a ''regular''
rate matrix. We will use the transition rate matrix
to specify the dynamics of the Markov chain by means of generating a collection of ''transition matrices''
on
(
), via the following theorem.
We say
is ''regular'' to mean that we do have uniqueness for the above system, i.e., that there exists exactly one solution. We say
is ''irregular'' to mean
is not regular. If
is finite, then there is exactly one solution, namely
and hence
is regular. Otherwise,
is infinite, and there exist irregular transition rate matrices on
. If
is regular, then for the unique solution
, for each
,
will be a
stochastic matrix
In mathematics, a stochastic matrix is a square matrix used to describe the transitions of a Markov chain. Each of its entries is a nonnegative real number representing a probability. It is also called a probability matrix, transition matrix, ...
. We will assume
is regular from the beginning of the following subsection up through the end of this section, even though it is conventional to not include this assumption. (Note for the expert: thus we are not defining continuous-time Markov chains in general but only ''non-explosive'' continuous-time Markov chains.)
Transition-probability definition
Let
be the (unique) solution of the system (). (Uniqueness guaranteed by our assumption that
is regular.) We say
is ''Markov with initial distribution
and rate matrix
'' to mean: for any nonnegative integer
, for all
such that