Continuous time Markov chain

In the case of discrete time, we observe states on instantaneous and immutable moments. In the context of continuous time, the observations are continuous, that is to say without temporal disruption.

Continuous time

Let M + 1 be mutually exclusive states. The analysis starts at time 0 and the time runs continuously, we call X(t) the state of the system at time t. The changeover points ti are random points in time (they are not necessarily integers). It is impossible to have two state changes at the same time.

langage62

Consider three consecutive points in time where there has been a change of states r in the past, s to the present moment and s + t in the future. X(s) = i and X(r) = 1. A continuous time stochastic process at the Markov property if:

langage63

The transition probabilities are stationary since they are independent of s. We denote pij(t) = P(X(t) = j , X(0) = i ) the function of transition probability in continuous time.

Let Ti denote a random variable denoting the time spent in state i before moving to another state, i∈ {0, …, M}. Suppose the process enters state i at time t’ = s. Then for a time t > 0, Ti > t ⇔ X(t’ = i ), ∀t’∈[s, s + t].

The stationarity property of the transition probabilities results in: P(Ti > s + t, Ti > s) = P(Ti > t). The distribution of the time remaining by the next output of i by the process is the same regardless of the time already spent in the state i. The variable Ti is without memory. The only continuous random variable distribution with this property is the exponential distribution.

The exponential distribution Ti has a unique parameter λ and its mean is R[Ti ] = 1/λ. This result allows us to describe a continuous time Markov chain in an equivalent way as follows:

  • The random variable Ti has an exponential distribution of parameter λ
  • when the process leaves state i, it goes to state j with a probability pij such that (similar to a Markov chain in discrete time):proba36
  • the next visited state after i is independent of the time spent in state i.
  • The continuous time Markov chain has the same class and reducibility properties as discrete time chains.

proba41

Here are some properties of the exponential law:

markov11

Birth and death process

Thus, if we consider μi the parameter of the exponential random variable associated with the state i, we can represent the Markov chain in continuous time as follows:

proba43

We can see the Markov chain with discrete time included, hence the possibility of carrying out a study of the discrete model. It should be noted that there is no notion of periodicity in this context.

If we consider that we move from state i to state j after a time Tij and that we consider this time as an exponential random variable of rate μij, then it is possible to write the chain of Markov in continuous time under a Birth-Death model:

proba44

Be careful, there is a big change between the stochastic side of the movement from one state to another and the side continuous in time. It is important to understand that the transition matrix of a continuous time Markov chain is always a Birth-Death model with the following properties:

proba45

This matrix is called the infinitesimal generator.

Thus, from the following discrete Markov graph (the exponential law is the same in all three states):

proba37

It is possible to obtain the following Birth-Death model:

proba38