In the case of discrete time, we observe states on instantaneous and immutable moments. In the context of continuous time, the observations are continuous, that is to say without temporal disruption.

## Continuous time

Let M + 1 be mutually exclusive states. The analysis starts at time 0 and the time runs continuously, we call X(t) the state of the system at time t. The changeover points t_{i} are random points in time (they are not necessarily integers). It is impossible to have two state changes at the same time.

Consider three consecutive points in time where there has been a change of states r in the past, s to the present moment and s + t in the future. X(s) = i and X(r) = 1. A continuous time stochastic process at the Markov property if:

The transition probabilities are stationary since they are independent of s. We denote p_{ij}(t) = P(X(t) = j , X(0) = i ) the function of transition probability in continuous time.

Let T_{i} denote a random variable denoting the time spent in state i before moving to another state, i∈ {0, …, M}. Suppose the process enters state i at time t’ = s. Then for a time t > 0, T_{i} > t ⇔ X(t’ = i ), ∀t’∈[s, s + t].

The stationarity property of the transition probabilities results in: P(T_{i} > s + t, T_{i} > s) = P(T_{i} > t). The distribution of the time remaining by the next output of i by the process is the same regardless of the time already spent in the state i. The variable T_{i} is without memory. The only continuous random variable distribution with this property is the exponential distribution.

The exponential distribution T_{i} has a unique parameter λ and its mean is R[T_{i} ] = 1/λ. This result allows us to describe a continuous time Markov chain in an equivalent way as follows:

- The random variable T
_{i}has an exponential distribution of parameter λ - when the process leaves state i, it goes to state j with a probability p
_{ij}such that (similar to a Markov chain in discrete time): - the next visited state after i is independent of the time spent in state i.
- The continuous time Markov chain has the same class and reducibility properties as discrete time chains.

Here are some properties of the exponential law:

## Birth and death process

Thus, if we consider μ_{i} the parameter of the exponential random variable associated with the state i, we can represent the Markov chain in continuous time as follows:

We can see the Markov chain with discrete time included, hence the possibility of carrying out a study of the discrete model. It should be noted that there is no notion of periodicity in this context.

If we consider that we move from state i to state j after a time T_{ij} and that we consider this time as an exponential random variable of rate μ_{ij}, then it is possible to write the chain of Markov in continuous time under a Birth-Death model:

Be careful, there is a big change between the stochastic side of the movement from one state to another and the side continuous in time. It is important to understand that the transition matrix of a continuous time Markov chain is always a Birth-Death model with the following properties:

This matrix is called the infinitesimal generator.

Thus, from the following discrete Markov graph (the exponential law is the same in all three states):

It is possible to obtain the following Birth-Death model: