# Field:

- Random process
- Martingal
- Brownian process

## Discrete time, order 1:

- Markov chains in discrete time
- Recurrence and transience
- Invariant law and asymptotic behavior
- Time to reach a state
- Probability of absorption of a state

## Continious time, order 1:

# Probability

A random experiment, denoted E, is an experiment whose outcome is random. We denote by Ω the set of all possible results for this experiment, and is called universe, space of possibilities or even state space. A result of E is an element of Ω noted ω.

For example, in the coin flipping game, the universe of experience « throwing a coin » is Ω = {H, T}. For the experiment « to throw two pieces one after the other », the universe is Ω = {HH, HT, TH, TT}.

A random event A related to the experiment E is a subset of Ω which can be said from the experience if it is realized or not. On the previous example the random event « get Head » can be easily observed by throwing a coin. A random event is a set and therefore has the main properties of set theory.

Basic operations on parts of a set:

- Intersection: the intersection of sets A and B denoted A ∩B is the subset

belonging to both A and B. - Union : the union of two sets A and B, denoted AB is the set of points belonging to least one of the two sets.
- Empty set: the empty set, denoted by Ø, is the set containing no element.
- Disjoint sets: sets A and B are said to be disjoint if A ∩B = Ø.
- Complementary: the complement of the set A ⊂ Ω in Ω, denoted A
^{c}or Ω \ A, is the set of elements not belonging to A. The sets A and A^{c}are disjoint.

Set operations:

- Non: the completion of the contrary event A is represented by A
^{c}: the result of the experience does not belong to A. - And: the event « A and B are realized » is represented by A∩B; the result of the experiment is in both A and B.
- Or: the event « A or B are realized » is represented by A∪B; the result of the experiment is in either A or B or both.
- Implication: the fact that the realization of the event A causes the realization of B is implied by A ⊂ B.
- Incompatibility: if A∩B = Ø, A and B are said to be incompatible. A result of the experiment can not be in both A and B.

To each event, we seek to associate a measure (that we will not define in this course) between 0 and 1 and representing the probability that the event is realized. For an experiment A, this measure is denoted P(A).

Formally, let E be a random experience in universe Ω. We call a probability measure on Ω (or more simply probability) an application P which associates with any random event A a real number P(A) such that: (i) For every A such that P(A) exists, we have 0 ≤ P(A) ≤ 1. (ii) P(Ø) = 0 and P(Ω) = 1. (iii) A∩B =; implies that P(A∪B) = P(A) + P(B).

The probability of an event can be understood as P(A) = number of feasible cases / possible number of cases. The « number of cases » is the cardinal of an event / universe.

## Random variables and probability

A random variable is a function whose value depends on the outcome of a random experiment E of universe Ω. We say that a random variable X is discrete if it takes a finite number of values or is countable. The set of outcomes ω on which X takes a fixed value x forms the event {ω: X(ω) = x} that we note [X = x]. The probability of this event is noted P(X = x).

The function p_{X} : x → P(X = x) or P(x) is called the law of the random variable X. If {x_{1},x_{2},…} is the set of possible values for X, we have:

Let S_{2} be the number of tails (P in french, head is denoted F) obtained when throwing two coins. The set of possible values for S_{2} is {0,1,2}. If one equips the universe Ω associated with this random experiment of the uniform probability P, it comes the following solutions:

When it exists (the expectation is always defined if X takes a finite number of values, or if X is at positive values); We call the expectation or mean of a discrete random variable X the quantity noted E(X) defined by:

When it exists, the variance of a discrete random variable X is the quantity denoted Var(X) defined by:

The basic idea of conditioning is as follows: additional information about the experience modifies the likelihood that is given to the event being studied.

For example, for a throw of two dices (one red and one blue), the probability of the event « the sum is greater than or equal to 10 » is 1/6 without additional information. On the other hand if we know that the result of the red die is 6, it is equal to 1/2 whereas it is equal to 0 if the result of the red die is 2.

Let P be a probability measure on Ω and B an event such that P(B)> 0. The conditional probability of A knowing B is the real P (A|B) defined by:

Events A and B are said to be independent if:

We can extend the independents to n events. Let A_{1}, A_{2},…, A_{n} be events. They are said to be independent (as a whole) if for all k ∈ {1, …, n} and for any set of distinct integers {i_{1},…,i_{k} } ⊂ {1, … n}, we have :

Random variables can be independent two by two, without being independent in their entirety: