# Markov Chain

### Some Definitions (from MATH 115)

**Probability Vector**
A vector $s∈R_{n}$ is called a *probability vector* if the entries in the vector are nonnegative and sum to 1.

**Stochastic Matrix**
A square matrix is called *stochastic* if its columns are probability vectors.

**State Vectors, Markov Chains**
Given a stochastic matrix $P$, a *Markov Chain* is a sequence of probability vectors $s_{0} ,s_{1} ,s_{2} ,…$ where
$s_{k+1} =Ps_{k} $
for every nonnegative integer $k$. In a *Markov Chain*, the probability vectors $s_{k} $ are called *state vectors*.

**Steady-State Vector**
If $P$ is a stochastic matrix, then a state vector$s$ is called a steady-state vector for P if $Ps=s$. It can be shown that every stochastic matrix has a steady-state vector.

**Regular Matrix Definition**
An $n×n$ stochastic matrix $P$ is called regular if for some positive integer $k$, the matrix $P_{k}$ has all positive entries.

### To solve a Markov Chain problem (in MATH115)

Theorem 29.6. Let P be a regular$n×n$ stochastic matrix. Then $P$ has a unique steady- state vector $s$ and for any initial state vector $s_{0}∈R_{n}$, the resulting Markov Chain converges to the steady-state vector $s$.

- Read and understand the problem
- Determine the stochastic matrix $P$ and verify that $P$ is regular
- Determine the initial state vector $s_{0} $ if required
- Solve the homogeneous system $(P−I)s=0$
- Choose values for any parameters resulting from solving the above system so that $s$ is a probability vector
- Conclude by Theorem 29.6 that $s$ is the steady-state vector
- Interpret the entries of $s$ in terms of the original problem as needed