Markov Chain
Some Definitions (from MATH 115)
Probability Vector A vector is called a probability vector if the entries in the vector are nonnegative and sum to 1.
Stochastic Matrix A square matrix is called stochastic if its columns are probability vectors.
State Vectors, Markov Chains Given a stochastic matrix , a Markov Chain is a sequence of probability vectors where for every nonnegative integer . In a Markov Chain, the probability vectors are called state vectors.
Steady-State Vector If is a stochastic matrix, then a state vector is called a steady-state vector for P if . It can be shown that every stochastic matrix has a steady-state vector.
Regular Matrix Definition An stochastic matrix is called regular if for some positive integer , the matrix has all positive entries.
To solve a Markov Chain problem (in MATH115)
Theorem 29.6. Let P be a regular stochastic matrix. Then has a unique steady- state vector and for any initial state vector , the resulting Markov Chain converges to the steady-state vector .
- Read and understand the problem
 - Determine the stochastic matrix and verify that is regular
 - Determine the initial state vector if required
 - Solve the homogeneous system
 - Choose values for any parameters resulting from solving the above system so that is a probability vector
 - Conclude by Theorem 29.6 that is the steady-state vector
 - Interpret the entries of in terms of the original problem as needed