Markov Chain

Some Definitions (from MATH 115)

Probability Vector A vector is called a probability vector if the entries in the vector are nonnegative and sum to 1.

Stochastic Matrix A square matrix is called stochastic if its columns are probability vectors.

State Vectors, Markov Chains Given a stochastic matrix , a Markov Chain is a sequence of probability vectors where for every nonnegative integer . In a Markov Chain, the probability vectors are called state vectors.

Steady-State Vector If is a stochastic matrix, then a state vector is called a steady-state vector for P if . It can be shown that every stochastic matrix has a steady-state vector.

Regular Matrix Definition An stochastic matrix is called regular if for some positive integer , the matrix has all positive entries.

To solve a Markov Chain problem (in MATH115)

Theorem 29.6. Let P be a regular stochastic matrix. Then has a unique steady- state vector and for any initial state vector , the resulting Markov Chain converges to the steady-state vector .

  1. Read and understand the problem
  2. Determine the stochastic matrix and verify that is regular
  3. Determine the initial state vector if required
  4. Solve the homogeneous system
  5. Choose values for any parameters resulting from solving the above system so that is a probability vector
  6. Conclude by Theorem 29.6 that is the steady-state vector
  7. Interpret the entries of in terms of the original problem as needed