# Transition Matrices of Markov Chains

Transition Matrices of Markov Chains

Suppose that if it is sunny today, there is a 60% chance it will be sunny tomorrow and that if it is not sunny today, there is a 20% chance it will be sunny tomorrow. Use the four transition probabilities sunnysunny, sunnynot sunny, not sunnysunny, and not sunnynot sunny to form the transition matrix .

K=

.6 | .2 |

.4 | .8 |

If we assume today's sunniness depends only on yesterday's sunniness (and not on previous days), then this system is an example of a Markov Chain, an important type of stochastic process. Powers of the transition matrix can be used to compute the long-term probability of the system being in either of the two states. As the power grows, the entries in the first row will all approach the long term probability that the system is in the first state (sunny).

If it is sunny today, there is about a 1/3 chance of sun in five days. If it is cloudy today, there is about a 1/3 chance of sun in five days. Thus, today's weather doesn't matter for the long-term prediction!