Markov Chains

A Markov chain, named for Andrey Markov (1856–1922), is a stochastic process that describes transitions over time from one state to another and has the property that the future is independent of the past, given the present.
May 13, 2017—Christopher DeFiglia

Discrete-Time Markov Chain (DTMC)

We begin with a set of states for our system.
We define our state space to be 1, 2, 3:
In[]:=
dtmc=DiscreteMarkovProcess[1,{{.2,.3,.5},{.7,.2,.1},{.3,.3,.4}}];​​s={1,2,3}
Out[]=
{1,2,3}
Let’s take a look at the transition probability matrix for these states.
Make a matrix of probabilities where each row sums to 1:
In[]:=
p={{0.2,0.3,0.5},{0.7,.2,.1},{0.3,.3,.4}};MatrixForm[p]
Out[]=
0.2
0.3
0.5
0.7
0.2
0.1
0.3
0.3
0.4
Each entry
p
ij
is the probability of entering state j from state i.
For this particular DTMC, notice that you can enter any state from any other state.
Compute the probability of entering state 3 from state 2, which we do by picking out
p
2,3
:
In[]:=
p[[2,3]]
Out[]=
0.1
Now let’s simulate this system for 20 time steps.
We then make a plot of how the system evolved over time:
In[]:=
data1=RandomFunction[dtmc,{0,20}]
Out[]=
TemporalData
Time: 0 to 20
Data points: 21
Paths: 1

In[]:=
ListLinePlot[data1,FillingAxis,Ticks{Automatic,{1,2,3}},InterpolationOrder0]
Out[]=

Continuous-Time Markov Chain (CTMC)

Applications

AUTHORSHIP INFORMATION
Christopher DeFiglia
5/13/17