A Markov chain X 0 , X 1 , X 2 ,….. has the transition probability matrix And initial…
A Markov chain X0, X1, X2,….. has the transition probability matrix
And initial distribution po = 0.5 and p1 = 0.5. Determine the probabilities Pr{X2 = 0) and Pr{X3 = 01}.