How do you see a Markov chain is irreducible?

I have some trouble understanding the Markov chain property irreducible.

Irreducible is said to mean that the stochastic process can “go from any state to any state”.

But what defines whether it can go from state i to state j, or cannot go?


The wikipedia page gives the formalisation:

State j is accessible (written ij) from state i, if exists integer nij>0 s.t.
P(Xnij=j | X0=i)=p(nij)ij>0

then communicating is if ij and ji.

From these irreducibility follows somehow.

Answer

Here are three examples for transition matrices, the first two for the reducible case, the last for the irreducible one.

P1=(0.50.5000.90.100000.20.8000.70.3)P2=(0.10.10.40.40.50.10.10.30.20.40.20.20001)
For P1, when you are in state 3 or 4, you will stay there, and the same for states 1 and 2. There is no way to get from state 1 to state 3 or 4, for example.

For P2, you can get to any state from states 1 to 3, but once you are in state 4, you will stay there.
P3=(0.50.500000.900000.10000.800.20.700.100.200000.10.900.90000.10)
For this example, you may start in any state and can still reach any other state, although not necessarily in one step.

Attribution
Source : Link , Question Author : mavavilj , Answer Author : Christoph Hanck

Leave a Comment