# How do you see a Markov chain is irreducible?

I have some trouble understanding the Markov chain property irreducible.

Irreducible is said to mean that the stochastic process can “go from any state to any state”.

But what defines whether it can go from state $i$ to state $j$, or cannot go?

State $j$ is accessible (written $i\rightarrow j$) from state $i$, if exists integer $n_{ij}>0$ s.t.

then communicating is if $i\rightarrow j$ and $j \rightarrow i$.

From these irreducibility follows somehow.

For $P_1$, when you are in state 3 or 4, you will stay there, and the same for states 1 and 2. There is no way to get from state 1 to state 3 or 4, for example.
For $P_2$, you can get to any state from states 1 to 3, but once you are in state 4, you will stay there.