- (Markov chains:)
- For any 2 states A and Z, the set of times n at which it is impossible to reach Z if you start from A is bounded (finite).
This is best explained by an example. Define a markov chain with 2 states, A and B. If the probability of reaching B in one step from A is 1, and the probability of reaching A in one step from B is 1, then the sequence of states will always look like

`ABABABABAB...`; such a markov chain is not acyclic -- you cannot reach B from A at an even time.In fact, the above markov chain has a unique stable distribution, but no initial distribution of states (other than A -- 0.5, B -- 0.5) converges to it (for instance, if at time n P(A)=0.2 and P(B)=0.8, then at time n+1 P(A)=0.8 and P(B)=0.2; the 2 distributions fluctuate, but do not converge to P(A)=P(B)=0.5).

# acyclic (idea)

See all of acyclic, there is 1 more in this node.

Containing no cycles.