Containing no cycles.
(Markov chains:)
For any 2 states A and Z, the set of times n at which it is impossible to reach Z if you start from A is bounded (finite).

This is best explained by an example. Define a markov chain with 2 states, A and B. If the probability of reaching B in one step from A is 1, and the probability of reaching A in one step from B is 1, then the sequence of states will always look like ABABABABAB...; such a markov chain is not acyclic -- you cannot reach B from A at an even time.

In fact, the above markov chain has a unique stable distribution, but no initial distribution of states (other than A -- 0.5, B -- 0.5) converges to it (for instance, if at time n P(A)=0.2 and P(B)=0.8, then at time n+1 P(A)=0.8 and P(B)=0.2; the 2 distributions fluctuate, but do not converge to P(A)=P(B)=0.5).

A*cyc"lic (?), a. [Pref. a- not + cyclic.]

Not cyclic; not disposed in cycles or whorls; as:

(a) (Bot.)

Of a flower, having its parts inserted spirally on the receptacle.

(b) (Org. Chem.)

Having an open-chain structure; aliphatic.

 

© Webster 1913

Log in or register to write something here or to contact authors.