I've looked at raincomplex's homenode picture a million times, and thought, eh, yeah, a state transition graph on his whiteboard. Big deal. Until today.

Today I was bored and looked at it again, and thought it might be fun to see what kind of string of states I'd get if I walked through the graph a number of times. And then, because I was REALLY bored, I wrote out the state transition matrix and wondered what it would look like if I multiplied it by itself a bunch of times. Here are the results.

The diagram on his homenode gives a set of rules of a six sided die, if the die faces were labeled 0,1,2,3,4,5 instead of the usual 1,2,3,4,5,6. The die face is the state. But the results are exactly the same. The transition rules are:

  1. When the die face is 0, the next possible die face is either 0 or 3, with equal likelihood of either appearing.
    (Note: in his notation, you're rolling another imaginary die, and if that second imaginary die shows 0,2, or 4, then your first die face remains 0, and if the second one shows 1, 3, or 5, then the first die face changes to a 3. Obviously, with a fair die, either outcome is equiprobable.)
    Let's make up some notation. Let's write it this way: P{0|0}=1/2, P(3|0}=1/2,
    meaning, in words, "The probability of the next state being 3, given that the present state is 0, is 1/2"
    So the general notation is Prob{next state given present state} = (some number)
  2. When the die face is 1, the next possible die face is either 0 or 3, with equal likelihood of either appearing.
    P{0|1} = 0.5
    P{3|1} = 0.5
  3. When the die face is 2, the next possible die face is either 1 or 4, with equal likelihood of either appearing.
    P{1|2} = 0.5
    P{4|2} = 0.5
  4. When the die face is 3, the next possible die face is either 1 or 4, with equal likelihood of either appearing.
    P{1|3} = 0.5
    P{4|3} = 0.5
  5. When the die face is 4, the next possible die face is either 2 or 5, with equal likelihood of either appearing.
    P{2|4} = 0.5
    P{5|4} = 0.5
  6. When the die face is 5, the next possible die face is either 2 or 5, with equal likelihood of either appearing.
    P{2|5} = 0.5
    P{5|5} = 0.5

You can rewrite this as a probability transition matrix, T, where the element in the ith row and the jth column is the probability of transition from state i to state j, where (i,j) = {0,1,2,3,4,5}. (Note that the matrix starts with a zeroth row and a zeroth column, like in Matlab.)

 1/2   0    0   1/2   0    0
 1/2   0    0   1/2   0    0
  0   1/2   0    0   1/2   0
  0   1/2   0    0   1/2   0
  0    0   1/2   0    0   1/2 
  0    0   1/2   0    0   1/2 

Multiplying the probability transition matrix T by itself is squaring it. So T2 is:

 1/4  1/4   0   1/4  1/4   0
 1/4  1/4   0   1/4  1/4   0
 1/4   0   1/4  1/4   0    0
 1/4   0   1/4  1/4   0    0
  0   1/4  1/4   0   1/4  1/4
  0   1/4  1/4   0   1/4  1/4

Let's keep going. We're suckers for deducing patterns, aren't we? T3 is:

 1/4  1/8  1/8  1/4  1/8  1/8
 1/4  1/8  1/8  1/4  1/8  1/8
 1/8  1/4  1/8  1/8  1/4  1/8
 1/8  1/4  1/8  1/8  1/4  1/8
 1/8  1/8  1/4  1/8  1/8  1/4
 1/8  1/8  1/4  1/8  1/8  1/4

It turns out that if you keep going, and let N go to infinity, then TN is:

 1/6  1/6  1/6  1/6  1/6  1/6
 1/6  1/6  1/6  1/6  1/6  1/6
 1/6  1/6  1/6  1/6  1/6  1/6
 1/6  1/6  1/6  1/6  1/6  1/6
 1/6  1/6  1/6  1/6  1/6  1/6
 1/6  1/6  1/6  1/6  1/6  1/6

Pretty cool, huh? This says that every state is equally probable in the long run.

It is NOT equiprobable in the short run! If you know that you are in one state, then the rules of transition still apply: If you are in state 1, for example, you KNOW that your next state is going to be either 0 or 3, and no others! But if you run this pseudorandom sequence over a long enough time, your probability of being in any given state is equiprobable, with equal probability of 1/6.

If you rolled this crazy die, here's what a sequence of throws would look like:

 0 3 4 2 4 5 5 5 2 1 3 4 5 2 1 0 0 3 1 0 3 4 2 4 5 2 1

I'm not sure of user raincomplex's use for this little rule based algorithm but perhaps he can post that himself.

Probability transition matrices are used in coding and information theory, the theory of random processes, Markov chains (and more generally, Markov processes, like our favorite bot, E2D2, uses to generate something close to natural language comments in the catbox), mathematical economics, queueing theory, and even the design of communications protocols. You don't want to be stuck in a certain state for which there is no way out, or a loop out of which you can never transition.

My friend Steve Heppe created a probability transition matrix for all the squares of Monopoly, plus the cards you'd get that would instantly shuffle you off to jail or Go, or the nearest railroad, and he determined that, long term, some squares were more likely than others, so you ought to buy those and put up hotels on them. Every Monopoly player is aware of the tendency to land on certain squares more than others. Steve did this when he was in high school. Smart feller. Of course, it was TJHSST, the best public school in the country.

REFERENCES

  1. Richard W. Hamming, Coding and Information Theory, Prentice-Hall, (c)1980, pp. 80-85
  2. Norman Abramson, Information Theory and Coding, McGraw-Hill, (c)1963, pp. 95 ff.
  3. Athanasios Papoulis, Probability, Random Variables, and Stochastic Processes, (c)1965, pp. 532 ff. The magnificent, magisterial first edition. It's probably on more bookshelves of more communications engineers and coding/information theorists than any other book. My advisor at Georgia Tech, Dr. Aubrey Bush, had this book thoroughly memorized. He could tell me what page a certain topic or theorem was on. "The Price theorem? Ah yeah, that suckah's on page 226. Ol' Bob Price was at MIT with me when he came up with that. Good man, Bob. Smart feller."
  4. Athanasios Papoulis and S. Unnikrishna Pillai, Probability, Random Variables, and Stochastic Processes, 4th ed., (c)2002, pp. 698 ff. Some day I'm going to write a node about Papoulis. His very name strikes fear into the hearts of all first year graduate students in electrical engineering, because it was synonymous with the dreaded random processes classes we all had to take. His books are wonderful.