The

logistic map is a specific (and very simple)

dynamical system used to model

population growth. It looks like this:

n

_{t+1}=n

_{t}r(1-n

_{t})

Where n

_{t} is the population(sorta) at time t and r is the growth rate.

See, looks just like population growth, doesn't it? Okay, maybe not. Here's the basic idea though:

an unchecked

population will grow at a constant rate

proportional to the number of individuals in the population (more individuals = more breeding = more kids). This gives the

equation:

n

_{t+1}=n

_{t}r

But, eventually

resources will get low, and things won't

survive so well. The rate of population growth decreases as more individuals are brought into the population. This can be modelled by making n a value between 0 and 1 (hence the above "sorta", you can think of this as percentage of some theoretical

capacity if it helps.), and changing the equation to:

n

_{t+1}=n

_{t}r(1-n

_{t})

Now, as n gets close to one, the 1-n factor gets close to zero, and the rate of population growth slows down. r is generally constrained to the

interval between 0 and 4 in order to keep the equation a map from the interval 0 to 1 -> 0 to 1 Thus, it always maps back into itself.

At low values of r, this equation shows the standard bio 101 behavior. The population increases quickly at first, but gradually levels off at some

carrying capacity.

However, for higher values of r, some interesting things can happen. Lemme show you some examples:

If we start with n

_{0} as .1 and set our growth rate, r, equal to 2, we can solve the equation once, and we get:

n

_{1}=.18

plugging this back in, we get:

n

_{2}=.295 (rounded)

We can do this over and over again to get the

orbit of the system, as follows (fellow

math geeks feel free to follow along on your calculators at home):

{.1, .18, .295, .416, .486, .4996, .4999, .5, .5, .5, .5, ...}

Eventually, the population size settles down to .5 and stays there this is called a

fixed point. This is the typical BIO 101 behavior I was talking about. With a little experimenting, it's easy to discover that the fixed point is independent of the initial population size.

Now let's try a value of r which is a bit higher, say 3.2, with the same inital population.

{.1, .288, .656, .721, .642, .735, .623, .752, .597, .769, .567, .785, .539, .795, ... , .799, .513, .799, .513, .799, .513, ...}

In this case, the orbit hasn't settled down to a single fixed point. It alternates between two points. This is called a period two orbit. Just as above, the final

periodic orbit is independent of initial conditions.

Now, we're really going to have some fun. Crank that r value all the way up to 4. Ready? let's go:

{.1, .36, .922, .289, .821, .585, .970, .113, .401, .961, .147, .503, .999, .0002, .0009, .003, .015, .062, .231, ...(massive repeated pressing of enter)..., .093, .337, .893, .379, .942, .219,...}

Well, we can stop now. The point is that this isn't settling down at all. It's jumping all over the place. In fact, no matter how many times you hit enter, you'll never get the exact same value twice. This is a completely non-periodic orbit. Furthermore, if you change your inital conditions only slightly, say from .1 to .10001 and compare your results, you will notice that very quickly they begin to look totally different. This is called

sensitive dependence on inital conditions. Very similar initial conditions yield very different results.

**Welcome to Chaos.**

This system, depending on the r value, can have periodic orbits of any period, as well as chaos. As r is increased, it goes through a period-doubling cascade. First there is a fixed point (period 1) then a period 2 orbit, then period 4, then 8, then 16, and so on. At some point, these

bifurcations break down into semi-chaotic noise (true chaos doesn't occur until r=4, I think). However within this noise, there are small windows of periodic behavior, including odd-value periods. Within these windows of periodic behavior are windows of further semi-chaotic behavior. And on and on. This can be drawn schematicaly, and it has a

fractal structure. It ain't as pretty as the

mandelbrot set, but it's still a fractal. This one equation has a lot of fascinating properties, and you can learn a lot about

chaos from it. Unfortunately, it's hard to explain a lot of it without the ability to draw pictures.