A dynamical system is a system which consists of a set of possible states and a rule which determines future states from past states.

It's a good way to use math to model things from the real world. A ball flying though the air makes a nice example. It's future state (where it's going and what it will be doing when it gets there) depends on it's current state (the direction it's headed, how fast, etc.). It's the foundation of the idea that if you know how hard somebody thew the ball a minute ago, or how it was moving at any point after that, you can use a set of rules (either Newtonian physics, or your brain's circuitry) to figure out where it will end up, so that you can catch it.

Mathematically speaking, dynamical systems usually take the form of discrete time maps or differential equations.

A discrete time map is a dynamical system which works in increments. It takes the conditions at some time t and gives the conditions at a later time, t+a. A good example of such a map is the logistic map, which is a population growth model:

nt+1=ntr(1-nt)