Basics

Equilibrium, in the sciences, is a property attributed to a particular state of a system. It is defined by requiring that the state should not change over time.

Equivalently, in equilibrium the equations of motion of the system, which determine the rate of change of its state, are all zero. If X is some in principle observable quantity describing the state of the system, then

dX/dt = 0.

This is elementary enough. But we know that in the real world nothing actually exists in equilibrium, since the Universe as a whole is evolving and nothing is completely isolated from the rest of the world. So, how can the concept of equilibrium be meaningfully defined, and why is it useful?

Definitions and types of equilibrium

Static vs. dynamic

Static equilibrium is when the system stays the same because nothing is happening at all. If we had a perfect crystal of diamond at absolute zero in zero gravity it would be in static equilibrium. In classical or Newtonian mechanics, where objects are modelled as rigid bodies made up of a homogeneous and inert medium, static equilibrium is when the total force acting on a body (related to its acceleration by Newton's Second Law) is zero. (This is also known as mechanical equilibrium.)

In real life, everything has a nonzero temperature and nothing is completely rigid. Even a star made of a perfect octahedral diamond crystal would, over time, become spherical as its atoms relocated to minimize their potential energy. Nevertheless, static equilibrium may be a good approximation if we want to answer a question for which we expect thermal excitations and atomic rearrangements to be negligible. For example, the question "If I put a plate on this table tonight, where will it be in the morning?".

Dynamic equilibrium is a situation where some elements of the system are moving, but the state of the system as a whole is not changing. For example, consider a frictionless perfectly circular flywheel in a perfect vacuum. All parts of it are moving, but its state at any future time will be exactly the same as its state now. Or consider a section of pipe carrying a constant (non-turbulent) flow of fluid. Related, but not really equilibrium, is the situation when a system goes through cycles, returning to the same point at each cycle.

Again, exact dynamic equilibrium does not happen in reality, but it may be a good approximation depending on the questions we want to ask about the system. For example much of the interior of the Sun can be thought of as being in dynamical equilibrium with a constant outward flow of energy, although the Sun does and will evolve over billions of years. On a more mundane level, a person is likely to be in more or less the same state every week due to regular patterns of eating and excreting, so on average, over short periods, we can treat people approximately as dynamic equilibrium systems.

Stable vs. unstable vs. metastable

The stability of a equilibrium state (usually discussed for a static equilibrium) has to do with its response to small perturbations away from the point of equilibrium. For a stable equilibrium, the system will move towards the equilibrium after the perturbation is applied; for an unstable equilibrium, the system will move further and further away from equilibrium, no matter how small the initial perturbation.

The classic example is a ball sitting in the bottom of a valley (stable) or on top of a hill (unstable). With a spherical, perfectly smooth ball and hill, the equilibrium would truly be unstable and it would be impossible to balance the ball on top. There is also, of course, the saddle point which is stable in one direction but unstable in the perpendicular direction. Anyone trying to ride a horse for the first time knows that the saddle point is, strictly, an unstable equilibrium.

In real life, balls sitting on top of hills are in metastable equilibrium: this means that after a small perturbation (knock) they return towards their original position, but after a large perturbation they roll far away. Also, of course, the ever-present thermal and quantum fluctuations prevent the existence of a truly unstable equilibrium.

Metastable equilibrium is an extremely important concept in both everyday life and science: a house is just a collection of building materials in metastable equilibrium. In the study of phase transitions, interesting things happen when a substance reaches metastable equilibrium: one gets phenomena like supersaturated solutions or superheated liquids (the mug of water from the microwave at over 100°C that suddenly boils), which can result in a sudden release of energy, or the formation of structures (e.g. drops of dew on a spider's web).

Alan Turing's classic work on pattern formation during animal gestation (morphogenesis) makes use of unstable and metastable equilibrium. To put it simply, the leopard got its spots because a uniform distribution of chemicals over its body became, at some point, an unstable state; at this point tiny random fluctuations in the concentrations started to grow exponentially, the so-called Turing instability. Similar scenarios occur in cosmology, for example in the formation of monopoles and cosmic strings.

Statistical equilibrium, thermodynamic equilibrium and fluctuations

Systems that are not at absolute zero (i.e. everything in the real world) will have nonzero amounts of kinetic energy or thermal energy, which means that their degrees of freedom (meaning, roughly, the ways in which they can move about) will be excited. In normal matter, this will include vibrational modes, rotational modes, hyperfine structure, phonons, in fact all sorts of excited states. In short, things move around a lot.

In order to do thermodynamics or statistical mechanics, following Boltzmann, we consider a situation in which all of this riotous motion can be described on average by a statisticaldistribution. We accept that we are going to be ignorant of the exact configuration of the system, but we gain a very powerful general framework for talking about its bulk properties.

What does this have to do with equilibrium? Well, equilibrium is about the most basic concept in thermodynamics and stat. mech., without which we couldn't talk about such things as temperature or entropy. In order to measure temperature, you need to get two or more bodies into thermodynamic equilibrium with one another, such that on average no energy is being exchanged between them.

Actually, microscopic processes that we are ignorant of are continually transferring energy back and forth, but the things which we can measure, and the statistical distribution of energy, are not changing. If we consider all the microscopic processes that could influence the distribution, then average over them, the condition that the distribution function be the same over time is an extremely powerful one and leads to the Maxwell-Boltzmann distribution law.

In thermodynamic equilibrium, to put it simply, every degree of freedom has an energy 1/2 k T, where k is Boltzmann's constant and T is the temperature. This is known as equipartition. (Compare the virial theorem.) Quantum mechanics alter this picture somewhat, but it's important to note that thermal fluctuations would exist even in the absence of quantum effects. Thermodynamical equilibrium is a way of applying equilibrium to real objects, at the cost of admitting our ignorance of their precise microscopic details.

Chemical equilibrium

Chemical equilibrium is a similar concept, but rather than being concerned with the distribution of energy, it applies to the number of different types of species in a system. For example, a system containing atoms, electrons and ions like a plasma or aqueous solution is in chemical equilibrium when, on average, the microscopic processes leave the number of each species unchanged over time.

The story of a system approaching equilibrium, and its ultimate fate

However, even these statistical types of equilibrium are not precisely realistic, because it takes an infinite time for any system to reach equilibrium. A system will typically follow an exponential decay in its approach to equilibrium, getting arbitrarily close to, but never reaching it. It's fairly easy to see why: if the system is changing state, it's not in equilibrium. In order to approach equilibrium, it has to be changing state. But the processes that make it do so get weaker and weaker the closer it gets to equilibrium. So it doesn't get there in a finite time. But the concept is still useful because if we wait long enough, the system will get as close as we like to equilibrium, so eventually we can measure its temperature (say) to any given accuracy.

The notion of equilibrium also has very powerful consquences in conjunction with the Second Law of Thermodynamics: it allows us to predict what will happen in the far future of any system. The Second Law states that a closed system always changes in the direction of increasing entropy. Putting this together with the notion of a thermal equilibrium state, we deduce that the thermal equilibrium state of a closed system is the state with maximum entropy.

Now, given any initial conditions, a closed system will, given enough time, always end up in a state of thermal equilibrium (or heat death). (See also The Heat Death of the Universe. Of course, whether the Universe is a closed system is debatable.) This is an interesting, if depressing, example of scientific prediction: without knowing any details of the initial state of a system, we can predict (up to our ignorance of its microscopic behaviour) its ultimate fate.

Away from thermal equilibrium: chance would be a fine thing

We should always remember that the Second Law, like all of statistical mechanics, is only a statistical law about what is likely to happen. In fact, the law relies on the mathematical fact that states with greater entropy are overwhelmingly more likely than states with less entropy, for the simple reason that there are many, many, many more of them. Without going into any details, you can deduce that there is a non-zero probability that, just by chance, the entropy of a system will decrease. And the smaller the decrease in entropy, the larger the chance is of it happening.

So, if we take our closed system in equilibrium with the maximum amount of entropy, it will continually be fluctuating into states with very slightly less entropy (and back again). It is possible, although really quite incredibly unlikely, that there will be a large negative fluctuation in entropy and the system will be, as it were, resurrected from its heat death.

In fact, there is a theorem proved by Henri Poincaré that for any closed system satisfying deterministic physical laws (with other assumptions I won't go into), the system will, eventually, return to its original state. This is the "Poincaré recurrence". So if we start out in a state with low entropy, it must at some point in the future return to it, apparently violating the Second Law!

However, the time this takes is much, much, much longer than the time it takes for a system to get to equilibrium, or at least so close to it as makes no difference. The history of the system while it waits for its Poincaré recurrence will be extremely boring, consisting of very short periods fluctuating away from equilibrium and æons at, or very close to, maximum entropy. So, although technically the Second Law isn't always true, and thermal equilibrium isn't always the final state of a closed system, the concepts remain valid for all practical and scientific purposes, and you can stop waiting for your slice of buttered toast to reverse the trajectory that has just sent it into stable equilibrium on the carpet.