The second law of thermodynamics states that the Entropy of an isolated system does not decrease. Thats simple, except for that 'What is Entropy?' part, which trips up a lot of people. Entropy is the logarithm of the multiplicity of a system. What is the multiplicity?

Consider the set of exact states a system can take on, and some much smaller set of metrics one can apply to measure the system. For example, one metric could be the height of the center of mass. Another could be the total kinetic energy. Another would be how many particles the system has. The multiplicity of a given set of values from these metrics is the number of exact states that yield the same values. So, like, let's pick the metric values... center of mass height of 1 meter and a total kinetic energy of 100 Joules and the system contains 1500 particles. If there are 25 trillion such states the system could be in that yield those values, then that set of measurements has multiplicity 25 trillion.

The most useful metrics are things that are conserved and either extensive or intensive, because then you can do a special trick. It starts by splitting the system into parts, one much larger than the other. You have a fixed total number of states - 25 trillion - with those fixed total values. You can calculate these same values for the little part and the large part separately. If the center of mass of one happens to be higher, the center of mass of the other is lower. Similarly, if the little system contains 200 particles, then the large one contains 1300. You can learn a lot about the system by looking at the multiplicities of the various combinations - for example, you may find that the multiplicity of states in which the little system contains a certain fraction of the particles is much higher than for any other fraction of the particles. The same goes for the kinetic energy and the center of mass. In this way, simply by looking at the states the system can take on and doing some counting, you determine what sorts of fluctuations there can be.

Another case is, say, a large binary number. One metric is the number of 1s in the number. You can calculate an entropy for this case too.

One interesting thing to note is that the more metrics you apply, the lower the multiplicity (and thus the entropy) is - you are categorizing the system more finely. If your metrics give you enough information to uniquely identify the state, the multiplicity is 1, so the entropy is 0. You know it all. In real life, for real systems, this is impossible - imprecision leaks in to whatever you do.

So, all that said, the heart of entropy is this: there are a lot more states of high entropy than low entropy.

Already, one can see where the second law might be coming from - if there are a lot more high entropy states out there than low entropy ones, it will take precise aim to avoid them.

Once upon a time, a node was created to ask questions on the subject of entropy and the second law of thermodynamics, and these were my answers:

Question 1: It makes no sense to say that the universe began with perfect order, right?
Well, if our theories of the big bang are correct, all matter started out completely uniform over space, in an ultra-dense form. They were disturbed by quantum fluctuations which were then blown up to cosmic proportions by inflation - so, yes, the universe started in 'perfect order'.

Question 2: If the universe's entropy is always increasing, but can't increase without bound, then it has to stop or even decrease some time, right?
Just because something always increases does not mean that it will reach its limit. For example, -1/X will approach 0 for large values of X, but it will never reach 0 for any finite value of X. It is an always increasing function, but nonetheless does not reach 0.
Correspondingly, as the free energy of a system approaches zero, its rate of losing that free energy slows down - and the corresponding increase in entropy slows down as well. So, assuming that entropy is universal does not prove that it's false.

Question 3: Doesn't the idea of perfect order within the universe contradict the existence of God (by various means)?
Questions about God must refer to theology. Theology may refer to science as it sees fit. However, I see no reason that perfect order would imply God, nor the opposite.

Question 4: Should we really be spending effort on these questions?
Thermodynamics is very relevant. Cosomology may become relevant, and, as pure science, it tends to pay off in spades... but ponderings on the fate of the universe are at this point philosophical speculations rather than practical questions. This hardly makes them irrelevant.

Question 5: What if we're not quite right about the laws of the universe? Wouldn't we have a way out then?
Within any possible set of laws of physics in which one can define a quantity which has properties like those of energy (primarily, being conserved), then one can derive the three laws of thermodynamics. Thus, entropy cannot decrease without a very dramatic change in the laws of the universe. Astronomers have yet to see anywhere with dramatically different laws than those here... unless they changed in such a way so as to very carefully disguise themselves as the same. Possible. Unlikely.