A short story by Thomas Pynchon, anthologized in the collection Slow Learner. The story recounts the activities of several rooms of a small apartment building over the course of an evening in a metaphor that uses human interaction to illustrate the Heat Death of the Universe.

... is minus the derivative of free energy, taken with respect to temperature under constant volume.

... rearranges my socks in the night.

... is the monster that led you to kiss me. is the wonder that made me want you, disregarding the glint in your eye that told me of far off places you’d rather be.

... is triscuits.

... will suck the life out of the universe someday.

... lets my mind drift while my feet are glued and heavy.

... dictates Everything in the world’s slow decline toward disorder.

... makes me think of your lips while i’m sleeping. puts their memory on my breath as i wake.

... guides the doodles i make and the poems i write instead of learning this shit in physics class.
In Mage the Ascension, Entropy is the Sphere that allows one to change Destiny. As the universe is fated towards disorganization, masters of Entropy understand that it is easier to ride the tide of Fate than to go against it, and with their skill, are able to quantify probability energy (more commonly understood as Fate, Destiny, or Fortune) for their own purposes.

Apprentices in this Sphere are able to judge probabilities with an uncanny accuracy and later to alter the course of simple probabilities, while masters are able to affect complex life processes with curses or blessings as well as alter someone's worldview by pointing their thoughts along a certain path. Masters of this Sphere either tend towards a chaotic, random way of life, or a clean, orderly path manner and thought.

The Euthanatos have adopted this Sphere as their own.

There is a tremendous amount of loose talk, even in thermodynamics text books, about bulls in china shops and scrambled eggs in the context of entropy. Both these analogies are misleading because they involve work being done: the bull does work pushing the china through distances and the Delia does work with her fork moving the eggs around the pan.

To Bowdlerize slightly: imagine a metal spring is stretched by a small amount, dx; something it resists with a force F. The thermodynamic identity says, dU = d Work = F.dx (i.e. the small change in the spring's total energy equals the small amount of work that has been done on the spring by stretching it which equals the force times the amount of stretch.)

Interesting oneself again in the spring: this time add energy to the spring in the form of heat, that is touch the spring with something at a higher temperature than T - the temperature of the spring. This adds a small amount of heat energy, d Heat, to the spring. Thus, the thermodynamic identity is: dU = d Heat = T.dS (where dS is the small increase in the entropy of the spring.)

The analogy between the corresponding quantities in the two instances of the identity appears to be utterly fundamental to nature. The spring is pushed through a distance, stretched, by exposing it to a greater force, giving it work energy. Correspondingly the spring is pushed through an entropy by exposing it to a greater temperature, giving it heat energy.

Talk of bulls and china hardly calls attention to temperature difference. Scrambling eggs calls attention to the mixing action of the fork, not temperature.

The appropriate picture to give to a citizen enquiring after entropy is of a fried egg. Here it is clear it is the hot pan which is making the white go white. Error would quickly become apparent if film of an egg being fried were run backward.

(probability theory, information theory, based on statistical mechanics:)

The entropy of a random variable X is

H(X) = supX1,...,Xni=1n P(X∈Xi) log2 1/P(X∈Xi)

where the supremum is taken over all partitions of the range of X.

 

As partitions become finer, the finite sum above cannot decrease.  So, when the range of X is finite and X∈{x1,...,xm}, we simply have the "well known" formula for entropy

H(X) = ∑i=1m P(X=xi) log2 1/P(X=xi).

 

We can also go in the other direction, and start with the above finite version. The entropy of any variable X with any range is the most you can get out of taking the above and applying it to all projections of X onto a finite set. If X takes on finitely many values, there is no difference -- the "best" partition turns out to be the finest one, i,e, by isolating each value of X into its own partition. But the first formula lets you compute the entropy of other X's, ones that have an infinite range.

If X is a continuous random variable with probability density function or PDF p(x) then you get the expected formula for entropy

H(X) = ∫-∞ p(x) log2 (1/p(x)) dx

 

The second law of thermodynamics states that the Entropy of an isolated system does not decrease. Thats simple, except for that 'What is Entropy?' part, which trips up a lot of people. Entropy is the logarithm of the multiplicity of a system. What is the multiplicity?

Consider the set of exact states a system can take on, and some much smaller set of metrics one can apply to measure the system. For example, one metric could be the height of the center of mass. Another could be the total kinetic energy. Another would be how many particles the system has. The multiplicity of a given set of values from these metrics is the number of exact states that yield the same values. So, like, let's pick the metric values... center of mass height of 1 meter and a total kinetic energy of 100 Joules and the system contains 1500 particles. If there are 25 trillion such states the system could be in that yield those values, then that set of measurements has multiplicity 25 trillion.

The most useful metrics are things that are conserved and either extensive or intensive, because then you can do a special trick. It starts by splitting the system into parts, one much larger than the other. You have a fixed total number of states - 25 trillion - with those fixed total values. You can calculate these same values for the little part and the large part separately. If the center of mass of one happens to be higher, the center of mass of the other is lower. Similarly, if the little system contains 200 particles, then the large one contains 1300. You can learn a lot about the system by looking at the multiplicities of the various combinations - for example, you may find that the multiplicity of states in which the little system contains a certain fraction of the particles is much higher than for any other fraction of the particles. The same goes for the kinetic energy and the center of mass. In this way, simply by looking at the states the system can take on and doing some counting, you determine what sorts of fluctuations there can be.

Another case is, say, a large binary number. One metric is the number of 1s in the number. You can calculate an entropy for this case too.

One interesting thing to note is that the more metrics you apply, the lower the multiplicity (and thus the entropy) is - you are categorizing the system more finely. If your metrics give you enough information to uniquely identify the state, the multiplicity is 1, so the entropy is 0. You know it all. In real life, for real systems, this is impossible - imprecision leaks in to whatever you do.

So, all that said, the heart of entropy is this: there are a lot more states of high entropy than low entropy.

Already, one can see where the second law might be coming from - if there are a lot more high entropy states out there than low entropy ones, it will take precise aim to avoid them.


Once upon a time, a node was created to ask questions on the subject of entropy and the second law of thermodynamics, and these were my answers:

Question 1: It makes no sense to say that the universe began with perfect order, right?
Well, if our theories of the big bang are correct, all matter started out completely uniform over space, in an ultra-dense form. They were disturbed by quantum fluctuations which were then blown up to cosmic proportions by inflation - so, yes, the universe started in 'perfect order'.

Question 2: If the universe's entropy is always increasing, but can't increase without bound, then it has to stop or even decrease some time, right?
Just because something always increases does not mean that it will reach its limit. For example, -1/X will approach 0 for large values of X, but it will never reach 0 for any finite value of X. It is an always increasing function, but nonetheless does not reach 0.
Correspondingly, as the free energy of a system approaches zero, its rate of losing that free energy slows down - and the corresponding increase in entropy slows down as well. So, assuming that entropy is universal does not prove that it's false.

Question 3: Doesn't the idea of perfect order within the universe contradict the existence of God (by various means)?
Questions about God must refer to theology. Theology may refer to science as it sees fit. However, I see no reason that perfect order would imply God, nor the opposite.

Question 4: Should we really be spending effort on these questions?
Thermodynamics is very relevant. Cosomology may become relevant, and, as pure science, it tends to pay off in spades... but ponderings on the fate of the universe are at this point philosophical speculations rather than practical questions. This hardly makes them irrelevant.

Question 5: What if we're not quite right about the laws of the universe? Wouldn't we have a way out then?
Within any possible set of laws of physics in which one can define a quantity which has properties like those of energy (primarily, being conserved), then one can derive the three laws of thermodynamics. Thus, entropy cannot decrease without a very dramatic change in the laws of the universe. Astronomers have yet to see anywhere with dramatically different laws than those here... unless they changed in such a way so as to very carefully disguise themselves as the same. Possible. Unlikely.

Entropy, the measure of distribution of energy in a system, is quoted in units of J mol-1 K-1, that is, joules per mole per kelvin. The entropy of a system is given by the equation:

S = k ln w

where S is entropy, k is the Boltzmann constant and w is the number of ways of arranging the energy in the system.

As far as chemistry is concerned, the entropy of elements and compounds (known as the standard molar entropy) is given the symbol Sθ and refers to the entropy of that substance under standard conditions.

The physical state of a substance has a strong effect on its standard entropy - gases have the most freedom to distribute themselves randomly, so they tend to have the highest entropies, while at the opposite end at the scale, solids tend to have the lowest entropies. Among the elements, for example, gaseous oxygen has a standard molar entropy of 102.5 J mol-1 K-1, liquid mercury one of 76.0, and solid iron 27.3.

When there is a chemical change, there will also be an entropy change. The total entropy change, ΔSθtotal, will be the sum of the entropy change of the system and the entropy change of the surroundings. Since the universe tends towards increasing entropy, only those reactions with a positive total entropy change will occur spontaneously, and those with strongly negative values will be hard to achieve even artificially.

The entropy change of the system, ΔSθsystem, is simply the entropy of the products minus the entropy of the reactants. The entropy change of the surroundings, ΔSθsurroundings, is given by the enthalpy change of the reaction, ΔH, quoted in joules (NOT kilojoules), divided by the absolute temperature of the surroundings, which under standard conditions is 298K.

Given that values for standard entropies and enthalpy changes have been systematically measured and recorded, it is possible to calculate the feasibility of a reaction from published data. If the total entropy change is more than +200 J mol-1 K-1, the reaction will probably go to completion. Between +200 and -200, it will be reversible. If the total entropy change is less than -200, the reaction probably won't go at all.

The entropy change of the system tends to be positive when the products have a more entropic physical state than the reactants, e.g. solid to liquid or liquid to gas. Increasing the number of moles also has a positive increase on entropy. Conversely, the opposite processes have the opposite effect.

The entropy change of the surroundings is positive for exothermic reactions, because heat is being given out, so the surroundings have more energy and thus more ways of arranging their energy. Endothermic reactions, on the other hand, lead to a negative entropy change of surroundings, and the majority of spontaneous reactions are exothermic.


Reference: Revised Nuffield Advanced Science Book of Data, 1984

The Entropy of a Language

Entropy is basically a measure of randomness. This is all well and good, but this results in it being a measure of several other things which really all turn out to be the same thing. Rather than repeat the obscure and uninformative definition I was treated to in no fewer than 3 courses this year, here is an exploration of the way I understand the concept of Shannon entropy.

Let's reduce our alphabet to just 28 characters: letters a to z, the space and the full stop. With this, we can write words and arrange them into sentences. Take a string of these characters of length n. How many possible strings exist? 28n (because for each character we have 28 choices). We then ask ourselves how many of these strings are legal in a given language. The higher the proportion of legal strings, the higher the entropy.

Note that here we define a language as a set of rules which define whether a string is legal or not. This includes natural languages such as English, binary code, random languages and the rule that any string is permitted so long as it contains alternating vowels and consonants. You should also note that the space full stop characters are being treated like any other, so among the possible strings are "..  ..   " and "   d     ".

Entropy is a measure of surprise

Suppose we choose a language where only strings containing only the letter l are allowed. You are reading a book written in this language. You have just finished the first page (which was covered in ls). Quick! Before, turning to the next page, what is the following character going to be?

Were you surprised at discovering the l on the following page? No? This is because such a language has zero entropy. Now consider a language where all strings are allowed. Can you predict the following character in this string? uqoaxul jtb.yjhdcn. Because it could be any of the 28, you have only a slim chance of being correct. This is the language whose entropy is maximal. The concept carries over to English: Like most natural languages, English has a relatively low entropy, which is why you are easily able to guess the last 6 characters of this sen... Because many natural languages have a fixed set of legal words, it is more interesting to look at word entropy. But for purposes of simplicity, we'll stick to character entropy in this article.

Entropy is a measure of information

Now, let's see how much information is carried by the two languages discussed in the previous paragraph. Measuring information is rather tricky. Clearly, in the language composed only of bs each character conveys no information, because if you know what the language is, you already know what character come next. We'll give it an entropy of 0: no information per character. Remember twenty questions? That we are able to define a concept with yes-no questions shows that one answer could be defined as a piece of information. As such, a random binary code has an entropy of 1 bit per character. This will be our measure of entropy.

You might think that, in a 28 character alphabet, each character conveys 28 pieces of information; namely the presence of one and the absence of the others. This is not quite correct: Because the presence of one character automatically forces the others' absence, we don't get that much information. It is as if I told you that a cloud was white and then went on to say it wasn't black: no additional information. Binary code again shows us how to measure the entropy of our random language: because it takes log2n bits to code a n-character alphabet, the random language has an entropy per character of 4.8. That means that a 28-character alphabet can encode a maximum of 4.8 bits (or pieces) of information with each character.

What is entropy for?

In natural languages, entropy is quite hard to measure. Most of them have an entropy somewhere around 1.5 bits per character. It might seem that this is a bit of a waste (entropy at higher level, such as word and sentence is also very low). But this high level of redundancy is what allows us to only read the shape of words; what allows us to understand each other even in noisy environments. Computers who recognize human speech try to use this low entropy in order to make sure they didn't misinterpret a sound; but this doesn't always work, as the redundancy only exists because a very large number of rules comes with the language and it is difficult to tell a computer about all of them.

In fact, the only point where low entropy is a problem is in computer science. When saving or sending information as plain text, it is not random: it has the rules of English, Java or XML and so contains much redundant information. This is what compression programs do: they take a file with low entropy and convert it into a language (set of rules) which has a higher entropy, thus saving space.

Entropy also comes in handy when defining the concept of unicity distance in cryptography. This basically tells us that if we know what cipher is used, and that we try each of the keys in turn to decode the secret message and if we know that the decoded message should be in English, then because of English's low entropy, it is highly unlikely there be more than one decoding which actually follows the rules of English. This problem is usually solved by having a very high number of keys, thus increasing the effort needed to decode with each of them in turn and increasing the probability of having more than one meaningful output.

Does this entropy have anything to do with the entropy I learnt in physics?

Yes and no. Language entropy is obviously not a physical concept and has little to do with thermodynamics. Physical entropy, however, can also be seen as a measure of randomness or information. The classical example I learnt was that of having a basin full of red and blue balls: The second law of thermodynamics tells us that the basin will eventually become a random mix of red and blue balls. So we can liken this random state to the random language, where each particle contains maximal information (it's position). And we can liken the state where the red balls are on one side and the blue on the other to the b language, where each ball tells us nothing, because we already know where it is. As noded elsewhere, physical entropy doesn't work at ball-sized level but at particle level.

This concludes our little foray into the world of entropy. Just remember that, as in all things, there are formal definitions and layman's explanations. When you do learn about language Entropy in a CS course, you can only use this as a conceptual aid: Entropy is a measure of randomness, information, surprise and number of rules.

This node brought to you by 3 CS courses which left me thoroughly confused and one kind professor who had me help him write next year's course thanks to whom I finally understood entropy by having to write an introduction to it myself. (That introduction was specifically for cryptography and thus bears little resemblance to this article.)

You play a game of pool. You rack the balls, shoot, scratch, shoot again, etc. and we film the whole thing just with closeups of each ball. And now I show you the whole game, one ball at a time (you can't see the cue) only in reverse. Did you even notice? Our physics is all time symmetric. It doesn't matter which was the ball goes or time flows, it's all legal. While it would be strange if a number of pool balls all reconfigured into a perfect triangle, there's nothing to suggest that it's impossible.

Entropy is commonly understood as a measure of the disorder of a system. Now according to particle theory everything, even you and me, is made out of tiny particles which are constantly in motion, (unless it gets really, really cold in here) have attractive forces between them, etc. So here's the deal: We consider temperature to be a measure of the average kinetic energy of a system of particles. Note: Kinetic! So these particles are moving, and they may be moving fast or slow but they're essentially moving randomly. Now consider one more thing, Boltzman's equation for entropy, S = klnW, which illustrates a proportionality between entropy (S), and the natural logarithm of the number of possible state configurations (W). So by the third law of thermodynamics, all systems above 0 Kelvin, have entropy thus are always randomly existing in alternating configurations. But since all this is just random, we could conceive that something "random" could just as soon happen. Suppose all the particles in your eternally vibrating body pulled downwards simultaneously. Well you'd probably cough before you started to melt but these particles are acting erratically so who's to say it wouldn't happen? And when you consider that on an infinite timeline the probability of everything goes to one, if we wait long enough we should get to see our whole universe melt or jump or fragment into a million little pieces (which would be phenomenal considering the number of pieces the universe really has).

Now you're probably thinking something like, "but there are rules against this sort of behavior young man! How dare you discredit that great works of Newton or Faraday! I will not hear another word of this nonsense! Recant!"

And so verily I say unto thee: "Probability!

"But Sir! Newton hardly constructed 'Laws' inasmuch as he developed rules of thumb for big things that aren't going too fast. Do you really believe that the great and mighty universe would bow to the meager suggestions of lesser beings? The world does what it wants to and out of respect for humanity, and a general slothful and apathetic nature, it occasionally yields to our suggestion!

"My friends! Systems of particles are far too numerous to colonize. The atomic census is and always has been a nightmare! Members of the molecular parliament are disassociating left and right out of sheer embarrassment. Electrons don't unionize, they ionize! Simply put, the organization is lacking. Unity at the atomic level is a picopipedream! It's the sort of thing that could only occur in the nether regions of the infinite timeline."

So keep your petty solid mechanics or vibrations. But fear the day when atoms unbind and bound together. Do not be so naive as to believe that anything is permanent or rigid. Everything is in a constant state of indecision, of entropy.

En"tro*py (?), n. [Gr. a turning in; in + a turn, fr. to turn.] Thermodynamics

A certain property of a body, expressed as a measurable quantity, such that when there is no communication of heat the quantity remains constant, but when heat enters or leaves the body the quantity increases or diminishes. If a small amount, h, of heat enters the body when its temperature is t in the thermodynamic scale the entropy of the body is increased by h t. The entropy is regarded as measured from some standard temperature and pressure. Sometimes called the thermodynamic function.

The entropy of the universe tends towards a maximum. Clausius.

 

© Webster 1913.

Log in or register to write something here or to contact authors.