A (perhaps) surprising fact is that the concepts of entropy as used in thermodynamics and information theory are connected. If you want to store information in a system, it must be in an orderly state (with low entropy), or the patterns you are imprinting in it will be lost in thermal noise.

If you want to lower the entropy of a system (delete information from it), you must dissipate heat to the outside, or the second law of thermodynamics would be violated. More precisly, for each bit you delete, at least kT ln 2 Joules must be dissipated, where k is Boltzmann's constant and T is the temperature.

Normal computers delete one bit of information for each logic operation they carry out. That does not matter much, because they give off much more heat that the thermodynamical limit anyway. However, as computers get more efficient (perhaps using nanotechnology), it could become a serious problem.

That has created interest in computing designs that do not delete their input, but only permute it in some way, so that the answer can be read off. Such a computer could, in principle, use arbitrarily little energy to perform a calculation. That could have consequences for cryptography, because thermodynamical arguments are often made to demonstrate, for example, that keys of a certain length cannot be broken by brute force