The act of expressing a message using as little media (i.e. as few symbols in an alphabet, usually bits) as possible. This is done by removing redundancy; the resulting message says as little as possible while retaining all of the information of the original. Information theory uses entropy to describe how much information a message contains, and thus how large it is when compressed. The opposite of compression is error-correcting coding.