Entropy

The thermodynamic entropy S, often simply called the entropy in the context of chemistry and Thermo-Dynamics, is a measure of the amount of energy in a physical system which cannot be used to do work. It is also a measure of the Dis Order present in a system. http://en.wikipedia.org/wiki/Entropy

Claude Shannon defined a measure of entropy that, when applied to an information source, could determine the capacity of the channel required to transmit the source (Sig Nal) as encoded binary digits. Shannon's measure of entropy came to be taken as a measure of the information contained in a message, as opposed to the portion of the message that is strictly determined (hence predictable) by inherent structures, like for instance redundancy in the structure of languages or the statistical properties of a language relating to the frequencies of occurrence of different letter or word pairs, triplets etc. http://www.wikipedia.org/wiki/Information_entropy


Edited:    |       |    Search Twitter for discussion