# Dictionary:Entropy

Other languages:
English • ‎español

1. (en’ tr∂ pē) A thermodynamic quantity that measures the unavailable energy. Higher entropy represents increased disorder. Entropy never decreases in a reaction, according to the second law of thermodynamics. See thermodynamic functions and Figure T-2.

2. A set G has the entropy H(G):

${\displaystyle H(G)=\log _{2}N}$,

where N is the minimum number of elements needed to specify G.

3. A measure of the uncertainty in a message. If P(mi) is the probability that the message mi has been transmitted, then the entropy H, where there are i possible messages, is given by

${\displaystyle H=-\sum \nolimits _{i}P(m_{i})\log _{2}(P(mi))}$.

The entropy of a situation with no uncertainty is zero. Entropy is a measure of the average information content of a message.