Entropy
In information theory, the entropy of a random variable is the average level of “information”. The core idea of information theory is that the “informational value” of a communicated message depends on the degree to which the content of the message is surprising. If a highly likely event occurs, the message carries very little information. On the other hand, if a highly unlikely event occurs, the message is much more informative. For instance, the knowledge that some particular number will not be the winning number of a lottery provides very little information, because any particular chosen number will almost certainly not win. However, knowledge that a particular number will win a lottery has high informational value because it communicates the outcome of a very low probability event.
Statements
Definition. Let be a probability space and a random variable. The entropy of of is defined by the formula
where is the
- probability mass function, when is discrete
- probability density function, when is continuous. In this case, the entropy is also called as differential entropy.