Shannon Entropy

This online calculator computes Shannon entropy for a given event probability table and for a given message.

This page exists due to the efforts of the following people:

Timur

Timur

Michele

Michele

Created: 2013-06-04 15:04:43, Last updated: 2021-09-30 12:33:45

In information theory, entropy is a measure of the uncertainty in a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the message's information.
Claude E. Shannon introduced the formula for entropy in his 1948 paper "A Mathematical Theory of Communication."

H(X) = - \sum_{i=1}^np(x_i)\log_b p(x_i)

Minus is used because for values less than 1, and logarithm is negative. However, since

-\log a = \log \frac{1}{a},

formula can be expressed as

H(X)= \sum_{i=1}^np(x_i)\log_b \frac{1}{p(x_i)}

Expression
\log_b \frac{1}{p(x_i)}
is also called an uncertainty or surprise, the lower the probability p(x_i), i.e. p(x_i) → 0, the higher the uncertainty or the potential surprise, i.e. u_i → ∞, for the outcome x_i.

In this case, the formula expresses the mathematical expectation of uncertainty, which is why information entropy and information uncertainty can be used interchangeably.

This calculator computes Shannon entropy for given probabilities of events

PLANETCALC, Shannon Entropy

Shannon Entropy

Digits after the decimal point: 2
Entropy, bits
 

This calculator computes Shannon entropy for symbol frequencies of a given message.

PLANETCALC, Shannon Entropy

Shannon Entropy

Digits after the decimal point: 2
Entropy, bits
 

URL copied to clipboard
PLANETCALC, Shannon Entropy

Comments