next up previous
Next: The Shannon-Fano Algorithm Up: Lossless Compression Algorithms (Entropy Previous: Lossless Compression Algorithms (Entropy

Basics of Information Theory

According to Shannon, the entropy of an information source S is defined as:

$H(S) = \eta = \sum_i ~ p_i \log_2
\frac{{\textstyle 1}}{{\textstyle p_i}} $

where pi is the probability that symbol Si in S will occur.



Dave Marshall
10/4/2001