According to Shannon, the entropy of an information source *S* is
defined as:

where *p*_{i} is the probability that symbol *S*_{i} in *S* will occur.

- indicates the amount of information contained in
*S*_{i}, i.e., the number of bits needed to code*S*_{i}. - For example, in an image with uniform distribution of gray-level
intensity, i.e.
*p*_{i}= 1/256, then the number of bits needed to code each gray level is 8 bits. The entropy of this image is 8. - Q: How about an image in which half of the pixels are white (I = 220) and half are black (I = 10)?