Regarding that Information Entropy equation, a friend of mine provides the answer:
Shannon's Information Content: for a given set A(x) where x is the variable, the average entropy (information content) is given by:
H(x) = SUM{ p(x) * log_2 (1/p(x)) }
where p(x) = probability of event x
|