View Single Post
Old 2006-04-27, 07:46   Link #50
Psieye
Lore Hunter
 
 
Join Date: Dec 2005
Location: GMT +1
Age: 40
Send a message via AIM to Psieye Send a message via MSN to Psieye
Regarding that Information Entropy equation, a friend of mine provides the answer:

Shannon's Information Content: for a given set A(x) where x is the variable, the average entropy (information content) is given by:

H(x) = SUM{ p(x) * log_2 (1/p(x)) }

where p(x) = probability of event x
Psieye is offline