View Single Post
Old 2006-04-27, 16:54   Link #52
lavarock
Busy wasting my time..:-(
 
Join Date: Nov 2003
It's interesting to see this in haruhi, from Information Theory I took last semester, just a brief summary and how it may relate to haruhi.

Shannon's Entropy is basically a way to quantize information. For discrete random variable, Entropy is H(x)=-sum(p(x)log2(p(x))) where p(x) is the probability mass function. Discrete random variable means the probability are set to be certain number at some discrete event. For example coin toss, if it's not biased, then Pr(X=0)=0.5, meaning the probability of X=0, let's say it's head, equals 0.5. The unit of entropy is bits (base 2).
Entropy should be thought as the uncertainty of a probability event. The entropy of an event is highest when the probability is evenly distributed. For example an unbiased coin toss, in this case the uncertainty is highest if the coin is completely unbiased, therefore H is highest.
BTW the entropy for continuous RV is slightly different, where you change summation to integration.

Entropy is used to calculate the capacity of a transmission channel by
C=max{I(x;y)},
p(x)
where I is the mutual information between the output y and input x:
I(x;y)=H(y)-H(y|x)
H(y|x) means the entropy of y given x, so the meaning of the mutual information is, the difference between information of output y and information of output y given input x. A communication channel's capacity therefore can be understood as the maximum of mutual information between the output and input over all input distributions. We can also describe the capacity as channel's ability to reduce uncertainty of output given input. Capacity of the channel is important to determine the maximum rate possible for that channel.

Entropy is also big part of coding theory. Coding is used primarily for data compression and encryption, and is major part of every computer geek's daily life.
It can be proved that the optimal code cannot be better than the entropy of the source. What that means is, for any code, the minimal codeword length has to be greater or equal to entropy of the coding table.

Now haruhi, since Yuki represent alien races which is Integrated Information Data entity. It makes perfect sense to have this formula in the OP.

Now I feel such a nerd to type out boring stuff like this in an internet forum:P
lavarock is offline