Here's a post for science buffs, meaning myself. X and Y are uncertain events. The mutual information between two events X and Y is the amount that we could reduce our uncertainty of X by observing Y. So if X is a coin toss, and Y is a different coin toss, then the mutual information between X and Y is zero. But if X is a coin toss and Y is a record of the outcome of that coin toss, we stand to gain all information about X by observing Y (the mutual information is 1 bit).
Now suppose that X and Y are coin tosses, and Z=0 if the outcomes of X and Y are equal, or Z=1 if the outcomes of X and Y are not equal. The three-way mutual information of this setup is negative. Here's some equations:
I(X;Y;Z) = I(X;Y) - I(X;Y|Z)
I(X;Y) = 0 (that's the mutual information of X and Y)
but I(X;Y|Z) = H(X)-H(X|YZ) > 0
where H(X) means "the uncertainty of X" and H(X|YZ) means "the uncertainty of X given the results of Y and Z"
What does it mean!?
0 Comments:
Post a Comment
<< Home