EE? log
The relation between AMI and other entropies can be identified as follows:
This relation explicitly states that the information is equal to the decrease in entropy associated to inflows once we know the outflows (or the decrease in outputs entropy once we know the inputs), and that AMI possesses symmetry.
If we write the relation between entropies and information as hi,o - Hi + ho - 7i;o
we can nicely associate each quantity with a Venn diagram. The diagrams are presented in Figure 4.
Starting from the top left (where we see the two entropies sketched as circles that intersect), we can see the entropy of each process (middle left), and the joint entropy as the union of the two circles (bottom left). AMI will be represented by the intersection between the two circles (top right), the sum of the conditional entropies as the union minus the intersection of the two entropies (middle right), and finally the conditional
entropy as a difference (bottom right). All the identities and inequalities presented above can be written using this simple representation. Using this representation, it is also easier to get to the significance of the AMI: this number represents the decrease in uncertainty due to the fact that we know the conditional entropies. For example, associating the process a with the inputs and the process b with the outputs we can directly translate the identity [21] as: the joint entropy (bottom left) equals the conditional entropy ofinputs given outputs (bottom right) plus the entropy of outputs (right circle).
Was this article helpful?
Learning About 10 Ways Fight Off Cancer Can Have Amazing Benefits For Your Life The Best Tips On How To Keep This Killer At Bay Discovering that you or a loved one has cancer can be utterly terrifying. All the same, once you comprehend the causes of cancer and learn how to reverse those causes, you or your loved one may have more than a fighting chance of beating out cancer.
Post a comment