The relation between AMI and other entropies can be identified as follows:
This relation explicitly states that the information is equal to the decrease in entropy associated to inflows once we know the outflows (or the decrease in outputs entropy once we know the inputs), and that AMI possesses symmetry.
If we write the relation between entropies and information as hi,o - Hi + ho - 7i;o
we can nicely associate each quantity with a Venn diagram. The diagrams are presented in Figure 4.
Starting from the top left (where we see the two entropies sketched as circles that intersect), we can see the entropy of each process (middle left), and the joint entropy as the union of the two circles (bottom left). AMI will be represented by the intersection between the two circles (top right), the sum of the conditional entropies as the union minus the intersection of the two entropies (middle right), and finally the conditional
entropy as a difference (bottom right). All the identities and inequalities presented above can be written using this simple representation. Using this representation, it is also easier to get to the significance of the AMI: this number represents the decrease in uncertainty due to the fact that we know the conditional entropies. For example, associating the process a with the inputs and the process b with the outputs we can directly translate the identity  as: the joint entropy (bottom left) equals the conditional entropy ofinputs given outputs (bottom right) plus the entropy of outputs (right circle).
Was this article helpful?