We may define conditional probabilities and entropies, that will be associated with events of the form 'a particle that is now in compartment i moves to compartment j. In this case, we know that the particle is currently in compartment i, but we want to measure the uncertainty associated with the next destination.
The conditional probability will be of the type 'probability of input to j given the output from i' with the following formal structure:
In the same way, we can define the conditional probability of a particle coming from compartment i once we know that it has arrived in compartment j as:
The associated entropies thus become
Hi I o = - E Pi I o (/ I i )logPi I o(j I i) - -EE 7 !og 7 
Ho I i = - E Po I i I i j I logPo I i I i j I = -EE 7 log j 
Having defined some entropy measures, some identities and inequalities are now necessary to define AMI. Let us start with the following identity:
Relation  tells us that the joint entropy is equal to the sum of the entropy associated with inputs (outputs) plus the conditional entropy on outputs given the inputs (inputs given the outputs). It also shows us the symmetry of the joint entropy. Also, the joint entropy is less or equal than the sum of the two processes' entropies:
Equality is reached just if the two processes are independent.
Was this article helpful?