## Conditional Entropies

We may define conditional probabilities and entropies, that will be associated with events of the form 'a particle that is now in compartment i moves to compartment j. In this case, we know that the particle is currently in compartment i, but we want to measure the uncertainty associated with the next destination.

The conditional probability will be of the type 'probability of input to j given the output from i' with the following formal structure:

In the same way, we can define the conditional probability of a particle coming from compartment i once we know that it has arrived in compartment j as:

The associated entropies thus become

Hi I o = - E Pi I o (/ I i )logPi I o(j I i) - -EE 7 !og 7 [14]

Ho I i = - E Po I i I i j I logPo I i I i j I = -EE 7 log j [15]

Having defined some entropy measures, some identities and inequalities are now necessary to define AMI. Let us start with the following identity:

Relation [16] tells us that the joint entropy is equal to the sum of the entropy associated with inputs (outputs) plus the conditional entropy on outputs given the inputs (inputs given the outputs). It also shows us the symmetry of the joint entropy. Also, the joint entropy is less or equal than the sum of the two processes' entropies:

Equality is reached just if the two processes are independent.

## 10 Ways To Fight Off Cancer

Learning About 10 Ways Fight Off Cancer Can Have Amazing Benefits For Your Life The Best Tips On How To Keep This Killer At Bay Discovering that you or a loved one has cancer can be utterly terrifying. All the same, once you comprehend the causes of cancer and learn how to reverse those causes, you or your loved one may have more than a fighting chance of beating out cancer.

Get My Free Ebook