Conditional Entropies

We may define conditional probabilities and entropies, that will be associated with events of the form 'a particle that is now in compartment i moves to compartment j. In this case, we know that the particle is currently in compartment i, but we want to measure the uncertainty associated with the next destination.

The conditional probability will be of the type 'probability of input to j given the output from i' with the following formal structure:

In the same way, we can define the conditional probability of a particle coming from compartment i once we know that it has arrived in compartment j as:

The associated entropies thus become

Hi I o = - E Pi I o (/ I i )logPi I o(j I i) - -EE 7 !og 7 [14]

Ho I i = - E Po I i I i j I logPo I i I i j I = -EE 7 log j [15]

Having defined some entropy measures, some identities and inequalities are now necessary to define AMI. Let us start with the following identity:

Relation [16] tells us that the joint entropy is equal to the sum of the entropy associated with inputs (outputs) plus the conditional entropy on outputs given the inputs (inputs given the outputs). It also shows us the symmetry of the joint entropy. Also, the joint entropy is less or equal than the sum of the two processes' entropies:

Equality is reached just if the two processes are independent.

Worm Farming

Worm Farming

Do You Want To Learn More About Green Living That Can Save You Money? Discover How To Create A Worm Farm From Scratch! Recycling has caught on with a more people as the years go by. Well, now theres another way to recycle that may seem unconventional at first, but it can save you money down the road.

Get My Free Ebook

Post a comment