## Entropy

What follows is the derivation of entropy taken from information theory, or Shannon entropy. Note that this formulation of entropy resembles the form of thermody-namic entropy (or Gibbs entropy), but the connection between the two concepts is just at the metaphorical level. If, at a given time, we mark at random a particle that is traveling in the ecosystem, we can compute the probability associated with the event 'the particle is moving from compartment i to compartment j' as:

Recalling the correspondence between flows and edges in the network, we can call the above quantity as the probability associated with the edge ej.

Repeating this experiment many times (i.e., each time we mark a particle at random and we check which edge -coefficient - it will cross) will yield a sequence of edges. Some edges will be represented more often than others, as their associated probability is larger than the others.

Shannon introduced the concept of entropy as a measure of the uncertainty associated with this sort of sequences:

Entropy is equal to the sum of the probabilities of each possible outcome i times the logarithm of the probability of i (note that in this formula 0log(0) is considered 0). Some properties of the entropy are: (1) HX > 0, as the probabilities pi < 1; (2) HX < log(m) where m is the number of possible events. Equality holds if pi = 1/m, for every i.

Was this article helpful?

Learning About 10 Ways Fight Off Cancer Can Have Amazing Benefits For Your Life The Best Tips On How To Keep This Killer At Bay Discovering that you or a loved one has cancer can be utterly terrifying. All the same, once you comprehend the causes of cancer and learn how to reverse those causes, you or your loved one may have more than a fighting chance of beating out cancer.

Get My Free Ebook

## Post a comment