What follows is the derivation of entropy taken from information theory, or Shannon entropy. Note that this formulation of entropy resembles the form of thermody-namic entropy (or Gibbs entropy), but the connection between the two concepts is just at the metaphorical level. If, at a given time, we mark at random a particle that is traveling in the ecosystem, we can compute the probability associated with the event 'the particle is moving from compartment i to compartment j' as:
Recalling the correspondence between flows and edges in the network, we can call the above quantity as the probability associated with the edge ej.
Repeating this experiment many times (i.e., each time we mark a particle at random and we check which edge -coefficient - it will cross) will yield a sequence of edges. Some edges will be represented more often than others, as their associated probability is larger than the others.
Shannon introduced the concept of entropy as a measure of the uncertainty associated with this sort of sequences:
Entropy is equal to the sum of the probabilities of each possible outcome i times the logarithm of the probability of i (note that in this formula 0log(0) is considered 0). Some properties of the entropy are: (1) HX > 0, as the probabilities pi < 1; (2) HX < log(m) where m is the number of possible events. Equality holds if pi = 1/m, for every i.
Was this article helpful?
You Might Start Missing Your Termites After Kickin'em Out. After All, They Have Been Your Roommates For Quite A While. Enraged With How The Termites Have Eaten Up Your Antique Furniture? Can't Wait To Have Them Exterminated Completely From The Face Of The Earth? Fret Not. We Will Tell You How To Get Rid Of Them From Your House At Least. If Not From The Face The Earth.