Entropy

What follows is the derivation of entropy taken from information theory, or Shannon entropy. Note that this formulation of entropy resembles the form of thermody-namic entropy (or Gibbs entropy), but the connection between the two concepts is just at the metaphorical level. If, at a given time, we mark at random a particle that is traveling in the ecosystem, we can compute the probability associated with the event 'the particle is moving from compartment i to compartment j' as:

Recalling the correspondence between flows and edges in the network, we can call the above quantity as the probability associated with the edge ej.

Repeating this experiment many times (i.e., each time we mark a particle at random and we check which edge -coefficient - it will cross) will yield a sequence of edges. Some edges will be represented more often than others, as their associated probability is larger than the others.

Shannon introduced the concept of entropy as a measure of the uncertainty associated with this sort of sequences:

Entropy is equal to the sum of the probabilities of each possible outcome i times the logarithm of the probability of i (note that in this formula 0log(0) is considered 0). Some properties of the entropy are: (1) HX > 0, as the probabilities pi < 1; (2) HX < log(m) where m is the number of possible events. Equality holds if pi = 1/m, for every i.

Project Earth Conservation

Project Earth Conservation

Get All The Support And Guidance You Need To Be A Success At Helping Save The Earth. This Book Is One Of The Most Valuable Resources In The World When It Comes To How To Recycle to Create a Better Future for Our Children.

Get My Free Ebook


Post a comment