The concept of stability is one of the most important in ecology (for details see Chapter 6). It is evident that only a stable community can exist over the course of a rather long time. Clearly that is possible if the sizes of the populations constituting the community do not undergo any large fluctuations. This definition is closer to the thermodynamic (or more correctly, to the statistical physics) notion of system stability. In thermodynamics (statistical physics) a system is believed to be stable when large fluctuations that can take the system far from the equilibrium or even destroy it are unlikely (see, for instance, Landau and Lifshitz, 1995). Evidently, the general thermo-dynamic concepts (for instance, the stability principle associated in the case of closed systems with the Second Law and, in the case of open systems, with Prigogine's theorem) should be applicable to biological (and, in particular, ecological) systems. As an illustration of such a phenomenon, we consider a well-known problem of the relationship between the species diversity of a community and its stability.

Ecologists consider it almost as an axiom that communities which are more complex in structure and richer in comprising species are more stable. Any popular ecological textbook (for instance, E. Odum's book, Fundamentals of Ecology) would convince you of this. This is explained in the following way: different species adapt differently to environmental variations. Therefore, a variety of species may respond with more success to different environmental variations than a community composed of a small number of species, and hence the former will be more stable. In other words, the more diverse the community is, the more stable it is. Perhaps this motivates the fact that Shannon's information entropy was suggested as a measure of species diversity (Margalef, 1951, 1968; MacArthur, 1955):

where pi = Ni/Yj=1 Ni, n is the number of species in the community and Ni is the population size of ith species. Comparing Eqs. (3.1) and (4.1) we see that they are fully identical, only the interpretation of notations differs. Margalef and MacArthur also suggested using the value of D as a measure of stability: the more the D is, the more stable is the community. Therefore, when a community moves to its climax, then its diversity increases. In accordance with this "logic", the community is the most stable if D is maximal. But, as could readily be shown, in this case the community structure is such that specimens of any species occur with the same frequency (maxp D is attained at p* = 1/n). In other words, the diversity of a community is maximal when the distribution of species is uniform, or when there are no abundant or rare species, and no structures. However, observations in real communities show that this is never the case, and that there is always a hierarchical structure with a dominant species. What is the reason for such a contradiction? It probably lies in the formal application of models and concepts taken from physics and information theory to systems that do not suit this type of definition. Both Boltzmann's entropy in thermodynamics and statistical physics and Shannon's entropy in the theory of information make sense only for populations of n n

weakly interacting particles. A typical example of such a system is the ideal gas: its macroscopic state is an additive function of the microscopic states of its molecules.

Let us remember the original formulation of Boltzmann's entropy: SB ~ ln W where W is the probability of the state of the system. In the general formulation, Boltzmann's formula is applicable to any system, not only Pto systems with weak interactions. But as soon as we use the standard formula S = ~kYj=\pi lnpi, we implicitly use the classic thermodynamic model of the ideal gas.

The use of the entropy measure to such objects as the ideal gas is well founded. Moreover, stability of the equilibriumâ€”entropy is maximal in this stateâ€”is associated with the Second Law. However, the stable structure of a biological community is the consequence of interactions between populations rather than a function of the characteristics of individual species, i.e. the biological community is a typical system with strongly interacting elements. But as soon as we become concerned with such systems the entropy measure is no longer appropriate. There is one more argument against the use of diversity as a goal function relating to stability. The entropy increases (tending to a maximum) only in closed systems, but any biological system is an open system in a thermodynamic sense, so that its total entropy is changed in an arbitrary way. When the system is in equilibrium (we speak of a dynamic equilibrium) the rate of the entropy production inside a system is positive and minimal. This is Prigogine's theorem. In this case, in relation with stability, the goal function is the rate of entropy production, not entropy.

Notice, however, that in large numbers of competitive communities at initial stages of their successions, far from climax, an increase in diversity may be observed. It seems that in these cases diversity is a "good" goal function for stability. This is explained in the following way: in the initial stages, far from equilibrium, the competition is still weak, and the community may well be regarded as a system with weak interactions. Moreover, interestingly, for communities of aquatic organisms (especially, phyto- and zooplankton) diversity increases along the entire transition from the initial state until climax. The latter can be considered as some dynamic equilibrium. The reason is the same: aquatic communities are the systems with weak interactions.

By summarising all the arguments considered above, we can say that the causal link between diversity and stability is not evident and univalent, as it seemed earlier. Nevertheless, there are empirical facts, which bring us to think about some very special properties of diversity in application to biological communities and about "linguistic" analogies between "natural" alphabetic languages (English, Russian, etc.), social systems and biological communities. The values of D for many communities tend to concentrate within a fairly narrow interval with a supremum of about five bits per individual (Margalef, 1995). There is an impression that Nature avoids both very low and very high diversity. The same picture is seen in alphabetic languages where information per letter, as a rule, does not exceed five (Ebeling et al., 1990). If Shannon's entropies are estimated for the distribution of human population with respect to professional groups in developed countries, then their values also do not exceed this limit. However, the analogous estimation, made, for instance, on a beach, gives a much higher value (Margalef, 1995).

So, we can shortly summarise that the maximum diversity principle can be considered basically true, but with certain constraints, which define at least the non-uniform structure of the given biological community.

Was this article helpful?

## Post a comment