Exergy and information

Introducing the new variables N = £ni=1 Ni and pi = N,/N where N is the total number of particles (matter) in the system, we can rewrite expression (4.3) for exergy as n r A T ~l

The vector of intensive variables p = {p1,...,pn} describes the system composition; N is an extensive variable. The value K = £n=i Pi lnp/p0) within the constant factor 1/ln 2 is the so-called Kullback measure of the increment of information (see Chapter 4).

Let us bear in mind the exact meaning of Kullback's measure. From the information point of view any distribution p contains a certain quantity of information determined by Shannon's formula. The quantity of information increases as a result of transition from one distribution to another (p0 to p). It is, namely, this increment of information (per one particle or one unit of matter), which is determined by Kullback's measure. Then the product NK can be interpreted as a measure of the total information, which has been accumulated in the process of transition from some reference state to the current one.

We can present the expression for exergy in the form

where Exinf = NK(p, p0) $ 0 and Exmat = N ln(N/N0) - (N - N0) $ 0, i.e. as the sum of two items: the first is a result of structural changes in the system and the second is caused by a change in the total mass of the system.

We have shown above that exergy increases with time, but this is proved only in the vicinity of thermodynamic equilibrium. It was assumed that this statement is true along the whole system trajectory in the course of the whole system evolution from thermodynamic equilibrium, which can be identified as some "pre-biological" situation, to the current state (J0rgensen, 1992c). Then, if we follow one of the main biological paradigms, namely that "the ontogenesis is always repeating the phylogenesis", we can generalise the previous statement in the form of the so-called "exergy maximum principle": any ecosystem in the process of evolution towards its climax state tends to increase its own exergy.

Thus, we postulate that dEx/dt $ 0. By differentiating Eq. (5.2), d(Ex)/dt = d(Ex)inf/dt + d(Ex)mat/dt = (dN/dt)(K + ln(N/N0)) + (dK/dt)N, denoting ln(N/N0) = J and taking into account that N > 0 we get the evolutionary criterion in the form dK +(K + J) J $ 0. (5.3)

dt dt

If the positiveness of dJ/dt means an increase in the total biomass in the course of evolution then the positiveness of dK=dt can be interpreted as an increase in the specific information content (per unit of biomass). It is obvious that if both the total biomass and its specific information content increases then exergy also increases. However, if the total biomass is constant (d J/dt = 0) then the system can evolve only if the information content of its biomass is growing. This could also be interpreted as the growth of diversity. On the other hand, the information content can decrease (dK/dt < 0), but if the total biomass grows sufficiently quickly (dJ/dt >> 1) then exergy is growing, and the system is evolving. The evolution is carried out if the biomass is decreasing but the information content of the biomass is growing (sufficiently quickly). At last, there is a paradoxical situation when exergy is increasing while the total biomass and its information content are decreasing. If N < N0 then J < 0, and J(dJ/dt) > 0. From Eq. (5.3) we have

dj dt

It is obvious that this inequality can be realised if l Jl >> K and idK/dtl << 1, i.e. the information content is sufficiently low (K P 1) and the process of its further decrease is very slow. Note that the inequality l Jl Q 1 takes place only if the condition N << N0 is fulfilled. In this case we can say that the system is "paying" for its evolution with its own biomass. In the vicinity of thermodynamic equilibrium, at the initial stage of evolution K 0 J ~ 0. Then from Eq. (5.3) we have dK/dt > 0, i.e. at this stage, in order to evolve, the system has to increase its own biomass information content.

It is interesting that the exergy maximum principle possesses certain selective properties. In order to show this we consider a partial case when the total biomass is constant. Then the maximum of exergy coincides with the maximum of K, which is attained at faces of the simplex X Pi = 1; Pi $ 0, i = 1,..., n. It is possible to show that maxp K = max(- [ln(1/p0)]. This result could be interpreted in the following way.

The system with constant biomass, which is increasing its own exergy in the course of its evolution, tends to eliminate all the elements (substances) except one, which has a minimal initial concentration. In a "pre-biological" (reference) state, some "living substance" has a minimal concentration. In other words, the system, which increases its own exergy, selects among its components that which had been present in a minimal quantity at the beginning of evolution. But among these elements there are some which are necessary for the maintenance of life and the system must retain them. How can this contradiction be resolved? We can do this by introducing some constraints. For instance, one requirement would be to maintain a certain level of system diversity. Formally this means (in this partial case) that we look for the maximum of K under the additional constraint H = - 1 pi ln pi = constant. This implies that exergy has to increase while maintaining a certain non-arbitrary level of diversity. We think there is a very deep analogy with Fisher's fundamental theorem of natural selection (see, for instance, Svirezhev and Passekov, 1982).

Until now the connection between exergy and information rests on the fact that one of the important terms in the expression for exergy has the form of one of the main information measures, namely Kullback's measure. However, the connection is rather deeper. We try to show this using a very popular method in statistics, an urn scheme.

Let there be n urns corresponding to n different sorts of particles, and each urn contains Ni0 (i 1 , . , n) particles. We assume that all operations with ith urn do not depend on operations with other urns and, vice versa, operations with others do not influence operations with ith urn. In accordance with Shannon (or Boltzmann), the information (or entropy) contained in this urn is proportional to the logarithm of number by which Ni0 particles could be ordered. This number is (NO)!. Using Stirling's formula: lnL(N0)!j < Ni0 ln Ni0 for Ni0 Q 1, we get that the information per one particle of ith urn is equal to CT0 = L ln N0 where L = 1 /ln 2, so that the information is expressed in bits. It is necessary to explain what is meant by the word "ordering". For this we call on our Ecodemon to help us. In order to determine the number of particles it has to enumerate them. It can do this by associating them with some ordered structure, for instance a row of Ni0 miniboxes, each of which can contain not more than one particle. When all particles are distributed among all miniboxes, Ecodemon can say that this urn contains Ni0 particles. To distribute these particles among the miniboxes it can use (N0)! different ways.

There is another model which gives the same results. It is known that the entropy of the system of N0 identical particles, as a measure of uncertainty, is equal to S(n) = N0S(1) - k ln N0! where S(1) is the entropy of a single particle and k is a dimensionless coefficient (Landau and Lifshitz, 1995). If the system consists of different particles then its entropy is simply equal to the sum of the entropies of each particle. But this entropy could not be a characteristic of substance since the system consisting of different particles cannot represent some type of matter. Note that entropy here is not additive, the proper entropy of a particle decreases by the value k ln N0, that is, equivalent to the rise of information because of the joining of identical particles into some subsystem (urn).

Then we shall change the number of particles in the ith urn by adding or removing the small quantity DNO (DNO > 0 if particles are added and DN0 < 0 if removed). In the chemical interpretation these particles are molecules of a certain chemical substance, and their change is a result of chemical reactions that occur in the environment.

Thus, this experiment can be considered as a simple statistical model describing change in the intensity and spectral composition of solar radiation as a result of reflection, absorption and spectral transformation of solar radiation by some active surface (see below for details).

Let a new number of particles in ith urn be NThen the information contained in one particle in this new situation will be equal to s, = L ln N,, and the change of information (per one particle) as a result of transition from the initial situation to the new one is equal to s, - s° = L ln—. (5.5)

In order to get the total increment of information, S/,, for the whole system, we must multiply this value, s, — s0, by the number of new particles, i.e. by SN,, and must sum their partial terms over all urns

S/ = j S/, = j (s, — s0)SN,- = L j ln — SNi. (5.6)

By integrating Eq. (5.6) from initial state N0 to current state N, we obtain the expression for the total increase of information (in bits):

/ — /° = L X — ln(Ni/N?) — (N — N°)]. (5.7)

Comparing Eq. (5.7) with the expression for exergy (formula (4.3)) we can see that

where the exergy is measured in energy units.

Note that by using the definition of Kullback's measure we can obtain the same expression for exergy in a simpler way. Indeed, the process of the increment of information can be considered as a sum of two independent processes. The first process is a transition from the distribution = N0/^n=i N0} to the new distribution [pi = N,/ Y.¡Ni}, , = 1,..., n. This process is accompanied by the following increment of information:

where K is Kullback's measure and N the current total number of particles. Note, in this connection, we implicitly assume that the total number of particles is not changed.

The second process is a small (namely, in this case these processes can be considered as independent) change in the total number of particles. Using the above arguments for the calculation of the information increment as a result of a change in the number of particles in the ith urn, we can write the following expression for the second process:

Was this article helpful?

0 0

Post a comment