N

then

/ n \ n kl N ln N - X N ln Nj = -kN ^Pi ln ph (1.3)

V i=i ' i=i where pi = Ni/N. The frequencies pi can be considered as probabilities of a given particle to have ith set of properties. It is obvious that entropy S is maximal (at fixed N) when the probabilities are equal to each other, any particle can be detected at any point of the space of properties, or, in other words, the system is not ordered. Then Smax = kN ln n. In the opposite case, when all pi (with the exception of single pk = 1) are equal to zero, i.e. all particles are concentrated in kth box, then Smin = 0. From n possible states of the system the single state was chosen, so that we can say that it is maximally ordered. Since Boltzmann's constant k is very small, in order for the entropy to have a reasonable value compared with the thermal effects in real physical systems, the value N must be enormous. This is true if our particles are molecules (for instance, the number of molecules in 1 mol, the so-called Avogadro number, Na < 6 X 1023). Then, even if some substance has only two energy levels (really any substance has many more), 1 mol of the substance can have the entropy Smax < 5.7 J/K. This is a very reasonable value. However, when we deal with "biological" particles, for instance specimens, as a rule their number in such integrity as an ecosystem is not too large in order to get reasonable values corresponding to the ecosystem energetics. Certainly, we can increase the number by increasing the volume of the ecosystem, but we cannot do it to an unlimited degree without destroying its integrity. This contradiction forces us to think about the following: could we apply the entropy concept to an ecosystem directly? Perhaps we would have to use another coefficient of proportionality instead of Boltzmann's constant in the relation between entropy and the logarithm of possible states. We shall try to answer these questions in Section 4.2.

Was this article helpful?

0 0

Post a comment