Entropy and probability

The Boltzmann formula connects entropy, S, and the "thermodynamic" probability, W; the thermodynamic probability is equal to the number of possible states in which the system can be found; Boltzmann's constant k = 1.38 X 10—23 J/K. It is in these units that the physical entropy is usually evaluated. Note that the expression "the number of possible states in which the system can be found" needs additional explanation.

Let the system contain N particles, which do not depend on each other and can be combined in different combinations. The number of these combinations, which we shall nominate "possible states", is equal to W = Nn . Then S = k ln(NN) = kN ln N.

Now let these particles possess some properties: in classic thermodynamics this can be energy, specific membership in ecology, etc. and the distribution of particles in relation to these properties is known, described by the vector N = {N1;...,Nn}, 1 Ni = N. Here the system is divided by n boxes, each of which corresponds to a certain combination of properties, for instance to a certain interval of energy or to certain biological species. In other words, if there is a space of parameters describing the properties, and the space is divided by a rather large number of cells, then these cells are our boxes. It is assumed that these boxes are isolated from each other. This means that a particle from one box cannot pass to others, and particles of ith sort cannot produce particles of another sort. This is certainly a very strict constraint if we deal with biological particles. The value of Ni is the number of particles in ith box, i.e. the number of particles possessing such values of the properties which define ith box. The total number of this kind of distribution is equal to

Was this article helpful?

0 0

Post a comment