All the laws of Newton's mechanics are symmetrical with respect to the direction of time: the Past and the Future do not differ from each other in Newton's world. However, in our world heat transfers only from a hot to a cold body, and never vice versa; the "arrow of time" does really exist, and there are always experiments allowing the "Observer" to distinguish the future from its past. Therefore, by falling outside the framework of

"normal physics", and in order to ensure the link of our world to reality, we formulate the following statement: in an isolated system some physical value determined by the value of energy is always increasing or remaining constant, but it is never decreasing. This value is entropy. By the same token we have formulated the so-called Second Law of Thermodynamics. It is interesting that an enormous number of attempts ("their name is legion") were being undertaken and still are being undertaken to find such special conditions under which the Second Law could be violated... but unsuccessfully. On the other hand, the Second Law can only be confirmed experimentally, not theoretically.

The equations of classical physics and even quantum mechanics, as well as the general relativity theory, presume that time is reversible, which is not the case according to the Second Law of Thermodynamics. Prigogine and Stengers (1979) write that although quantum mechanics and the general relativity theory are revolutionary, as far as the concept of time is concerned they are direct descendants of classical dynamics and carry radical negation of the irreversibility of time. Einstein claimed that the irreversibility of time was an illusion. The Second Law of Thermodynamics clearly breaks the symmetry of time. Prigogine introduces a natural time ordering of dynamic states. For instance, a drop of blue ink in a glass of water will after some time inevitably colour all the water light blue; but the opposite process, that the light blue colour in a glass of water forms a single drop of concentrated blue ink, has an almost infinitely little probability to occur. The example illustrates the "arrow of time", which has ecological implications: the evolution.

Keep in mind that the energy conservation law can be proved theoretically if the basic equations, so-called "motion equations" that fully determine the dynamics of all particles participating in the process, are known. But all attempts to prove the law of increasing (non-decreasing) entropy were unsuccessful. Moreover, Henry Poincare has proved that it is impossible to calculate entropy, even if co-ordinates, velocities, masses and momenta of all particles are known. Generally speaking, there is no theoretical proof of correctness of the First and Second Laws, so that all the proofs are experimental; they are a consequence of our proper experience in our world. The problem of induction does exist: maybe such extreme conditions exist that the First and Second Laws become non-applicable, either partly or wholly. In other words, maybe such virtual worlds with characteristic sizes and times that differ from ours exist, and in these worlds energy can be created or destroyed, and entropy of an isolated system decreases. Fortunately, we are living in the "right" world, where the biosphere and ecology obey the First and Second Laws.

Apparently, some "strangeness" of the Second Law has brought to its formulation a number of various equivalent forms:

• It is impossible to take heat from a reservoir and convert it into work without at the same time transferring heat from a hot to a cold reservoir.

• It is impossible to transfer heat from a cold to a hot reservoir without converting a certain amount of work into heat in the same process.

• Entropy of an isolated system always increases during irreversible processes.

• All real processes produce entropy.

• Time has only one direction—which is formulated in everyday language as "do not cry over spilt milk". This may be called the "arrow of time".

• All real processes are irreversible.

• All real processes result in a partial transfer of one of the other energy forms to heat, that unfortunately cannot be fully utilised to do work because of the difficulties in providing a reservoir at absolute zero of temperature and because it is released at the temperature of the environment.

It seems to us that such a diversity of definitions will much more easily allow us to find suitable "ecological" interpretations. However, one natural question arises: is entropy a real existing physical value (like energy), which can be measured, or is it some theoretical fiction, which is needed only for representing the observation in more or less elegant form? The positive answer is given by the famous Boltzmann formula:

Hewed on the Boltzmann memorial at the Vienna cemetery, this formula is to live in the endless sky over the grave of the great Boltzmann. This is one of the greatest formulas, making bridges between thermodynamics and other sciences: probability theory, theory of information, dynamical systems theory, etc. It connects the entropy, S, and the "thermodynamic" probability of state, W; the coefficient of proportionality k is the so-called Boltzmann constant. The thermodynamic probability is equal to the number of possible states in which the system can be found. The Boltzmann constant is k = 1.37 X 10—23 J/K; the entropy is usually evaluated in these units. It is obvious that entropy is a measure of order (or, most likely, disorder) in the system. Entropy increases with increase in temperature and decreases with its fall. It is obvious that the temperature exists when the system can be only at a single state, i.e. W = 1, then S = 0. This temperature T = 0 is called absolute zero, and it is equal to t0 = — 273.15°C. The statement that entropy is equal to zero at the absolute zero of temperature is called Nernst's theorem. It is obvious that S > 0 at T > 0. We see that the symmetry is again violated, since the temperature cannot be negative. It is interesting that formally the temperature could be negative, but any body would strive to split spontaneously into scattering parts. In other words, for T < 0, bodies in equilibrium cannot exist (Landau and Lifshitz, 1995).

Entropy can only be created, it cannot be destroyed. Let us represent (as we did above for energy and matter) the entropy differential as the sum of two items: dS = deS + diS where deS corresponds to the entropy exchange between the system and its environment, and diS describes the entropy production within the system. In accordance with the Second Law of Thermodynamics:

Concerning the exchange term deS, Clausius has suggested that deS = DQ/T,

where Q is the heat input into the system from its environment (gained heat). In an isolated system, when deS = 0 then dS = diS $ 0. In a closed system deS = DQ/T, since djS $ 0, then dS = deS + diS $ DQ/T. The latter inequality means that if heat is brought into the system, then its entropy and, correspondingly, the disorder increase. On the contrary, if heat is withdrawn, then entropy decreases and the order increases. In a thermally (not completely) isolated system, when SQ = 0, dS $ 0 too. Since dS = Q + SAirrev ~YNm dinm and deS = SQ/T then diS = SAijrev " X^d^m- (3-4)

From this follows that in a thermally isolated system (SQ = 0), within which there are neither chemical nor biological transformations (all dinm = 0) and all processes are reversible (SAirrev = 0), dS ; 0, i.e. the entropy is constant.

A system is said to be in equilibrium when it has no further tendency to change its properties. As previously mentioned, a dynamic equilibrium (steady state) can be maintained by equal process rates in opposite directions. The entropy of a thermally isolated system will increase until no further spontaneous changes can occur—all gradients have been eliminated—the thermodynamic equilibrium has been reached. The entropy therefore reaches its maximum at thermodynamic equilibrium. The criterion for thermodynamic equilibrium is, in other words, that entropy is at maximum.

Was this article helpful?

## Post a comment