Fig. 6.2. Phase portrait of Eq. (2.1) for two values of a: a+ = 1 and a = —1; K = 5
the stored exergy is dissipated in order to "organise" a new equilibrium and to maintain it. It is natural that the non-trivial equilibrium can only exist in an open system so the processes of exergy accumulation and dissipation take place simultaneously, balancing each other at the dynamic equilibrium.
In reality, the population always undergoes permanent random perturbations. Since real processes are irreversible, their effect is accumulated, and these small perturbations can "shatter" and destroy the population so that after some time the system will again reach the thermodynamic equilibrium; without perturbations it could exist indefinitely. Constructed in some special way, the Lyapunov function can help us to answer the question: "How long the population within random environment could exist?" (Svirezhev and Logofet, 1978).
Since the Lyapunov function L(N) is a quasi-potential, then — (>L/>N) = f (N) and after integrating we get where the integration constant is determined from the condition L(K) = 0. The function is shown in Fig. 6.3.
As a model of random perturbations, we select the "white noise" with the small amplitude a » 1. Let the population be near a stable equilibrium N'k = K, and VK : 0 < N < i is the attraction domain of this point. It is natural that, as a measure of stability of this equilibrium against random perturbation, we can take the mathematical expectation of the mean time needed for the trajectory initiated in VK to leave this domain. This value is t , exp(2L(0)/a2) = exp(aK2/3a2). It is interesting that this time depends (certainly, with the exception of an intensity of random perturbations) only on the value of the Lyapunov function in the zero equilibrium. We can say that both equilibriums are not independent; and the characteristics of thermodynamic equilibrium significantly influence the stability of dynamics equilibrium even at a long distance.
We continue to consider the problem of Lyapunov functions from the viewpoint of classic thermodynamics. If the entropy is a sufficiently smooth function of the phase variables, then, in the vicinity of equilibrium, the entropy of some open system is
presented in the form:
Differentiating Eq. (2.3) with respect to time, we get (AS = S — Seq)
As usual we represent the difference dS as the sum of two items, dS = deS + diS, where diS is the entropy change caused by internal processes and deS is the entropy change caused by independent changes in the interaction between the system and its environment. If we assume that the difference diS has a higher order of smallness than deS, then de S d(8S) di S 1 d(82 S)
In accordance with the Second Law the entropy within any system must increase (not decrease). Then
dt 2 dt
In thermodynamics the second variation 82S is presented as a quadratic form of 8Ni = Ni — Np and if it is negative definite then the corresponding equilibrium is stable (in a thermodynamic sense).
Setting 82S = 2L we can consider 82S as the Lyapunov function. In accordance with Lyapunov stability theorem, if 82S < 0 (this inequality is a condition for thermodynamic stability) and d(82S)/dt $ 0 (this is the Second Law), then the equilibrium is stable (in Lyapunov's sense).
If the considered system is chemical and its variables ci, i = 1,..., n are molar chemical concentrations, then
T i=1 j=1 UCj where m = Mi(c1, —, cn) is the chemical potential of ith component, which in the general case can depend on all other concentrations. For so-called Onsager's systems (see Chapter 3):
Was this article helpful?