Information is a concept intuitively clear to everybody and quite correctly associated with knowledge or - which is similar in meaning - with eliminating uncertainties. Knowledge is naturally considered to be useful as it increases efficiency of person's activities, ensures his better adaptability to changing environments, and therefore enhances his vital capacity and sustainability. In fact, however, that is not always so and excessive knowledge may be dangerous. Whether the knowledge appears to be constructive or destructive for an entity (subject) depends mostly on the subject's own state or own structure. When turning our own attention to ourselves, we could confidently assert that information (knowledge) obtained is capable of making changes in organization (structure, order, regulations) of our thoughts, organization of technological processes, engineering structures, social communities, etc., the latter gaining in efficiency and stability in the process.
C. Shannon worked out the law of the rate of information transfer from transmitter to receiver over a channel of any physical nature with noise. The law has been derived from a random process model and has direct analogy with models of thermostatics and therefore with models of diversity.
The capacity of a channel of band W perturbed by white thermal noise power N when the average transmitter power is limited to P is given by
where N = wN0 (N0 is the power of noise on unit of a band of frequencies).
It is easily seen that this law is nothing else than a logarithmical form of allometric relationship which widely occurs both in living organisms and in inorganic nature. By conditions, the law of information transfer gives rise to a fractal set. In linguistics, the frequency band is related with the alphabet length, and the signal power with the length of a word. For biological systems, the frequency band is associated with the specialization level (the narrower is the band, the more specialized is the system). The signal power (dispersion) depends on the environment strength (energy) and diversity. There is a linear dependence between noise and frequency band -the narrower is the band, the less are errors, though communication channel capacity decreases accordingly. Assuming that there is some probability of errors becoming lethal on accumulation, the individual stability of the receiver (in terms of error-free operation) would increase with narrowing of the frequency band (increasing specialization). Taking improvement of stability to be a target function of evolution, we come to a conclusion that specialization is a natural way to this target. A specialized system, however, has a lower channel capacity and therefore lesser resistance to fluctuations in environments. If we assume, for example, that a population of organisms should have a certain minimum of diversity, it is easy to see that the most specialized and least fertile organisms are likely to inhabit environments of tropical rainforests, while the least specialized organisms of maximum fertility would be found under conditions of cold climate of taiga and tundra, and partly in deserts. This dependence is a matter of common knowledge. It follows from the law of communication channel capacity that there exists a limit of information amount that can be transmitted per unit of frequency band equal to 1.443 natural units. All the limitations bring us to the conclusion that no supersystem can exist that could receive information within an arbitrary large frequency band; a number of receiving systems (mutually complementary by alphabet) are necessary for effective transformation of the information. Hence it immediately follows that a diversity is necessary in the receiving system, and the more powerful is the transmitter, the greater is the number of various receivers needed for complete transformation of information. Under certain simple assumptions, it may be inferred from eqn  that the diversity is given approximately by
Number of species S — aNb where b < 1 and N is sample size and N—f (area, habitat capacity). This is identical to the relationships derived from thermostatics. A connection between the quantitative information model and thermostatic model is determined by their common mathematical basis: information is defined as the inverse of the entropy.
If a noisy channel is fed by a source, there are two statistical processes at work: the source and the noise. Thus there are a number of entropies that can be calculated. First there is the entropy H(x) of the source or of the input to the channel (these will be equal if the transmitter is nonsingular). The entropy of the output of the channel, that is, the received signal, will be denoted by H(y). In the noiseless case H(y) — H(x). The joint entropy of input and output will be H(xy). Finally, there are two conditional entropies, Hx(y) and Hy(x) - the entropy of the output when the input is known and conversely. Among these quantities, we have the relations H(x,y) — H(x) + Hx(y) — H(y) + Hy(x) All of these entropies can be measured on a per-second or a per-symbol basis. The rate of transmission I can be written in two other forms due to the identities noted above. We have I — H(x) - Hy(x) — H(y) - Hx(y) — H(x) + H(y) - H(x,y).
Entropy differs from diversity in that a quantity of information within a closed system transmitter increases, and not decreases, with time, as uncertainty of the transmitter decreases with time and its behavior may be more reliably predicted by the receiver. If, however, the transmitter is an open system, its uncertainties do not depend generally on the duration of transmission, and its behavior keeps up at a constant level of unpredictability. As the model of the communication channel capacity is homologous to the thermostatic model, information may be considered a phenomenon of universal occurrence. It is the transmission of information from environment to any object within a certain frequency band that controls the existing order or structure of the object. Even in case of ceasing external action or transfer of information from outside, the structure appears steady for a long time in the existing environments. In that case, there is a good reason to speak of stored information.
The rate of information flow received by the system within a certain time interval may be estimated in terms of difference in diversity at the moments of time under comparison. Information may also be measured by Kulback entropy in comparison with a diversity under conditions of equilibrium or steady (stationary) state. There is practically no study aimed at measurement of the quantitative information flows. There was a rather keen interest in information theory as applied to natural sciences, and to biology in particular, in the 1950s and the 1960s. One of the sections of information theory (i.e., theory of coding) made a considerable contribution to solving problems of genetic code and molecular synthesis. Limited possibilities for measurements and inadequate equipment hindered fruitful application of information theory for ecological research. Though a connection between information theory and thermodynamics was evident as early as the 1950s, a real integration of the two branches became possible only on a basis of developed theory of nonequilibrium thermodynamics and synergetics. All the above accounts for an exponential growth of published papers dealing with the considered problem during the last 10-15 years. The studies are mainly focused on explaining the evolution of both living matter and human society.
Evidently, the law of quantitative transfer of information does not cover all the aspects of what we instinctively associate with knowledge. A signal received may be meaningless in the receiver's perception and would not change its state, and, vice versa, a signal of negligible strength may induce drastic changes. Accordingly, information includes both quantitative and semantic components. In the simplest case, the latter may be dealt with in terms of decoding of signals coming from transmitter to receiver. It implies that there is an outside observer who establishes rules of decoding, and records signal characteristics at the input and consequent changes in the receiver state. Formally, it is a problem of statistic analysis aimed at a search for invariants with respect to signal receiver toward the transmitter. This important and by no means trivial problem of biosemiotics is related to partial interaction analysis and is potentially capable of simulation of all possible partial relations. However, its solution does not necessarily give an insight into the problem at the macroscopic level.
It should be stressed that, as follows unambiguously from our experience, an interaction between two systems may produce some new systems, and structure and properties of the latter may appear completely unpredictable, even if we have a complete knowledge of the initially interacting systems (emergence). Generally, it is impossible even to define a set of possible outcomes, that is, expected uncertainty. Therefore, the appearance (emergence) of a new, earlier unknown structure may be defined as origination of new information. The only condition for it is some energy input to the system. On the other hand, any former locally steady structure may disappear, together with related information. It seems conceivable, therefore, that conservation laws do not apply to information at the macroscopic level. Being a measure of order, information arises from chaos and returns to it; the evolution based on memory (selection of locally stable structures) proceeds by progressive retrieval of order and accumulation of information. Open macrosystem receives energy in various forms from conventionally separated environment and generates flow of information and its continuous increase.
Actually, it is this phenomenon that brings us to change our understanding of entropy as a measure of disorder; it makes us revise the classic thermodynamic model (that admits only mechanical forms of energy conversion), thus eliminating discrepancy between the observed evolution and the second principle of thermodynamics. There are numerous researches dealing with this problem. More than 60 monographs have been recently published by SpringerVerlag publishers. S. D. Khaitun, in particular, gives a meticulous review of existing opinions on thermodynamic irreversibility and concludes on advisability of coming back to wording of the second principle as stated by W. Thomson (Lord Kelvin); according to the latter, mechanical energy dissipates (depreciates) in the course of irreversible processes - its amount decreases when passing into other kinds of energy. It is a mechanical approach, where all the processes in the system are described by movement of constituent particles and its state is exhaust-ingly characterized in terms of coordinates and impulses so that the energy appears to be their function. Mechanical energy differs from nonmechanical in that its movement may be completely described by a set of coordinates and impulses; in other words, the energy is described by the Hamiltonian function. Nonmechanical energy is related to entropy information, because a part of the energy is spent for new structure synthesis and maintenance and for synthesis of new information. When considered together with the law of the information transfer rate over a communication channel, the results bring us to a conclusion that power (energy) of any external action is spent partly for synthesis of some elements of known type and partly for creation of new structures, with unknown characteristics; those enlarge the band (where the external actions are reproduced) and reduce the noise level in every individual case of the information reception.
S. E. Jorgensen and Yu. M. Svirezhev introduce information into a biological system through Kulback entropy, the latter being a measure of the system deviation from the stationary state. In their model, the system evolution is governed by consumed energy and inner order generated by the system itself and controlling the exergy (useful work). The evolution is aimed at increase in exergy, that is, at a synthesis of structures far from equilibrium or stationary state. Demonstrating a fact of information synthesis, A. M. Khasen supplemented the nonequilibrium thermodynamics model developed by I. Prigogine. He considered entropy information as a function of complex variables, which permitted to recognize it in two constituents, namely basic information and semantic information. The expanded model generates new structures and increases entropy information within the self-developing and self-organizing system. It would be natural to suggest that a constant analogous to Boltzmann constant (length of a word or width offrequency band) appears as a function of self-development and creates a hierarchy (of the wordphrase-paragraph type). The hierarchy arises from a limited transmission capacity at a currently accepted level of energy transformation. Increase of the transmissivity is due to self-organization of the synthesized systems into systems of the next (higher) level, with narrower frequency band. Accordingly, the number of hierarchic levels increases with total signal strength, while diversity decreases at every higher level.
The chosen descriptive (qualitative) models ofinforma-tion synthesis and transformation, in common with other analogous models, predict an exponential growth of information in biosphere and therefore 'cancel' a danger of the 'heat death' imminent according to the second principle of thermodynamics. Within the frame of those models, the biosphere is considered a system of a practically unlimited growth of information complexity. That does not mean that individual elements cannot fail; but every lost element would be replaced by two to four new ones, so that the rate of diversity synthesis grows progressively. It should be noted, however, that there is no universally accepted model of information processes in the biosphere; at present, we can only speak about a search for an adequate theory.
Was this article helpful?