At present, there is no general information theory. In the theory of quantitative information, it is described as elimination of uncertainty. This definition implies existence of a finite set of possible states or relationships and their prior probabilities. In a more comprehensive sense, information is understood as the appearance (emergence) of order or structure with unknown characteristics from the chaos. In that case, there is no closed set of states or their prior probabilities. The information may be measured post fac-tum, for example, in terms of distance between the emerged structure and its stationary analog, by some other means. There exists a distinct trend toward inclusion of information into thermostatic model as a missing variable which controls evolution and its irreversibility. The very fact of living matter evolution (including evolution of human beings) demonstrates that in the course of time the set of its stages gains in power and new locally stabilized systems appear; they become more and more complicated and require increasing flow of energy for their maintenance. A growth of the consumed energy flow is compensated by enhanced total transmission capacity. A great problem in synthesis of new structures consists of balance between the memory controlling admissible variants of new structures (targeted evolution) and environmental influence either through selection or by way of direct or indirect perception of its properties by the evolving object. Under actual conditions, the information flows are measurable, though an experience in such measurements is rather scarce.
Was this article helpful?