Entropy in Classical Science

The second law of thermodynamics states that, in any physical or chemical process, the quality of the energy in a system (i.e., the free energy or the exergy content) degrades, thus proving the existence of irreversibility in nature.

According to Clausius, for any cyclic process

where Qis heat transfer (measured in cal or J) and T is the absolute temperature (K). S is an extensive state function, namely 'entropy' (calK"1 or JK"1), and it measures the irreversibility of a process; for an infinitesimal reversible transformation it results in

The term entropy was coined by Clausius from rpowq (transformation) and evrpo'K'q (evolution, mutation, or even confusion). For an isolated system AS > 0, equality holds only for ideal reversible processes (e.g., a Carnot cycle). In an ideal reversible transformation between two states of a system entropy variation is given by the sum of the ratios between heat quantities (exchanged with the environment in a succession of short whiles during the transformation process) and the absolute temperatures (considering infinitesimal whiles, thus infinitesimal heat quantities, this sum is given by a sum of integrals).

Thus, in classical thermodynamics, entropy is a function of state whose variation, in the transition of a system from one state to another, can be calculated.

In real irreversible transformations of an isolated system, according to the second law of thermodynamics, the entropy variation is always positive, thus entropy tends to a maximum, that is, the end of spontaneous evolution of the system. Any real process can only proceed in a direction which results in an entropy increase. Heat always flows spontaneously from a hotter reservoir to a colder one, until there is no longer a temperature difference or gradient; gas always flows from high pressure to low pressure until there is no longer pressure difference or gradient. This principle has been applied to the whole universe suggesting the hypothesis of a trend toward a thermal death; it represents the extent to which nature becomes more disordered or random. Entropy is also an indicator or even an evidence of the existence of time (the arrow of time) because it gives a direction to the succession of states of a system.

In statistical mechanics, entropy is an increasing function of probability of the macroscopic state of a system. According to Boltzmann, entropy reflects the number of different ways energy microstates of matter can be combined to give a particular macrostate; it is proportional to the logarithm of the number of possible microconfigurations for a given macrostate:

where k represents the number of microstates in which the matter-energy of a system can be expressed and k is the Boltzmann's constant.

The larger the number of microstates for a given macrostate, the larger the entropy. Thus, this tendency toward an increase of entropy of an isolated system corresponds to the fact that a system evolves toward the most-probable macrostates. Real systems tend to the macrostate which has the largest number of corresponding accessible microstates.

In general, probability of a state is in inverse relation to the level of organization and order; thus, entropy is also conceived as a measure of disorder and undiversification of a system.

Was this article helpful?

0 0
Solar Power Sensation V2

Solar Power Sensation V2

This is a product all about solar power. Within this product you will get 24 videos, 5 guides, reviews and much more. This product is great for affiliate marketers who is trying to market products all about alternative energy.

Get My Free Ebook


Post a comment