Skip to main content Skip to Search Box

Definition: entropy from Philip's Encyclopedia

Quantity that specifies the disorder of a physical system; the greater the disorder, the greater the entropy. In thermodynamics, it expresses the degree to which thermal energy is available for work - the less available it is, the greater the entropy. According to the second law of thermodynamics, a system's change in entropy is either zero or positive in any process.


Summary Article: entropy from The Columbia Encyclopedia

(ĕn'trəpē), quantity specifying the amount of disorder or randomness in a system bearing energy or information. Originally defined in thermodynamics in terms of heat and temperature, entropy indicates the degree to which a given quantity of thermal energy is available for doing useful work—the greater the entropy, the less available the energy. For example, consider a system composed of a hot body and a cold body; this system is ordered because the faster, more energetic molecules of the hot body are separated from the less energetic molecules of the cold body. If the bodies are placed in contact, heat will flow from the hot body to the cold one. This heat flow can be utilized by a heat engine (device which turns thermal energy into mechanical energy, or work), but once the two bodies have reached the same temperature, no more work can be done. Furthermore, the combined lukewarm bodies cannot unmix themselves into hot and cold parts in order to repeat the process. Although no energy has been lost by the heat transfer, the energy can no longer be used to do work. Thus the entropy of the system has increased. According to the second law of thermodynamics, during any process the change in entropy of a system and its surroundings is either zero or positive. In other words the entropy of the universe as a whole tends toward a maximum. This means that although energy cannot vanish because of the law of conservation of energy (see conservation laws), it tends to be degraded from useful forms to useless ones. It should be noted that the second law of thermodynamics is statistical rather than exact; thus there is nothing to prevent the faster molecules from separating from the slow ones. However, such an occurrence is so improbable as to be impossible from a practical point of view. In information theory the term entropy is used to represent the sum of the predicted values of the data in a message.

The Columbia Encyclopedia, © Columbia University Press 2017

Related Credo Articles

Full text Article ENTROPY
Key Contemporary Concepts

Entropy comes from the Greek tropé , meaning transformation or, literally, ‘turning’, after the analogy of energy. The term was first proposed...

Full text Article Entropy
Bloomsbury Guide to Human Thought

Today, entropy (Greek, ‘transformation’) can be thought of in three ways. It is intimately related to the fact that naturally...

Full text Article entropy
Astronomy Encyclopedia

(symbol S ) Measure of the ENERGY within a system that is not available to do work. It is also often interpreted as the degree of...

See more from Credo