entropy: Meaning and Definition of

en•tro•py

Pronunciation: (en'tru-pē), [key]
— n.
    1. (on a macroscopic scale) a function of thermodynamic variables, as temperature, pressure, or composition, that is a measure of the energy that is not available for work during a thermodynamic process. A closed system evolves toward a state of maximum entropy.
    2. (in statistical mechanics) a measure of the randomness of the microscopic constituents of a thermodynamic system. Symbol: S
  1. (in data transmission and information theory) a measure of the loss of information in a transmitted signal or message.
  2. (in cosmology) a hypothetical tendency for the universe to attain a state of maximum homogeneity in which all matter is at a uniform temperature
  3. a doctrine of inevitable social decline and degeneration.
Random House Unabridged Dictionary, Copyright © 1997, by Random House, Inc., on Infoplease.
See also: