entropy


Also found in: Thesaurus, Medical, Legal, Financial, Acronyms, Idioms, Encyclopedia, Wikipedia.
Related to entropy: enthalpy

en·tro·py

 (ĕn′trə-pē)
n. pl. en·tro·pies
1. Symbol S For a closed thermodynamic system, a quantitative measure of the amount of thermal energy not available to do work.
2. A measure of the disorder or randomness in a closed system.
3. A measure of the loss of information in a transmitted message.
4. The tendency for all matter and energy in the universe to evolve toward a state of inert uniformity.
5. Inevitable and steady deterioration of a system or society.

[German Entropie : Greek en-, in; see en-2 + Greek tropē, transformation; see trep- in Indo-European roots.]

en·tro′pic (ĕn-trō′pĭk, -trŏp′ĭk) adj.
en·tro′pi·cal·ly adv.
American Heritage® Dictionary of the English Language, Fifth Edition. Copyright © 2016 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.

entropy

(ˈɛntrəpɪ)
n, pl -pies
1. (General Physics) a thermodynamic quantity that changes in a reversible process by an amount equal to the heat absorbed or emitted divided by the thermodynamic temperature. It is measured in joules per kelvin. Symbol: S See also law of thermodynamics
2. (General Physics) a statistical measure of the disorder of a closed system expressed by S = klog P + c where P is the probability that a particular state of the system exists, k is the Boltzmann constant, and c is another constant
3. lack of pattern or organization; disorder
4. (Communications & Information) a measure of the efficiency of a system, such as a code or language, in transmitting information
[C19: from en-2 + -trope]
Collins English Dictionary – Complete and Unabridged, 12th Edition 2014 © HarperCollins Publishers 1991, 1994, 1998, 2000, 2003, 2006, 2007, 2009, 2011, 2014

en•tro•py

(ˈɛn trə pi)

n.
1. a function of thermodynamic variables, as temperature or pressure, that is a measure of the energy that is not available for work in a thermodynamic process. Symbol: S
2. (in data transmission and information theory) a measure of the loss of information in a transmitted signal.
3. (in cosmology) a hypothetical tendency for the universe to attain a state of maximum homogeneity in which all matter is at a uniform temperature.
4. a state of disorder, as in a social system, or a hypothetical tendency toward such a state.
[< German Entropie (1865); see en-2, -tropy]
en•tro•pic (ɛnˈtroʊ pɪk, -ˈtrɒp ɪk) adj.
en•tro′pi•cal•ly, adv.
Random House Kernerman Webster's College Dictionary, © 2010 K Dictionaries Ltd. Copyright 2005, 1997, 1991 by Random House, Inc. All rights reserved.

en·tro·py

(ĕn′trə-pē)
A measure of the amount of disorder in a system. Entropy increases as the system's temperature increases. For example, when an ice cube melts and becomes liquid, the energy of the molecular bonds which formed the ice crystals is lost, and the arrangement of the water molecules is more random, or disordered, than it was in the ice cube.
The American Heritage® Student Science Dictionary, Second Edition. Copyright © 2014 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.
ThesaurusAntonymsRelated WordsSynonymsLegend:
Noun1.entropy - (communication theory) a numerical measure of the uncertainty of an outcome; "the signal contained thousands of bits of information"
communication theory, communications - the discipline that studies the principles of transmiting information and the methods by which it is delivered (as print or radio or television etc.); "communications is his major field of study"
information measure - a system of measurement of information based on the probabilities of the events that convey information
2.entropy - (thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work; "entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity"
physical property - any property used to characterize matter and energy and their interactions
conformational entropy - entropy calculated from the probability that a state could be reached by chance alone
thermodynamics - the branch of physics concerned with the conversion of different forms of energy
Based on WordNet 3.0, Farlex clipart collection. © 2003-2012 Princeton University, Farlex Inc.
Translations
entropie
entropia

entropy

[ˈentrəpɪ] Nentropía f
Collins Spanish Dictionary - Complete and Unabridged 8th Edition 2005 © William Collins Sons & Co. Ltd. 1971, 1988 © HarperCollins Publishers 1992, 1993, 1996, 1997, 2000, 2003, 2005

entropy

[ˈɛntrəpi] nentropie f
Collins English/French Electronic Resource. © HarperCollins Publishers 2005

entropy

nEntropie f
Collins German Dictionary – Complete and Unabridged 7th Edition 2005. © William Collins Sons & Co. Ltd. 1980 © HarperCollins Publishers 1991, 1997, 1999, 2004, 2005, 2007

entropy

[ˈɛntrəpɪ] nentropia
Collins Italian Dictionary 1st Edition © HarperCollins Publishers 1995

en·tro·py

n. entropía, disminución de la capacidad de convertir la energía en trabajo.
English-Spanish Medical Dictionary © Farlex 2012
References in periodicals archive ?
Finthammer develops, implements, evaluates, and improves the very first algorithms tailor-made for solving the maximum entropy optimization problem under aggregating semantics.
This study aimed to determine hysteresis, enthalpy, entropy, enthalpy-entropy compensation theory and Gibbs free energy related to water adsorption and desorption in 'Malagueta' pepper seeds.
Even for [5] method having a great performance, the usage of finite values of entropy for fault identification leads to a narrowed field of approach.
Strange, surreal, and symbolic, Aaron Costain's graphic novel, Entropy, probes some of the deepest questions about creation with a funny, absurd sensibility that makes it all go down smoothly.
Clausius coined the term of entropy in 1865 and explained the preference for this word by its etymology.
Part of theories on human consciousness, the concept of entropy has become a greater research focus with recent improvements in the ability of functional magnetic resonance imaging (fMRI) to track chemical activity patterns in the brain.
In the past few years, the distance, similarity, inclusion, and information entropy measures for IVIFSs were very important topics.
In the present work, we examine the validity of GSLT by assuming various forms of entropy on apparent and event horizons.
Entropy (3) was first of all characterized by Havrda and Charvat [1].