单词 | entropy |
释义 | entropyn. 1. Physics and Chemistry. a. Originally: a thermodynamic state related to the separation and dispersal of molecules under the action of heat (see disgregation n.), later interpreted as a measure of the unavailability of energy in a thermodynamic system for doing work (work n. 10). In later use: a measure of the dispersal of energy in a system, frequently interpreted as representing the degree of disorder or randomness in that system.Entropy was first defined by the German physicist Rudolf Clausius (1822–88). Scottish physicists Peter Guthrie Tait (1831–1901) and James Clerk Maxwell (1831–79) were the first to interpret entropy as a measure of the unavailability of energy for work.The modern mathematical definition of entropy, in terms of the possible microstates (see microstate n. 1) of a thermodynamic system, first appears in the work of Austrian physicist Ludwig Boltzmann (1844–1906), who viewed entropy as a measure of the disorder of a system. ΘΚΠ the world > action or operation > inaction > disinclination to act or listlessness > [noun] > lack of vigour or energy neshnesseOE thowlessness1489 fecklessness1637 nervelessness1857 entropy1867 the world > matter > chemistry > physical chemistry > thermochemistry > [noun] > entropy entropy1867 the world > matter > physics > mechanics > dynamics > thermodynamics > [noun] > entropy entropy1867 1867 tr. R. Clausius Mech. Theory Heat ix. 357 I have intentionally formed the word entropy so as to be as similar as possible to the word energy. 1889 Proc. Amer. Acad. Arts & Sci. 1888–9 24 461 Boltzmann has even been able to determine the precise nature of the functions which Clausius called entropy and disgregation. 1933 W. E. Orchard From Faith to Faith xi. 280 The deduction which one of our greatest physicist astronomers draws from the second law of thermodynamics: namely, that since there must be a maximum entropy, there must have been once its maximum opposite. 1955 Sci. Amer. May 124/2 Certain combinations of balls yield a greater change in entropy than others. Those combinations in which entropy change reaches maximum value lead to solutions. 2013 A. Rutherford Creation: Origin of Life iv. 75 The entropy of the universe is bound to only ever increase, thereby ultimately creating a more balanced but less ordered existence. ΚΠ 1868 P. G. Tait Sketch Thermodynamics i. 29 We shall..use the excellent term Entropy in the opposite sense to that in which Clausius has employed it,—viz., so that the Entropy of the Universe tends to zero. 1885 H. W. Watson & S. H. Burbury Math. Theory Electr. & Magn. I. 245 As in the working of a heat engine, the entropy of the system must be diminished by the process, that is, there must be equalisation of temperature. 2. figurative. A state of or tendency towards disorder; an irreversible dissipation of energy resulting in stagnation or inactivity.Frequently with modifying word. ΚΠ 1925 A. Strachey & J. Strachey tr. S. Freud Coll. Papers III. v. 599 In considering the conversion of psychical energy no less than of physical, we must make use of the concept of an entropy, which opposes the undoing of what has already occurred. 1965 Financial Times 11 Aug. Moralising by those whose industrial entropy is an accepted fact of life is neither likely to persuade the workers nor assist the trade unions in the task of trying to meet the nation's difficulties. 1985 Anthropos 80 228/1 For some investigators..any departure from the traditional lifeways is necessarily accompanied by cultural entropy. 2017 Irish Times (Nexis) 28 Oct. 34 Irish writers have sought to extract creative energy from that political entropy. 3. a. Statistics. Any of various statistical measures of the uncertainty of outcomes in a given probability distribution; esp. the quantity −Σ log P(xi) log P (xi), where P(xi) is the probability of an event xi occurring.The entropy of a fair coin toss, for example, is 1, representing maximum uncertainty, while that of one using a coin which always lands on heads is 0.Although mathematician Claude Shannon (1916–2001) coined the term in the context of information theory (see sense 3b), he first used it in this more general sense. ΚΠ 1948 C. E. Shannon in Bell Syst. Techn. Jrnl. 27 394 If x is a chance variable we will write H(x) for its entropy. 1968 P. A. P. Moran Introd. Probability Theory i. 50 Since −x log x is a convex function the entropy of a finite set of events is a maximum when their probabilities are equal. 1989 Jrnl. Amer. Statist. Assoc. 84 159/2 The entropy of a discrete probability distribution is positive. 2003 Isis 94 550/1 The present lectures mainly involve the normal/Gaussian, binomial, multinomial, beta/gamma probability distributions, which are two of the three continuous maximum entropy (ME) distributions. b. In information theory: the quantity −Σ log pi log pi, where pi is the probability of a particular sign, symbol, value, etc., occurring in an item of data, frequently regarded as a measure of the number of bits required to encode the information (information n. 2c) contained therein. ΚΠ 1948 C. E. Shannon in Bell Syst. Techn. Jrnl. 27 396 The entropy of the source will be defined as the average of these Hi weighted in accordance with the probability of occurrence of the states in question. 1964 Language 40 210 The basic probability concept, ‘entropy’, and its quantum, the ‘bit’, are now part of the metalanguage of linguistics. 2016 R. Pettigrew Accuracy & Laws of Credence xiii. 177 We can now define a measure of entropy corresponding to any inaccuracy measure. This entry has been updated (OED Third Edition, September 2018; most recently modified version published online March 2022). < n.1867 |
随便看 |
英语词典包含1132095条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。