Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

History of entropy

From Wikipedia, the free encyclopedia
Part of the history of physics
This articleneeds additional citations forverification. Please helpimprove this article byadding citations to reliable sources. Unsourced material may be challenged and removed.
Find sources: "History of entropy" – news ·newspapers ·books ·scholar ·JSTOR
(November 2024) (Learn how and when to remove this message)

In thehistory of physics, the concept ofentropy developed in response to the observation that a certain amount of functional energy released fromcombustion reactions is always lost to dissipation or friction and is thus not transformed intouseful work. Early heat-powered engines such asThomas Savery's (1698), theNewcomen engine (1712) andNicolas-Joseph Cugnot'ssteam tricycle (1769) were inefficient, converting less than two percent of the input energy into usefulwork output.[citation needed] Over the next two centuries, physicists investigated this puzzle of lost energy; the result was the concept of entropy.

In the early 1850s,Rudolf Clausius set forth the concept of thethermodynamic system and posited the argument that in anyirreversible process a small amount ofheat energyδQ is incrementally dissipated across the system boundary. Clausius continued to develop his ideas of lost energy, and coined the termentropy.

Since the mid-20th century the concept of entropy has found application in the field ofinformation theory, describing an analogous loss of data in information transmission systems.

Classical thermodynamic views

[edit]

In 1803, mathematicianLazare Carnot published a work entitledFundamental Principles of Equilibrium and Movement. This work includes a discussion on the efficiency of fundamental machines, i.e. pulleys and inclined planes. Carnot saw through all the details of the mechanisms to develop a general discussion on the conservation of mechanical energy. Over the next three decades, Carnot's theorem was taken as a statement that in any machine the accelerations and shocks of the moving parts all represent losses ofmoment of activity, i.e. theuseful work done. From this Carnot drew the inference thatperpetual motion was impossible. Thisloss of moment of activity was the first-ever rudimentary statement of thesecond law of thermodynamics and the concept of 'transformation-energy' orentropy, i.e. energy lost to dissipation and friction.[1]

Carnot died in exile in 1823. During the following year his sonSadi Carnot, having graduated from theÉcole Polytechnique training school for engineers, but now living on half-pay with his brother Hippolyte in a small apartment in Paris, wroteReflections on the Motive Power of Fire. In this book, Sadi visualized anideal engine in which any heat (i.e.,caloric) converted intowork, could be reinstated by reversing the motion of the cycle, a concept subsequently known asthermodynamic reversibility. Building on his father's work, Sadi postulated the concept that "some caloric is always lost" in the conversion into work, even in his idealized reversible heat engine, which excluded frictional losses and other losses due to the imperfections of any real machine. He also discovered that this idealized efficiency was dependent only on thetemperatures of the heat reservoirs between which the engine was working, and not on the types ofworking fluids. Any realheat engine could not realize theCarnot cycle's reversibility, and was condemned to be even less efficient. This loss of usable caloric was a precursory form of the increase in entropy as we now know it. Though formulated in terms of caloric, rather than entropy, this was an early insight into the second law of thermodynamics.

Clausius' definitions

[edit]

1854 definition

[edit]
Rudolf Clausius - originator of the concept of"entropy"

In his 1854 memoir, Clausius first develops the concepts ofinterior work, i.e. that "which the atoms of the body exert upon each other", andexterior work, i.e. that "which arise from foreign influences [to] which the body may be exposed", which may act on a working body of fluid or gas, typically functioning to work a piston. He then discusses the three categories into which heatQ may be divided:

  1. Heat employed in increasing the heat actually existing in the body.
  2. Heat employed in producing the interior work.
  3. Heat employed in producing the exterior work.

Building on this logic, and following a mathematical presentation of thefirst fundamental theorem, Clausius then presented the first-ever mathematical formulation of entropy, although at this point in the development of his theories he called it "equivalence-value", perhaps referring to the concept of themechanical equivalent of heat which was developing at the time rather than entropy, a term which was to come into use later.[2] He stated:[3]

thesecond fundamental theorem in the mechanicaltheory of heat may thus be enunciated:

If two transformations which, without necessitating any other permanent change, can mutually replace one another, be called equivalent, then the generations of the quantity of heatQ from work at the temperatureT, has theequivalence-value:

QT{\displaystyle {\frac {Q}{T}}}

and the passage of the quantity of heatQ from the temperatureT1 to the temperatureT2, has the equivalence-value:

Q(1T21T1){\displaystyle Q\left({\frac {1}{T_{2}}}-{\frac {1}{T_{1}}}\right)}

whereinT is a function of the temperature, independent of the nature of the process by which the transformation is effected.

In modern terminology, that is, the terminology introduced by Clausius himself in 1865, we think of this equivalence-value as "entropy", symbolized byS. Thus, using the above description, we can calculate the entropy change ΔS for the passage of the quantity of heatQ from the temperatureT1, through the "working body" of fluid, which was typically a body of steam, to the temperatureT2 as shown below:

Diagram of Sadi Carnot'sheat engine, 1824

If we make the assignment:

S=QT{\displaystyle S={\frac {Q}{T}}}

Then, the entropy change or "equivalence-value" for this transformation is:

ΔS=SfinalSinitial{\displaystyle \Delta S=S_{\rm {final}}-S_{\rm {initial}}\,}

which equals:

ΔS=(QT2QT1){\displaystyle \Delta S=\left({\frac {Q}{T_{2}}}-{\frac {Q}{T_{1}}}\right)}

and by factoring out Q, we have the following form, as was derived by Clausius:

ΔS=Q(1T21T1){\displaystyle \Delta S=Q\left({\frac {1}{T_{2}}}-{\frac {1}{T_{1}}}\right)}

1856 definition

[edit]

In 1856, Clausius stated what he called the "second fundamental theorem in themechanical theory of heat" in the following form:

δQT=N{\displaystyle \int {\frac {\delta Q}{T}}=-N}

whereN is the "equivalence-value" of all uncompensated transformations involved in a cyclical process. This equivalence-value was a precursory formulation of entropy.[4]

1862 definition

[edit]
Main article:disgregation

In 1862, Clausius stated what he calls the "theorem respecting the equivalence-values of the transformations" or what is now known as thesecond law of thermodynamics, as such:

The algebraic sum of all the transformations occurring in a cyclical process can only be positive, or, as an extreme case, equal to nothing.

Quantitatively, Clausius states the mathematical expression for this theorem is follows.

LetδQ be an element of the heat given up by the body to any reservoir of heat during its own changes, heat which it may absorb from a reservoir being here reckoned as negative, andT theabsolute temperature of the body at the moment of giving up this heat, then the equation:

δQT=0{\displaystyle \int {\frac {\delta Q}{T}}=0}

must be true for every reversible cyclical process, and the relation:

δQT0{\displaystyle \int {\frac {\delta Q}{T}}\leq 0}

must hold good for every cyclical process which is in any way possible.

This was an early formulation of the second law and one of the original forms of the concept of entropy.

1865 definition

[edit]

In 1865, Clausius gave irreversible heat loss, or what he had previously been calling "equivalence-value", a name:[5][6][7]

I propose thatS be taken from the Greek words, 'en-tropie' [intrinsic direction]. I have deliberately chosen the word entropy to be as similar as possible to the word energy: the two quantities to be named by these words are so closely related in physical significance that a certain similarity in their names appears to be appropriate.[8]

Clausius did not specify why he chose the symbolS to represent entropy, and it is almost certainly untrue that Clausius choseS in honor ofSadi Carnot; the given names of scientists are rarely if ever used this way.[9]

Later developments

[edit]

In 1876, physicistJ. Willard Gibbs, building on the work of Clausius,Hermann von Helmholtz and others, proposed that the measurement of "available energy" ΔG in a thermodynamic system could be mathematically accounted for by subtracting the "energy loss"TΔS from total energy change of the system ΔH. These concepts were further developed byJames Clerk Maxwell [1871][citation needed] andMax Planck [1903][10]

Statistical mechanics

[edit]

In 1877,Ludwig Boltzmann developed astatistical mechanics evaluation of the entropyS, of a body in its own given macrostate of internal thermodynamic equilibrium. It may be written as:

S=kBlnΩ{\displaystyle S=k_{\rm {B}}\ln \Omega \!}

where

kB denotes theBoltzmann constant and
Ω denotes the number of microstates consistent with the given equilibrium macrostate.

Boltzmann himself did not actually write this formula expressed with the named constantkB, which is due to Planck's reading of Boltzmann.[11]

Boltzmann saw entropy as a measure of statistical "mixedupness" or disorder. This concept was soon refined by Gibbs, and is now regarded as one of the cornerstones of the theory ofstatistical mechanics.

Relation to living systems

[edit]

Erwin Schrödinger made use of Boltzmann's work in his 1944 bookWhat is Life?[12] to explain why living systems have far fewer replication errors than would be predicted from statistical mechanics. Schrödinger used the Boltzmann equation in a different form to show increase of entropy

S=kBlnD{\displaystyle S=k_{\rm {B}}\ln D\!}

whereD is the number of possible energy states in the system that can be randomly filled with energy. He postulated a local decrease of entropy for living systems when (1/D) represents the number of states that are prevented from randomly distributing, such as occurs in replication of the genetic code.

S=kBln(1/D){\displaystyle -S=k_{\rm {B}}\ln(1/D)\!}[clarification needed]

Without this correction Schrödinger claimed that statistical mechanics would predict one thousand mutations per million replications, and ten mutations per hundred replications following the rule for square root ofn, far more mutations than actually occur.

Schrödinger's separation of random and non-random energy states is one of the few explanations for why entropy could be low in the past, but continually increasing now. It has been proposed as an explanation of localized decrease of entropy[13] inradiant energy focusing in parabolic reflectors and duringdark current in diodes, which would otherwise be in violation of statistical mechanics.

Information theory

[edit]

An analog tothermodynamic entropy isinformation entropy. In 1948, while working atBell Telephone Laboratories, electrical engineerClaude Shannon set out to mathematically quantify the statistical nature of "lost information" in phone-line signals. To do this, Shannon developed the very general concept ofinformation entropy, a fundamental cornerstone ofinformation theory. Although the story varies, initially it seems that Shannon was not particularly aware of the close similarity between his new quantity and earlier work in thermodynamics. In 1939, however, when Shannon had been working on his equations for some time, he happened to visit the mathematicianJohn von Neumann. During their discussions, regarding what Shannon should call the "measure of uncertainty" or attenuation in phone-line signals with reference to his new information theory, according to one source:[14]

My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.

According to another source, when von Neumann asked him how he was getting on with his information theory, Shannon replied:[15]

The theory was in excellent shape, except that he needed a good name for "missing information". "Why don’t you call it entropy", von Neumann suggested. "In the first place, a mathematical development very much like yours already exists in Boltzmann's statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage.

In 1948 Shannon published his seminal paperA Mathematical Theory of Communication, in which he devoted a section to what he calls Choice, Uncertainty, and Entropy.[16] In this section, Shannon introduces anH function of the following form:

H=Ki=1kp(i)logp(i),{\displaystyle H=-K\sum _{i=1}^{k}p(i)\log p(i),}

whereK is a positive constant. Shannon then states that "any quantity of this form, whereK merely amounts to a choice of a unit of measurement, plays a central role in information theory as measures of information, choice, and uncertainty." Then, as an example of how this expression applies in a number of different fields, he referencesRichard C. Tolman's 1938Principles of Statistical Mechanics, stating that

the form ofH will be recognized as that of entropy as defined in certain formulations of statistical mechanics wherepi is the probability of a system being in celli of its phase space ...H is then, for example, theH in Boltzmann's famousH theorem.

As such, over the last fifty years, ever since this statement was made, people have been overlapping the two concepts or even stating that they are exactly the same.

Shannon's information entropy is a much more general concept than statistical thermodynamic entropy. Information entropy is present whenever there are unknown quantities that can be described only by a probability distribution. In a series of papers byE. T. Jaynes starting in 1957,[17][18] the statistical thermodynamic entropy can be seen as just a particular application of Shannon's information entropy to the probabilities of particular microstates of a system occurring in order to produce a particular macrostate.

Popular use

[edit]

The term entropy is often used in popular language to denote a variety of unrelated phenomena. One example is the concept ofcorporate entropy as put forward somewhat humorously by authorsTom DeMarco and Timothy Lister in their 1987 classic publicationPeopleware, a book on growing and managing productive teams and successful software projects. Here, they view energy waste as red tape and business team inefficiency as a form of entropy, i.e. energy lost to waste. This concept has caught on and is now common jargon in business schools.

In another example, entropy is the central theme inIsaac Asimov's short story "The Last Question" (first copyrighted in 1956). The story plays with the idea that the most important question is how to stop the increase of entropy.

Terminology overlap

[edit]

When necessary, to disambiguate between the statistical thermodynamic concept of entropy, and entropy-like formulae put forward by different researchers, the statistical thermodynamic entropy is most properly referred to as theGibbs entropy. The termsBoltzmann–Gibbs entropy orBG entropy, andBoltzmann–Gibbs–Shannon entropy orBGS entropy are also seen in the literature.

See also

[edit]

References

[edit]
  1. ^Mendoza, E. (1988).Reflections on the Motive Power of Fire – and other Papers on the Second Law of Thermodynamics by E. Clapeyron and R. Clausius. New York: Dover Publications.ISBN 0-486-44641-7.
  2. ^Mechanical Theory of Heat, byRudolf Clausius, 1850-1865
  3. ^Published inPoggendoff's Annalen, December 1854, vol. xciii. p. 481; translated in theJournal de Mathematiques, vol. xx. Paris, 1855, and in thePhilosophical Magazine, August 1856, s. 4. vol. xii, p. 81
  4. ^Clausius, Rudolf. (1856). "On the Application of the Mechanical theory of Heat to the Steam-Engine." as found in: Clausius, R. (1865).The Mechanical Theory of Heat – with its Applications to the Steam Engine and to Physical Properties of Bodies. London: John van Voorst, 1 Paternoster Row. MDCCCLXVII.
  5. ^Laidler, Keith J. (1995).The Physical World of Chemistry. Oxford University Press. pp. 104–105.ISBN 0-19-855919-4.
  6. ^OED, Second Edition, 1989, "Clausius (Pogg. Ann. CXXV. 390), assuming (unhistorically) the etymological sense of energy to be ‘work-contents’ (werk-inhalt), devised the term entropy as a corresponding designation for the ‘transformation-contents’ (Verwandlungsinhalt) of a system"
  7. ^Baierlein, Ralph (December 1992). "How entropy got its name".American Journal of Physics.60 (12): 1151.Bibcode:1992AmJPh..60.1151B.doi:10.1119/1.16966.
  8. ^Clausius, Rudolf (1865)."Ueber verschiedene für die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Wärmetheorie (Vorgetragen in der naturforsch. Gesellschaft zu Zürich den 24. April 1865)".Annalen der Physik und Chemie.125 (7):353–400.Bibcode:1865AnP...201..353C.doi:10.1002/andp.18652010702. "Sucht man fürS einen bezeichnenden Namen, so könnte man, ähnlich wie von der GröſseU gesagt ist, sie sey derWärme- und Werkinhalt des Körpers, von der GröſseS sagen, sie sey derVerwandlungsinhalt des Körpers. Da ich es aber für besser halte, die Namen derartiger für die Wissenschaft wichtiger Gröſsen aus den alten Sprachen zu entnehmen, damit sie unverändert in allen neuen Sprachen angewandt werden können, so schlage ich vor, die GröſseS nach dem griechischen Worte ἡ τροπή, die Verwandlung, dieEntropie des Körpers zu nennen. Das WortEntropie habei ich absichtlich dem WorteEnergie möglichst ähnlich gebildet, denn die beiden Gröſsen, welche durch diese Worte benannt werden sollen, sind ihren physikalischen Bedeutungen nach einander so nahe verwandt, daſs eine gewisse Gleichartigkeit in der Benennung mir zweckmäſsig zu seyn scheint." (p. 390).
  9. ^Girolami, G. S. (2020). "A Brief History of Thermodynamics, As Illustrated by Books and People".J. Chem. Eng. Data.65 (2):298–311.doi:10.1021/acs.jced.9b00515.S2CID 203146340.
  10. ^Planck, M. (1903).Treatise on Thermodynamics. Ogg, A. (transl.). London: Longmans, Green & Co.OL 7246691M.
  11. ^Partington, J.R. (1949),An Advanced Treatise on Physical Chemistry, vol. 1,Fundamental Principles,The Properties of Gases, London:Longmans, Green and Co., p. 300
  12. ^Schrödinger, Erwin (2004).What is Life? (11th reprinting ed.). Cambridge: Canto. pp. 72–73.ISBN 0-521-42708-8.
  13. ^"Random and Non Random States". 27 August 2014.
  14. ^M. Tribus, E.C. McIrvine, "Energy and information",Scientific American, 224 (September 1971).
  15. ^Avery, John (2003).Information Theory and Evolution. World Scientific.ISBN 981-238-400-6.
  16. ^C.E. Shannon, "A Mathematical Theory of Communication",Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, July, October, 1948,EprintArchived 1998-01-31 at theWayback Machine,PDF
  17. ^E. T. Jaynes (1957)Information theory and statistical mechanics,Physical Review106:620
  18. ^E. T. Jaynes (1957)Information theory and statistical mechanics II,Physical Review108:171
Classical physics
Modern physics
Recent developments
On specific discoveries
By periods
By groups
Scientific disputes
Retrieved from "https://en.wikipedia.org/w/index.php?title=History_of_entropy&oldid=1280690843"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp