Movatterモバイル変換


[0]ホーム

URL:


SEP home page
Stanford Encyclopedia of Philosophy

Information Processing and Thermodynamic Entropy

First published Tue Sep 15, 2009

Are principles of information processing necessary to demonstrate theconsistency of statistical mechanics? Does the physical implementationof a computational operation have a fundamental thermodynamic cost,purely by virtue of its logical properties? These two questions lie atthe centre of a large body of literature concerned with the Szilardengine (a variant of the Maxwell's demon thought experiment),Landauer's principle (supposed to embody the fundamental principle ofthe thermodynamics of computation) and possible connections betweenthe two. A variety of attempts to answer these questions haveillustrated many open questions in the foundations of statisticalmechanics.

1. Maxwell, Szilard and Landauer

1.1 Maxwell's demon

Maxwell's demon was first mentioned in a letter written to Tait in1867. Maxwell was one amongst a number of researchers in thedeveloping field of thermodynamics who was interested in seeking anunderstanding of thermal phenomena in terms of an underlying atomicphysics. However, unlike Boltzmann and Clausius, who were attemptingto prove the law of entropy increase from such atomic physics, Maxwellhad realised that if thermodynamics was ultimately grounded in atomictheory, then the second law of thermodynamics could have only astatistical validity.

Maxwell's demon: the partitioned container

The standard demon is supposed to be able to create a temperaturedifference in a gas, without expending work. The gas is in acontainer, divided in two by an insulated partition, but there is ahole in the partition just large enough for a single molecule to passthrough. The gas has been allowed to equilibrate at some well definedtemperature, and as a result the average kinetic energy of eachmolecule is (3/2)kT (we ignore internal degrees offreedom and assume the gas is monatomic), whereT is the absolute (Kelvin) temperature scale andkis Boltzmann's constant.

The demon is equipped with a shutter that is able to block thehole. If a molecule that is moving faster than average approaches thehole from the left hand side, the demon closes the hole with theshutter, and the molecule is elastically reflected back to theleft. If a molecule approaches from the left hand side, but movingslower than average, the demon leaves the hole unblocked, and themolecule proceeds through to the right. When a molecule approaches thehole from the right hand side, the demon's sorting procedure isreversed: slow molecules are blocked and fast molecules are allowed topass through. The result is a gradual accumulation of fastermolecules on the left and slower molecules on the right. Furthercollisions between molecules will distribute this kinetic energythroughout each side, resulting in the gas in the left side gettinghotter and the right side getting cooler. As the collisions with theshutter are elastic, and moving the shutter is frictionless, no workis performed by the demon. The temperature difference that developscould be exploited by a conventional heat engine to extract work, inviolation of second law of thermodynamics.

Maxwell's demon: the standard demon at work

A simpler demon could be constructed simply by always blockingmolecules coming from the left and never blocking molecules approachingfrom the right. This pressure demon would cause a pressure differenceto develop between the two sides of the partition and again, aconventionally operating engine could exploit such a difference toextract work.

His thought experiment was intended to demonstrate the possibilityof a gas evolving from a higher to a lower entropy state. The latertime reversal and recurrence arguments of Loschmidt and Poincaresimilarly challenged theH-theorem of Boltzmann (see Uffink (2006),Section 4, for a discussion of these arguments), but time reversalrequires great sensitivity and recurrence takes a very long time.Although they show the law of entropy increase is not absolute, wemight still be surprised to actually witness a large decrease inentropy happen through these means. With the demon we to appear to needanother explanation of why such systems are not seen to occurspontaneously in nature, or why we cannot arrange for such a decreasein entropy to occur.

Maxwell's original discussions emphasised that the demon needed tohave powers of perception and handling the individual molecules fargreater than our own. This makes its operation simply a matter of scaleand the statistical nature of the second law not probabilistic, butdue to our inability to discriminate the exact state of a large numberof particles (similar to our inability to exploit Loschmidt'sreversibility objection). This leaves open the possibility of a devicewhich could discriminate fluctuations in individual atomic velocitiesand it is not clear that any probabilistic argument would prevent workbeing extracted from this. The explanation of Brownian motion byEinstein in 1905, as the effect of statistical mechanical fluctuations,made them appear directly observable and open to exploitation. Earmanand Norton (1998) includes a historical review of early attempts todesign such devices.

Smoluchowski (1914) is generally credited with having proposed theexplanation that prevents their operation. He replaced the demon by aphysical device, in this case a gentle spring that presses the trapdooragainst the side of the partition. The combined spring and trapdoor issupposed to act as a valve. The trapdoor is held shut if a moleculecollides from the left, but is opened by a fast collision from theright, so that a pressure difference develops. However, the spring is asystem with its own kinetic and potential energy. The collisions willtransfer energy to the spring, making it oscillate. When the internalenergy of the spring matches the temperature of the gas, it is flappingback and forth, effectively randomly. It becomes as likely as not to bespontaneously open when a molecule arrives from the left and to beswinging shut when a molecule arrives from the right. This balancemeans pressure or temperature differences are no more likely to occurthan they would spontaneously if the hole was simply left open. If theinternal energy of the spring is out of equilibrium with the gas, itwill be heated (or cooled) by the collisions with the molecules untilequilibrium is reached.

The trapdoor may appear to violate the second law over shortperiods, but the behaviour is such that it is not violated in the longrun. Smoluchowski suggested that a modified second law should expressthe inability of a device to produce continuous, reliable reductions inentropy.

1.2 Szilard's engine

Smoluchowski also left the possibility open of an exception even toa modified second law:

As far as we know today, there is no automatic,permanently effective perpetual motion machine, in spite of themolecular fluctuations, but such a device might, perhaps, functionregularly if it were appropriately operated by intelligentbeings.

Szilard (1929) attempted to investigate this special case ofintelligently operated devices by considering a box containing only asingle molecule. He argued that in order to achieve the entropyreduction, the intelligent being must acquire knowledge of whichfluctuation occurs and so must perform a measurement. The second lawwould not be threatened provided there was a compensating cost toperforming this measurement, regardless of the character of theintelligent being.

The Szilard engine

The Szilard engine consists of a box, containing a single molecule, inthermal contact with a heat bath, and a partition. Thermal contacttransfers energy, through random fluctuations, back and forth betweenthe molecule and the heat bath. The molecule bounces randomlythroughout the box with this thermal energy.

The partition is capable of being inserted into the box, dividing itinto two separate volumes, and is also capable of sliding,frictionlessly, along the box to the left or to the right. When thepartition is inserted in the box, collisions with the molecule exert apressure on the partition. If the partition moves in the direction ofthe pressure, it may be coupled to a pulley and the force used to lifta weight. If the partition moves against the pressure, this requiresthe coupled pulley to lower a weight, working against the pressureexerted by the molecule.

If the partition is inserted at the middle point, the molecule iscaught on one side or the other with equal probability. Now if it is knownwhich side the molecule is on, it is possible to connect up thepartition to a pulley and extract work. Based upon the ideal gas law,PV=NkT, for the casewhereN=1, it is a standard calculation to show that the maximumwork extracted as the partition moves to the side of the boxiskT  ln 2. As the moleculeis assumed to be in thermal contact with the heat bath at all times,the kinetic energy of the molecule is maintained at(3/2)kT, and the work extracted is drawn from theheat bath. Once the partition reaches the side of the box, it can beremoved and the cycle has been completed. Heat has been extracted fromthe heat bath and converted into work, with apparent certainty. Theprocess may be repeated indefinitely to continue the certainextraction of work. If this succeeds, Smoluchowski's modified secondlaw appears to be violated.

Szilard's analysis emphasised the necessity ofknowingwhich side the molecule is on, for work to be extracted. Without thisit would not be possible to know in which direction the partition needsto move. This connects the apparent second law violation to the stateof a demon's knowledge. Szilard argued that the second law would besaved if the acquisition of knowledge by the demon came with acompensating entropy cost. Much of the succeeding literature isconcerned with whether, and how, this knowledge comes with the costthat Szilard believed, or with whether such knowledge is, in fact,necessary for the engine's operation.

1.3 Landauer's principle

Landauer (1961) investigated the question of what are the physicallimitations on building a device to implement a computation. At thetime he wrote, an influential body of work had been developed, byBrillouin (1951, 1956), Gabor (1964) and Rothstein (1951), arguingthat the acquisition of information through a measurement required adissipation of at leastkT  ln 2energy for each bit of information gathered. von Neumann (1949) hadalso suggested on the basis of Szilard's work, that every act ofinformation processing was necessarily accompanied by this level ofenergy dissipation.

Starting by representing logical operations as abstract maps definedfrom one set of discrete logical states to another set, Landauer arguedthat a physical system that was designed to implement the logicaloperation must have physical states to correspond to the logicalstates. He then distinguished between logically reversible andlogically irreversible operations: an operation is logically reversibleif the input state can be uniquely identified from the outputstate.

The NOT operation, for example, is a logically reversible operation.If the output is logical state one, then the input must have beenlogical state zero, and vice versa. An example of a logicallyirreversible operation is the AND operation. If the output is logicalstate zero, then there are three possible combinations of input logicalstates that could have produced that output: (zero, zero); (zero, one);and (one, zero).

As logically reversible operations need to be 1:1 maps, Landauerargued they can be implemented by physical devices which do notcompress the physical state space. Logically irreversible operationsreduce the logical state space, so must compress the physical statespace. Landauer argued that this must be accompanied by a correspondingentropy increase in the environment, in the form of heat dissipation.Most familiar logical operations are irreversible and so by thisargument must generate heat.

Szilard engine for irreversible logical operation

To quantify the heat generation Landauer considered the most basiclogically irreversible operation to be resetting a bit. This operationtakes two input logical states, (conventionally zero and one) andalways outputs logical state zero (in some papers reset to one isconsidered instead).

For a physical implementation of this operation, it is usual toconsider a device similar to the Szilard engine. A box, in thermalcontact with a heat bath, contains a single molecule and a partitionwhich divides the box in two. If the molecule is on the left hand side,then the physical state represents logical state zero and if themolecule is on the right hand side, it represents logical stateone.

The partition is then removed from the centre of the box, so themolecule is free to travel throughout the box. The partition isinserted into the far right hand side of the box and, maintainingthermal contact, is slowly moved to the centre of the box. Once again,collisions with the molecule exert a pressure on the partition,requiring work to be performed, and the energy from the work istransferred via the molecule to heat in the heat bath. Standardcalculations show this requires at leastkT  ln 2 work. One expressionof what has now become known as “Landauer's principle” isthat there are no possible physical implementations of the resettingoperation that can do better that this, that is, reset a bit to zeroconverting less thankT  ln 2of work into heat.

Landauer referred to this operation as resetting, although in much ofthe succeeding literature this has become known as erasure. This hascaused some confusion as “erasing” information could betaken just to mean an operation which destroys the originalinformation, without necessarily leaving the system determinately inthe zero state. An example of such‘erasure-by-destruction’ would be to simply remove thepartition from the box, wait long enough for thermalisation torandomise the location of the molecule, then reinsert the partition inthe centre of the box. Such an operation would clearly destroy theoriginal information, represented by the original location of themolecule, but requires no work to be performed. However, it is alsoclearlynot an implementation of the reset operation.

Landauer's argument suggests that, purely from the abstract propertiesof a logical operation one can deduce a thermodynamic constraint uponany physical system which is required to act as an embodiment of thatlogical operation. It suggests there is a non-trivial thermodynamicsof computation, a non-trivial connection between abstract logicalfunctions and their physical implementation in thermal systems.

Landauer did not, in his 1961 paper, directly address the questionof whether a measurement was logically reversible or irreversible, butonly questioned whether the concept of a measurement had been definedsufficiently well in the work of Brillouin and others. At that point intime, he regarded logical irreversibility as an essential part ofcomputation and believed this was responsible for a necessary minimumheat generation in information processing. His arguments were presentedas making more precise the arguments of Brillouin and von Neumann.

Bennett (1973) built upon Landauer's work, but argued that logicalirreversibility could be avoided in computation in general. In(Bennett 1982) he argued that measurement could also be represented bya logically reversible process, avoiding any need to generateheat. This represented a major change from von Neumann and Brillouin'sarguments, and Bennett's presentation of Landauer's principle rapidlybecame accepted as the fundamental principle of the thermodynamics ofcomputation.

2. Statistical Mechanics and the Second Law

The literature devoted to analysing the second law of thermodynamicsin the context of statistical mechanics starts with the development ofstatistical mechanics in the late 19th century. Considerable confusionhas arisen simply due to the fact that as the subject has developed,the meaning of key terms have changed or become ambiguous. When onepaper speaks of the second law being violated or of entropy decreasingand another of it being saved or being non-decreasing, it is notnecessarily the case that they are referring to the same things. Whilea review of the foundations of thermal physics is beyond the scope ofthis entry (though see related entries bySklar andUffink), some important distinctionsneed be borne in mind.

2.1 Which entropy?

Even in phenomenological thermodynamics, the definition ofthermodynamic entropy is difficult to make precise and may beapproached in a number of ways (see (Uffink 2001) for an extensivetreatment of this issue). The traditional approach is based upon thework of Carnot, Kelvin and Clausius, one version of which will begiven here.

A closed thermodynamic system has contact with the rest of the worldonly through work and heat exchange. Work performed upon the systemlowers a weight through a gravitational potential, while work extractedis used to raise the weight through the potential. This work may beperformed through the manipulation of externally controllableparameters of the system (such as adjusting the volume of a sealed boxcontaining a gas) or may be by other means (such as driving a paddlewheel within the gas, stirring the gas). Heat is exchanged with heatbaths brought into thermal contact with the system. Many heat baths maybe used, and they may be at different temperatures to each other. Aclosed cycle is a sequence of operations that leaves the system in thesame thermodynamic state at the end of the sequence as it was at thestart of the sequence, but may change the position of the weight in thegravitational potential, and may involve quantities of heat beingdeposited in, or extracted from, the individual heat baths.

It was empirically observed that in any closed cycle, whose soleresult is the generation of heats,Qi, in heat baths at temperaturesTi,(requiring workW=∑iQi to be performed), the Clausius inequality:

i 
Qi
Ti
 ≥ 0

holds. Both the Kelvin version of the second law ofthermodynamics:

it is impossible to perform a cyclic process with no other result thanthat heat is extracted from a heat bath, and a weight israised

and the Clausius version:

it is impossible to perform a cyclic process with no other result thanthat heat is extracted from a heat bath with a low temperature anddeposited in a heat bath with a higher temperature

are special cases of this inequality. Kelvin and Clausius, building onthe work of Carnot, suggested that the inequality must hold, for allclosed cycles, as a general law. The temperature scale is the absolutetemperature scale (up to a positive multiplicative constantrescaling:T′=aT,witha>0), which may be measured with an ideal gasthermometer.

Now let us suppose there exists a process that transforms the systemfrom a thermodynamic stateA to thermodynamicstateB, while generatingheatsqi in heat baths attemperaturesTi, and an opposite process,fromB toA, generatesheatsq′i, in the same heat baths,in such a way that the equality:

i 
qi
Ti
 +i 
q′i
Ti
 = 0

is reached. It follows from the inequality that, if there is anyprocess that transforms the system from stateA tostateB, while generatingheatsQi in heat baths attemperaturesTi, then:

i 
Qi
Ti
 +i 
q′i
Ti
 ≥ 0

and so:

i 
Qi
Ti
 ≥ i 
qi
Ti
 .

The term

i 
qi
Ti

defines a minimum quantity associated with the heats generated by anypossible process that transforms the system from stateA to stateB.Clausius had the insight that this could be used to define a functionof the thermodynamic state, through the measurement of heat transferredto heat baths, as the system changes between two states. The function,the thermodynamic entropyS, is defined by

S(A) − S(B) = i 
qi
Ti

For any other process,

i 
Qi
Ti
 ≥ i 
qi
Ti

so for any process to be possible:

i 
Qi
Ti
 ≥ S(A) − S(B)

Anadiabatic process (one which does not generate any heat)from stateA to stateB is therefore only possibleif it is entropy increasing:S(A) ≤ S(B).

This definition, of thermodynamic entropy, depends upon cyclicprocesses that can reach the equality, which are calledreversible processes. The existence of such processes betweenthermodynamic states allows the entropy differences between thosestates to be determined and, by extension to all states, defines athermodynamic entropy function that is globally unique (up to arescalingS′ = a−1S + b,wherea andb are constants, anda is the multiplicative constant from thetemperature scale). It may be noted that if there exist states thatcannot be connected by a reversible process, it is still alwayspossible to define an entropy function that satisfies

i 
Qi
Ti
 ≥ S(A) − S(B)

for all possible processes, but its value will not be determineduniquely (i.e., there will exist a number of functions that satisfythe inequality).

To reach the equality for a cycle generally requiresquasistatic reversible processes. These are processes forwhich the system goes through infinitesimally small changes in thestate variables (such as the temperature, the volume and the pressureof a gas) and for which the change can go in either direction withequal and opposite infinitesimal heat exchanges with heat baths. Theseheat exchanges are generally only reversible if the system is inthermal equilibrium with the heat bath.

For the changes in the state variables to be infinitesimal, thestate space must be continuous. A sequence of states will then berepresented by a continuous curve in the state space. The curveconnectingA toBthrough these infinitesimal changes replaces the summation with anintegral. TheTi canbe replaced by the temperature of the systemT, and the heat,dQ, is now the heat absorbed by the system,to give:

S(B) − S(A) =      B

A
 
dQ
T

The Clausius inequality ensures that this value is the same for allquasistatic reversible paths fromA toB. It shouldbe noted that a quasistatic reversible path is an idealisation that isreachable only in the limit of infinitely slow processes.

This thermodynamic entropy is a consistently defined single valuedfunction of the thermodynamic state only if the Clausius inequalityholds. If a Maxwellian demon exists, however, then it would appearpossible the Clausius inequality wouldnot hold. Toinvestigate further it is necessary to consider statistical mechanicalgeneralisations of entropy.

For statistical mechanics we need to consider a microscopic statespace and a dynamical evolution of states in that space. Classicallythis will be a phase space, with anN-body system having3N position degrees of freedom and 3N momentumdegrees of freedom. A single point in the phase space corresponds tothe combined physical state of all theN bodies. The dynamicsis almost always supposed to be Hamiltonian. A Hamiltonian flowpreserves the measuredX3NdP3N. This measure can be used to define the volume,VR, of aregion of phase space,R, as:

VR=     

R
dx3Ndp3N

A very important consequence of this is Liouville's Theorem, whichshows that the volume of phase space occupied by a set of states doesnot change when that set of states evolves through a Hamiltonianevolution.

For quantum mechanical systems, the microscopic state space is aHilbert space. Dynamic evolution is generally through a unitaryoperator, but with doubly stochastic transitions if wavefunctioncollapse occurs. Volumes of regions of the state space are associatedwith the dimensionality of the smallest subspace containing theregion, and an analogue of Liouville's Theorem holds for unitaryevolution (doubly stochastic transitions may increase, but cannotdecrease, this state space volume). For the most part there is littledifference between the classical and quantum treatments of Szilard'sengine and Landauer's principle and, unless stated otherwise, we willuse the classical treatment. We will consider some of the suggesteddifferences in Section 5.

The Boltzmann entropy,SB = k lnW, is widely regarded as being the most natural analog within statisticalmechanics for the thermodynamic entropy. It is a property of anindividual microstate. The state space is divided up into a number ofdistinct regions andSB is defined in terms of the volume,W, of the region of state space to whichthe microstate belongs. All microstates within a given region have thesame Boltzmann entropy.

There are many approaches to defining the division of the statespace into distinct regions. The most common group together sets ofmicrostates that match criteria such as being macroscopically orobservationally indistinguishable or are accessible over time to themicrostate evolution. For the systems considered here, these approachesgenerally define the same regions. We may conventionally refer to theseregions as macrostates, while acknowledging this terminology is alittle inappropriate when describing systems consisting only of asingle molecule. In the case of the Szilard engine, for example, themacrostate of the system, when the partition is absent, consists of theset of all the microstates for which the molecule is in the box. Whenthe partition is inserted in the box, the macrostate is the set of allthe microstates for which the position of the molecule is on the sameside of the partition as the actual location of the molecule. We willsometimes refer to the Boltzmann entropy of a macrostate: this issimply the Boltzmann entropy of the microstates within thatmacrostate.

The Boltzmann entropy,SB, is not guaranteed to be non-decreasing. While decreases ofSB are known to be possible through thereversibility and recurrence objections to Boltzmann'sH-theorem, such decreases would be seen as surprising if theyoccurred in practice. While an individual microstate can evolve from ahigh volume macrostate to a low volume macrostate, Liouville's Theoremguarantees that only a fraction of the microstates from the largermacrostate can end up in the smaller macrostate under a Hamiltonianevolution. From the logarithmic form of Boltzmann entropy theproportion, by volume, isp ≤ eΔSB/k(with ΔSB the reduction in the Boltzmann entropy between the two macrostates). If it can be assumedthat the probability of the microstate being in a given subregion ofthe macrostate is proportional to the phase space volume of thesubregion, this gives Einstein's fluctuation formula. Althoughwidespread use is made of this assumption, its justification is one ofthe more significant challenges for the foundations of statisticalmechanics.

By Liouville's Theorem, if all the microstates (up to a set of measurezero) in an initial macrostate evolve into the same final macrostate,then the Boltzmann entropy of the final macrostate cannot be less thanthe Boltzmann entropy of the initial macrostate. This is amacroscopically deterministic process. A macroscopicallyindeterministic process is one in which microstates starting in thesame initial macrostate end up in different final macrostates. Penrose(1970, Chapters V, VI) analysed the problem of the change in Boltzmannentropies for these kinds of processes in detail. After attempting ajustification of probabilities being proportional to phase spacevolume, he argued that even theaverage of the Boltzmannentropy can decrease for macroscopically indeterministicprocesses. For a system initially in a macrostate with BoltzmannentropySB, and evolving, withprobabilitypi, into amacrostatei with BoltzmannentropySBi, it is possible that∑ipiSBi < SB(it is even possible that∀iSBi < SB).However, Penrose also showed that this decrease is bounded:∑ipi (SBi  − k ln pi) ≥ SB.He suggested that, when there is a probability distribution overdistinct macrostates, a modified, statistical entropy,SP  =  ∑ipi (SBi  − k ln pi),should be used. This statistical entropy is non-decreasing even formacroscopically indeterministic processes.

The Gibbs approach to statistical mechanics is based upon probabilitydistributions,p(X3N, P3N),over the state space, rather than the properties of individualmicrostates. The Gibbs entropy of the distribution is defined bySG = −k ∫ p(X3N, P3N) ln p(X3N, P3N) dX3NdP3N.If the state space is divided into a number of distinct macrostates,Ri, then the Gibbs entropy of theith macrostate is

SGi = −kRip(X3N, P3N | i) ln p(X3N, P3N | i) dX3NdP3N,

withp(X3N, P3N | i) = p(X3N, P3N, i) / piandpi = Rip(X3N, P3N) dX3N dP3N.This givesSG = ∑ipi(SGi −k ln pi).(N.B.p(X3N, P3N, i) = 0outside ofRi andp(X3N, P3N, i) = p(X3N, P3N)inside ofRi.)

It is a consequence of Liouville's Theorem that this entropy isconstant under Hamiltonian flows. While this guarantees entropy isnon-decreasing, it presents a problem for providing an account of theappearance of entropy increase. The standard approach to dealing withthis is called coarse graining. Coarse graining replaces theprobability distribution over each macrostate,p(X3N, P3N | i),with a ‘smoother’ probability distribution,p′(X3N, P3N | i),Typically the ‘uniform’ distribution:

p′(X3N, P3N | i)   =   
1
Ri dX3N dP3N

insideRi andp′(X3N, P3N | i) = 0outsideRi. The coarse grained entropy for the macrostate now satisfies

SGi = −kRip′(X3N, P3N | i) ln p′(X3N, P3N | i) dX3NdP3NSGi .

The entropy of the overall coarse grained probability distributionp′(X3N, P3N) = ip′(X3N, P3N | i) piis

SG = −k  p′(X3N, P3N) ln p′(X3N, P3N) dX3NdP3N =ipi(SGi −k ln pi) ≥SG.

Coarse graining the probability distribution avoids Liouville'sTheorem, and successive coarse grainings increase the coarse grainedGibbs entropy. The justification for coarse graining is usuallyattributed to our observations being insensitive to the fine grainedstructure of the original probability distributionp(X3N, P3N | i).The acceptability of this practice is one of the major problems forthe Gibbs approach to statistical mechanics.

For Szilard's engine and Landauer's principle, it is largely agreedthat numerical evaluations allow us to setSBi = SGi = SGifor all macrostates, by the adjustment of a single additive constant(this is equivalent to saying that entropydifferencesbetween macrostates are the same for each definition ofentropy). Where there is no uncertainty over the macroscopic state, itcan also be assumedSB = SP = SG = SG.This has allowed a large amount of discussion to take place in theliterature referring to ‘entropy’, withoutspecifyingwhich entropy is meant. Although this practice cancause confusion, it avoids unnecessarily cumbersome language insituations where the different entropies are in agreement. However,when an argument is only valid for aparticular entropy, orwhen these entropies disagree, it should be clearly stated whichentropy is involved.

2.2 Which second law?

Atomic physics allows the possibility of evolutions to states thatwould usually be characterised as having a lower thermodynamicentropy. An exorcism of Maxwell's demon, therefore, cannot be takento be the claim that this entropy cannot go down, as this isdefinitely possible. Smoluchowski proposed, not an outright exorcismof Maxwell's demon, but an argument that Maxwell's demon could notviolate a modified second law. He offered a modified formulation ofthe second law which states the demon is unable to reliably,continuously produce work. Such a demon, while not exorcised, might beconsidered “tamed”. A tame demon could produce (in theterminology of Earman and Norton (1998)) “straight”violations of the second law, but not “embellished”violations, where the straight violation is exploited in such a way asto reliably, continuously produce such work.

Such a formulation leaves much to be clarified: reliability is notclearly defined and the requirement that the demon only fail in theinfinite time limit seems to allow arbitrarily large violations on anyfinite time scale we might care about, with probabilities as close toone as we please, provided they do not actually reach one. A commonreworking of the modified form, but which is clearly stronger, is thatthe demon cannot operate a cycle, in finite time, in which theexpectation value for work produced is positive, as repetition of sucha cycle could produce arbitrarily large amounts of work withprobability arbitrarily close to one.

Not only are these formulations inequivalent, they leave open thepossibility of other types of violation. Is there a demon whichcontinuously, reliably produces work without ever completing a cycle orperhaps a demon which produces arbitrarily large amounts of work andcompletes a cycle with a probability arbitrarily close to one, butwhich still does not succeed on average as it faces a catastrophicallylarge failure on the remote possibility that it fails? Whether a cycleneeds to complete with certainty, or just with probability arbitrarilyclose to one, whether a completed cycle means the system must return toexactly its initial state or whether it need only return to anequivalent state, would seem to require clarification as part of such amodified law.

Formulations of modified second laws in terms of entropy must firstidentify the entropy function being used. Here the problem is not somuch to define a modified law. One may easily define ones that arealready known to be true (the fine grained Gibbs entropy cannotdecrease) or false (the Boltzmann entropy cannot decrease, or cannotdecrease on average). The challenge must be to show that a givenmodified law is, in fact, the law one should really be concerned about.Its violation would show that untamed demons clearly exist, while itsproof would ensure that all demons are tamed.

In this entry aconstrained violation is one in which theunmodified second law is violated, but with some constraint upon theform of the violation so thatsome form of modified second lawmay still be possible. Anunconstrained violation is one inwhich all attempts to construct meaningful modifications of the secondlaw are also invalidated. Although this is still ambiguous, there seemsgeneral agreement that a cycle which could complete in finite time andwith probability one, and have no other effect than to convert aquantity of heat into work, would certainly constitute an unconstrainedviolation.

3. Statistical Mechanics Requiring Information Processing

Szilard's own answer to the dilemma posed by his engine was topresume that a demon could not operate the device continuously andreliably. He then tried to deduce where it must go wrong. He imposedthe postulate that the second law must be obeyed and after eliminatingall other sources of entropy production, he argued that a minimumentropy production occurs during measurement. The postulated second lawis a modified one, requiring that theaverage entropyproduction in a measurement process must equal theaverageentropy reduction as a result of that measurement. Although Szilard didnot identify a specific definition of entropy, it is clear from thecontext that he was considering the entropy of specific macrostates.The most significant aspect of Szilard's argument is, in effect, thatif statistical mechanics is incompatible with the operation of untameddemons, then it isrequired that there be an entropic costassociated with the demon's acquisition of information.

3.1 Measurement with light

While Szilard gave a specific example of a measurement process thatwas supposed to demonstrate the entropic cost, he did not attempt ageneral argument that no physical measurement process could do better.Instead the argument went that if one did exist, it would lead to anuntamed demon. In light of later arguments, it should also be pointedout that it is not, in fact, measurement but the erasure of themeasurement outcome that generates the entropy production in Szilard'sexample.

Szilard's argument was developed further after Shannon identifiedthat the measurep ln p had operationalsignificance for information theory, suggestive of a deeper connectionbetween entropy and information. To illustrate the idea further, bothGabor (1964) and Brillouin (1951) constructed specific models ofdissipative measurement that involve shining a light into one side ofthe engine to see whether that side contains the molecule. The light isscattered if the molecule is on that side, but is not scattered if themolecule is not present. However, if the system, including theelectromagnetic field itself, is in thermal equilibrium then there isbackground radiation at that temperature with a blackbody spectrum. Tosee the scattered light it must be distinguishable from the backgroundradiation. This requires a photon with energy much higher than the meanenergy of the blackbody spectrum, so ℏ ν ≫kT.A photon must be used whose energy is greater than the energy gainedfrom the operation of the engine, thereby precluding a net conversionof heat into work.

Brillouin, in particular, attempted to develop this idea into ageneral theory of the relationship between information and entropy.Brillouin now identified entropy with the Gibbs entropy of a system. Hedistinguished two kinds of information: ‘bound’ information and ‘free’information. Bound information refers strictly to information which isembodied in the states of physical systems. Information which is, forexample, only contained in someone's mind, according to Brillouin, isfree, not bound. Performing a measurement could reduce thethermodynamic entropy of a system, but to do so requires the creationof an equivalent quantity of bound information in the device whichholds the outcome of the measurement. However, it is not entirely clearif Brillouin's argument was that the creation of bound information isnecessarily associated with compensating entropy production, as thearguments based upon scattered photons suggest, or if the boundinformation in itself is an additional term that must be added tonormal entropy to produce a generalised second law.

3.2 Engines without demons

A counter-argument to Szilard was developed originally by Popper,although it first appears in print with Feyerabend (1966) and has beenrediscovered repeatedly. Its goal is to reject the idea thatstatistical mechanical entropy is a subjective quantity. The primarypurpose of the counter-argument is to show that work can be extractedwithout needing to have an intelligent being involved. The concept ofinformation is argued not to carry any real burden in understanding theSzilard engine. The measurement can be performed, and the operation ofthe engine effected, without there needing to be a demon to find outthe result of the measurement. The description of the measurementoutcome as information is superfluous.

Popper-Szilard engine diagram

The Popper-Szilard engine attaches a pulley and weight to each side ofthe partition, but positions a floor beneath the engine in such a waythat when the partition is in the centre of the box, both weights areresting on the floor and the pulley is taut. The partition has a holein it, allowing the molecule access to both sides of the engine. At anarbitrary time, the hole is blocked, without any measurement performedon the location of the molecule. The collisions of the moleculeagainst the partition will now exert a force on one or the otherweight, lifting it against gravity.

What is frequently left unclear in discussions of these devices iswhether a compensating entropy increase must still take place and ifnot, what implications this has. To some, most notably Feyerabend, nocompensation need occur and the argument is continued to claimexplicitly the second law of thermodynamics is unconditionallyviolated. To others, such as Popper, it indicates that the second lawof thermodynamics is only applicable to large systems with many degreesof freedom. The engine is used to demonstrate the domain of validity ofthermodynamics, in a similar manner to Maxwell's original demon. Fromthis point of view, the Smoluchowski trapdoor may still be viewed as athermodynamic system but the Szilard engine is not. Advances intechnology, such as in nano-technology and quantum computation, mayrender this argument problematical, as the ability to constructreliable devices manipulating the states of individual molecules will,if there is not some additional physical constraint, eventually allowthe construction of macroscopically large numbers of such devices. Ifeach device could reliably, continuously extract even microscopicamounts of heat, then in aggregate there would again be the productionof macroscopically significant quantities of work.

3.3 Memory and erasure

Landauer's work led indirectly to criticisms of the Szilard argumentalthough Landauer did not directly address the Szilard engine, orMaxwell's demon. According to his own comments in (Landauer 1986), itappears he believed that logically irreversible operations were anecessary part of a computation and these generated heat. Although itwas known to be possible to simulate logically irreversible operationswith logically reversible ones, this came with its own cost ofadditional bits of information needing to be stored. To avoid thisstorage, and to complete a thermodynamic cycle, required the additionalbits to be reset to zero, with a corresponding entropy cost.

A similar argument was developed from first principles by Penrose(1970, Chapters V and VI, with particular reference to VI.3), who haddeduced that the Boltzmann entropy could go down on average, butonly during macroscopically indeterministic processes. Thisreduction was bounded, such that, if a macroscopically indeterministicprocess started in a determinate macrostate of Boltzmann entropyS0, andproduced distinct macrostates of Boltzmann entropySi, withprobabilitypi, thenS0 − ⟨ Si ⟩ ≤ −k ⟨ ln pi ⟩.He further argued this meant that a process which started withdistinct macrostates of Boltzmann entropySi, eachoccurring with probabilitypi,could not end up in a macrostate with Boltzmann entropySfwith certainty, unlessSf − ⟨ Si ⟩ ≥ −k ⟨ ln pi ⟩ as, otherwise, the two processes in succession could lead to amacroscopicallydeterministic process with decreasing Boltzmann entropy(Sf<S0).

Penrose then directly applied this to the problem of the Szilardengine. The insertion of the partition corresponds to amacroscopically indeterministic process. After the insertion themolecule is in one of two possible macrostates, in either casereducing the Boltzmann entropybyk ln 2. However, this reduction cannot bedirectly exploited toextractkT ln 2 heat and leavethe molecule in a macrostate occupying the entire box again. If therewas a process which could directly exploit this reduction, it couldcombine with the insertion of the partition to be a macroscopicallydeterministic process that reduces the Boltzmann entropy (via theextraction of heat from the heat bath). Penrose had already argued,from first principles, that a macroscopically deterministic processcannot reduce the Boltzmann entropy.

He then considered adding a demon, and applied the same argument tothe combined system of engine and demon. Now he concluded work can beextracted leaving the molecule in a macrostate occupying the entirebox, but only by leaving the demon indeterministically in one of anumber of macrostates. The additional statistical entropy of theprobability distribution over the demon's macrostates compensates forthe entropy reduction in the heat bath. Eliminating the statisticaldistribution over the demon's states is possible only with the cost ofan entropy increase in the heat bath.

Penrose concluded that the demon needs to measure the location ofthe molecule, store that location in its memory, and can then extractthe work. At the end of the operation, the demon retains the memory ofwhere the molecule was located, maintaining the indeterministicoutcome. He considered a demon repeating this process many times,gradually filling its memory up with the outcomes of previousmeasurements. If the demon's memory is finite, eventually it will runout of space, leading to one of two possibilities: either the demonceases to be able to operate or the demon must reset some of its memoryback to zero, which requires a corresponding increase in Boltzmannentropy elsewhere.

Penrose's solution was effectively rediscovered independently byBennett (1982), after Bennett had demonstrated that logicallyreversible computation could avoid the necessity of storing largequantities of additional information (Bennett 1973). Bennett presentedan explicit physical model for reversible measurement. In logicalterms, he represented a measurement by the state of a measuring systembecoming correlated to the state of the measured system. The measuringsystem starts in a fixed state (which can be taken to be logical statezero) and moves through a correlated interaction, into a copy of thelogical state, zero or one, of the system being measured. For(measuring system, measured system), a combined input logical state of(zero, zero) is left unaffected with the output logical state (zero,zero), while a combined input logical state of (zero, one) becomes theoutput logical state (one, one). As the combined output state can beused to uniquely identify the combined input state, the operation islogically reversible.

For the physical process, Bennett made the molecule in the Szilardengine diamagnetic and the measuring device a one domain ferromagnetwith a fixed initial polarisation. With careful manipulation, he arguedit is possible to use the perturbation of the magnetic field by thediamagnet, to correlate the polarisation of the ferromagnetnon-dissipatively to the diamagnet's location on one side or other ofthe engine. By the same kind of manipulation, he argued that to resetthe polarisation of the ferromagnet to its initial statewithout using the correlated location of the diamagnet, thereis a cost ofkT ln 2 in heatgeneration, in accordance with Landauer's principle. Extracting workfrom the Szilard engine allows the diamagnetic molecule to move freelythroughout the engine, so losing the correlation. The heat generatingreset operation is then the only way to restore the ferromagnet to itsinitial polarisation.

With the ferromagnet representing the demon's memory, Bennett hadgiven an explicit counter-example to the arguments of Szilard andBrillouin, that measurement is necessarily dissipative. At the sametime he argued that the process of resetting the demon's memory is anecessarily logically irreversible step, which incurs the heatgeneration cost identified by Landauer, and this cost saves a modifiedform of the second law. Although Bennett gave a particular model forresetting the demons memory, no direct proof was given that there canbe no better way. Unlike Penrose's argument, Landauer's principle seemsto be assumed, rather than derived.

Supporters of this resolution argue that the demonless engines ofPopper et. al. are also examples of this kind of process. In each caseof a demonless engine, there is another mechanism that is left in oneof two distinct possible states after the work is extracted, dependingupon which side of the box the molecule was located. In the case of thePopper-Szilard engine, the partition is on the left or right hand sideand a different weight has been lifted. This state still encodes theinformation about which outcome occurs. To truly have completed acycle, reliably, it is necessary to restore the partition to the centreof the box, regardless of the side to which it moved, and to have adefinite, single weight raised. If Landauer's principle is correct,this is a resetting operation and necessarily incurs a cost that, onaverage, offsets the work extracted from the operation of theengine.

3.4 Algorithmic complexity

The Landauer-Penrose-Bennett resolution depends for its consistencyin continuing to take into account a probability distribution over thepossible outcomes of the measurement, after the measurement hasoccurred. There are many reasons why one might feel that this raisesproblems. For example, a Boltzmannian approach to statistical mechanicstakes the entropy of a system to depend only upon a confined region ofthe microstate space, usually either that of microstates compatiblewith a given macrostate, or accessible to the microstate over time.Probability distributions over inaccessible or macroscopically distinctregions should not, from this point of view, have thermodynamicsignificance. From an opposite point of view, a subjective Bayesianmight argue probability distributions are only expressions of anagent's uncertainty. If the intelligent demon is supposed to be theagent, it can hardly be uncertain of its own state once it hasperformed the measurement.

Zurek (1989a; 1989b) proposed a development of theLandauer-Penrose-Bennett position, building on a suggestion ofBennett's, that would attempt to address some of these concerns andalso whether a sufficiently cleverly programmed computer might be ableto do better than Landauer's principle would suggest. If a demonperformed a large number of cycles without erasing its memory, asPenrose suggested, then its memory would contain a randomly generatedbit string. Zurek's suggestion was that the Boltzmannian entropy of thephysical states representing the bit string needed to have added to itthe algorithmic complexity of the bit string itself and it is thistotal entropy that is the subject of a modified second law.

The algorithmic complexity of a bit string is a measure of how mucha given bit string can be compressed by a computer. A long, butalgorithmically simple, bit string can be compressed into a muchshorter bit string. This shorter bit string could apparently be resetto zero, by resetting each individual bit, for a much lower cost thanresetting each individual bit in the original longer bit string. Aclever demon might, therefore, be able to compress its memory and soavoid the full erasure cost that would be needed to ensure a modifiedsecond law holds.

However, long randomly generated bit strings are, with highprobability, incompressible, so unless the demon was very lucky, thiscould not work. Furthermore, the average algorithmic complexity of astatistical mixture of bit strings is not less than the Shannoninformation of the statistical distribution. It can be shown,therefore, that even if the best compression algorithms are availablethe clever demon cannot do better, on average, than simply resettingeach bit individually in its memory.

Zurek's proposed total entropy, given by the sum of the Boltzmannianentropy and the algorithmic complexity, is a property of the individualmicrostates, rather than of the probability distribution over theoutcomes. It can decrease during macroscopically indeterministicprocesses, if a particularly algorithmically simple sequence ofoutcomes happens to occur. However, unlike the Boltzmann entropy, evenduring macroscopically indeterministic processes it does not decreaseon average.

3.5 Sound vs. profound dilemma

Earman and Norton (1999) presented a dilemma for all attempts toexorcise the Szilard engine by information theoretic arguments. Wouldbe exorcists who want to use information theory to save the second lawmust, they urge, choose between “sound” and“profound” horns of a dilemma.

The sound horn proposes a resolution in which the demon, as well asthe system, are assumed to be ‘canonical thermal systems' and sosubject to, a possibly modified form of, the second law. By imposingsome form of the second law, consideration of what informationprocessing a demon may need to undertake would lead to a conclusion asto what the thermodynamic cost of that information processing must be.This approach lead Szilard to conclude that information acquisition hadan intrinsic thermodynamic cost. It is possible to read some papers onthe use of Landauer's principle in the Szilard engine in a similar way:the justification for believing that the resetting operation must carrya thermodynamic cost is for, otherwise, even a modified second lawwould be false.

This kind of eliminative argument has significant weaknesses. Allpossible sources of entropy increase are eliminated and it is concludedwhatever is left must be responsible for any remaining entropyincrease. It is hard to see how it can be considered an explanation ofwhy demons do not exist, as by imposing the second law in thefirst place, one has simply ruled out their possible existence. At mostit can provide a demonstration of the consistency of assuming someparticular operation has a thermodynamic cost attached to it. It isalso hard to see what confidence one can have in the conclusions ofsuch an argument. Szilard, Gabor and Brillouin thought that sucharguments led to the conclusion that information acquisition had anintrinsic cost. By contrast, Landauer, Penrose and Bennett concludedthat information acquisition does not have an intrinsic cost, butinformation erasure does. What are the grounds for supposing Landauer'sprinciple is any more secure than Szilard's argument?

A sound resolution proceeds by assuming a second law to be true, anddeducing the consequences for information processing. The other horn ofthe dilemma, a profound resolution, introduces Landauer's principle (orsome other information theoretic principle) as an independent axiom,that cannot be derived within statistical mechanics. By adding thisadditional axiom, a modified second law may be derived, and the absenceof untamed demons deduced. This in itself raises questions. It wouldsuggest that statistical mechanics is incomplete without beingsupplemented with this new principle of information processing. IfLandauer's principle is genuinely the reason the Szilard engine fails,a profound resolution implies that without it, statistical mechanicswould allow unconstrained violations of the second law. Again,if we have no other independent grounds for believing in Landauer'sprinciple, then how can we be confident that a clever device cannot beconstructed to produce such an unconstrained violation?

3.6 Demons exist

There is, of course, another response to the Szilard engine, andMaxwell's demon in general. It is to argue that these demons do, infact, exist. This has been suggested in recent years by Albert (2000)and developed by Hemmo and Shenker (2007). The principal argumentdepends on the observation that, in macroscopically indeterministicprocesses, the Boltzmann entropy can go down. Adopting a Boltzmannianpoint of view, Albert argued that the Boltzmannian entropy of themacrostate that the system inhabits is the only meaningful measure ofthermodynamic entropy. The fact that in indeterministic processes itcan be made to go down systematically is therefore just a fact of theworld. In the Szilard engine, inserting the partition in the centre ofthe box is just such an indeterministic process that reduces theBoltzmann entropy regardless of which side the molecule is located. Asthe demonless engines show, extracting the work from this reduction ispossible, without the intervention of intelligence.

Hemmo and Shenker considered whether this work extraction requires arecord to remain of the initial location of the molecule, as supportersof the Landauer-Penrose-Bennett resolution would suggest. They arguethat this is not necessary. Instead they perform an operationequivalent to ‘erasure-by-destruction’, on the auxiliary system. Thisdestroys any remaining macroscopic trace of the information about wherethe molecule was located, without incurring any thermodynamic cost (seealso Maroney (2005), who considered similar processes from within aGibbsian framework), but leaves the auxiliary in one of several lowBoltzmann entropy macrostates.

However, it seems also clearly acknowledged in these arguments thatthe reduction can only be achieved through macroscopicallyindeterministic processes. The system and auxiliary cannotboth be restored to their initial macrostate with certainty,without heat being generated. It would appear that the violation isstill constrained and the demon may be tame. Albert, at least, appearedto acknowledge this, and the corresponding possibility that a modifiedsecond law may still be possible.

Finally, it may be noted that it is possible to conceive of untamedMaxwell's demons, that produce unconstrained violations of the secondlaw, by simply modifying the microscopic laws in order to make it so.Zhang and Zhang (1992) provided an example of this replacing thepartition in the centre of Maxwell's box with a velocity dependantpotential capable of creating a pressure difference in the gas. Thispotential is non-Hamiltonian, so Liouville's Theorem does not hold. Thephase volume of the macrostate of the gas can then be compressed andboth the Boltzmann and Gibbs entropies can be systematically reducedeven in macroscopically deterministic processes. Such examples, whilenot necessarily being presented as candidates for demons that can beconstructed in the real world, are nevertheless useful in clarifyingwhat attendant assumptions are being made when exorcisms areproposed.

4. Information Processing Using Statistical Mechanics

In his earliest paper, Landauer derived the cost of performing thereset operation by simply assuming that there is an equivalencebetween the Shannon information of a distribution of logical states,and thermodynamic entropy of a physical system that can representthose logical states. A reduction in the Shannon information contentof the logical states would then reduce the thermodynamic entropy ofthe physical system. Landauer then further assumed that the second lawholds true and that this reduction must produce a thermodynamicentropy increase elsewhere. Viewed in this way, his argument appearsto contain elements both of the profound and the sound horns of Earmanand Norton's (1998, 1999) dilemma. It follows the profound horn inidentifying Shannon information with thermodynamic entropy and thesound horn in assuming the validity of the second law.

It is less clear that Landauer himself thought he was introducing anew principle. It seems more plausible that he was taking for grantedthe Gibbsian approach to statistical mechanics, which identifiedp ln p as the statisticalmechanical entropy long before Shannon's work, and the non-decrease ofthis entropy through coarse-graining. As such, the validity ofLandauer's principle would still require the structure of Gibbsianstatistical mechanics to be self-consistent and the appropriaterepresentation of thermal systems. At the very least this cannot betaken for granted unless it has already been established that untameddemons do not exist and so the unquestioned use of Landauer'sprinciple in exorcisms of the demon would still appear to becircular.

Nevertheless, this cannot be taken to imply that analysing thethermodynamics of computation is an altogether pointless task. Thequestion of whether or not one can deduce thermodynamic consequences ofphysically implementing logical operations, from the structure of thelogical operation itself, may be a well-founded question (although itclearly cannot be wholly divorced from what fundamental physics saysabout the construction of such devices). Furthermore, were it to beshown that Landauer's principle wasnot correct, and thatinformation could be reset at arbitrarily low cost, it would seem thateither untamed demonsmust be possible or some further sourceof heat generation must be discovered in the operation of Szilard'sengine.

At one extreme, von Neumann argued that any logical operationnecessarily generated a minimum quantity of heat. Gabor and Brillouinhad argued that measurement, at least, generated heat. Landauer andBennett argued that only logically irreversible operations needgenerate heat, but that measurement is logically reversible. This wasbacked up by the presentation of physical processes that could, inprinciple, perform a measurement with arbitrarily little heatgeneration. Bennett argued that while the measurement processesdiscussed by Gabor and Brillouin, using scattered light, generateheat, they were incorrect to generalise this to the claim that allmeasurement process must necessarily do the same. However, a similarcharge may be levelled at Landauer's principle. Although explicitmodels for resetting operations have been constructed that getarbitrarily close to a minimum heat generationofkT ln 2, what is needed isa proof that no physical process can do better. A literature nowexists examining the strength of this claim, and in particular whatare the physical and statistical assumptions underpinning theirarguments.

4.1 Liouvillean proofs

Schematically, the simplest of arguments in favour of Landauer'sprinciple is based directly on volume of phase space arguments andLiouville's Theorem.

Volume of phase space diagram

The microscopic state space of the system and environmentis assumed to be divided into logical, or information bearing, degreesof freedom and environmental, or non-information bearing degrees offreedom. These can be simplified to just two dimensions. In the figure,the horizontal axis represents the information bearing degrees offreedom. The system being in logical state zero or logical state onecorresponds to the microscopic state lying within a particular regionof the information bearing degree of freedom. The reset to zerooperation is required to be a Hamiltonian evolution of the state spaceof the system and environment that leaves the system in logical statezero, regardless of the initial logical state.

As the region of the state space available to the logical degrees offreedom is reduced by a factor of two, Liouville's Theorem requiresthat the region of state space available to the environmental degreesof freedom must have doubled. It is then argued that this must involveheat generation in the environment. This last step requires a littlejustification. In the Boltzmannian approach, the temperature of amacrostate is generally defined by the formula

 SB V=   1   .
ET

If this system is very large, it is assumedthat its temperature undergoes negligible changes from the absorptionof small quantities of heat. It follows that if the physical volume iskept fixed, so that the only change in energy is through heating, thenthe heat and entropy are related by ΔQ =T ΔSB. Systems that satisfy these conditions may be considered to be goodheat baths. A doubling in phase volume of the heat bath implies anincrease in Boltzmann entropy ofk ln 2, sorequires heat absorptionofkT ln 2.

Although this simple argument seems compelling, doubts may easily beraised. The representation of the logical states and the environment isso simplified that it should at least raise concerns as to what mayhave been lost in this simplification. Shenker (2000) and Hemmo andShenker (2007) raise a number of such issues, as part of theirarguments over the existence of a demon. There seems no reason torequire the environment to absorb heat, or even be in a macrostatewhich has twice the phase volume. Instead, it could be in one of twodistinct regions, corresponding to macroscopically distinct, butotherwise quite unimportant, degrees of freedom (such as the locationof a shoe dropped on the floor). All that is required is that theenvironmental degrees of freedom must end in one of a number ofdistinct macrostates, whosetotal phase volume over allpossible macrostates must be twice that of the its original macrostate.The use of an ‘erasure by destruction’ operation can then ensure thatthe specific macrostates have no trace of the original information.

However, as noted above in Section 3.6, Hemmo and Shenkeracknowledge that some tighter formulations of Landauer's principleexist that prevent this kind of move (although they question thereasonableness of these formulations): if one requires that theresetting operation leaves unchanged the macrostate of anyenvironmental degree of freedom that is not a heat bath, then theLiouvillean argument seems to hold.

4.2 Gibbsian proofs

The earliest attempts to construct general, rigorous proofs ofLandauer's principle, from within statistical mechanics, were byShizume (1995) and Piechocinska (2000). (Although Penrose (1970,Chapter VI) does, in effect, derive an entropic cost for the resetoperation, it appears his work was motivated quite independently ofLandauer's). Shizume's proof relies upon treating thermal contact as aGaussian white noise field in a Langevin equation, while Piechocinskaassumes Hamiltonian dynamics and a heat bath that is initiallyuncorrelated with the system and represented by a Gibbs canonicaldistribution.

These proofs, and those that follow on from them, are in thetradition of making standard assumptions that are generally acceptedwithin the context of practical calculations in statistical mechanics.It is taken for granted that a probability distribution over at leastsome region of state space is a meaningful characterisation of thestate of a system, and further constraints are usually assumed aboutwhat kind of probability distribution this could be. From the point ofview of investigating questions about the fundamental consistency ofstatistical mechanics, this may appear unsatisfactory. If the aim ismore modest, of simply investigating the generality of thecircumstances under which Landauer's principle might hold, such proofsmay still be seen to be of value. It is important to distinguishbetween, for example, the rather hard problem of providing anexplanation of why thermalised systemsshould be representedby a canonical distribution over the accessible states space, from thesomewhat more easy task of deriving the consequences of the empiricallyjustified observation that thermalised systemsare wellrepresented by a canonical distribution over the accessible statespace.

Piechocinska provided proofs for both the classical and quantumcases. A brief sketch of Piechocinska's quantum proof is possible.First simply define a functionΓ(i,f,m,n)= ln pi−  ln pf − β(EnEm), wherepi isthe occupation probability of an initial statei of the system,pf theoccupation probability of a final statef,beta the dispersion parameter of the canonical distribution of the heatbath, andEm andEn theenergies of eigenstates of the heat bath. No particular physicalsignificance is attached to this function. However, from therequirement that the overall evolution be unitary, it can be shown⟨ e−Γ ⟩=1and so by convexity⟨ Γ ⟩ ≥ 0. The mean heat generated in the heat bath canthen easily be shown to satisfy⟨ Q ⟩ ≥ −ΔHkT ln 2 where ΔH = ∑ ipi log pi − ∑ fpf log pf is the change in Shannon information over the states of the systemduring the operation. If there are assumed to be two equiprobableinput states, the requirement that the operation be a resettingoperation and leave the system determinately in a specific outputstate leads to ΔH = −1 and the usual expressionof Landauer's principle is obtained.

This considers only pure quantum states to represent logical states,although in the classical case her proof allows logical states to berepresented by macrostates. Maroney (2009) generalised the methodfurther, to input and output states represented by macrostates withvariable mean internal energies, entropies and temperatures andconsidered the consequences for more general logical operations thanthe reset operation. Turgut (2009), building on Penrose's work,derived similar—though apparently more powerful—resultsusing the Gibbs microcanonical distribution approach.

4.3 Phenomenological proofs

Groisman, Ladyman, Short and Presnell (2007) presented aphenomenological argument in favour of the proposition that alllogically irreversible operations are thermodynamically irreversible.In this they were responding in part to Norton's (2005) criticism ofthe generality of previous proofs of Landauer's principle (includingPiechocinska's) and in part to Maroney's (2005) argument that theequality in Piechocinska's proof is reachable, and when reached theheat is generated in a thermodynamically reversible fashion.

Their intention seems to have been to have a general proof that canbe shown to hold independently of any particular model of physicalprocess of resetting. To achieve this without presuming something aboutthe physics which governs a device that performs the resetting, theyexplicitly accepted the sound approach to Earman and Norton's Sound vs.Profound dilemma. This means they assumed that a particular modifiedform of the second law of thermodynamics must hold true. The modifiedversion that they adopted is that there are no cyclic processes, whosesole result is to restore a system to its initial state and having astrictly positive expectation value for the conversion of heat intowork.

Their method was, in essence, equivalent to that of Szilard andBennett. They considered the operation of the Szilard engine, using adevice to measure and store the location of the molecule, and acorrelated operation to extract heat from it. Then they reset thedevice. Unlike Bennett, they did not assume that resetting has a cost,and argue that this cost saves the second law. Rather, like Szilard,they imposed a modified form of the second law, and from it deducedthat resetting has a cost. The generality of their claim is based uponthe fact that they made no attempt to characterise the devicephysically, although they did assume it was capable of performingnon-dissipative measurements.

4.4 Counter arguments and examples

Early criticisms of Landauer's principle tended to focus on theclaim that logically reversible operations could be implemented in athermodynamically reversible fashion, and instead to defend theposition of von Neumann, Gabor and Brillouin (see the references inLandauer's response (Porod 1988) and in (Bennett 2003) for examples ofthis debate). The increasingly explicit models developed by workerssuch as Bennett, Fredkin and Toffoli have now generally been consideredto have established that thermodynamically reversible computation isindeed possible, if implemented by logically reversible computers.

Recent criticisms (Shenker 2000 [in Other Internet Resources], Norton2005) focussed upon whether heat generation must necessarily beassociated with logically irreversible operations. A particularobjection raised by Shenker and a similar objection by Norton,concerned the use of probability distributions over macroscopicallydistinct states. Both Shenker and Norton argued that the thermodynamicentropy of a system is only properly defined by considering the regionof state space accessible to the given microstate. For physicalrepresentations of logical states it is essential that the physicalsystem cannot jump from one logical state to another. Consequently,regions of state space corresponding to different logical states arenot accessible to one another, or as Shenker referred to it, they arenot interaccessible. In Landauer's original paper, he simplyidentified thep ln p term of theprobability distribution over the distinct logical states withthermodynamic entropy. More developed Gibbsian approaches stillcalculate the Gibbs entropy of a statistical distribution over regionsof state space which are not interaccessible. Norton argued in detailthat this is an illegitimate calculation, and that the resultingentropy has nothing to do with thermodynamic entropy.

The objection extends far beyond the scope of Landauer's principle.It shares much with general Boltzmannian arguments against the Gibbsapproach to statistical mechanics. However, it is questionable whetherproofs following the approach of Piechocinska are genuinely vulnerableto the accusation. Piechocinska did not, in fact, attribute athermodynamic entropy to the Gibbs entropy measure (entropy was barelyeven mentioned in her paper) nor, contrary to some criticisms, did sheassume a specific model of how the reset operation is performed, atleast in the quantum case. The assumption that generates thelogarithmic form of Landauer's principle is that heat baths areinitially canonically distributed (an assumption which Norton, atleast, appeared willing to concede) combined with the requirement thata single Hamiltonian (or unitary, in the case of quantum theory)dynamics is used to describe the combined evolution of the system andenvironment, independantly of the input logical state. Similar commentsmay be made about (Maroney 2009), while (Turgut 2009) managed to gofurther and deduce a stronger constraint, which implies the usualversion of Landauer's principle, but without even needing to considerprobability distributions over the input logical states.

A more concrete counterexample is that of Allahverdyan andNieuwenhuizen (2001), where they argued it is possible to effect anerasure process with a lower cost than Landauer's principle suggests,but only in the low temperature quantum regime. A typical assumption instatistical mechanical proofs is that the interaction energy between asystem and a heat bath can be treated as negligible (at least beforethe start and after the end of the process). Allahverdyan andNieuwenhuizen's example exploits this by considering situations wherethe temperature is sufficiently low that the interaction energy can nolonger be treated as negligible when compared tokT. In this situation the heat bath can no longer betreated as a canonical distribution, independently of the system andthe standard proofs do not hold. It should be noted that while Zhangand Zhang's non-Hamiltonian demon is not considered actuallyphysically possible, Allahverdyan and Nieuwenhuizen explicitly claimedthat in the low temperature regime, Landauer's principle can, in fact,be broken.

5. The Role of Quantum Theory

The relationship between quantum theory, measurement andirreversibility is a complex one, and considering the effect ofquantum theory on the argument has lead to sometimes surprising claimsbeing made.

Von Neumann explicitly referred to Szilard's argument in (vonNeumann 1932, Chapter V.2), when discussing the irreversibility ofwavefunction collapse on measurement, although it is unclear preciselywhat role this was intended to play. Both Gabor and Brillouin'smeasurement procedures, using light, required the quantised treatmentof the electromagnetic field to produce dissipation. Gabor stated quiteclearly his belief that measurement by classical electromagnetic fieldscould take place non-dissipatively and would lead to unconstrainedviolations of the second law. The Landauer-Penrose-Bennett argumentdoes not require measurements to generate heat, so for them, classicalstatistical mechanics need not lead to untamed demons. Although itmight still be argued that quantum electrodynamics dissipates heat ifused for measurement, the fact that some physical processes formeasurement dissipate heat does not undermine Bennett's claim thatother physical processes can perform measurementsnon-dissipatively.

Still, wavefunction collapse is frequently associated withthermodynamic irreversibility. It might seem there is a contradictionhere, as measurement, in the Szilard engine, is supposed to lead to thepossibility of a decrease in entropy, while measurement, in quantumtheory, is supposed to increase entropy. Alternatively, it may bewondered if the entropy increase in wavefunction collapse offsets thereduction from measurement. These thoughts are largely a confusionwhich can be easily remedied. A projective, von Neumann, measurementwill increase the Gibbs-von Neumann entropy of a density matrix unlessthe observable being measured commutes with the density matrix, inwhich case the measurement will leave it constant. This, however,applies to the density matrix that describes the statisticaldistribution over all the measurement outcomes. The subensemble densitymatrix that corresponds to a particular outcome having occurred canstill have a lower Gibbs-von Neumann entropy than the ensemble over allmeasurement outcomes.

Zurek (1986) is the most prominent example of attempts to relatequantum measurement to the Szilard engine. Zurek suggested that amolecule being in a superposition of being on either side of thepartition, in the Szilard engine, is not subjective uncertainty as towhich side the molecule is located. Instead it is an objective factthat the quantum state occupies both sides. Even with the partitionpresent, the molecule occupies the entire box. Until a measurement isperformed, collapsing the quantum state to one side or the other, workcannot be extracted. Zurek still regarded the resetting operation asthe source of compensating entropy increase. This is justified asnecessary on the basis that the demon, having performed the measurementand extracted the work, is now in a statistical mixture of havingobserved each of the possible outcomes. For the cost of resetting thedemon, Zurek appealed directly to Landauer's principle.

The logic here is hard to follow. If there was something ambiguousor troubling, in the classical situation, about having to refer tosubjective uncertainty over the location of the molecule while there isan objective fact on the matter, then it should be at least astroubling to have to refer to the statistical mixture of the demon'sstates when deciding to perform the resetting operation. Zurek did notseem to suggest that the demon is to be regarded as being in asuperposition of measurement outcomes (if it were, then it couldpresumably be reset to a standard state with a lower cost). It seemsassumed that there was a matter of fact about which outcome occurredand therefore which state the demon is in. Also ambiguous is whether weare to understand the measurement as a non-unitary wavefunctioncollapse to one outcome or the other. If wavefunction collapse is to beregarded as a necessary component in understanding the absence ofdemons, where does this leave no-collapse interpretations?

It is also important to bear in mind that all attempts to deriveLandauer's principle, to date, have been based upon classicalinformation processing. While it would appear that a lower bound, verysimilar in form to Landauer's principle, can be derived for quantumcomputational operations, unlike the classical case there appears noproof as yet of the existence of processes that can in principle reachthis bound. It remains possible, therefore, that quantum computationsmay need to incur additional thermodynamic costs. This appears to betrue even for the quantum analog of logically reversible operations:Bennett's (1973) procedure for avoiding the cost of storing additionalbits involves an operation which cannot in general be applied toquantum operations (Maroney 2004 [in Other Internet Resources]). Finally, as noted above,Allahverdyan and Nieuwenhuizen argued in the opposite direction, thatthe derivations of this lower bound involve assumptions which can beviolated by quantum theory in the low temperature regime.

6. Discussion

Throughout the literature, there are widely different motivationsand standards of proof accepted, so that at times it can be hard to seeany coherent body of work. Maxwell's original intention was todemonstrate that the second law had only limited validity, and thatviolations were possible. Although he originally described the demon as aliving being, he later reduced its role to that of being a valve.Smoluchowski sought to tame the demon, by formulating a modified secondlaw that it did not violate. His exorcism applied only to mechanicaldevices and he left open the question of whether intelligentintervention could reverse the second law. As the literature developed,both the scope of the exorcism and the nature of the demon changed. Theconcept of intelligence became reduced to that of informationprocessing performed by a mechanical device. Rather than investigatehow a tame demon might be constrained by physical laws the aim becameto exclude demons altogether. Much of the resulting literature proceedson a case-by-case basis, where individual examples of demons are arguedto fail and it is simply extrapolated that all demons will fail forsimilar reasons.

Szilard's original intention was to analyse whether intelligencecould be used to defeat the second law. His argument did not analysewhat was meant by intelligence. He argued that any being, howeverintelligent, would still have to perform measurements. If thesemeasurements came with a sufficient entropic cost, the second law wouldbe safe without needing to consider further the constitution of anintelligent being. This clearly does not require intelligence to bereduced to performing measurements and processing the results of suchmeasurements.

Under the analysis of von Neumann, Brillouin and Gabor, a moregeneral conception developed of information processing having afundamental entropic cost. The possession of information could indeedbe regarded as reducing entropy and therefore capable of producingthermodynamic work. However, as the acquisition of information, throughmeasurement, was still believed to require a dissipation of heat, therewas no threat to a generalised form of the second law. No explicitconsideration needed to be made regarding exactly what kind of beingthe possessor of information had to be.

Landauer's analysis led to the argument that measurement did notreduce the degrees of freedom needed to represent the combined systemof object and measuring device, so did not lead to the necessity ofheat generation. The models of measurement proposed by Szilard, Gaborand Brillouin were insufficiently general. Landauer's early papers didnot explicitly consider the Szilard engine and the problem of Maxwell'sdemon, so they did not address what implications this had for theconsistency of statistical mechanics. It appears that Landauer believedthat logically reversible computation could not be achieved withoutaccumulating a large amount of irrelevant information. Thataccumulation would ultimately require erasure, so that the cost of alogically irreversible computation could not be avoided. It tookBennett to demonstrate that this cost could be avoided in logicallyreversible computation.

Nevertheless Bennett also argued that the Szilard engine was a processwhere the logically irreversible erasure step could not be avoided tocomplete the cycle. The information acquired during thenon-dissipative measurement had to be stored in the state of thedemon's memory. While this information allowed the demon to extractwork from the engine, this memory could not be reset without incurringan erasure cost at least equal to the work extracted. Unlike theanalysis from Szilard to Brillouin, the consistency of statisticalmechanics now requires us to say at least something about the demonitself as a physical being, constrained by a particular kind ofphysical law. The demon may be able to perform measurementsnon-dissipatively, but its memorymust be representable by aphysical device, and this device must be included in the statisticalmechanical description. This contrasts sharply with Brillouin.Brillouin characterised information as “bound” if it wasembodied in the states of a physical device, but he explicitly statedthat information contained only in the mind was “free”,not “bound”.

The relationship between entropy and information now becomes lessclear. With the demon left out of the system, it is free to beconsidered an agent, who has information about the system. Theuncertainty in the description of the system may be thought of as thedemon's lack of knowledge about the exact state of the system. If thedemon has more information, the entropy of the system isless. However, once the demon can acquire informationnon-dissipatively, the entropy of the system goes down and the onlycompensation seems to be an increase in the uncertainty of the demon'sstate itself.

If the state of the demon's own mind must now be includedin the system, with uncertainty over which state the demon isin, that raises the question: “Whoseuncertainty?”. Perhaps it is the uncertainty the demon has atthe start of the process about which state it will be in? Perhaps thestatistical distribution should be abandoned, as the demon is notuncertain about its own state, but that another property—such asthe proposal of Zurek—has changed by a compensating amount?Perhaps the demon has been naturalised to the point that it no longerrepresents an intelligent agent, possessing and acting on information,and the uncertainty is now another, external agent's uncertainty? Inthe latter case it would appear any hope of using the Szilard engineto answer Smoluchowski's original concern, about intelligentintervention, has been lost entirely.

For Penrose and Bennett's resolution to succeed, it is necessary thatthe demon be conceived of as a particular type of physical system, asthe physical constitution of the demon's memory must now be taken intoaccount. If the demon is presumed to be representable by a Hamiltoniansystem, then it has been reduced to a mechanical device, not toodissimilar to Smoluchowski's trapdoor and spring. It is not sosurprising that being a Hamiltonian system, and subject to Liouville'sTheorem, might naturally lay some physical constraint upon a demon,rendering it tamable. However, the existence and exact nature of thisconstraint, and the possibility of a corresponding modified secondlaw, remains open to debate and depends in a large measure on theapproach taken to understanding statistical mechanics.

In the dissipationless measurements of Landauer et al., measurementstill requires the development of correlations between physicalsystems, through interaction. The demonless engines of Popper et al.indicate that the only physical aspect of this process that isrelevant is precisely this correlated interaction, and that thequestion of whether the interaction is for the purpose of informationgathering is not, in itself, of significance. While advocates of theinformation theoretic approach might argue that thecorrelationis information, the arguments of Maroney and ofShenker and Hemmo would appear to challenge whether maintaining thiscorrelation is even necessary to understand the operation of theengine. It may be asked whether the characterisation of thecorrelation as “information” is actually playing any role,or whether this is just a trivial relabelling exercise? If the terminformation adds nothing to the understanding of statistical mechanicsthen it could be purged from the description (of course, this stillleaves open the possibility that statistical mechanics has non-trivialimplications for the physics of information processing).

Both the Szilard engine and Landauer's principle seem to raise asimilar problem about the relationship between knowledge andthermodynamic entropy: if one couldknow which side of theengine the molecule was located, one could extract work; if one couldknow which logical state the device was in, one could set itto zero without work. Without this knowledge, it is necessary to designa process that acts independently of the specific state the system isin. But it is plain that this does not tell us that, evenwithout the knowledge, it is impossible to design a cleverprocess that can still extract the work from engine withoutcompensation, or a clever process that can still reset the bit withoutwork. Hamiltonian mechanics and Liouville's Theorem seems to play avital, if largely unnoticed, role. As Zhang and Zhang's demondemonstrates, unconstrained violations of the second law are clearlypossible given non-Hamiltonian flows and no appeal to informationtheory or computation would seem able to avoid this.

Bibliography

A comprehensive annotated bibliography of the subject, up until 2003,is included in the collection (Leff and Rex 2003), which also containsmany key articles, mainly supporting the Landauer-Penrose-Bennettposition. A detailed analysis of the history of Maxwell's demon, theSzilard engine and critiques of information theoretic exorcisms are inEarman and Norton (1998, 1999) with Norton (2005) providing a similarcritique of Landauer's principle.

  • Albert, D.Z., 2001,Time and Chance, Cambridge,Massachusetts: Harvard University Press.
  • Allahverdyan, A.E. and T.M. Nieuwenhuizen, “Breakdown of theLandauer bound for information erasure in the quantumregime”,Physical Review E, 64: 0561171–0561179.
  • Bennett, C.H., 1973, “Logical reversibility ofcomputation”,IBM Journal of Research andDevelopment, 17: 525–532.
  • Bennett, C.H., 1982, “The thermodynamics ofcomputation—a review”,International Journal ofTheoretical Physics, 21(12): 905–940.
  • Bennett, C.H., 2003, “Notes on Landauer's principle,reversible computation, and Maxwell's demon”,Studies in theHistory and Philosophy of Modern Physics, 34: 501–510.
  • Brillouin, L., 1951, “Maxwell's demon cannot operate:Information and entropy I”,Journal of Applied Physics,22: 334–337.
  • Brillouin, L., 1956,Science and Information Theory, NewYork: Academic Press.
  • Earman, J. and J.D. Norton, 1998, “Exorcist XIV: The wrathof Maxwell's demon. Part I. From Maxwell toSzilard”,Studies in the History and Philosophy of ModernPhysics, 29: 435–471.
  • Earman, J. and J.D. Norton, 1999, “Exorcist XIV: The wrathof Maxwell's demon. Part II. From Szilard toLandauer”,Studies in the History and Philosophy of ModernPhysics, 30: 1–40.
  • Feyerabend, P.K., 1966, “On the possibility of a perpetuummobile of the second kind”, inMind, Matter and Method:Essays in Philosophy and Science in Honor of Herbert Feigel,P.K. Feyerabend and G. Maxwell (eds.), Minneapolis, Minnesota:University of Minnesota Press, pp. 409–412.
  • Gabor, D., 1964, “Light and Information”,Progressin Optics, 1: 111–153.
  • Groisman, B., J. Ladyman, S. Presnell, and T. Short, 2007,“The connection between logical and thermodynamicirreversibility”,Studies in the History and Philosophy ofModern Physics, 38: 58–79.
  • Landauer, R., 1961, “Irreversibility and heat generation inthe computing process”,IBM Journal of Research andDevelopment, 5: 183–191.
  • Leff, H.S. and A.F. Rex, 1990,Maxwell's Demon: Entropy,Information, Computing, Princeton, New Jersey: PrincetonUniversity Press.
  • Leff, H.S. and A.F. Rex, 2003,Maxwell's Demon 2: Entropy,Classical and Quantum Information, Computing, Philadelphia,Pennsylvania: Institute of Physics Publishing.
  • Maroney, O.J.E., 2005, “The (absence of a) relationshipbetween thermodynamic and logical irreversibility”,Studiesin the History and Philosophy of Modern Physics, 36:355–374.
  • Maroney, O.J.E., 2009, “Generalising Landauer'sprinciple”,Physical Review E, 79: 031105.
  • Maxwell, J.C., 1867, Letter to P.G. Tait, 11 December 1867, inLife and Scientific Work of Peter Guthrie Tait, C.G.Knott (author),Cambridge: Cambridge University Press, 1911, pp. 213–215.
  • Norton, J.D., 2005, “Eaters of the lotus: Landauer'sprinciple and the return of Maxwell's demon”,Studies in the History and Philosophy of Modern Physics,36: 375–411.
  • Penrose, O., 1970,Foundations of Statistical Mechanics,Oxford: Pergamon Press.
  • Piechocinska, B., 2000, “Informationerasure”,Physical Review A, 61: 1–9.
  • Porod, W., 1988, “Comment on ‘Energy requirements incommunication’”,Applied Physics Letters, 52:2191.
  • Rothstein, J., 1951, “Information, measurement and quantummechanics”,Science, 114: 171–175.
  • Shizume, K., 1995, “Heat generation required byerasure”,Physical Review E, 52: 3495–3499.
  • Smoluchowski, M. von, 1914, “Gültigkeitsgrenzen deszweiten Hauptsatzes der Wärmtheorie”,Vorträgeüber die Kinetische Theorie der Materie und derElektrizität, Leipzig: Teubner, 1914, pp. 89–121.
  • Szilard, L., 1929, “On the Decrease of Entropy in aThermodynamic System by the Intervention of IntelligentBeings”,Zeitschrift fur Physik 53: 840–856.English translation inThe Collected Works of Leo Szilard:Scientific Papers, B.T. Feld and G. Weiss Szilard (eds.),Cambridge, Massachusetts: MIT Press, 1972, pp. 103–129.
  • Turgut, S., 2009, “Relations between entropies produced innon-deterministic thermodynamic processes”,Physical Review E, 79: 041102.
  • Uffink, J., 2001, “Bluff your way in the second law ofthermodynamics”,Studies in the History and Philosophy ofModern Physics, 32: 305–394.
  • Uffink, J., 2006, “Compendium of the foundations ofclassical statistical physics” inPhilosophy of Physics(Handbook of the Philosophy of Science), J. Butterfield and J.Earman (eds.), Amsterdam: North Holland, Part B,pp. 923–1074.
  • von Neumann, J., 1932,Mathematical Foundations of QuantumMechanics, English translation, Princeton, New Jersey: PrincetonUniversity Press, 1955.
  • von Neumann, J., 1949, “The Role of High and of ExtremelyHigh Complication”, Fourth University of Illinois lecture,inTheory of Self-Reproducing Automata, A.W. Burks (ed.),Champaign, Illinois: University of Illinois Press, 1966,pp. 64–73.
  • Zhang, K. and K. Zhang, 1992, “Mechanical modelsof Maxwell's demon with noninvariant phase volume”,Physical Review A, 46: 4598–4605.
  • Zurek, W.H., 1986, “Maxwell's demon, Szilard's engine andquantum measurements”, inFrontiers of NonequilibriumStatistical Physics, G.T. Moore and M.O. Scully (eds.), New York:Plenum Press, pp. 151–161.
  • Zurek, W.H., 1989a, “Algorithmic randomness and physicalentropy”,Physical Review A, 40: 4731–4751.
  • Zurek, W.H., 1989b, “Thermodynamic cost of computation,algorithmic complexity and the information metric”,Nature, 347: 119–124.

Other Internet Resources

Acknowledgments

Many thanks to John Norton for his helpful suggestions and editingduring the production of this article.

Copyright © 2009 by
Owen Maroney<owen.maroney@philosophy.ox.ac.uk>

Open access to the SEP is made possible by a world-wide funding initiative.
The Encyclopedia Now Needs Your Support
Please Read How You Can Help Keep the Encyclopedia Free

Browse

About

Support SEP

Mirror Sites

View this site from another server:

USA (Main Site)Philosophy, Stanford University

The Stanford Encyclopedia of Philosophy iscopyright © 2023 byThe Metaphysics Research Lab, Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054


[8]ページ先頭

©2009-2025 Movatter.jp