| |
This is an extensive review of recent work on the foundations of statistical mechanics. | |
Roughly speaking, classical statistical physics is the branch of theoretical physics that aims to account for the thermal behaviour of macroscopic bodies in terms of a classical mechanical model of their microscopic constituents, with the help of probabilistic assumptions. In the last century and a half, a fair number of approaches have been developed to meet this aim. This study of their foundations assesses their coherence and analyzes the motivations for their basic assumptions, and the interpretations of their central concepts. (...) The most outstanding foundational problems are the explanation of time-asymmetry in thermal behaviour, the relative autonomy of thermal phenomena from their microscopic underpinning, and the meaning of probability. A more or less historic survey is given of the work of Maxwell, Boltzmann and Gibbs in statistical physics, and the problems and objections to which their work gave rise. Next, we review some modern approaches to (i) equilibrium statistical mechanics, such as ergodic theory and the theory of the thermodynamic limit; and to (ii) non-equilibrium statistical mechanics as provided by Lanford's work on the Boltzmann equation, the so-called Bogolyubov-Born-Green-Kirkwood-Yvon approach, and stochastic approaches such as `coarse-graining' and the `open systems' approach. In all cases, we focus on the subtle interplay between probabilistic assumptions, dynamical assumptions, initial conditions and other ingredients used in these approaches. (shrink) | |
While the fundamental laws of physics are time-reversal invariant, most macroscopic processes are irreversible. Given that the fundamental laws are taken to underpin all other processes, how can the fundamental time-symmetry be reconciled with the asymmetry manifest elsewhere? In statistical mechanics, progress can be made with this question. What I dub the ‘Zwanzig–Zeh–Wallace framework’ can be used to construct the irreversible equations of SM from the underlying microdynamics. Yet this framework uses coarse-graining, a procedure that has faced much criticism. I (...) focus on two objections in the literature: claims that coarse-graining makes time-asymmetry ‘illusory’ and ‘anthropocentric’. I argue that these objections arise from an unsatisfactory justification of coarse-graining prevalent in the literature, rather than from coarse-graining itself. This justification relies on the idea of measurement imprecision. By considering the role that abstraction and autonomy play, I provide an alternative justification and offer replies to the illusory and anthropocentric objections. Finally, I consider the broader consequences of this alternative justification: the connection to debates about inter-theoretic reduction and the implication that the time-asymmetry in SM is weakly emergent. 1Introduction 1.1Prospectus2The Zwanzig–Zeh–Wallace Framework3Why Does This Method Work? 3.1The special conditions account3.2When is a density forwards-compatible?4Anthropocentrism and Illusion: Two Objections 4.1The two objections in more detail4.2Against the justification by measurement imprecision5An Alternative Justification 5.1Abstraction and autonomy5.2An illustration: the Game of Life6Reply to Illusory7Reply to Anthropocentric8The Wider Landscape: Concluding Remarks 8.1Inter-theoretic relations8.2The nature of irreversibility. (shrink) | |
This chapter will review selected aspects of the terrain of discussions about probabilities in statistical mechanics (with no pretensions to exhaustiveness, though the major issues will be touched upon), and will argue for a number of claims. None of the claims to be defended is entirely original, but all deserve emphasis. The first, and least controversial, is that probabilistic notions are needed to make sense of statistical mechanics. The reason for this is the same reason that convinced Maxwell, Gibbs, and (...) Boltzmann that probabilities would be needed, namely, that the second law of thermodynamics, which in its original formulation says that certain processes are impossible, must, on the kinetic theory, be replaced by a weaker formulation according to which what the original version deems impossible is merely improbable. Second is that we ought not take the standard measures invoked in equilibrium statistical mechanics as giving, in any sense, the correct probabilities about microstates of the system. We can settle for a much weaker claim: that the probabilities for outcomes of experiments yielded by the standard distributions are effectively the same as those yielded by any distribution that we should take as a representing probabilities over microstates. Lastly, (and most controversially): in asking about the status of probabilities in statistical mechanics, the familiar dichotomy between epistemic probabilities (credences, or degrees of belief) and ontic (physical) probabilities is insufficient; the concept of probability that is best suited to the needs of statistical mechanics is one that combines epistemic and physical considerations. (shrink) | |
Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the most important notions (...) of entropy and to clarify the relations between them, After setting the stage by introducing the thermodynamic entropy, we discuss notions of entropy in information theory, statistical mechanics, dynamical systems theory and fractal geometry. (shrink) | |
Theoretical explication of a growing body of empirical data on consciousness-related anomalous phenomena is unlikely to be achieved in terms of known physical processes. Rather, it will first be necessary to formulate the basic role of consciousness in the definition of reality before such anomalous experience can adequately be represented. This paper takes the position that reality is constituted only in the interaction of consciousness with its environment, and therefore that any scheme of conceptual organization developed to represent that reality (...) must reflect the processes of consciousness as well as those of its environment. In this spirit, the concepts and formalisms of elementary quantum mechanics, as originally proposed to explain anomalous atomic-scale physical phenomena, are appropriated via metaphor to represent the general characteristics of consciousness interacting with any environment. More specifically, if consciousness is represented by a quantum mechanical wave function, and its environment by an appropriate potential profile, Schrödinger wave mechanics defines eigenfunctions and eigenvalues that can be associated with the cognitive and emotional experiences of that consciousness in that environment. To articulate this metaphor it is necessary to associate certain aspects of the formalism, such as the coordinate system, the quantum numbers, and even the metric itself, with various impressionistic descriptors of consciousness, such as its intensity, perspective, approach/avoidance attitude, balance between cognitive and emotional activity, and receptive/assertive disposition. With these established, a number of the generic features of quantum mechanics, such as the wave/particle duality, and the uncertainty, indistinguishability, and exclusion principles, display metaphoric relevance to familiar individual and collective experiences. Similarly, such traditional quantum theoretic exercises as the central force field and atomic structure, covalent molecular bonds, barrier penetration, and quantum statistical collective behavior become useful analogies for representation of a variety of consciousness experiences, both normal and anomalous, and for the design of experiments to study these systematically. (shrink) | |
Or better: time asymmetry in thermodynamics. Better still: time asymmetry in thermodynamic phenomena. “Time in thermodynamics” misleadingly suggests that thermodynamics will tell us about the fundamental nature of time. But we don’t think that thermodynamics is a fundamental theory. It is a theory of macroscopic behavior, often called a “phenomenological science.” And to the extent that physics can tell us about the fundamental features of the world, including such things as the nature of time, we generally think that only fundamental (...) physics can. On its own, a science like thermodynamics won’t be able to tell us about time per se. But the theory will have much to say about everyday processes that occur in time; and in particular, the apparent asymmetry of those processes. The pressing question of time in the context of thermodynamics is about the asymmetry of things in time, not the asymmetry of time, to paraphrase Price ( , ). I use the title anyway, to underscore what is, to my mind, the centrality of thermodynamics to any discussion of the nature of time and our experience in it. The two issues—the temporal features of processes in time, and the intrinsic structure of time itself—are related. Indeed, it is in part this relation that makes the question of time asymmetry in thermodynamics so interesting. This, plus the fact that thermodynamics describes a surprisingly wide range of our ordinary experience. We’ll return to this. First, we need to get the question of time asymmetry in thermodynamics out on the table. (shrink) | |
I argue that the theory of chance proposed by David Lewis has three problems: (i) it is time asymmetric in a manner incompatible with some of the chance theories of physics, (ii) it is incompatible with statistical mechanical chances, and (iii) the content of Lewis's Principal Principle depends on how admissibility is cashed out, but there is no agreement as to what admissible evidence should be. I proposes two modifications of Lewis's theory which resolve these difficulties. I conclude by tentatively (...) proposing a third modification of Lewis's theory, one which explains many of the common features shared by the chance theories of physics. (shrink) | |
There are two theoretical approaches in statistical mechanics, one associated with Boltzmann and the other with Gibbs. The theoretical apparatus of the two approaches offer distinct descriptions of the same physical system with no obvious way to translate the concepts of one formalism into those of the other. This raises the question of the status of one approach vis-à-vis the other. We answer this question by arguing that the Boltzmannian approach is a fundamental theory while Gibbsian statistical mechanics is an (...) effective theory, and we describe circumstances under which Gibbsian calculations coincide with the Boltzmannian results. We then point out that regarding GSM as an effective theory has important repercussions for a number of projects, in particular attempts to turn GSM into a nonequilibrium theory. (shrink) No categories | |
We often use symmetries to infer outcomes’ probabilities, as when we infer that each side of a fair coin is equally likely to come up on a given toss. Why are these inferences successful? I argue against answering this with an a priori indifference principle. Reasons to reject that principle are familiar, yet instructive. They point to a new, empirical explanation for the success of our probabilistic predictions. This has implications for indifference reasoning in general. I argue that a priori (...) symmetries need never constrain our probability attributions, even when it comes to our initial credences. (shrink) | |
The process of recognition or isolation of one or several entities from among many possible entities is termed intellego perception. It is shown that not only are many of our everyday percepts of this type, but perception of microscopic events using the methods of quantum mechanics are also intellego in nature. Information theory seems to be a natural language in which to express perceptual activity of this type. It is argued that the biological organism quantifies its sensations using an information (...) theoretical measure. This, in turn, sets the stage for a mathematical theory of sensory perception. (shrink) | |
A persistent question about the deBroglie–Bohm interpretation of quantum mechanics concerns the understanding of Born’s rule in the theory. Where do the quantum mechanical probabilities come from? How are they to be interpreted? These are the problems of emergence and interpretation. In more than 50 years no consensus regarding the answers has been achieved. Indeed, mirroring the foundational disputes in statistical mechanics, the answers to each question are surprisingly diverse. This paper is an opinionated survey of this literature. While acknowledging (...) the pros and cons of various positions, it defends particular answers to how the probabilities emerge from Bohmian mechanics and how they ought to be interpreted. (shrink) | |
The so-called ergodic hierarchy (EH) is a central part of ergodic theory. It is a hierarchy of properties that dynamical systems can possess. Its five levels are egrodicity, weak mixing, strong mixing, Kolomogorov, and Bernoulli. Although EH is a mathematical theory, its concepts have been widely used in the foundations of statistical physics, accounts of randomness, and discussions about the nature of chaos. We introduce EH and discuss how its applications in these fields. | |
Gibbsian statistical mechanics (GSM) is the most widely used version of statistical mechanics among working physicists. Yet a closer look at GSM reveals that it is unclear what the theory actually says and how it bears on experimental practice. The root cause of the difficulties is the status of the Averaging Principle, the proposition that what we observe in an experiment is the ensemble average of a phase function. We review different stances toward this principle, and eventually present a coherent (...) interpretation of GSM that provides an account of the status and scope of the principle. (shrink) No categories | |
I will argue, pace a great many of my contemporaries, that there's something right about Boltzmann's attempt to ground the second law of thermodynamics in a suitably amended deterministic time-reversal invariant classical dynamics, and that in order to appreciate what's right about (what was at least at one time) Boltzmann's explanatory project, one has to fully apprehend the nature of microphysical causal structure, time-reversal invariance, and the relationship between Boltzmann entropy and the work of Rudolf Clausius. | |
The thermodynamics of computation assumes that computational processes at the molecular level can be brought arbitrarily close to thermodynamic reversibility and that thermodynamic entropy creation is unavoidable only in data erasure or the merging of computational paths, in accord with Landauer’s principle. The no-go result shows that fluctuations preclude completion of thermodynamically reversible processes. Completion can be achieved only by irreversible processes that create thermodynamic entropy in excess of the Landauer limit. | |
This paper addresses the question of how we should regard the probability distributions introduced into statistical mechanics. It will be argued that it is problematic to take them either as purely ontic, or purely epistemic. I will propose a third alternative: they are almost objective probabilities, or epistemic chances. The definition of such probabilities involves an interweaving of epistemic and physical considerations, and thus they cannot be classified as either purely epistemic or purely ontic. This conception, it will be argued, (...) resolves some of the puzzles associated with statistical mechanical probabilities: it explains how probabilistic posits introduced on the basis of incomplete knowledge can yield testable predictions, and it also bypasses the problem of disastrous retrodictions, that is, the fact the standard equilibrium measures yield high probability of the system being in equilibrium in the recent past, even when we know otherwise. As the problem does not arise on the conception of probabilities considered here, there is no need to invoke a Past Hypothesis as a special posit to avoid it. (shrink) | |
The received wisdom in statistical mechanics is that isolated systems, when left to themselves, approach equilibrium. But under what circumstances does an equilibrium state exist and an approach to equilibrium take place? In this paper we address these questions from the vantage point of the long-run fraction of time definition of Boltzmannian equilibrium that we developed in two recent papers. After a short summary of Boltzmannian statistical mechanics and our definition of equilibrium, we state an existence theorem which provides general (...) criteria for the existence of an equilibrium state. We first illustrate how the theorem works with a toy example, which allows us to illustrate the various elements of the theorem in a simple setting. After a look at the ergodic programme, we discuss equilibria in a number of different gas systems: the ideal gas, the dilute gas, the Kac gas, the stadium gas, the mushroom gas and the multi-mushroom gas. In the conclusion we briefly summarise the main points and highlight open questions. (shrink) | |
This paper examines the problem of founding irreversibility on reversible equations of motion from the point of view of the Brussels school's recent developments in the foundations of quantum statistical mechanics. A detailed critique of both their 'subdynamics' and 'transformation' theory is given. It is argued that the subdynamics approach involves a generalized form of 'coarse-graining' description, whereas, transformation theory cannot lead to truly irreversible processes pointing to a preferred direction of time. It is concluded that the Brussels school's conception (...) of microscopic temporal irreversibility, as such, is tacitly assumed at the macroscopic level. Finally a logical argument is provided which shows, independently of the mathematical formalism of the theory concerned, that statistical reasoning alone is not sufficient to explain the arrow of time. (shrink) | |
Landauer’s principle is, roughly, the principle that logically irreversible operations cannot be performed without dissipation of energy, with a specified lower bound on that dissipation. Although widely accepted in the literature on the thermodynamics of computation, it has been the subject of considerable dispute in the philosophical literature. Proofs of the principle have been questioned on the grounds of insufficient generality and on the grounds of the assumption, used in the proofs, of the availability of reversible processes at the microscale. (...) The relevance of the principle, should it be true, has also been questioned, as it has been argued that microscale fluctuations entail dissipation that always greatly exceeds the Landauer bound. In this article Landauer’s principle is treated within statistical mechanics, and a proof of the principle is given that neither relies on neglect of fluctuations nor assumes the availability of thermodynamically reversible processes. In addition, it is argued that microscale fluctuations are no obstacle to approximating thermodynamic reversibility, in the appropriate sense, as closely as one would like. (shrink) | |
Let us begin with a characteristic example. Consider a gas that is confined to the left half of a box. Now we remove the barrier separating the two halves of the box. As a result, the gas quickly disperses, and it continues to do so until it homogeneously fills the entire box. This is illustrated in Figure 1. | |
I contrast two possible attitudes towards a given branch of physics: as inferential, and as dynamical. I contrast these attitudes in classical statistical mechanics, in quantum mechanics, and in quantum statistical mechanics; in this last case, I argue that the quantum-mechanical and statistical-mechanical aspects of the question become inseparable. Along the way various foundational issues in statistical and quantum physics are illuminated. | |
There is a long tradition of thinking of thermodynamics, not as a theory of fundamental physics, but as a theory of how manipulations of a physical system may be used to obtain desired effects, such as mechanical work. On this view, the basic concepts of thermodynamics, heat and work, and with them, the concept of entropy, are relative to a class of envisaged manipulations. This article is a sketch and defense of a science of manipulations and their effects on physical (...) systems. I call this science thermo-dynamics, or ΘΔcs\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${\Theta \Delta }^{\text{cs}}$$\end{document}, for short, to highlight that it may be different from the science of thermodynamics, as the reader conceives it. An upshot of the discussion is a clarification of the roles of the Gibbs and von Neumann entropies. Light is also shed on the use of coarse-grained entropies. (shrink) | |
The thesis proposes, defends, and applies a new model of inter-theoretic reduction, called "Neo-Nagelian" reduction. There are numerous accounts of inter-theoretic reduction in the philosophy of science literature but the most well-known and widely-discussed is the Nagelian one. In the thesis I identify various kinds of problems which the Nagelian model faces. Whilst some of these can be resolved, pressing ones remain. In lieu of the Nagelian model, other models of inter-theoretic reduction have been proposed, chief amongst which are so-called (...) "New Wave" models. I show these to be no more adequate than the original Nagelian model. I propose a new model of inter-theoretic reduction, Neo-Nagelian reduction. This model is structurally similar to the Nagelian one, but differs in substantive ways. In particular I argue that it avoids the problems pertaining to both the Nagelian and New Wave models. Multiple realizability looms large in discussions about reduction: it is claimed that multiply realizable properties frustrate the reduction of one theory to another in various ways. I consider these arguments and show that they do not undermine the Neo-Nagelian of reduction of one theory to another. Finally, I apply the model to statistical mechanics. Statistical mechanics is taken to be a reductionist enterprise: one of the aims of statistical mechanics is to reduce thermodynamics. Without an adequate model of inter-theoretic reduction one cannot assess whether it succeeds; I use the Neo-Nagelian model to critically discuss whether it does. Specifically, I consider two very recent derivations of the Second Law of thermodynamics, one from Boltzmannian classical statistical mechanics and another from quantum statistical mechanics. I argue that they are partially successful, and that each makes for a promising line of future research. (shrink) No categories | |
Simple models have played an important role in the discussion of foundational issues in statistical mechanics. Among them the spin-echo system is of particular interest since it can be realized experimentally. This has led to inferences being drawn about approaches to the foundations of statistical mechanics, particularly with respect to the use of coarse-graining. We examine these claims with the help of computer simulations. | |
No categories | |
Schulman (Entropy 7(4):221–233, 2005) has argued that Boltzmann’s intuition, that the psychological arrow of time is necessarily aligned with the thermodynamic arrow, is correct. Schulman gives an explicit physical mechanism for this connection, based on the brain being representable as a computer, together with certain thermodynamic properties of computational processes. Hawking (Physical Origins of Time Asymmetry, Cambridge University Press, Cambridge, 1994) presents similar, if briefer, arguments. The purpose of this paper is to critically examine the support for the link between (...) thermodynamics and an arrow of time for computers. The principal arguments put forward by Schulman and Hawking will be shown to fail. It will be shown that any computational process that can take place in an entropy increasing universe, can equally take place in an entropy decreasing universe. This conclusion does not automatically imply a psychological arrow can run counter to the thermodynamic arrow. Some alternative possible explanations for the alignment of the two arrows will be briefly discussed. (shrink) | |
No categories | |
According to the principle of indifference, when a set of possibilities is evidentially symmetric for you – when your evidence no more supports any one of the possibilities over any other – you're required to distribute your credences uniformly among them. Despite its intuitive appeal, the principle of indifference is often thought to be unsustainable due to the problem of multiple partitions: Depending on how a set of possibilities is divided, it seems that sometimes, applying indifference reasoning can require you (...) to assign incompatible credences to equivalent possibilities. This paper defends the principle of indifference from the problem of multiple partitions by offering two guides for how to respond. The first is for permissivists about rationality, and is modeled on permissivists' arguments for the claim that a body of evidence sometimes does not uniquely determine a fully rational credence function. The second is for impermissivists about rationality, and is modeled on impermissivists' arguments for the claim that a body of evidence does always uniquely determine a fully rational credence function. What appears to be a decisive objection against the principle of indifference is in fact an instance of a general challenge taking different forms familiar to both permissivists and impermissivists. (shrink) No categories | |
I discuss a broad critique of the classical approach to the foundations of statistical mechanics (SM) offered by N. S. Krylov. He claims that the classical approach is in principle incapable of providing the foundations for interpreting the "laws" of statistical physics. Most intriguing are his arguments against adopting a de facto attitude towards the problem of irreversibility. I argue that the best way to understand his critique is as setting the stage for a positive theory which treats SM as (...) a theory in its own right, involving a completely different conception of a system's state. As the orthodox approach treats SM as an extension of the classical or quantum theories (one which deals with large systems), Krylov is advocating a major break with the traditional view of statistical physics. (shrink) | |
The question why natural processes tend to flow along a preferred direction has always been considered from within the perspective of the Second Law of Thermodynamics, especially its statistical formulation due to Maxwell and Boltzmann. In this article, we re-examine the subject from the perspective of a new historico-philosophical formulation based on the careful use of selected theoretical elements taken from three key modern thinkers: Hans Reichenbach, Ilya Prigogine, and Roger Penrose, who are seldom considered together in the literature. We (...) emphasize in our analysis how the entropy concept was introduced in response to the desire to extend the applicability of the Second Law to the cosmos at large (Reichenbach and Penrose), and to examine whether intrinsic irreversibility is a fundamental universal characteristics of nature (Prigogine). While the three thinkers operate with vastly different technical proposals and belong to quite distinct intellectual backgrounds, some similarities are detected in their thinking. We philosophically examine these similarities but also bring into focus the uniqueness of each approach. Our purpose is not providing an exhaustive derivations of logical concepts identified in one thinker in terms of ideas found in the others. Instead, the main objective of this work is to stimulate historico-philosophical investigations and inquiries into the problem of the direction of time in nature by way of crossdisciplinary examinations of previous theories commonly treated in literature as disparate domains. (shrink) | |
Huw Price (1996, 2002, 2003) argues that causal-dynamical theories that aim to explain thermodynamic asymmetry in time are misguided. He points out that in seeking a dynamical factor responsible for the general tendency of entropy to increase, these approaches fail to appreciate the true nature of the problem in the foundations of statistical mechanics (SM). I argue that it is Price who is guilty of misapprehension of the issue at stake. When properly understood, causal-dynamical approaches in the foundations of SM (...) offer a solution for a different problem; a problem that unfortunately receives no attention in Price’s celebrated work. (shrink) | |
In this paper I will evaluate whether some knowledge states that are interpretatively derived from statistical mechanical probabilities could be somehow relevant in actual practices, as famously rejected by Albert (2000). On one side, I follow Frigg (2010a) in rejecting the causal relevance of knowledge states as a mere byproduct of misinterpreting this theoretical field. On the other side, I will argue against Uffink (2011) that probability-represented epistemic states cannot be explanatorily relevant, because (i) probabilities cannot faithfully represent significant epistemic (...) states, and (ii) those states cannot satisfactorily account for why an agent should theoretically believe or expect something. (shrink) | |
In (Weaver 2021), I showed that Boltzmann’s H-theorem does not face a significant threat from the reversibility paradox. I argue that my defense of the H-theorem against that paradox can be used yet again for the purposes of resolving the recurrence paradox without having to endorse heavy-duty statistical assumptions outside of the hypothesis of molecular chaos. As in (Weaver 2021), lessons from the history and foundations of physics reveal precisely how such resolution is achieved. | |
This paper addresses the question of how we should regard the probability distributions introduced into statistical mechanics. It will be argued that it is problematic to take them either as purely subjective credences, or as objective chances. I will propose a third alternative: they are "almost objective" probabilities, or "epistemic chances". The definition of such probabilities involves an interweaving of epistemic and physical considerations, and so cannot be classified as either purely subjective or purely objective. This conception, it will be (...) argued, resolves some of the puzzles associated with statistical mechanical probabilities; it explains how probabilistic posits introduced on the basis of incomplete knowledge can yield testable predictions, and it also bypasses the problem of disastrous retrodictions, that is, the fact the standard equilibrium measures yield high probability of the system being in equilibrium in the recent past, even when we know otherwise. As the problem does not arise on the conception of probabilities considered here, there is no need to invoke a Past Hypothesis as a special posit to avoid it. (shrink) | |
One popular approach to statistical mechanics understands statistical mechanical probabilities as measures of rational indifference. Naive formulations of this ``indifference approach'' face reversibility worries - while they yield the right prescriptions regarding future events, they yield the wrong prescriptions regarding past events. This paper begins by showing how the indifference approach can overcome the standard reversibility worries by appealing to the Past Hypothesis. But, the paper argues, positing a Past Hypothesis doesn't free the indifference approach from all reversibility worries. For (...) while appealing to the Past Hypothesis allows it to escape one kind of reversibility worry, it makes it susceptible to another - the Meta-Reversibility Objection. And there is no easy way for the indifference approach to escape the Meta-Reversibility Objection. As a result, reversibility worries pose a steep challenge to the viability of the indifference approach. (shrink) | |
A serious flaw in Hartry Field’s instrumental account of applied mathematics, namely that Field must overestimate the extent to which many of the structures of our mathematical theories are reflected in the physical world, underlies much of the criticism of this account. After reviewing some of this criticism, I illustrate through an examination of the prospects for extending Field’s account to classical equilibrium statistical mechanics how this flaw will prevent any significant extension of this account beyond field theories. I note (...) in the conclusion that this diagnosis of Field’s program also points the way to modifications that may work. (shrink) | |
Complexity theory attempts to explain, at the most general possible level, the interesting behaviors of complex systems. Two such behaviors are the emergence of simple or stable high-level behavior from relatively complex low-level behavior, and the emergence of sophisticated high-level behavior from relatively simple low-level behavior; they are often found nested in the same system. Concerning the emergence of simplicity, this essay examines Herbert Simon's explanation from near-decomposability and a stochastic explanation that generalizes the approach of statistical physics. A more (...) general notion of an abstract difference-making structure is introduced with examples, and a discussion of evolvability follows. Concerning the emergence of sophistication, this essay focuses on, first, the energetics approach associated with dissipative structures and the fourth law of thermodynamics, and second, the notion of a complex adaptive system. (shrink) | |
We investigate whether small perturbations can cause relaxation to quantum equilibrium over very long timescales. We consider in particular a two-dimensional harmonic oscillator, which can serve as a model of a field mode on expanding space. We assume an initial wave function with small perturbations to the ground state. We present evidence that the trajectories are highly confined so as to preclude relaxation to equilibrium even over very long timescales. Cosmological implications are briefly discussed. | |
A generalization of Ehrenfest''s urn model is suggested. This will allow usto treat a wide class of stochastic processes describing the changes ofmicroscopic objects. These processes are homogeneous Markov chains. Thegeneralization proposed is presented as an abstract conditional (relative)probability theory. The probability axioms of such a theory and some simpleadditional conditions, yield both transition probabilities and equilibriumdistributions. The resulting theory interpreted in terms of particles andsingle-particle states, leads to the usual formulae of quantum and classicalstatistical mechanics; in terms of chromosomes (...) and allelic types, it allowsthe deduction of many genetical models including the Ewens sampling formula;in terms of agents'' strategies, it gives a justification of the ``herdbehaviour'''' typical of a population of heterogeneous economic agents. (shrink) | |
With reference to two specific modalities of sensation, the taste of saltiness of chloride salts, and the loudness of steady tones, it is shown that the laws of sensation (logarithmic and power laws) are expressions of the entropy per mole of the stimulus. That is, the laws of sensation are linear functions of molar entropy. In partial verification of this hypothesis, we are able to derive an approximate value for the gas constant, a fundamental physical constant, directly from psychophysical measurements. (...) The significance of our observation lies in the linking of the phenomenon of “sensation” directly to a physical measure. It suggests that if the laws of physics are universal, the laws of sensation and perception are similarly universal. It also connects the sensation of a simple, steady physical signal with the molecular structure of the signal: the greater the number of microstates or complexions of the stimulus signal, the greater the magnitude of the sensation (saltiness or loudness). The hypothesis is currently tested on two sensory modalities. (shrink) | |
In a previous work (M. Campisi. Stud. Hist. Phil. M. P. 36 (2005) 275-290) we have addressed the mechanical foundations of equilibrium thermodynamics on the basis of the Generalized Helmholtz Theorem. It was found that the volume entropy provides a good mechanical analogue of thermodynamic entropy because it satisfies the heat theorem and it is an adiabatic invariant. This property explains the ``equal'' sign in Clausius principle ($S_f \geq S_i$) in a purely mechanical way and suggests that the volume entropy (...) might explain the ``larger than'' sign (i.e. the Law of Entropy Increase) if non adiabatic transformations were considered. Based on the principles of microscopic (quantum or classical) mechanics here we prove that, provided the initial equilibrium satisfy the natural condition of decreasing ordering of probabilities, the expectation value of the volume entropy cannot decrease for arbitrary transformations performed by some external sources of work on a insulated system. This can be regarded as a rigorous quantum mechanical proof of the Second Law. We discuss how this result relates to the Minimal Work Principle and improves over previous attempts. The natural evolution of entropy is towards larger values because the natural state of matter is at positive temperature. Actually the Law of Entropy Decrease holds in artificially prepared negative temperature systems. (shrink) | |
Bose-Einstein statistics may be characterized in terms of multinomial distribution. From this characterization, an information theoretic analysis is made for Einstein-Podolsky-Rosen like situation; using Shannon’s measure of entropy. | |
How should our beliefs change over time? The standard answer to this question is the Bayesian one. But while the Bayesian account works well with respect to beliefs about the world, it breaks down when applied to self-locating or de se beliefs. In this work I explore ways to extend Bayesianism in order to accommodate de se beliefs. I begin by assessing, and ultimately rejecting, attempts to resolve these issues by appealing to Dutch books and chance-credence principles. I then propose (...) and examine several accounts of the dynamics of de se beliefs. These examinations suggest that an extension of Bayesianism to de se beliefs will require some uncomfortable choices. I conclude by laying out the options available, and assessing the prospects of each. (shrink) | |
We present an analysis of Szilard's one-molecule Maxwell's demon, including a detailed entropy accounting, that suggests a general theory of the entropy cost of information. It is shown that the entropy of the demon increases during the expansion step, due to the decoupling of the molecule from the measurement information. It is also shown that there is an entropy symmetry between the measurement and erasure steps, whereby the two steps additivelv share a constant entropy change, but the proportion that occurs (...) during each of the two steps is arbitrary. Therefore the measurement step may be accompanied by an entropy increase, a decrease, or no change at all, and likewise for the erasure step. Generalizing beyond the demon, decorrelation between a physical system and information about that system always causes an entropy increase in the joint system comprised of both the original system and the information. Decorrelation causes a net entropy increase in the universe unless, as in the Szilard demon, the information is used to decrease entropy elsewhere before the correlation is lost. Thus, information is thermodynamically costly precisely to the extent that it is not used to obtain work from the measured system. (shrink) |