
| Part of a series of articles about |
| Quantum mechanics |
|---|
Scientists |
In variousinterpretations ofquantum mechanics,wave function collapse, also calledreduction of the state vector,[1] occurs when awave function—initially in asuperposition of severaleigenstates—reduces to a single eigenstate due tointeraction with the external world. This interaction is called anobservation and is the essence of ameasurement in quantum mechanics, which connects the wave function with classicalobservables such asposition andmomentum. Collapse is one of the two processes by whichquantum systems evolve in time; the other is the continuous evolution governed by theSchrödinger equation.[2]
In theCopenhagen interpretation, wave function collapse connects quantum to classical models, with a specialrole for the observer. By contrast,objective-collapse proposes an origin in physical processes. In themany-worlds interpretation, collapse does not exist; all wave function outcomes occur whilequantum decoherence accounts for the appearance of collapse.
Historically,Werner Heisenberg was the first to use the idea of wave function reduction to explain quantum measurement.[3][4]
In quantum mechanics each measurable physical quantity of a quantum system is called anobservable which, for example, could be the position and the momentum but also energy, components of spin (), and so on. The observable acts as alinear function on the states of the system; its eigenvectors correspond to the quantum state (i.e.eigenstate) and theeigenvalues to the possible values of the observable. The collection of eigenstates/eigenvalue pairs represent all possible values of the observable. Writing for an eigenstate and for the corresponding observed value, any arbitrary state of the quantum system can be expressed as a vector usingbra–ket notation:The kets specify the different available quantum "alternatives", i.e., particular quantum states.
Thewave function is a specific representation of a quantum state. Wave functions can therefore always be expressed as eigenstates of an observable though the converse is not necessarily true.
To account for the experimental result that repeated measurements of a quantum system give the same results, the theory postulates a "collapse" or "reduction of the state vector" upon observation,[5]: 566 abruptly converting an arbitrary state into a single component eigenstate of the observable:
where the arrow represents a measurement of the observable corresponding to the basis.[6]For any single event, only one eigenvalue is measured, chosen randomly from among the possible values.
Thecomplex coefficients in the expansion of a quantum state in terms of eigenstates,can be written as an (complex) overlap of the corresponding eigenstate and the quantum state:They are called theprobability amplitudes. Thesquare modulus is the probability that a measurement of the observable yields the eigenstate. The sum of the probability over all possible outcomes must be one:[7]
As examples, individual counts in adouble slit experiment with electrons appear at random locations on the detector; after many counts are summed the distribution shows a wave interference pattern.[8] In aStern-Gerlach experiment with silver atoms, each particle appears in one of two areas unpredictably, but the final conclusion has equal numbers of events in each area.
This statistical aspect of quantum measurements differs fundamentally fromclassical mechanics. In quantum mechanics the only information we have about a system is its wave function and measurements of its wave function can only give statistical information.[5]: 17
The two terms "reduction of the state vector" (or "state reduction" for short) and "wave function collapse" are used to describe the same concept. Aquantum state is a mathematical description of a quantum system; aquantum state vector uses Hilbert space vectors for the description.[9]: 159 Reduction of the state vector replaces the full state vector with a single eigenstate of the observable.
The term "wave function" is typically used for a different mathematical representation of the quantum state, one that uses spatial coordinates also called the "position representation".[9]: 324 When the wave function representation is used, the "reduction" is called "wave function collapse".
The Schrödinger equation describes quantum systems but does not describe their measurement. Solution to the equations include all possible observable values for measurements, but measurements only result in one definite outcome. This difference is called themeasurement problem of quantum mechanics. To predict measurement outcomes from quantum solutions, the orthodox interpretation of quantum theory postulates wave function collapse and uses theBorn rule to compute the probable outcomes.[10] Despite the widespread quantitative success of these postulates scientists remain dissatisfied and have sought more detailed physical models. Rather than suspending the Schrödinger equation during the process of measurement, the measurement apparatus should be included and governed by the laws of quantum mechanics.[11]: 127
Quantum theory offers no dynamical description of the "collapse" of the wave function. Viewed as a statistical theory, no description is expected. As Fuchs and Peres put it, "collapse is something that happens in our description of the system, not to the system itself".[12]
Variousinterpretations of quantum mechanics attempt to provide a physical model for collapse.[13]: 816 Three treatments of collapse can be found among the common interpretations. The first group includes hidden-variable theories likede Broglie–Bohm theory; here random outcomes only result from unknown values of hidden variables. Results fromtests ofBell's theorem shows that these variables would need to be non-local. The second group models measurement as quantum entanglement between the quantum state and the measurement apparatus. This results in a simulation of classical statistics called quantum decoherence. This group includes themany-worlds interpretation andconsistent histories models. The third group postulates additional, but as yet undetected, physical basis for the randomness; this group includes for example theobjective-collapse interpretations. While models in all groups have contributed to better understanding of quantum theory, no alternative explanation for individual events has emerged as more useful than collapse followed by statistical prediction with the Born rule.[13]: 819
The significance ascribed to the wave function varies from interpretation to interpretation and even within an interpretation (such as theCopenhagen interpretation). If the wave function merely encodes an observer's knowledge of the universe, then the wave function collapse corresponds to the receipt of new information. This is somewhat analogous to the situation in classical physics, except that the classical "wave function" does not necessarily obey a wave equation. If the wave function is physically real, in some sense and to some extent, then the collapse of the wave function is also seen as a real process, to the same extent.[citation needed]
Quantum decoherence explains why a system interacting with an environment transitions from being apure state, exhibiting superpositions, to amixed state, an incoherent combination of classical alternatives.[14] This transition is fundamentally reversible, as the combined state of system and environment is still pure, but for all practical purposes irreversible in the same sense as in thesecond law of thermodynamics: the environment is a very large and complex quantum system, and it is not feasible to reverse their interaction. Decoherence is thus very important for explaining theclassical limit of quantum mechanics, but cannot explain wave function collapse, as all classical alternatives are still present in the mixed state, and wave function collapse selects only one of them.[15][16][14]
The form of decoherence known asenvironment-induced superselection proposes that when a quantum system interacts with the environment, the superpositionsapparently reduce to mixtures of classical alternatives. The combined wave function of the system and environment continue to obey the Schrödinger equation throughout thisapparent collapse.[17] More importantly, this is not enough to explainactual wave function collapse, as decoherence does not reduce it to a single eigenstate.[15][14]
The concept of wavefunction collapse was introduced byWerner Heisenberg in his 1927 paper on theuncertainty principle, "Über den anschaulichen Inhalt der quantentheoretischen Kinematik und Mechanik", and incorporated into themathematical formulation of quantum mechanics byJohn von Neumann, in his 1932 treatiseMathematische Grundlagen der Quantenmechanik.[4] Heisenberg did not try to specify exactly what the collapse of the wavefunction meant. However, he emphasized that it should not be understood as a physical process.[18] Niels Bohr never mentions wave function collapse in his published work, but he repeatedly cautioned that we must give up a "pictorial representation". Despite the differences between Bohr and Heisenberg, their views are often grouped together as the "Copenhagen interpretation", of which wave function collapse is regarded as a key feature.[19]
John von Neumann's influential 1932 workMathematical Foundations of Quantum Mechanics took a more formal approach, developing an "ideal" measurement scheme[20][21]: 1270 that postulated that there were two processes of wave function change:
In 1957Hugh Everett III proposed a model of quantum mechanics that dropped von Neumann's first postulate. Everett observed that the measurement apparatus was also a quantum system and its quantum interaction with the system under observation should determine the results. He proposed that the discontinuous change is instead a splitting of a wave function representing the universe.[21]: 1288 While Everett's approach rekindled interest in foundational quantum mechanics, it left core issues unresolved. Two key issues relate to origin of the observed classical results: what causes quantum systems to appear classical and to resolve with the observed probabilities of theBorn rule.[21]: 1290 [20]: 5
Beginning in 1970H. Dieter Zeh sought a detailed quantum decoherence model for the discontinuous change without postulating collapse. Further work byWojciech H. Zurek in 1980 lead eventually to a large number of papers on many aspects of the concept.[22] Decoherence assumes that every quantum system interacts quantum mechanically with its environment and such interaction is not separable from the system, a concept called an "open system".[21]: 1273 Decoherence has been shown to work very quickly and within a minimal environment, but as yet it has not succeeded in a providing a detailed model replacing the collapse postulate of orthodox quantum mechanics.[21]: 1302
By explicitly dealing with the interaction of object and measuring instrument, von Neumann[2] described a quantum mechanical measurement scheme consistent with wave function collapse. However, he did not prove thenecessity of such a collapse. Von Neumann's projection postulate was conceived based on experimental evidence available during the 1930s, in particularCompton scattering. Later work refined the notion of measurements into the more easily discussedfirst kind, that will give the same value when immediately repeated, and thesecond kind that give different values when repeated.[23][24][25]
Among Bohr scholars it is common to assert that Bohr never mentions the wave function collapse (see e.g. Howard, 2004 and Faye, 2008). It is true that in Bohr's published writings, he does not discuss the status or existence of this standard component in the popular image of the Copenhagen interpretation.