
Quantum information is theinformation of thestate of aquantum system. It is the basic entity of study inquantum information science,[1][2][3] and can be manipulated usingquantum information processing techniques. Quantum information refers to both the technical definition in terms ofVon Neumann entropy and the general computational term.
It is an interdisciplinary field that involvesquantum mechanics,computer science,information theory,philosophy andcryptography among other fields.[4][5][6] Its study is also relevant to disciplines such ascognitive science,psychology andneuroscience.[7][8][9][10] Its main focus is in extracting information from matter at the microscopic scale. Observation in science is one of the most important ways of acquiring information and measurement is required in order to quantify the observation, making this crucial to thescientific method. Inquantum mechanics, due to theuncertainty principle,non-commutingobservables cannot be precisely measured simultaneously, as aneigenstate in one basis is not an eigenstate in the other basis. According to the eigenstate–eigenvalue link, an observable is well-defined (definite) when the state of the system is an eigenstate of the observable.[11] Since any two non-commuting observables are not simultaneously well-defined, a quantum state can never contain definitive information about both non-commuting observables.[8]
Data can be encoded into thequantum state of a quantum system asquantum information.[12] While quantum mechanics deals with examining properties of matter at the microscopic level,[13][8]quantum information science focuses on extracting information from those properties,[8] andquantum computation manipulates and processes information – performs logical operations – usingquantum information processing techniques.[14]
Quantum information, like classical information, can be processed usingdigital computers,transmitted from one location to another, manipulated withalgorithms, and analyzed with computer science andmathematics. Just like the basic unit of classical information is the bit, quantum information deals withqubits.[15] Quantum information can be measured using Von Neumann entropy.
Recently, the field ofquantum computing has become an active research area because of the possibility to disrupt modern computation, communication, andcryptography.[14][16]
The history of quantum information theory began at the turn of the 20th century whenclassical physics was revolutionized intoquantum physics. The theories of classical physics were predicting absurdities such as theultraviolet catastrophe, or electrons spiraling into the nucleus. At first these problems were brushed aside by adding ad hoc hypotheses to classical physics. Soon, it became apparent that a new theory must be created in order to make sense of these absurdities, and the theory of quantum mechanics was born.[2]
Quantum mechanics was formulated byErwin Schrödinger using wave mechanics andWerner Heisenberg usingmatrix mechanics.[17] The equivalence of these methods was proven later.[18] Their formulations described the dynamics of microscopic systems but had several unsatisfactory aspects in describing measurement processes. Von Neumann formulated quantum theory usingoperator algebra in a way that it described measurement as well as dynamics.[19] These studies emphasized the philosophical aspects of measurement rather than a quantitative approach to extracting information via measurements.
| Evolution of: | Picture () | ||
| Schrödinger (S) | Heisenberg (H) | Interaction (I) | |
| Ket state | constant | ||
| Observable | constant | ||
| Density matrix | constant | ||
In the 1960s,Ruslan Stratonovich,Carl Helstrom and Gordon[20] proposed a formulation of optical communications using quantum mechanics. This was the first historical appearance of quantum information theory. They mainly studied error probabilities and channel capacities for communication.[20][21][22] Later,Alexander Holevo obtained an upper bound of communication speed in the transmission of a classical message via aquantum channel.[23][24]
In the 1970s, techniques for manipulating single-atom quantum states, such as theatom trap and thescanning tunneling microscope, began to be developed, making it possible to isolate single atoms and arrange them in arrays. Prior to these developments, precise control over single quantum systems was not possible, and experiments used coarser, simultaneous control over a large number of quantum systems.[2] The development of viable single-state manipulation techniques led to increased interest in the field of quantum information and computation.
In the 1980s, interest arose in whether it might be possible to use quantum effects to disproveEinstein's theory of relativity. If it were possible to clone an unknown quantum state, it would be possible to useentangled quantum states to transmit information faster than the speed of light, disproving Einstein's theory. However, theno-cloning theorem showed that such cloning is impossible. The theorem was one of the earliest results of quantum information theory.[2]
Despite all the excitement and interest over studying isolated quantum systems and trying to find a way to circumvent the theory of relativity, research in quantum information theory became stagnant in the 1980s. However, around the same time another avenue started dabbling into quantum information and computation:Cryptography. In a general sense,cryptography is the problem of doing communication or computation involving two or more parties who may not trust one another.[2]
Bennett and Brassard developed a communication channel on which it is impossible to eavesdrop without being detected, a way of communicating secretly at long distances using theBB84 quantum cryptographic protocol.[25] The key idea was the use of the fundamental principle of quantum mechanics that observation disturbs the observed, and the introduction of an eavesdropper in a secure communication line will immediately let the two parties trying to communicate know of the presence of the eavesdropper.
With the advent ofAlan Turing's revolutionary ideas of a programmable computer, orTuring machine, he showed that any real-world computation can be translated into an equivalent computation involving a Turing machine.[26][27] This is known as theChurch–Turing thesis.
Soon enough, the first computers were made, and computer hardware grew at such a fast pace that the growth, through experience in production, was codified into an empirical relationship calledMoore's law. This 'law' is a projective trend that states that the number of transistors in anintegrated circuit doubles every two years.[28] As transistors began to become smaller and smaller in order to pack more power per surface area, quantum effects started to show up in the electronics resulting in inadvertent interference. This led to the advent of quantum computing, which uses quantum mechanics to design algorithms.
At this point, quantum computers showed promise of being much faster than classical computers for certain specific problems. One such example problem was developed byDavid Deutsch andRichard Jozsa, known as theDeutsch–Jozsa algorithm. This problem however held little to no practical applications.[2]Peter Shor in 1994 came up with a very important and practicalproblem, one of finding the prime factors of an integer. Thediscrete logarithm problem as it was called, could theoretically be solved efficiently on a quantum computer but not on a classical computer hence showing that quantum computers should be more powerful than Turing machines.
Around the time computer science was making a revolution, so was information theory and communication, throughClaude Shannon.[29][30][31] Shannon developed two fundamental theorems of information theory: noiseless channel coding theorem andnoisy channel coding theorem. He also showed thaterror correcting codes could be used to protect information being sent.
Quantum information theory also followed a similar trajectory, Ben Schumacher in 1995 made an analogue to Shannon'snoiseless coding theorem using thequbit. A theory of error-correction also developed, which allows quantum computers to make efficient computations regardless of noise and make reliable communication over noisy quantum channels.[2]
Quantum information differs strongly from classical information, epitomized by thebit, in many striking and unfamiliar ways. While the fundamental unit of classical information is thebit, the most basic unit of quantum information is thequbit. Classical information is measured usingShannon entropy, while the quantum mechanical analogue isVon Neumann entropy. Given astatistical ensemble of quantum mechanical systems with thedensity matrix, it is given by[2] Many of the same entropy measures in classicalinformation theory can also be generalized to the quantum case, such as Holevo entropy[32] and theconditional quantum entropy.
Unlike classical digital states (which are discrete), a qubit is continuous-valued, describable by a direction on theBloch sphere. Despite being continuously valued in this way, a qubit is thesmallest possible unit of quantum information, and despite the qubit state being continuous-valued, it isimpossible tomeasure the value precisely. Five famous theorems describe the limits on manipulation of quantum information.[2]
These theorems are proven fromunitarity, which according toLeonard Susskind is the technical term for the statement that quantum information within the universe is conserved.[33]: 94 The five theorems open possibilities in quantum information processing.
The state of a qubit contains all of its information. This state is frequently expressed as a vector on the Bloch sphere. This state can be changed by applyinglinear transformations orquantum gates to them. Theseunitary transformations are described as rotations on the Bloch sphere. While classical gates correspond to the familiar operations ofBoolean logic, quantum gates are physicalunitary operators.
The study of the above topics and differences comprises quantum information theory.
Quantum mechanics is the study of how microscopic physical systems change dynamically in nature. In the field of quantum information theory, the quantum systems studied are abstracted away from any real world counterpart. A qubit might for instance physically be aphoton in alinear optical quantum computer, an ion in atrapped ion quantum computer, or it might be a large collection of atoms as in asuperconducting quantum computer. Regardless of the physical implementation, the limits and features of qubits implied by quantum information theory hold as all these systems are mathematically described by the same apparatus ofdensity matrices over thecomplex numbers. Another important difference with quantum mechanics is that while quantum mechanics often studiesinfinite-dimensional systems such as aharmonic oscillator, quantum information theory is concerned with both continuous-variable systems[34] and finite-dimensional systems.[8][35][36]
Entropy measures the uncertainty in the state of a physical system.[2] Entropy can be studied from the point of view of both the classical and quantum information theories.
Classical information is based on the concepts of information laid out byClaude Shannon. Classical information, in principle, can be stored in a bit of binary strings. Any system having two states is a capable bit.[37]
Shannon entropy is the quantification of the information gained by measuring the value of a random variable. Another way of thinking about it is by looking at the uncertainty of a system prior to measurement. As a result, entropy, as pictured by Shannon, can be seen either as a measure of the uncertainty prior to making a measurement or as a measure of information gained after making said measurement.[2]
Shannon entropy, written as a function of a discrete probability distribution, associated with events, can be seen as the average information associated with this set of events, in units of bits:
This definition of entropy can be used to quantify the physical resources required to store the output of an information source. The ways of interpreting Shannon entropy discussed above are usually only meaningful when the number of samples of an experiment is large.[35]
TheRényi entropy is a generalization of Shannon entropy defined above. The Rényi entropy of order r, written as a function of a discrete probability distribution,, associated with events, is defined as:[37]
for and.
We arrive at the definition of Shannon entropy from Rényi when, ofHartley entropy (or max-entropy) when, andmin-entropy when.
Quantum information theory is largely an extension of classical information theory to quantum systems. Classical information is produced when measurements of quantum systems are made.[37]
One interpretation of Shannon entropy was the uncertainty associated with a probability distribution. When we want to describe the information or the uncertainty of a quantum state, the probability distributions are simply replaced bydensity operators:
where are the eigenvalues of.
Von Neumann entropy plays a role in quantum information similar to the role Shannon entropy plays in classical information.
Quantum communication is one of the applications of quantum physics and quantum information. There are some famous theorems such as the no-cloning theorem that illustrate some important properties in quantum communication.Dense coding andquantum teleportation are also applications of quantum communication. They are two opposite ways to communicate using qubits. While teleportation transfers one qubit from Alice and Bob by communicating two classical bits under the assumption that Alice and Bob have a pre-sharedBell state, dense coding transfers two classical bits from Alice to Bob by using one qubit, again under the same assumption, that Alice and Bob have a pre-shared Bell state.
One of the best known applications of quantum cryptography isquantum key distribution which provide a theoretical solution to the security issue of a classical key. The advantage of quantum key distribution is that it is impossible to copy a quantum key because of theno-cloning theorem. If someone tries to read encoded data, the quantum state being transmitted will change. This could be used to detect eavesdropping.
The first quantum key distribution scheme,BB84, was developed by Charles Bennett andGilles Brassard in 1984. It is usually explained as a method of securely communicating a private key from a third party to another for use in one-time pad encryption.[2]
E91 was made byArtur Ekert in 1991. His scheme uses entangled pairs of photons. These two photons can be created by Alice, Bob, or by a third party including eavesdropper Eve. One of the photons is distributed to Alice and the other to Bob so that each one ends up with one photon from the pair.
This scheme relies on two properties of quantum entanglement:
B92 is a simpler version of BB84.[38]
The main difference between B92 and BB84:
Like the BB84, Alice transmits to Bob a string of photons encoded with randomly chosen bits but this time the bits Alice chooses the bases she must use. Bob still randomly chooses a basis by which to measure but if he chooses the wrong basis, he will not measure anything which is guaranteed by quantum mechanics theories. Bob can simply tell Alice after each bit she sends whether he measured it correctly.[39]
The most widely used model in quantum computation is thequantum circuit, which are based on the quantum bit "qubit". Qubit is somewhat analogous to thebit in classical computation. Qubits can be in a 1 or 0quantum state, or they can be in asuperposition of the 1 and 0 states. However, when qubits are measured, the result of the measurement is always either a 0 or a 1; theprobabilities of these two outcomes depend on thequantum state that the qubits were in immediately prior to the measurement.
Any quantum computation algorithm can be represented as a network ofquantum logic gates.
If a quantum system were perfectly isolated, it would maintain coherence perfectly, but it would be impossible to test the entire system. If it is not perfectly isolated, for example during a measurement, coherence is shared with the environment and appears to be lost with time; this process is called quantum decoherence. As a result of this process, quantum behavior is apparently lost, just as energy appears to be lost by friction in classical mechanics.
QEC is used inquantum computing to protect quantum information from errors due todecoherence and otherquantum noise. Quantum error correction is essential if one is to achieve fault-tolerant quantum computation that can deal not only with noise on stored quantum information, but also with faulty quantum gates, faulty quantum preparation, and faulty measurements.
Peter Shor first discovered this method of formulating aquantum error correcting code by storing the information of one qubit onto a highly entangled state ofancilla qubits. A quantum error correcting code protects quantum information against errors.
Many journals publish research inquantum information science, although only a few are dedicated to this area. Among these are: