Quantum contextuality
Quantum contextuality is a feature of the phenomenology of quantum mechanics whereby measurements of quantum observables cannot simply be thought of as revealing pre-existing values. Any attempt to do so in a realistic hidden-variable theory leads to values that are dependent upon the choice of the other observables which are simultaneously measured. More formally, the measurement result of a quantum observable is dependent upon which other commuting observables are within the same measurement set.
Contextuality was first demonstrated to be a feature of quantum phenomenology by the Bell–Kochen–Specker theorem. The study of contextuality has developed into a major topic of interest in quantum foundations as the phenomenon crystallises certain non-classical and counter-intuitive aspects of quantum theory. A number of powerful mathematical frameworks have been developed to study and better understand contextuality, from the perspective of sheaf theory, graph theory, hypergraphs, algebraic topology, and probabilistic couplings.
Nonlocality, in the sense of Bell's theorem, may be viewed as a special case of the more general phenomenon of contextuality, in which measurement contexts contain measurements that are distributed over spacelike separated regions. This follows from the Fine–Abramsky–Brandenburger theorem.
Quantum contextuality has been identified as a source of quantum computational speedups and quantum advantage in quantum computing. Contemporary research has increasingly focused on exploring its utility as a computational resource.
Kochen and Specker
and Ernst Specker, and separately John Bell, constructed proofs any realistic hidden-variable theory able to explain the phenomenology of quantum mechanics is contextual for systems of Hilbert space dimension three and greater. The Kochen–Specker theorem proves that realistic noncontextual hidden variable theories cannot reproduce the empirical predictions of quantum mechanics. Such a theory would suppose the following.- All quantum-mechanical observables may be simultaneously assigned definite values. These global value assignments may deterministically depend on some 'hidden' classical variable which, in turn, may vary stochastically for some classical reason. The measured assignments of observables may therefore finally stochastically change. This stochasticity is however epistemic and not ontic as in the standard formulation of quantum mechanics.
- Value assignments pre-exist and are independent of the choice of any other observables which, in standard quantum mechanics, are described as commuting with the measured observable, and they are also measured.
- Some functional constraints on the assignments of values for compatible observables are assumed.
Frameworks for contextuality
Sheaf-theoretic framework
The sheaf-theoretic, or Abramsky–Brandenburger, approach to contextuality initiated by Samson Abramsky and Adam Brandenburger is theory-independent and can be applied beyond quantum theory to any situation in which empirical data arises in contexts. As well as being used to study forms of contextuality arising in quantum theory and other physical theories, it has also been used to study formally equivalent phenomena in logic, relational databases, natural language processing, and constraint satisfaction.In essence, contextuality arises when empirical data is locally consistent but globally inconsistent. Analogies may be drawn with impossible figures like the Penrose staircase, which in a formal sense may also be said to exhibit a kind of contextuality.
This framework gives rise in a natural way to a qualitative hierarchy of contextuality.
- contextuality may be witnessed in measurement statistics, e.g. by the violation of an inequality. A representative example is the KCBS proof of contextuality.
- Logical contextuality may be witnessed in the 'possibilistic' information about which outcome events are possible and which are not possible. A representative example is Hardy's nonlocality proof of nonlocality.
- Strong contextuality is a maximal form of contextuality. Whereas contextuality arises when measurement statistics cannot be reproduced by a mixture of global value assignments, strong contextuality arises when no global value assignment is even compatible with the possible outcome events. A representative example is the original Kochen–Specker proof of contextuality.
Graph and hypergraph frameworks
Adán Cabello, Simone Severini, and Andreas Winter introduced a general graph-theoretic framework for studying contextuality of different physical theories. Within this framework experimental scenarios are described by graphs, and certain invariants of these graphs were shown have particular physical significance. One way in which contextuality may be witnessed in measurement statistics is though the violation of noncontextuality inequalities. With respect to certain appropriately normalised inequalities, the independence number, Lovász number, and fractional packing number of the graph of an experimental scenario provide tight upper bounds on the degree to which classical theories, quantum theory, and generalised probabilistic theories, respectively, may exhibit contextuality in an experiment of that kind. A more refined framework based on hypergraphs rather than graphs is also used.Contextuality-by-Default (CbD) framework
In the CbD approach, developed by Ehtibar Dzhafarov, Janne Kujala, and colleagues, contextuality is treated as a property of any system of random variables, defined as a set in which each random variable is labeled by its content, the property it measures, and its context, the set of recorded circumstances under which it is recorded ; stands for “ is measured in.” The variables within a context are jointly distributed, but variables from different contexts are stochastically unrelated, defined on different sample spaces. A coupling of the system is defined as a system in which all variables are jointly distributed and, in any context, and are identically distributed. The system is considered noncontextual if it has a coupling such that the probabilities are maximal possible for all contexts and contents such that. If such a coupling does not exist, the system is contextual. For the important class of cyclic systems of dichotomous random variables, , it has been shown that such a system is noncontextual if and only ifwhere
and
with the maximum taken over all whose product is. If and, measuring the same content in different context, are always identically distributed, the system is called consistently connected. Except for certain logical issues, in this case CbD specializes to traditional treatments of contextuality in quantum physics. In particular, for consistently connected cyclic systems the noncontextuality criterion above reduces to which includes the Bell/CHSH inequality, KCBS inequality, and other famous inequalities. That nonlocality is a special case of contextuality follows in CbD from the fact that being jointly distributed for random variables is equivalent to being measurable functions of one and the same random variable. CbD essentially coincides with the probabilistic part of Abramsky's sheaf-theoretic approach if the system is strongly consistently connected, which means that the joint distributions of and coincide whenever are measured in contexts. However, unlike most approaches to contextuality, CbD allows for inconsistent connectedness, with and differently distributed. This makes CbD applicable to physics experiments in which no-disturbance condition is violated, as well as to human behavior where this condition is violated as a rule. In particular, Vctor Cervantes, Ehtibar Dzhafarov, and colleagues have demonstrated that random variables describing certain paradigms of simple decision making form contextual systems, whereas many other decision-making systems are noncontextual once their inconsistent connectedness is properly taken into account.
Operational framework
An extended notion of contextuality due to Robert Spekkens applies to preparations and transformations as well as to measurements, within a general framework of operational physical theories. With respect to measurements, it removes the assumption of determinism of value assignments that is present in standard definitions of contextuality. This breaks the interpretation of nonlocality as a special case of contextuality, and does not treat irreducible randomness as nonclassical. Nevertheless, it recovers the usual notion of contextuality when outcome determinism is imposed.Spekkens' contextuality can be motivated using Leibniz's law of the identity of indiscernibles. The law applied to physical systems in this framework mirrors the entended definition of noncontextuality. This was further explored by Simmons et al, who demonstrated that other notions of contextuality could also be motivated by Leibnizian principles, and could be thought of as tools enabling ontological conclusions from operational statistics.
Other frameworks and extensions
- A form of contextuality that may present in the dynamics of a quantum system was introduced by Shane Mansfield and Elham Kashefi, and has been shown to relate to computational quantum advantages. As a notion of contextuality that applies to transformations it is inequivalent to that of Spekkens. Examples explored to date rely on additional memory constraints which have a more computational than foundational motivation. Contextuality may be traded-off against Landauer erasure to obtain equivalent advantages.
Fine–Abramsky–Brandenburger theorem
Measures of contextuality
Contextual fraction
A number of methods exist for quantifying contextuality. One approach is by measuring the degree to which some particular noncontextuality inequality is violated, e.g. the KCBS inequality, the Yu–Oh inequality, or some Bell inequality. A more general measure of contextuality is the contextual fraction.Given a set of measurement statistics e, consisting of a probability distribution over joint outcomes for each measurement context, we may consider factoring e into a noncontextual part eNC and some remainder e',The maximum value of λ over all such decompositions is the noncontextual fraction of e denoted NCF, while the remainder CF= is the contextual fraction of e. The idea is that we look for a noncontextual explanation for the highest possible fraction of the data, and what is left over is the irreducibly contextual part. Indeed for any such decomposition that maximises λ the leftover e'
It has also been proved that CF is an upper bound on the extent to which e violates any normalised noncontextuality inequality. Here normalisation means that violations are expressed as fractions of the algebraic maximum violation of the inequality. Moreover, the dual linear program to that which maximises λ computes a noncontextual inequality for which this violation is attained. In this sense the contextual fraction is a more neutral measure of contextuality, since it optimises over all possible noncontextual inequalities rather than checking the statistics against one inequality in particular.
Measures of (non)contextuality within the Contextuality-by-Default (CbD) framework
Several measures of the degree of contextuality in contextual systems were proposed within the CbD framework, but only one of them, denoted CNT2, has been shown to naturally extend into a measure of noncontextuality in noncontextual systems, NCNT2. This is important, because at least in the non-physical applications of CbD contextuality and noncontextuality are of equal interest. Both CNT2 and NCNT2 are defined as the -distance between a probability vector representing a system and the surface of the noncontextuality polytope representing all possible noncontextual systems with the same single-variable marginals. For cyclic systems of dichotomous random variables, it is shown that if the system is contextual,and if it is noncontextual,
where is the -distance from the vector to the surface of the box circumscribing the noncontextuality polytope. More generally, NCNT2 and CNT2 are computed by means of linear programming. The same is true for other CbD-based measures of contextuality. One of them, denoted CNT3, uses the notion of a quasi-coupling, that differs from a coupling in that the probabilities in the joint distribution of its values are replaced with arbitrary reals. The class of quasi-couplings maximizing the probabilities is always nonempty, and the minimal total variation of the signed measure in this class is a natural measure of contextuality.
Contextuality as a resource for quantum computing
Recently, quantum contextuality has been investigated as a source of quantum advantage and computational speedups in quantum computing.Magic state distillation
is a scheme for quantum computing in which quantum circuits constructed only of Clifford operators, which by themselves are fault-tolerant but efficiently classically simulable, are injected with certain "magic" states that promote the computational power to universal fault-tolerant quantum computing. In 2014, Mark Howard, et al. showed that contextuality characterises magic states for qudits of odd prime dimension and for qubits with real wavefunctions. Extensions to the qubit case have been investigated by Juani Bermejo-Vega et al. This line of research builds on earlier work by Ernesto Galvão, which showed that Wigner function negativity is necessary for a state to be "magic"; it later emerged that Wigner negativity and contextuality are in a sense equivalent notions of nonclassicality.Measurement-based quantum computing
is a model for quantum computing in which a classical control computer interacts with a quantum system by specifying measurements to be performed and receiving measurement outcomes in return. The measurement statistics for the quantum system may or may not exhibit contextuality. A variety of results have shown that the presence of contextuality enhances the computational power of an MBQC.In particular, researchers have considered an artificial situation in which the power of the classical control computer is restricted to only being able to compute linear Boolean functions, i.e. to solve problems in the Parity L complexity class ⊕L. For interactions with multi-qubit quantum systems a natural assumption is that each step of the interaction consists of a binary choice of measurement which in turn returns a binary outcome. An MBQC of this restricted kind is known as an l2-MBQC.
Anders and Browne
In 2009, Janet Anders and Dan Browne showed that two specific examples of nonlocality and contextuality were sufficient to compute a non-linear function. This in turn could be used to boost computational power to that of a universal classical computer, i.e. to solve problems in the complexity class P. This is sometimes referred to as measurement-based classical computation. The specific examples made use of the Greenberger–Horne–Zeilinger nonlocality proof and the supra-quantum Popescu–Rohrlich box.Raussendorf
In 2013, Robert Raussendorf showed more generally that access to strongly contextual measurement statistics is necessary and sufficient for an l2-MBQC to compute a non-linear function. He also showed that to compute non-linear Boolean functions with sufficiently high probability requires contextuality.Abramsky, Barbosa and Mansfield
A further generalisation and refinement of these results due to Samson Abramsky, Rui Soares Barbosa and Shane Mansfield appeared in 2017, proving a precise quantifiable relationship between the probability of successfully computing any given non-linear function and the degree of contextuality present in the l2-MBQC as measured by the contextual fraction. Specifically, where are the probability of success, the contextual fraction of the measurement statistics e, and a measure of the non-linearity of the function to be computed, respectively.Further examples
- The above inequality was also shown to relate quantum advantage in non-local games to the degree of contextuality required by the strategy and an appropriate measure of the difficulty of the game.
- Similarly the inequality arises in a transformation-based model of quantum computation analogous to l2-MBQC where it relates the degree of sequential contextuality present in the dynamics of the quantum system to the probability of success and the degree of non-linearity of the target function.
- Preparation contextuality has been shown to enable quantum advantages in cryptographic random-access codes and in state-discrimination tasks.
- In classical simulations of quantum systems, contextuality has been shown to incur memory costs.