Quantum contextuality
   HOME

TheInfoList



OR:

Quantum contextuality is a feature of the
phenomenology Phenomenology may refer to: Art * Phenomenology (architecture), based on the experience of building materials and their sensory properties Philosophy * Phenomenology (philosophy), a branch of philosophy which studies subjective experiences and a ...
of
quantum mechanics Quantum mechanics is a fundamental theory in physics that provides a description of the physical properties of nature at the scale of atoms and subatomic particles. It is the foundation of all quantum physics including quantum chemistry, ...
whereby measurements of quantum
observable In physics, an observable is a physical quantity that can be measured. Examples include position and momentum. In systems governed by classical mechanics, it is a real-valued "function" on the set of all possible system states. In quantum phy ...
s cannot simply be thought of as revealing pre-existing values. Any attempt to do so in a realistic hidden-variable theory leads to values that are dependent upon the choice of the other (compatible) observables which are simultaneously measured (the measurement context). More formally, the measurement result (assumed pre-existing) of a quantum
observable In physics, an observable is a physical quantity that can be measured. Examples include position and momentum. In systems governed by classical mechanics, it is a real-valued "function" on the set of all possible system states. In quantum phy ...
is dependent upon which other
commuting Commuting is periodically recurring travel between one's place of residence and place of work or study, where the traveler, referred to as a commuter, leaves the boundary of their home community. By extension, it can sometimes be any regu ...
observables In physics, an observable is a physical quantity that can be measured. Examples include position and momentum. In systems governed by classical mechanics, it is a real-valued "function" on the set of all possible system states. In quantum p ...
are within the same measurement set. Contextuality was first demonstrated to be a feature of quantum phenomenology by the Bell–Kochen–Specker theorem. The study of contextuality has developed into a major topic of interest in
quantum foundations Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relati ...
as the phenomenon crystallises certain non-classical and counter-intuitive aspects of quantum theory. A number of powerful mathematical frameworks have been developed to study and better understand contextuality, from the perspective of
sheaf Sheaf may refer to: * Sheaf (agriculture), a bundle of harvested cereal stems * Sheaf (mathematics), a mathematical tool * Sheaf toss, a Scottish sport * River Sheaf, a tributary of River Don in England * ''The Sheaf'', a student-run newspaper se ...
theory,
graph theory In mathematics, graph theory is the study of '' graphs'', which are mathematical structures used to model pairwise relations between objects. A graph in this context is made up of '' vertices'' (also called ''nodes'' or ''points'') which are conn ...
,
hypergraph In mathematics, a hypergraph is a generalization of a graph in which an edge can join any number of vertices. In contrast, in an ordinary graph, an edge connects exactly two vertices. Formally, an undirected hypergraph H is a pair H = (X,E) w ...
s,
algebraic topology Algebraic topology is a branch of mathematics that uses tools from abstract algebra to study topological spaces. The basic goal is to find algebraic invariants that classify topological spaces up to homeomorphism, though usually most classify ...
, and probabilistic couplings. Nonlocality, in the sense of Bell's theorem, may be viewed as a special case of the more general phenomenon of contextuality, in which measurement contexts contain measurements that are distributed over spacelike separated regions. This follows from Fine's theorem. Quantum contextuality has been identified as a source of quantum computational speedups and quantum advantage in
quantum computing Quantum computing is a type of computation whose operations can harness the phenomena of quantum mechanics, such as superposition, interference, and entanglement. Devices that perform quantum computations are known as quantum computers. Though ...
. Contemporary research has increasingly focused on exploring its utility as a computational resource.


Kochen and Specker

The need for contextuality was discussed informally in 1935 by
Grete Hermann Grete Hermann (2 March 1901 – 15 April 1984) was a German mathematician and philosopher noted for her work in mathematics, physics, philosophy and education. She is noted for her early philosophical work on the foundations of quantum mechanics, ...
, but it was more than 30 years later when Simon B. Kochen and
Ernst Specker Ernst Paul Specker (11 February 1920, Zurich – 10 December 2011, Zurich) was a Swiss mathematician. Much of his most influential work was on Quine's New Foundations, a set theory with a universal set, but he is most famous for the Kochen ...
, and separately John Bell, constructed proofs that any realistic hidden-variable theory able to explain the phenomenology of quantum mechanics is contextual for systems of
Hilbert space In mathematics, Hilbert spaces (named after David Hilbert) allow generalizing the methods of linear algebra and calculus from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. Hilbert spaces arise natural ...
dimension three and greater. The Kochen–Specker theorem proves that realistic noncontextual
hidden variable theories In physics, hidden-variable theories are proposals to provide explanations of quantum mechanical phenomena through the introduction of (possibly unobservable) hypothetical entities. The existence of fundamental indeterminacy for some measurem ...
cannot reproduce the empirical predictions of quantum mechanics. Such a theory would suppose the following. # All quantum-mechanical observables may be simultaneously assigned definite values (this is the realism postulate, which is false in standard quantum mechanics, since there are observables which are indefinite in every given quantum state). These global value assignments may deterministically depend on some 'hidden' classical variable which, in turn, may vary stochastically for some classical reason (as in statistical mechanics). The measured assignments of observables may therefore finally stochastically change. This stochasticity is however epistemic and not ontic as in the standard formulation of quantum mechanics. # Value assignments pre-exist and are independent of the choice of any other observables which, in standard quantum mechanics, are described as commuting with the measured observable, and they are also measured. # Some functional constraints on the assignments of values for compatible observables are assumed (e.g., they are additive and multiplicative, there are however several versions of this functional requirement). In addition, Kochen and Specker constructed an explicitly noncontextual hidden variable model for the two-dimensional
qubit In quantum computing, a qubit () or quantum bit is a basic unit of quantum information—the quantum version of the classic binary bit physically realized with a two-state device. A qubit is a two-state (or two-level) quantum-mechanical system, ...
case in their paper on the subject,S. Kochen and E.P. Specker, "The problem of hidden variables in quantum mechanics", ''Journal of Mathematics and Mechanics'' 17, 59–87 (1967) thereby completing the characterisation of the dimensionality of quantum systems that can demonstrate contextual behaviour. Bell's proof invoked a weaker version of
Gleason's theorem In mathematical physics, Gleason's theorem shows that the rule one uses to calculate probabilities in quantum physics, the Born rule, can be derived from the usual mathematical representation of measurements in quantum physics together with the a ...
, reinterpreting the theorem to show that quantum contextuality exists only in Hilbert space dimension greater than two.Gleason, A. M, "Measures on the closed subspaces of a Hilbert space", ''Journal of Mathematics and Mechanics'' 6, 885–893 (1957).


Frameworks for contextuality


Sheaf-theoretic framework

The
sheaf Sheaf may refer to: * Sheaf (agriculture), a bundle of harvested cereal stems * Sheaf (mathematics), a mathematical tool * Sheaf toss, a Scottish sport * River Sheaf, a tributary of River Don in England * ''The Sheaf'', a student-run newspaper se ...
-theoretic, or Abramsky–Brandenburger, approach to contextuality initiated by
Samson Abramsky Samson Abramsky (born 12 March 1953) is Professor of Computer Science at University College London. He was previously the Christopher Strachey Professor of Computing at the University of Oxford, from 2000 to 2021. He has made contributions to ...
and
Adam Brandenburger Adam; el, Ἀδάμ, Adám; la, Adam is the name given in Genesis 1-5 to the first human. Beyond its use as the name of the first man, ''adam'' is also used in the Bible as a pronoun, individually as "a human" and in a collective sense as " ...
is theory-independent and can be applied beyond quantum theory to any situation in which empirical data arises in contexts. As well as being used to study forms of contextuality arising in quantum theory and other physical theories, it has also been used to study formally equivalent phenomena in
logic Logic is the study of correct reasoning. It includes both formal and informal logic. Formal logic is the science of deductively valid inferences or of logical truths. It is a formal science investigating how conclusions follow from prem ...
,
relational database A relational database is a (most commonly digital) database based on the relational model of data, as proposed by E. F. Codd in 1970. A system used to maintain relational databases is a relational database management system (RDBMS). Many relati ...
s,
natural language processing Natural language processing (NLP) is an interdisciplinary subfield of linguistics, computer science, and artificial intelligence concerned with the interactions between computers and human language, in particular how to program computers to proc ...
, and
constraint satisfaction In artificial intelligence and operations research, constraint satisfaction is the process of finding a solution through a set of constraints that impose conditions that the variables must satisfy. A solution is therefore a set of values for th ...
. In essence, contextuality arises when empirical data is ''locally consistent but globally inconsistent''. This framework gives rise in a natural way to a qualitative hierarchy of contextuality. *(Probabilistic) contextuality may be witnessed in measurement statistics, e.g. by the violation of an inequality. A representative example is the KCBS proof of contextuality. *Logical contextuality may be witnessed in the 'possibilistic' information about which outcome events are possible and which are not possible. A representative example is Hardy's nonlocality proof of nonlocality. *Strong contextuality is a maximal form of contextuality. Whereas (probabilistic) contextuality arises when measurement statistics cannot be reproduced by a mixture of global value assignments, strong contextuality arises when no global value assignment is even compatible with the possible outcome events. A representative example is the original Kochen–Specker proof of contextuality. Each level in this hierarchy strictly includes the next. An important intermediate level that lies strictly between the logical and strong contextuality classes is all-versus-nothing contextuality, a representative example of which is the Greenberger–Horne–Zeilinger proof of nonlocality.


Graph and hypergraph frameworks

Adán Cabello,
Simone Severini Simone Severini is an Italian-born British computer scientist. He is currently Professor of Physics of Information at University College London, and Director of Quantum Computing at Amazon Web Services. Work Severini worked in quantum informat ...
, and
Andreas Winter Andreas J. Winter (born 14 June 1971, Mühldorf, Germany) is a German mathematician and mathematical physicist at the Universitat Autònoma de Barcelona (UAB) in Spain. He received his Ph.D. in 1999 under Rudolf Ahlswede and Friedrich Götze at t ...
introduced a general graph-theoretic framework for studying contextuality of different physical theories. Within this framework experimental scenarios are described by graphs, and certain invariants of these graphs were shown have particular physical significance. One way in which contextuality may be witnessed in measurement statistics is through the violation of noncontextuality inequalities (also known as generalized Bell inequalities). With respect to certain appropriately normalised inequalities, the independence number,
Lovász number In graph theory, the Lovász number of a graph is a real number that is an upper bound on the Shannon capacity of the graph. It is also known as Lovász theta function and is commonly denoted by \vartheta(G), using a script form of the Greek letter ...
, and fractional packing number of the graph of an experimental scenario provide tight upper bounds on the degree to which classical theories, quantum theory, and generalised probabilistic theories, respectively, may exhibit contextuality in an experiment of that kind. A more refined framework based on
hypergraph In mathematics, a hypergraph is a generalization of a graph in which an edge can join any number of vertices. In contrast, in an ordinary graph, an edge connects exactly two vertices. Formally, an undirected hypergraph H is a pair H = (X,E) w ...
s rather than graphs is also used.


Contextuality-by-Default (CbD) framework

In the CbD approach, developed by Ehtibar Dzhafarov, Janne Kujala, and colleagues, (non)contextuality is treated as a property of any ''system of random variables'', defined as a set \mathcal=\left\ in which each random variable R_^ is labeled by its ''content'' q, the property it measures, and its ''context'' c, the set of recorded circumstances under which it is recorded (including but not limited to which other random variables it is recorded together with); q\prec c stands for "q is measured in c". The variables within a context are jointly distributed, but variables from different contexts are ''stochastically unrelated'', defined on different sample spaces. A ''(probabilistic) coupling'' of the system \mathcal is defined as a system S in which all variables are jointly distributed and, in any context c, R^=\left\  and S^=\left\  are identically distributed. The system  is considered noncontextual if it has a coupling S such that the probabilities \Pr\left _^=S_^\right/math> are maximal possible for all contexts c,c' and contents q such that q\prec c,c'. If such a coupling does not exist, the system is contextual. For the important class of ''cyclic systems'' of dichotomous (\pm1) random variables,  \mathcal_=\left\ (n\geq2), it has been shown that such a system is noncontextual if and only if D\left(\mathcal_\right)\leq\Delta\left(\mathcal_\right), where \Delta\left(\mathcal_\right)=\left(n-2\right)+\left, R_^-R_^\+\left, R_^-R_^\+\ldots\left, R_^-R_^\, and D\left(\mathcal_\right)=\max\left(\lambda_\left\langle R_^R_^\right\rangle +\lambda_\left\langle R_^R_^\right\rangle +\ldots+\lambda_\left\langle R_^,R_^\right\rangle \right), with the maximum taken over all \lambda_=\pm1  whose product is -1. If R_^ and R_^, measuring the same content in different context, are always identically distributed, the system is called ''consistently connected'' (satisfying "no-disturbance" or "no-signaling" principle). Except for certain logical issues, in this case CbD specializes to traditional treatments of contextuality in quantum physics. In particular, for consistently connected cyclic systems the noncontextuality criterion above reduces to D\left(\mathcal_\right)\leq n-2,which includes the Bell/CHSH inequality (n=4), KCBS inequality (n=5), and other famous inequalities. That nonlocality is a special case of contextuality follows in CbD from the fact that being jointly distributed for random variables is equivalent to being measurable functions of one and the same random variable (this generalizes
Arthur Fine Arthur Isadore Fine (born November 11, 1937) is an American philosopher of science now emeritus at the University of Washington. Education and career Having studied physics, philosophy, and mathematics, Fine graduated from the University of Chi ...
's analysis of Bell's theorem). CbD essentially coincides with the probabilistic part of Abramsky's sheaf-theoretic approach if the system is ''strongly consistently connected'', which means that the joint distributions of \left\  and \left\  coincide whenever q_,\ldots,q_ are measured in contexts c,c'. However, unlike most approaches to contextuality, CbD allows for ''inconsistent connectedness,'' with R_^ and R_^ differently distributed. This makes CbD applicable to physics experiments in which no-disturbance condition is violated, as well as to human behavior where this condition is violated as a rule. In particular, Vctor Cervantes, Ehtibar Dzhafarov, and colleagues have demonstrated that random variables describing certain paradigms of simple decision making form contextual systems, whereas many other decision-making systems are noncontextual once their inconsistent connectedness is properly taken into account.


Operational framework

An extended notion of contextuality due to Robert Spekkens applies to preparations and transformations as well as to measurements, within a general framework of operational physical theories. With respect to measurements, it removes the assumption of determinism of value assignments that is present in standard definitions of contextuality. This breaks the interpretation of nonlocality as a special case of contextuality, and does not treat irreducible randomness as nonclassical. Nevertheless, it recovers the usual notion of contextuality when outcome determinism is imposed. Spekkens' contextuality can be motivated using Leibniz's law of the
identity of indiscernibles The identity of indiscernibles is an ontological principle that states that there cannot be separate objects or entities that have all their properties in common. That is, entities ''x'' and ''y'' are identical if every predicate possessed by ''x'' ...
. The law applied to physical systems in this framework mirrors the entended definition of noncontextuality. This was further explored by Simmons ''et al'', who demonstrated that other notions of contextuality could also be motivated by Leibnizian principles, and could be thought of as tools enabling ontological conclusions from operational statistics.


Extracontextuality and extravalence

Given a pure quantum state , \psi \rangle, Born's rule tells that the probability to obtain another state , \phi \rangle in a measurement is , \langle \phi , \psi \rangle, ^2. However, such a number does not define a full probability distribution, i.e. values over a set of mutually exclusive events, summing up to 1. In order to obtain such a set one needs to specify a context, that is a complete set of commuting operators (CSCO), or equivalently a set of N orthogonal projectors , \phi_n \rangle \langle \phi_n , that sum to identity, where N is the dimension of the Hilbert space. Then one has \sum_n , \langle \phi_n , \psi \rangle, ^2 = 1 as expected. In that sense, one can tell that a state vector , \psi \rangle alone is predictively incomplete, as long a context has not been specified. The actual physical state, now defined by , \phi_n \rangle within a specified context, has been called a modality by Auffèves and Grangier Since it is clear that , \psi \rangle alone does not define a modality, what is its status ? If N \geq 3, one sees easily that , \psi \rangle is associated with an equivalence class of modalities, belonging to different contexts, but connected between themselves with certainty, even if the different CSCO observables do not commute. This equivalence class is called an extravalence class, and the associated transfer of certainty between contexts is called extracontextuality. As a simple example, the usual singlet state for two spins 1/2 can be found in the (non commuting) CSCOs associated with the measurement of the total spin (with S=0, \; m=0), or with a Bell measurement, and actually it appears in infinitely many different CSCOs - but obviously not in all possible ones. The concepts of extravalence and extracontextuality are very useful to spell out the role of contextuality in quantum mechanics, that is not non-contextual (like classical physical would be), but not either fully contextual, since modalities belonging to incompatible (non-commuting) contexts may be connected with certainty. Starting now from extracontextuality as a postulate, the fact that certainty can be transferred between contexts, and is then associated with a given projector, is the very basis of the hypotheses of Gleason's theorem, and thus of Born's rule. Also, associating a state vector with an extravalence class clarifies its status as a mathematical tool to calculate probabilities connecting modalities, which correspond to the actual observed physical events or results. This point of view is quite useful, and it can be used everywhere in quantum mechanics.


Other frameworks and extensions

A form of contextuality that may present in the dynamics of a quantum system was introduced by Shane Mansfield and Elham Kashefi, and has been shown to relate to computational quantum advantages. As a notion of contextuality that applies to transformations it is inequivalent to that of Spekkens. Examples explored to date rely on additional memory constraints which have a more computational than foundational motivation. Contextuality may be traded-off against Landauer erasure to obtain equivalent advantages.


Fine's theorem

The
Kochen–Specker theorem In quantum mechanics, the Kochen–Specker (KS) theorem, also known as the Bell–Kochen–Specker theorem, is a "no-go" theorem proved by John S. Bell in 1966 and by Simon B. Kochen and Ernst Specker in 1967. It places certain constraints on the ...
proves that quantum mechanics is incompatible with realistic noncontextual hidden variable models. On the other hand Bell's theorem proves that quantum mechanics is incompatible with factorisable hidden variable models in an experiment in which measurements are performed at distinct spacelike separated locations.
Arthur Fine Arthur Isadore Fine (born November 11, 1937) is an American philosopher of science now emeritus at the University of Washington. Education and career Having studied physics, philosophy, and mathematics, Fine graduated from the University of Chi ...
showed that in the experimental scenario in which the famous CHSH inequalities and proof of nonlocality apply, a factorisable hidden variable model exists if and only if a noncontextual hidden variable model exists. This equivalence was proven to hold more generally in any experimental scenario by
Samson Abramsky Samson Abramsky (born 12 March 1953) is Professor of Computer Science at University College London. He was previously the Christopher Strachey Professor of Computing at the University of Oxford, from 2000 to 2021. He has made contributions to ...
and
Adam Brandenburger Adam; el, Ἀδάμ, Adám; la, Adam is the name given in Genesis 1-5 to the first human. Beyond its use as the name of the first man, ''adam'' is also used in the Bible as a pronoun, individually as "a human" and in a collective sense as " ...
. It is for this reason that we may consider nonlocality to be a special case of contextuality.


Measures of contextuality


Contextual fraction

A number of methods exist for quantifying contextuality. One approach is by measuring the degree to which some particular noncontextuality inequality is violated, e.g. the KCBS inequality, the Yu–Oh inequality, or some Bell inequality. A more general measure of contextuality is the contextual fraction. Given a set of measurement statistics ''e'', consisting of a probability distribution over joint outcomes for each measurement context, we may consider factoring ''e'' into a noncontextual part ''eNC'' and some remainder ''e, e = \lambda e^ + (1-\lambda)e' \, . The maximum value of λ over all such decompositions is the noncontextual fraction of ''e'' denoted NCF(''e''), while the remainder CF(''e'')=(1-NCF(''e'')) is the contextual fraction of ''e''. The idea is that we look for a noncontextual explanation for the highest possible fraction of the data, and what is left over is the irreducibly contextual part. Indeed for any such decomposition that maximises λ the leftover ''e''' is known to be strongly contextual. This measure of contextuality takes values in the interval ,1 where 0 corresponds to noncontextuality and 1 corresponds to strong contextuality. The contextual fraction may be computed using
linear programming Linear programming (LP), also called linear optimization, is a method to achieve the best outcome (such as maximum profit or lowest cost) in a mathematical model whose requirements are represented by linear relationships. Linear programming is ...
. It has also been proved that CF(''e'') is an upper bound on the extent to which ''e'' violates ''any'' normalised noncontextuality inequality. Here normalisation means that violations are expressed as fractions of the algebraic maximum violation of the inequality. Moreover, the dual linear program to that which maximises λ computes a noncontextual inequality for which this violation is attained. In this sense the contextual fraction is a more neutral measure of contextuality, since it optimises over all possible noncontextual inequalities rather than checking the statistics against one inequality in particular.


Measures of (non)contextuality within the Contextuality-by-Default (CbD) framework

Several measures of the degree of contextuality in contextual systems were proposed within the CbD framework, but only one of them, denoted CNT2, has been shown to naturally extend into a measure of noncontextuality in noncontextual systems, NCNT2. This is important, because at least in the non-physical applications of CbD contextuality and noncontextuality are of equal interest. Both CNT2 and NCNT2 are defined as the L_1-distance between a probability vector \mathbf representing a system and the surface of the ''noncontextuality polytope'' \mathbb representing all possible noncontextual systems with the same single-variable marginals. For cyclic systems  of dichotomous random variables, it is shown that if the system is contextual (i.e., D\left(\mathcal_\right)>\Delta\left(\mathcal_\right)), \mathrm_=D\left(\mathcal_\right)-\Delta\left(\mathcal_\right), and if it is noncontextual ( D\left(\mathcal_\right)\leq\Delta\left(\mathcal_\right)), \mathrm_=\min\left(\Delta\left(\mathcal_\right)-D\left(\mathcal_\right),m\left(\mathcal_\right)\right), where m\left(\mathcal_\right) is the L_1-distance from the vector \mathbf\in\mathbb to the surface of the box circumscribing the noncontextuality polytope. More generally, NCNT2 and CNT2 are computed by means of linear programming. The same is true for other CbD-based measures of contextuality. One of them, denoted CNT3, uses the notion of a ''quasi-coupling'', that differs from a coupling in that the probabilities in the joint distribution of its values are replaced with arbitrary reals (allowed to be negative but summing to 1). The class of quasi-couplings S maximizing the probabilities \Pr\left _^=S_^\right/math> is always nonempty, and the minimal
total variation In mathematics, the total variation identifies several slightly different concepts, related to the ( local or global) structure of the codomain of a function or a measure. For a real-valued continuous function ''f'', defined on an interval ...
of the
signed measure In mathematics, signed measure is a generalization of the concept of (positive) measure by allowing the set function to take negative values. Definition There are two slightly different concepts of a signed measure, depending on whether or not ...
in this class is a natural measure of contextuality.


Contextuality as a resource for quantum computing

Recently, quantum contextuality has been investigated as a source of quantum advantage and computational speedups in
quantum computing Quantum computing is a type of computation whose operations can harness the phenomena of quantum mechanics, such as superposition, interference, and entanglement. Devices that perform quantum computations are known as quantum computers. Though ...
.


Magic state distillation

Magic state distillation Magic state distillation is a method for creating more accurate quantum states from multiple noisy ones, which is important for building fault tolerant quantum computers. It has also been linked to quantum contextuality, a concept thought to con ...
is a scheme for quantum computing in which quantum circuits constructed only of Clifford operators, which by themselves are fault-tolerant but efficiently classically simulable, are injected with certain "magic" states that promote the computational power to universal fault-tolerant quantum computing. In 2014, Mark Howard, ''et al.'' showed that contextuality characterizes magic states for qubits of odd prime dimension and for qubits with real wavefunctions. Extensions to the qubit case have been investigated by Juani Bermejo-Vega ''et al.'' This line of research builds on earlier work by Ernesto Galvão, which showed that Wigner function negativity is necessary for a state to be "magic"; it later emerged that Wigner negativity and contextuality are in a sense equivalent notions of nonclassicality.


Measurement-based quantum computing

Measurement-based quantum computation (MBQC) is a model for quantum computing in which a classical control computer interacts with a quantum system by specifying measurements to be performed and receiving measurement outcomes in return. The measurement statistics for the quantum system may or may not exhibit contextuality. A variety of results have shown that the presence of contextuality enhances the computational power of an MBQC. In particular, researchers have considered an artificial situation in which the power of the classical control computer is restricted to only being able to compute linear Boolean functions, i.e. to solve problems in the Parity L complexity class ⊕L. For interactions with multi-qubit quantum systems a natural assumption is that each step of the interaction consists of a binary choice of measurement which in turn returns a binary outcome. An MBQC of this restricted kind is known as an ''l2''-MBQC.


Anders and Browne

In 2009, Janet Anders and Dan Browne showed that two specific examples of nonlocality and contextuality were sufficient to compute a non-linear function. This in turn could be used to boost computational power to that of a universal classical computer, i.e. to solve problems in the complexity class P. This is sometimes referred to as measurement-based classical computation. The specific examples made use of the Greenberger–Horne–Zeilinger nonlocality proof and the supra-quantum Popescu–Rohrlich box.


Raussendorf

In 2013, Robert Raussendorf showed more generally that access to ''strongly contextual'' measurement statistics is necessary and sufficient for an ''l2''-MBQC to compute a non-linear function. He also showed that to compute non-linear Boolean functions with sufficiently high probability requires contextuality.


Abramsky, Barbosa and Mansfield

A further generalization and refinement of these results due to Samson Abramsky, Rui Soares Barbosa and Shane Mansfield appeared in 2017, proving a precise quantifiable relationship between the probability of successfully computing any given non-linear function and the degree of contextuality present in the ''l2''-MBQC as measured by the contextual fraction. Specifically, (1-p_s) \geq \left( 1-CF(e) \right) . \nu(f) where p_s, CF(e), \nu(f) \in ,1 are the probability of success, the contextual fraction of the measurement statistics ''e'', and a measure of the non-linearity of the function to be computed f , respectively.


Further examples

* The above inequality was also shown to relate quantum advantage in non-local games to the degree of contextuality required by the strategy and an appropriate measure of the difficulty of the game. * Similarly the inequality arises in a transformation-based model of quantum computation analogous to ''l2''-MBQC where it relates the degree of sequential contextuality present in the dynamics of the quantum system to the probability of success and the degree of non-linearity of the target function. * Preparation contextuality has been shown to enable quantum advantages in cryptographic random-access codes and in state-discrimination tasks. * In classical simulations of quantum systems, contextuality has been shown to incur memory costs.{{cite journal, last1=Kleinmann, first1=Matthias, last2=Gühne, first2=Otfried, last3=Portillo, first3=José R., last4=Larsson, first4=Jan-Åke, last5=Cabello, first5=Adán, date=November 2011, title=Memory cost of quantum contextuality, journal=New Journal of Physics, volume=13, issue=11, pages=113011, doi=10.1088/1367-2630/13/11/113011, issn=1367-2630, bibcode=2011NJPh...13k3011K, arxiv=1007.3650, s2cid=13466604


See also

*
Kochen–Specker theorem In quantum mechanics, the Kochen–Specker (KS) theorem, also known as the Bell–Kochen–Specker theorem, is a "no-go" theorem proved by John S. Bell in 1966 and by Simon B. Kochen and Ernst Specker in 1967. It places certain constraints on the ...
* Mermin–Peres square *
KCBS pentagram In quantum foundations, the KCBS pentagram was discovered by Alexander Klyachko, M. Ali Can, Sinem Binicioglu, and Alexander Shumovsky as an example disproving noncontextual hidden variable models. Let's say we have a pentagram, which is a ...
* Quantum nonlocality *
Quantum foundations Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relati ...
* Quantum indeterminacy


References

Quantum mechanics