A generalized probabilistic theory (GPT) is a general framework to describe the
operational
An operational definition specifies concrete, replicable procedures designed to represent a construct. In the words of American psychologist S.S. Stevens (1935), "An operation is the performance which we execute in order to make known a concept." F ...
features of arbitrary
physical theories. A GPT must specify what kind of physical systems one can find in the lab, as well as rules to compute the outcome statistics of any experiment involving labeled preparations, transformations and measurements. The framework of GPTs has been used to define hypothetical non-quantum physical theories which nonetheless possess
quantum theory's most remarkable features, such as
entanglement or
teleportation
Teleportation is the hypothetical transfer of matter or energy from one point to another without traversing the physical space between them. It is a common subject in science fiction and fantasy literature. Teleportation is often paired with tim ...
. Notably, a small set of physically motivated
axioms
An axiom, postulate, or assumption is a statement that is taken to be true, to serve as a premise or starting point for further reasoning and arguments. The word comes from the Ancient Greek word (), meaning 'that which is thought worthy or f ...
is enough to single out the GPT representation of quantum theory.
The mathematical formalism of GPTs has been developed since the 1950s and 1960s by many authors, and rediscovered independently several times. The earliest ideas are due to Segal and Mackey, although the first comprehensive and mathematically rigorous treatment can be traced back to the work of Ludwig, Dähn, and Stolz, all three based at the University of Marburg.
While the formalism in these earlier works is less similar to the modern one, already in the early 1970s the ideas of the Marburg school had matured and the notation had developed towards the modern usage, thanks also to the independent contribution of Davies and Lewis.
The books by Ludwig and the proceedings of a conference held in Marburg in 1973 offer a comprehensive account of these early developments.
[
The term "generalized probabilistic theory" itself was coined by Jonathan Barrett in 2007,][ based on the version of the framework introduced by Lucien Hardy.][
Note that some authors use the term ''operational probabilistic theory'' (OPT).][ OPTs are an alternative way to define hypothetical non-quantum physical theories, based on the language of ]category theory
Category theory is a general theory of mathematical structures and their relations. It was introduced by Samuel Eilenberg and Saunders Mac Lane in the middle of the 20th century in their foundational work on algebraic topology. Category theory ...
, in which one specify the axioms that should be satisfied by observations.
Definition
A GPT is specified by a number of mathematical structures, namely:
* a family of state spaces, each of which represents a physical system;
* a composition rule (usually corresponds to a tensor product
In mathematics, the tensor product V \otimes W of two vector spaces V and W (over the same field) is a vector space to which is associated a bilinear map V\times W \rightarrow V\otimes W that maps a pair (v,w),\ v\in V, w\in W to an element of ...
), which specifies how joint state spaces are formed;
* a set of measurements, which map states to probabilities and are usually described by an effect algebra
Effect algebras are partial algebras which abstract the (partial) algebraic properties of events that can be observed in quantum mechanics. Structures equivalent to effect algebras were introduced by three different research groups in theoretical ...
;
* a set of possible physical operations, i.e., transformations that map state spaces to state spaces.
It can be argued that if one can prepare a state and a different state , then one can also toss a (possibly biased) coin which lands on one side with probability and on the other with probability and prepare either or , depending on the side the coin lands on. The resulting state is a statistical mixture of the states and and in GPTs such statistical mixtures are described by convex combinations, in this case . For this reason all state spaces are assumed to be convex set
In geometry, a set of points is convex if it contains every line segment between two points in the set.
For example, a solid cube (geometry), cube is a convex set, but anything that is hollow or has an indent, for example, a crescent shape, is n ...
s. Following a similar reasoning, one can argue that also the set of measurement outcomes and set of physical operations must be convex.
Additionally it is always assumed that measurement outcomes and physical operations are affine maps, i.e. that if is a physical transformation, then we must have and similarly for measurement outcomes. This follows from the argument that we should obtain the same outcome if we first prepare a statistical mixture and then apply the physical operation, or if we prepare a statistical mixture of the outcomes of the physical operations.
Note that physical operations are a subset of all affine maps which transform states into states as we must require that a physical operation yields a valid state even when it is applied to a part of a system (the notion of "part" is subtle: it is specified by explaining how different system types compose and how the global parameters of the composite system are affected by local operations).
For practical reasons it is often assumed that a general GPT is embedded in a finite-dimensional vector space, although infinite-dimensional formulations exist.
Classical, quantum, and beyond
Classical theory is a GPT where states correspond to probability distributions and both measurements and physical operations are stochastic maps. One can see that in this case all state spaces are simplex
In geometry, a simplex (plural: simplexes or simplices) is a generalization of the notion of a triangle or tetrahedron to arbitrary dimensions. The simplex is so-named because it represents the simplest possible polytope in any given dimension. ...
es.
Standard quantum information theory
Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both t ...
is a GPT where system types are described by a natural number which corresponds to the complex Hilbert space dimension. States of the systems of Hilbert space dimension are described by the normalized positive semidefinite matrices, i.e. by the density matrices
In quantum mechanics, a density matrix (or density operator) is a matrix used in calculating the probabilities of the outcomes of measurements performed on physical systems. It is a generalization of the state vectors or wavefunctions: while th ...
. Measurements are identified with Positive Operator valued Measures (POVMs), and the physical operations are completely positive map
In mathematics a positive map is a map between C*-algebras that sends positive elements to positive elements. A completely positive map is one that satisfies a stronger, more robust condition.
Definition
Let A and B be C*-algebras. A linear m ...
s. Systems compose via the tensor product of the underlying complex Hilbert spaces.
Real quantum theory is the GPT which is obtained from standard quantum information theory by restricting the theory to real Hilbert spaces. It does not satisfy the axiom of local tomography.
The framework of GPTs has provided examples of consistent physical theories which cannot be embedded in quantum theory and indeed exhibit very non-quantum features. One of the first ones was Box-world, the theory with maximal non-local correlations. Other examples are theories with third-order interference and the family of GPTs known as generalized bits.
Many features that were considered purely quantum are actually present in all non-classical GPTs. These include the impossibility of universal broadcasting, i.e., the no-cloning theorem
In physics, the no-cloning theorem states that it is impossible to create an independent and identical copy of an arbitrary unknown quantum state, a statement which has profound implications in the field of quantum computer, quantum computing among ...
; the existence of incompatible measurements; and the existence of entangled states or entangled measurements.[
]
See also
*Quantum foundations
Quantum foundations is a discipline of science that seeks to understand the most counter-intuitive aspects of quantum theory, reformulate it and even propose new generalizations thereof. Contrary to other physical theories, such as general relat ...
References
{{reflist, colwidth=30em
Quantum mechanics