Partial Information Decomposition is an extension of
information theory
Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
, that aims to generalize the pairwise relations described by information theory to the interaction of multiple variables.
Motivation
Information theory can quantify the amount of information a single source variable
has about a target variable
via the
mutual information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual Statistical dependence, dependence between the two variables. More specifically, it quantifies the "Information conten ...
. If we now consider a second source variable
, classical information theory can only describe the mutual information of the joint variable
with
, given by
. In general however, it would be interesting to know how exactly the individual variables
and
and their interactions relate to
.
Consider that we are given two source variables
and a target variable
. In this case the total mutual information
, while the individual mutual information
. That is, there is
synergistic
Synergy is an interaction or cooperation giving rise to a whole that is greater than the simple sum of its parts (i.e., a non-linear addition of force, energy, or effect). The term ''synergy'' comes from the Attic Greek word συνεργία ' f ...
information arising from the interaction of
about
, which cannot be easily captured with classical information theoretic quantities.
Definition
Partial information decomposition further decomposes the mutual information between the source variables
with the target variable
as
Here the individual information atoms are defined as
*
is the ''unique'' information that
has about
, which is not in
*
is the ''synergistic'' information that is in the interaction of
and
about
*
is the ''redundant'' information that is in both
or
about
There is, thus far, no universal agreement on how these terms should be defined, with different approaches that decompose information into redundant, unique, and synergistic components appearing in the literature.
Applications
Despite the lack of universal agreement, partial information decomposition has been applied to diverse fields, including climatology, neuroscience sociology, and machine learning Partial information decomposition has also been proposed as a possible foundation on which to build a mathematically robust definition of
emergence in complex systems and may be relevant to formal theories of consciousness.
See also
*
Mutual information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual Statistical dependence, dependence between the two variables. More specifically, it quantifies the "Information conten ...
*
Total correlation In probability theory and in particular in information theory, total correlation (Watanabe 1960) is one of several generalizations of the mutual information. It is also known as the ''multivariate constraint'' (Garner 1962) or ''multiinformation'' ...
*
Dual total correlation
In information theory, dual total correlation, information rate, excess entropy,Nihat Ay, E. Olbrich, N. Bertschinger (2001). A unifying framework for complexity measures of finite systems. European Conference on Complex Systemspdf or binding inf ...
*
Interaction information
References
{{Reflist
Information theory