HOME

TheInfoList



OR:

In
information theory Information theory is the scientific study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. ...
, specific-information is the generic name given to the family of state-dependent measures that in expectation converge to the
mutual information In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the " amount of information" (in units such as ...
. There are currently three known varieties of specific information usually denoted I_V, I_S, and I_. The specific-information between a
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the p ...
X and a state Y=y is written as :I( X ; Y = y).


References

* *{{cite journal , pages=177–87 , doi=10.1088/0954-898X/14/2/301 , title=How much information is associated with a particular stimulus? , year=2003 , last1=Butts , first1=Daniel , journal=Network: Computation in Neural Systems , volume=14 , issue=2 , pmid=12790180 Information theory