Entropic Uncertainty
In quantum mechanics, information theory, and Fourier analysis, the entropic uncertainty or Hirschman uncertainty is defined as the sum of the temporal and spectral Shannon entropies. It turns out that Heisenberg's uncertainty principle can be expressed as a lower bound on the sum of these entropies. This is ''stronger'' than the usual statement of the uncertainty principle in terms of the product of standard deviations. In 1957, Hirschman considered a function ''f'' and its Fourier transform ''g'' such that :g(y) \approx \int_^\infty \exp (-2\pi ixy) f(x)\, dx,\qquad f(x) \approx \int_^\infty \exp (2\pi ixy) g(y)\, dy ~, where the "≈" indicates convergence in 2, and normalized so that (by Plancherel's theorem), : \int_^\infty , f(x), ^2\, dx = \int_^\infty , g(y), ^2 \,dy = 1~. He showed that for any such functions the sum of the Shannon entropies is non-negative, : H(, f, ^2) + H(, g, ^2) \equiv - \int_^\infty , f(x), ^2 \log , f(x), ^2\, dx - \int_^\infty , g(y), ^2 \lo ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Uncertainty Principle
In quantum mechanics, the uncertainty principle (also known as Heisenberg's uncertainty principle) is any of a variety of mathematical inequalities asserting a fundamental limit to the accuracy with which the values for certain pairs of physical quantities of a particle, such as position, ''x'', and momentum, ''p'', can be predicted from initial conditions. Such variable pairs are known as complementary variables or canonically conjugate variables; and, depending on interpretation, the uncertainty principle limits to what extent such conjugate properties maintain their approximate meaning, as the mathematical framework of quantum physics does not support the notion of simultaneously well-defined conjugate properties expressed by a single value. The uncertainty principle implies that it is in general not possible to predict the value of a quantity with arbitrary certainty, even if all initial conditions are specified. Introduced first in 1927 by the German physicist Wern ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Rényi Entropy
In information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is named after Alfréd Rényi, who looked for the most general way to quantify information while preserving additivity for independent events. In the context of fractal dimension estimation, the Rényi entropy forms the basis of the concept of generalized dimensions. The Rényi entropy is important in ecology and statistics as index of diversity. The Rényi entropy is also important in quantum information, where it can be used as a measure of entanglement. In the Heisenberg XY spin chain model, the Rényi entropy as a function of can be calculated explicitly because it is an automorphic function with respect to a particular subgroup of the modular group. In theoretical computer science, the min-entropy is used in the context of randomness extractors. Definition The Rényi ent ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Quantum Mechanical Entropy
In physics, a quantum (plural quanta) is the minimum amount of any physical entity (physical property) involved in an interaction. The fundamental notion that a physical property can be "quantized" is referred to as "the hypothesis of quantization". This means that the magnitude of the physical property can take on only discrete values consisting of integer multiples of one quantum. For example, a photon is a single quantum of light (or of any other form of electromagnetic radiation). Similarly, the energy of an electron bound within an atom is quantized and can exist only in certain discrete values. (Atoms and matter in general are stable because electrons can exist only at discrete energy levels within an atom.) Quantization is one of the foundations of the much broader physics of quantum mechanics. Quantization of energy and its influence on how energy and matter interact (quantum electrodynamics) is part of the fundamental framework for understanding and describing nature. E ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Riesz–Thorin Theorem
In mathematics, the Riesz–Thorin theorem, often referred to as the Riesz–Thorin interpolation theorem or the Riesz–Thorin convexity theorem, is a result about ''interpolation of operators''. It is named after Marcel Riesz and his student G. Olof Thorin. This theorem bounds the norms of linear maps acting between spaces. Its usefulness stems from the fact that some of these spaces have rather simpler structure than others. Usually that refers to which is a Hilbert space, or to and . Therefore one may prove theorems about the more complicated cases by proving them in two simple cases and then using the Riesz–Thorin theorem to pass from the simple cases to the complicated cases. The Marcinkiewicz theorem is similar but applies also to a class of non-linear maps. Motivation First we need the following definition: :Definition. Let be two numbers such that . Then for define by: . By splitting up the function in as the product and applying Hölder's inequality to i ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Logarithmic Schrödinger Equation
In theoretical physics, the logarithmic Schrödinger equation (sometimes abbreviated as LNSE or LogSE) is one of the nonlinear modifications of Schrödinger's equation. It is a classical wave equation with applications to extensions of quantum mechanics, quantum optics, nuclear physics, transport and diffusion phenomena, open quantum systems and information theory, effective quantum gravity and physical vacuum models and theory of superfluidity and Bose–Einstein condensation. Its relativistic version (with D'Alembertian instead of Laplacian and first-order time derivative) was first proposed by Gerald Rosen. It is an example of an integrable model. The equation The logarithmic Schrödinger equation is the partial differential equation. In mathematics and mathematical physics one often uses its dimensionless form: i \frac + \nabla^2 \psi + \psi \ln , \psi, ^2 = 0. for the complex-valued function of the particles position vector at time , and \nabla^2 \psi = \fr ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Inequalities In Information Theory
Inequalities are very important in the study of information theory. There are a number of different contexts in which these inequalities appear. Entropic inequalities Consider a tuple X_1,X_2,\dots,X_n of n finitely (or at most countably) supported random variables on the same probability space. There are 2''n'' subsets, for which (joint) entropies can be computed. For example, when ''n'' = 2, we may consider the entropies H(X_1), H(X_2), and H(X_1, X_2). They satisfy the following inequalities (which together characterize the range of the marginal and joint entropies of two random variables): * H(X_1) \ge 0 * H(X_2) \ge 0 * H(X_1) \le H(X_1, X_2) * H(X_2) \le H(X_1, X_2) * H(X_1, X_2) \le H(X_1) + H(X_2). In fact, these can all be expressed as special cases of a single inequality involving the conditional mutual information, namely :I(A;B, C) \ge 0, where A, B, and C each denote the joint distribution of some arbitrary (possibly empty) subset of our collection of ra ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Lebesgue Measure
In measure theory, a branch of mathematics, the Lebesgue measure, named after French mathematician Henri Lebesgue, is the standard way of assigning a measure to subsets of ''n''-dimensional Euclidean space. For ''n'' = 1, 2, or 3, it coincides with the standard measure of length, area, or volume. In general, it is also called ''n''-dimensional volume, ''n''-volume, or simply volume. It is used throughout real analysis, in particular to define Lebesgue integration. Sets that can be assigned a Lebesgue measure are called Lebesgue-measurable; the measure of the Lebesgue-measurable set ''A'' is here denoted by ''λ''(''A''). Henri Lebesgue described this measure in the year 1901, followed the next year by his description of the Lebesgue integral. Both were published as part of his dissertation in 1902. Definition For any interval I = ,b/math>, or I = (a, b), in the set \mathbb of real numbers, let \ell(I)= b - a denote its length. For any subset E\subseteq\mathbb, the Lebesg ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Normal Distribution
In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu is the mean or expectation of the distribution (and also its median and mode), while the parameter \sigma is its standard deviation. The variance of the distribution is \sigma^2. A random variable with a Gaussian distribution is said to be normally distributed, and is called a normal deviate. Normal distributions are important in statistics and are often used in the natural and social sciences to represent real-valued random variables whose distributions are not known. Their importance is partly due to the central limit theorem. It states that, under some conditions, the average of many samples (observations) of a random variable with finite mean and variance is itself a random variable—whose distribution converges to a normal dist ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Calculus Of Variations
The calculus of variations (or Variational Calculus) is a field of mathematical analysis that uses variations, which are small changes in functions and functionals, to find maxima and minima of functionals: mappings from a set of functions to the real numbers. Functionals are often expressed as definite integrals involving functions and their derivatives. Functions that maximize or minimize functionals may be found using the Euler–Lagrange equation of the calculus of variations. A simple example of such a problem is to find the curve of shortest length connecting two points. If there are no constraints, the solution is a straight line between the points. However, if the curve is constrained to lie on a surface in space, then the solution is less obvious, and possibly many solutions may exist. Such solutions are known as '' geodesics''. A related problem is posed by Fermat's principle: light follows the path of shortest optical length connecting two points, which depends ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Differential Entropy
Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not. The actual continuous version of discrete entropy is the limiting density of discrete points (LDDP). Differential entropy (described here) is commonly encountered in the literature, but it is a limiting case of the LDDP, and one that loses its fundamental association with discrete entropy. In terms of measure theory, the differential entropy of a probability measure is the negative relative entropy from that measure to the Lebesgue measure, where the latter is treated as if it were a probability measure, despite being unnormalized. Definition Let X be a r ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Variance
In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers is spread out from their average value. Variance has a central role in statistics, where some ideas that use it include descriptive statistics, statistical inference, hypothesis testing, goodness of fit, and Monte Carlo sampling. Variance is an important tool in the sciences, where statistical analysis of data is common. The variance is the square of the standard deviation, the second central moment of a distribution, and the covariance of the random variable with itself, and it is often represented by \sigma^2, s^2, \operatorname(X), V(X), or \mathbb(X). An advantage of variance as a measure of dispersion is that it is more amenable to algebraic manipulation than other measures of dispersion such as the expected absolute deviatio ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Normal Probability Distribution
In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu is the mean or expectation of the distribution (and also its median and mode), while the parameter \sigma is its standard deviation. The variance of the distribution is \sigma^2. A random variable with a Gaussian distribution is said to be normally distributed, and is called a normal deviate. Normal distributions are important in statistics and are often used in the natural and social sciences to represent real-valued random variables whose distributions are not known. Their importance is partly due to the central limit theorem. It states that, under some conditions, the average of many samples (observations) of a random variable with finite mean and variance is itself a random variable—whose distribution converges to a normal distrib ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |