HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...
and
directional statistics Directional statistics (also circular statistics or spherical statistics) is the subdiscipline of statistics that deals with directions (unit vectors in Euclidean space, R''n''), axes ( lines through the origin in R''n'') or rotations in R''n''. ...
, a wrapped probability distribution is a continuous
probability distribution In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon ...
that describes data points that lie on a unit ''n''-sphere. In one dimension, a wrapped distribution consists of points on the
unit circle In mathematics, a unit circle is a circle of unit radius—that is, a radius of 1. Frequently, especially in trigonometry, the unit circle is the circle of radius 1 centered at the origin (0, 0) in the Cartesian coordinate system in the Eucli ...
. If \phi is a random variate in the interval (-\infty,\infty) with
probability density function In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) ca ...
(PDF) p(\phi), then z = e^ is a circular variable distributed according to the wrapped distribution p_(\theta) and \theta = \arg(z) is an angular variable in the interval (-\pi,\pi] distributed according to the wrapped distribution p_w(\theta). Any probability density function p(\phi) on the line can be "wrapped" around the circumference of a circle of unit radius. That is, the PDF of the wrapped variable :\theta=\phi \mod 2\pi in some interval of length 2\pi is : p_w(\theta)=\sum_^\infty which is a periodic summation, periodic sum of period 2\pi. The preferred interval is generally (-\pi<\theta\le\pi) for which \ln(e^)=\arg(e^)=\theta.


Theory

In most situations, a process involving circular statistics produces angles (\phi) which lie in the interval (-\infty,\infty), and are described by an "unwrapped" probability density function p(\phi). However, a measurement will yield an angle \theta which lies in some interval of length 2\pi (for example, 0 to 2\pi). In other words, a measurement cannot tell whether the true angle \phi or a wrapped angle \theta = \phi+2\pi a, where a is some unknown integer, has been measured. If we wish to calculate the
expected value In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a ...
of some function of the measured angle it will be: :\langle f(\theta)\rangle=\int_^\infty p(\phi)f(\phi+2\pi a)d\phi. We can express the integral as a sum of integrals over periods of 2\pi: :\langle f(\theta)\rangle=\sum_^\infty \int_^ p(\phi)f(\phi+2\pi a)d\phi. Changing the variable of integration to \theta'=\phi-2\pi k and exchanging the order of integration and summation, we have :\langle f(\theta)\rangle= \int_0^ p_w(\theta')f(\theta'+2\pi a')d\theta' where p_w(\theta') is the PDF of the wrapped distribution and a' is another unknown integer (a'=a+k). The unknown integer a' introduces an ambiguity into the expected value of f(\theta), similar to the problem of calculating
angular mean In mathematics and statistics, a circular mean or angular mean is a mean designed for angles and similar cyclic quantities, such as daytimes, and fractional parts of real numbers. This is necessary since most of the usual means may not be appropr ...
. This can be resolved by introducing the parameter z=e^, since z has an unambiguous relationship to the true angle \phi: :z=e^=e^. Calculating the expected value of a function of z will yield unambiguous answers: :\langle f(z)\rangle= \int_0^ p_w(\theta')f(e^)d\theta'. For this reason, the z parameter is preferred over measured angles \theta in circular statistical analysis. This suggests that the wrapped distribution function may itself be expressed as a function of z such that: :\langle f(z)\rangle= \oint p_(z)f(z)\,dz where p_w(z) is defined such that p_w(\theta)\,, d\theta, =p_(z)\,, dz, . This concept can be extended to the multivariate context by an extension of the simple sum to a number of F sums that cover all dimensions in the feature space: : p_w(\vec\theta)=\sum_^ where \mathbf_k=(0,\dots,0,1,0,\dots,0)^ is the kth Euclidean basis vector.


Expression in terms of characteristic functions

A fundamental wrapped distribution is the
Dirac comb In mathematics, a Dirac comb (also known as shah function, impulse train or sampling function) is a periodic function with the formula \operatorname_(t) \ := \sum_^ \delta(t - k T) for some given period T. Here ''t'' is a real variable and th ...
, which is a wrapped
Dirac delta function In mathematics, the Dirac delta distribution ( distribution), also known as the unit impulse, is a generalized function or distribution over the real numbers, whose value is zero everywhere except at zero, and whose integral over the enti ...
: :\Delta_(\theta)=\sum_^. Using the delta function, a general wrapped distribution can be written :p_w(\theta)=\sum_^\int_^\infty p(\theta')\delta(\theta-\theta'+2\pi k)\,d\theta'. Exchanging the order of summation and integration, any wrapped distribution can be written as the convolution of the unwrapped distribution and a Dirac comb: :p_w(\theta)=\int_^\infty p(\theta')\Delta_(\theta-\theta')\,d\theta'. The Dirac comb may also be expressed as a sum of exponentials, so we may write: :p_w(\theta)=\frac\,\int_^\infty p(\theta')\sum_^e^\,d\theta'. Again exchanging the order of summation and integration: :p_w(\theta)=\frac\,\sum_^\int_^\infty p(\theta')e^\,d\theta'. Using the definition of \phi(s), the
characteristic function In mathematics, the term "characteristic function" can refer to any of several distinct concepts: * The indicator function of a subset, that is the function ::\mathbf_A\colon X \to \, :which for a given subset ''A'' of ''X'', has value 1 at points ...
of p(\theta) yields a
Laurent series In mathematics, the Laurent series of a complex function f(z) is a representation of that function as a power series which includes terms of negative degree. It may be used to express complex functions in cases where a Taylor series expansion c ...
about zero for the wrapped distribution in terms of the characteristic function of the unwrapped distribution: :p_w(\theta)=\frac\,\sum_^ \phi(n)\,e^ or :p_(z)=\frac\,\sum_^ \phi(n)\,z^ Analogous to linear distributions, \phi(m) is referred to as the characteristic function of the wrapped distribution (or more accurately, the characteristic
sequence In mathematics, a sequence is an enumerated collection of objects in which repetitions are allowed and order matters. Like a set, it contains members (also called ''elements'', or ''terms''). The number of elements (possibly infinite) is called ...
). This is an instance of the Poisson summation formula, and it can be seen that the coefficients of the
Fourier series A Fourier series () is a summation of harmonically related sinusoidal functions, also known as components or harmonics. The result of the summation is a periodic function whose functional form is determined by the choices of cycle length (or '' ...
for the wrapped distribution are simply the coefficients of the
Fourier transform A Fourier transform (FT) is a mathematical transform that decomposes functions into frequency components, which are represented by the output of the transform as a function of frequency. Most commonly functions of time or space are transformed ...
of the unwrapped distribution at integer values.


Moments

The moments of the wrapped distribution p_w(z) are defined as: : \langle z^m \rangle = \oint p_(z)z^m \, dz . Expressing p_w(z) in terms of the characteristic function and exchanging the order of integration and summation yields: : \langle z^m \rangle = \frac\sum_^\infty \phi(n)\oint z^\,dz . From the residue theorem we have : \oint z^\,dz = 2\pi \delta_ where \delta_k is the
Kronecker delta In mathematics, the Kronecker delta (named after Leopold Kronecker) is a function of two variables, usually just non-negative integers. The function is 1 if the variables are equal, and 0 otherwise: \delta_ = \begin 0 &\text i \neq j, \\ 1 & ...
function. It follows that the moments are simply equal to the characteristic function of the unwrapped distribution for integer arguments: : \langle z^m \rangle = \phi(m) .


Generation of random variates

If X is a random variate drawn from a linear probability distribution P, then Z=e^ is a circular variate distributed according to the wrapped P distribution, and \theta=\arg(Z) is the angular variate distributed according to the wrapped P distribution, with -\pi < \theta \leq \pi.


Entropy

The information entropy of a circular distribution with probability density p_w(\theta) is defined as: :H = -\int_\Gamma p_w(\theta)\,\ln(p_w(\theta))\,d\theta where \Gamma is any interval of length 2\pi. If both the probability density and its logarithm can be expressed as a
Fourier series A Fourier series () is a summation of harmonically related sinusoidal functions, also known as components or harmonics. The result of the summation is a periodic function whose functional form is determined by the choices of cycle length (or '' ...
(or more generally, any integral transform on the circle), the orthogonal basis of the series can be used to obtain a closed form expression for the entropy. The moments of the distribution \phi(n) are the Fourier coefficients for the Fourier series expansion of the probability density: :p_w(\theta)=\frac\sum_^\infty \phi_n e^. If the logarithm of the probability density can also be expressed as a Fourier series: :\ln(p_w(\theta))=\sum_^\infty c_m e^ where :c_m=\frac\int_\Gamma \ln(p_w(\theta))e^\,d\theta. Then, exchanging the order of integration and summation, the entropy may be written as: :H=-\frac\sum_^\infty\sum_^\infty c_m \phi_n \int_\Gamma e^\,d\theta. Using the orthogonality of the Fourier basis, the integral may be reduced to: :H=-\sum_^\infty c_n \phi_n. For the particular case when the probability density is symmetric about the mean, c_=c_m and the logarithm may be written: :\ln(p_w(\theta))= c_0 + 2\sum_^\infty c_m \cos(m\theta) and :c_m=\frac\int_\Gamma \ln(p_w(\theta))\cos(m\theta)\,d\theta and, since normalization requires that \phi_0=1, the entropy may be written: :H=-c_0-2\sum_^\infty c_n \phi_n.


See also

*
Wrapped normal distribution In probability theory and directional statistics, a wrapped normal distribution is a wrapped probability distribution that results from the "wrapping" of the normal distribution around the unit circle. It finds application in the theory of Brownia ...
*
Wrapped Cauchy distribution In probability theory and directional statistics, a wrapped Cauchy distribution is a wrapped probability distribution that results from the "wrapping" of the Cauchy distribution around the unit circle. The Cauchy distribution is sometimes known a ...
* Wrapped exponential distribution


References

* *


External links


Circular Values Math and Statistics with C++11
A C++11 infrastructure for circular values (angles, time-of-day, etc.) mathematics and statistics {{DEFAULTSORT:Wrapped Distribution Types of probability distributions Directional statistics