Vysochanskij–Petunin Inequality
   HOME





Vysochanskij–Petunin Inequality
In probability theory, the Vysochanskij– Petunin inequality gives a lower bound for the probability that a random variable with finite variance lies within a certain number of standard deviations of the variable's mean, or equivalently an upper bound for the probability that it lies further away. The sole restrictions on the distribution are that it be unimodal and have finite variance. (This implies that it is a continuous probability distribution except at the mode, which may have a non-zero probability.) Theorem Let X be a random variable with unimodal distribution, and \alpha\in \mathbb R. If we define \rho=\sqrt then for any r>0, :\begin \operatorname(, X-\alpha, \ge r)\le \begin \frac&r\ge \sqrt\rho \\ \frac-\frac&r\le \sqrt\rho \\ \end. \end Relation to Gauss's inequality Taking \alpha equal to a mode of X yields the first case of Gauss's inequality. Tightness of Bound Without loss of generality, assume \alpha=0 and \rho=1. * If r. * If 1\le r\le \sqrt, the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability Theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms of probability, axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure (mathematics), measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of the sample space is called an event (probability theory), event. Central subjects in probability theory include discrete and continuous random variables, probability distributions, and stochastic processes (which provide mathematical abstractions of determinism, non-deterministic or uncertain processes or measured Quantity, quantities that may either be single occurrences or evolve over time in a random fashion). Although it is not possible to perfectly p ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mode (statistics)
The mode is the value that appears most often in a set of data values. If is a discrete random variable, the mode is the value (i.e, ) at which the probability mass function takes its maximum value. In other words, it is the value that is most likely to be sampled. Like the statistical mean and median, the mode is a way of expressing, in a (usually) single number, important information about a random variable or a population. The numerical value of the mode is the same as that of the mean and median in a normal distribution, and it may be very different in highly skewed distributions. The mode is not necessarily unique to a given discrete distribution, since the probability mass function may take the same maximum value at several points , , etc. The most extreme case occurs in uniform distributions, where all values occur equally frequently. When the probability density function of a continuous distribution has multiple local maxima it is common to refer to all of the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Rule Of Three (statistics)
In statistical analysis, the rule of three states that if a certain event did not occur in a sample with subjects, the interval from 0 to 3/ is a 95% confidence interval for the rate of occurrences in the population. When is greater than 30, this is a good approximation of results from more sensitive tests. For example, a pain-relief drug is tested on 1500 human subjects, and no adverse event is recorded. From the rule of three, it can be concluded with 95% confidence that fewer than 1 person in 500 (or 3/1500) will experience an adverse event. By symmetry, for only successes, the 95% confidence interval is . The rule is useful in the interpretation of clinical trials generally, particularly in phase II and phase III where often there are limitations in duration or statistical power. The rule of three applies well beyond medical research, to any trial done times. If 300 parachutes are randomly tested and all open successfully, then it is concluded with 95% confiden ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Cantelli's Inequality
In probability theory, Cantelli's inequality (also called the Chebyshev-Cantelli inequality and the one-sided Chebyshev inequality) is an improved version of Chebyshev's inequality for one-sided tail bounds. The inequality states that, for \lambda > 0, : \Pr(X-\mathbb ge\lambda) \le \frac, where :X is a real-valued random variable, :\Pr is the probability measure, :\mathbb /math> is the expected value of X, :\sigma^2 is the variance of X. Applying the Cantelli inequality to -X gives a bound on the lower tail, : \Pr(X-\mathbb le -\lambda) \le \frac. While the inequality is often attributed to Francesco Paolo Cantelli who published it in 1928, it originates in Chebyshev's work of 1874. When bounding the event random variable deviates from its mean in only one direction (positive or negative), Cantelli's inequality gives an improvement over Chebyshev's inequality. The Chebyshev inequality has "higher moments versions" and "vector versions", and so does the Cantelli inequality ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Control Chart
Control charts is a graph used in production control to determine whether quality and manufacturing processes are being controlled under stable conditions. (ISO 7870-1) The hourly status is arranged on the graph, and the occurrence of abnormalities is judged based on the presence of data that differs from the conventional trend or deviates from the control limit line. Control charts are classified into Shewhart individuals control chart (ISO 7870-2) and CUSUM(CUsUM)(or cumulative sum control chart)(ISO 7870-4). Control charts, also known as Shewhart charts (after Walter A. Shewhart) or process-behavior charts, are a statistical process control tool used to determine if a manufacturing or business process is in a state of control. It is more appropriate to say that the control charts are the graphical device for Statistical Process Monitoring (SPM). Traditional control charts are mostly designed to monitor process parameters when the underlying form of the process distributio ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Chebyshev's Inequality
In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from the mean. Specifically, no more than 1/''k''2 of the distribution's values can be ''k'' or more standard deviations away from the mean (or equivalently, at least 1 − 1/''k''2 of the distribution's values are less than ''k'' standard deviations away from the mean). The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics. The inequality has great utility because it can be applied to any probability distribution in which the mean and variance are defined. For example, it can be used to prove the weak law of large numbers. Its practical usage is similar to the 68–95–99.7 rule, which applies only to normal distributions. Chebyshev's inequality is more general, stating th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Gauss's Inequality
In probability theory, Gauss's inequality (or the Gauss inequality) gives an upper bound on the probability that a unimodal random variable lies more than any given distance from its mode. Let ''X'' be a unimodal random variable with mode ''m'', and let ''τ'' 2 be the expected value of (''X'' − ''m'')2. (''τ'' 2 can also be expressed as (''μ'' − ''m'')2 + ''σ'' 2, where ''μ'' and ''σ'' are the mean and standard deviation of ''X''.) Then for any positive value of ''k'', : \Pr(, X - m, > k) \leq \begin \left( \frac \right)^2 & \text k \geq \frac \\ pt1 - \frac & \text 0 \leq k \leq \frac. \end The theorem was first proved by Carl Friedrich Gauss in 1823. Extensions to higher-order moments Winkler in 1866 extended Gauss' inequality to ''r''th moments Winkler A. (1886) Math-Natur theorie Kl. Akad. Wiss Wien Zweite Abt 53, 6–41 where ''r'' > 0 and the distribution is unimodal with a mode ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Continuous Probability Distribution
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon in terms of its sample space and the probabilities of events (subsets of the sample space). For instance, if is used to denote the outcome of a coin toss ("the experiment"), then the probability distribution of would take the value 0.5 (1 in 2 or 1/2) for , and 0.5 for (assuming that the coin is fair). Examples of random phenomena include the weather conditions at some future date, the height of a randomly selected person, the fraction of male students in a school, the results of a survey to be conducted, etc. Introduction A probability distribution is a mathematical description of the probabilities of events, subsets of the sample space. The sample space, often denoted by \Omega, is the set of all possible outcomes of a random ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Yuri Petunin
Yuri Ivanovich Petunin () was a Soviet and Ukrainian mathematician. Petunin was born in the city of Michurinsk (USSR) on September 30, 1937. After graduating from the Tambov State Pedagogical Institute he began his studies at Voronezh State University under the supervision of S.G Krein. He completed his postgraduate studies in 1962, and in 1968 he received his Doctor of Science Degree, the highest scientific degree awarded in the Soviet Union. In 1970 he joined the faculty of the computational mathematics department at Kyiv State University. Yuri Petunin is highly regarded for his results in functional analysis. He developed the theory of Scales in Banach spaces,S G Krein and Yu I Petunin, Scales of Banach spaces, 1966 Russ. Math. Surv. 21, 85–129 the theory of characteristics of linear manifolds in conjugate Banach spaces,Yu. I. Petunin and A. N. Plichko, The Theory of the Characteristics of Subspaces and Its Applications n Russian Vishcha Shkola, Kyiv (1980) and with S.G. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Unimodal Function
In mathematics, unimodality means possessing a unique mode. More generally, unimodality means there is only a single highest value, somehow defined, of some mathematical object. Unimodal probability distribution In statistics, a unimodal probability distribution or unimodal distribution is a probability distribution which has a single peak. The term "mode" in this context refers to any peak of the distribution, not just to the strict definition of mode which is usual in statistics. If there is a single mode, the distribution function is called "unimodal". If it has more modes it is "bimodal" (2), "trimodal" (3), etc., or in general, "multimodal". Figure 1 illustrates normal distributions, which are unimodal. Other examples of unimodal distributions include Cauchy distribution, Student's ''t''-distribution, chi-squared distribution and exponential distribution. Among discrete distributions, the binomial distribution and Poisson distribution can be seen as unimodal, tho ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability Distribution
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon in terms of its sample space and the probabilities of events (subsets of the sample space). For instance, if is used to denote the outcome of a coin toss ("the experiment"), then the probability distribution of would take the value 0.5 (1 in 2 or 1/2) for , and 0.5 for (assuming that the coin is fair). Examples of random phenomena include the weather conditions at some future date, the height of a randomly selected person, the fraction of male students in a school, the results of a survey to be conducted, etc. Introduction A probability distribution is a mathematical description of the probabilities of events, subsets of the sample space. The sample space, often denoted by \Omega, is the set of all possible outcomes of a ra ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]