Chisini Mean
In mathematics, a function ''f'' of ''n'' variables ''x''1, ..., ''x''''n'' leads to a Chisini mean ''M'' if, for every vector ⟨''x''1, ..., ''x''''n''⟩, there exists a unique ''M'' such that :''f''(''M'',''M'', ..., ''M'') = ''f''(''x''1,''x''2, ..., ''x''''n''). The arithmetic, harmonic, geometric, generalised, Heronian and quadratic means are all Chisini means, as are their weighted variants. While Oscar Chisini was arguably the first to deal with "substitution means" in some depth in 1929, the idea of defining a mean as above is quite old, appearing (for example) in early works of Augustus De Morgan.De Morgan, Augustus. "Mean" i''Penny Cyclopaedia''(1839). See also * Fréchet mean * Generalized mean * Jensen's inequality * Quasi-arithmetic mean * Stolarsky mean In mathematics, the Stolarsky mean is a generalization of the logarithmic mean. It was introduced by Kenneth B. Stolarsky in 1975. Definition For two positive real numbers x and y the Stolarsky Mean ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Mathematics
Mathematics is a field of study that discovers and organizes methods, Mathematical theory, theories and theorems that are developed and Mathematical proof, proved for the needs of empirical sciences and mathematics itself. There are many areas of mathematics, which include number theory (the study of numbers), algebra (the study of formulas and related structures), geometry (the study of shapes and spaces that contain them), Mathematical analysis, analysis (the study of continuous changes), and set theory (presently used as a foundation for all mathematics). Mathematics involves the description and manipulation of mathematical object, abstract objects that consist of either abstraction (mathematics), abstractions from nature orin modern mathematicspurely abstract entities that are stipulated to have certain properties, called axioms. Mathematics uses pure reason to proof (mathematics), prove properties of objects, a ''proof'' consisting of a succession of applications of in ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Arithmetic Mean
In mathematics and statistics, the arithmetic mean ( ), arithmetic average, or just the ''mean'' or ''average'' is the sum of a collection of numbers divided by the count of numbers in the collection. The collection is often a set of results from an experiment, an observational study, or a Survey (statistics), survey. The term "arithmetic mean" is preferred in some contexts in mathematics and statistics because it helps to distinguish it from other types of means, such as geometric mean, geometric and harmonic mean, harmonic. Arithmetic means are also frequently used in economics, anthropology, history, and almost every other academic field to some extent. For example, per capita income is the arithmetic average of the income of a nation's Human population, population. While the arithmetic mean is often used to report central tendency, central tendencies, it is not a robust statistic: it is greatly influenced by outliers (Value (mathematics), values much larger or smaller than ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Harmonic Mean
In mathematics, the harmonic mean is a kind of average, one of the Pythagorean means. It is the most appropriate average for ratios and rate (mathematics), rates such as speeds, and is normally only used for positive arguments. The harmonic mean is the multiplicative inverse, reciprocal of the arithmetic mean of the reciprocals of the numbers, that is, the generalized f-mean with f(x) = \frac. For example, the harmonic mean of 1, 4, and 4 is :\left(\frac\right)^ = \frac = \frac = 2\,. Definition The harmonic mean ''H'' of the positive real numbers x_1, x_2, \ldots, x_n is :H(x_1, x_2, \ldots, x_n) = \frac = \frac. It is the reciprocal of the arithmetic mean of the reciprocals, and vice versa: :\begin H(x_1, x_2, \ldots, x_n) &= \frac, \\ A(x_1, x_2, \ldots, x_n) &= \frac, \end where the arithmetic mean is A(x_1, x_2, \ldots, x_n) = \tfrac1n \sum_^n x_i. The harmonic mean is a Schur-concave function, and is greater than or equal to the minimum of its arguments: for positive a ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Geometric Mean
In mathematics, the geometric mean is a mean or average which indicates a central tendency of a finite collection of positive real numbers by using the product of their values (as opposed to the arithmetic mean which uses their sum). The geometric mean of numbers is the Nth root, th root of their product (mathematics), product, i.e., for a collection of numbers , the geometric mean is defined as : \sqrt[n]. When the collection of numbers and their geometric mean are plotted in logarithmic scale, the geometric mean is transformed into an arithmetic mean, so the geometric mean can equivalently be calculated by taking the natural logarithm of each number, finding the arithmetic mean of the logarithms, and then returning the result to linear scale using the exponential function , :\sqrt[n] = \exp \left( \frac \right). The geometric mean of two numbers is the square root of their product, for example with numbers and the geometric mean is \textstyle \sqrt = The geometric mean o ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Generalised Mean
In mathematics, generalised means (or power mean or Hölder mean from Otto Hölder) are a family of functions for aggregating sets of numbers. These include as special cases the Pythagorean means (arithmetic, geometric, and harmonic means). Definition If is a non-zero real number, and x_1, \dots, x_n are positive real numbers, then the generalized mean or power mean with exponent of these positive real numbers is M_p(x_1,\dots,x_n) = \left( \frac \sum_^n x_i^p \right)^ . (See -norm). For we set it equal to the geometric mean (which is the limit of means with exponents approaching zero, as proved below): M_0(x_1, \dots, x_n) = \left(\prod_^n x_i\right)^ . Furthermore, for a sequence of positive weights we define the weighted power mean as M_p(x_1,\dots,x_n) = \left(\frac \right)^ and when , it is equal to the weighted geometric mean: M_0(x_1,\dots,x_n) = \left(\prod_^n x_i^\right)^ . The unweighted means correspond to setting all . Special cases A few particular v ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Heronian Mean
In mathematics, the Heronian mean ''H'' of two non-negative real numbers ''A'' and ''B'' is given by the formula H = \frac \left(A + \sqrt +B \right). It is named after Hero of Alexandria. Properties Just like all means, the Heronian mean is symmetric (it does not depend on the order in which its two arguments are given) and idempotent (the mean of any number with itself is the same number). The Heronian mean of the numbers ''A'' and ''B'' is a Weighted arithmetic mean, weighted mean of their arithmetic mean, arithmetic and geometric means: H = \frac\cdot\frac + \frac\cdot\sqrt. Therefore, it lies between these two means, and between the two given numbers. Application in solid geometry The Heronian mean may be used in finding the volume of a frustum of a pyramid or cone (geometry), cone. The volume is equal to the product of the height of the frustum and the Heronian mean of the areas of the opposing parallel faces. A version of this formula, for square frusta, appears in the ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Quadratic Mean
In mathematics, the root mean square (abbrev. RMS, or rms) of a set of values is the square root of the set's mean square. Given a set x_i, its RMS is denoted as either x_\mathrm or \mathrm_x. The RMS is also known as the quadratic mean (denoted M_2), a special case of the generalized mean. The RMS of a continuous function is denoted f_\mathrm and can be defined in terms of an integral of the square of the function. In estimation theory, the root-mean-square deviation of an estimator measures how far the estimator strays from the data. Definition The RMS value of a set of values (or a continuous-time waveform) is the square root of the arithmetic mean of the squares of the values, or the square of the function that defines the continuous waveform. In the case of a set of ''n'' values \, the RMS is : x_\text = \sqrt. The corresponding formula for a continuous function (or waveform) ''f''(''t'') defined over the interval T_1 \le t \le T_2 is : f_\text = \sqrt , and the RMS for ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Oscar Chisini
Oscar Chisini (14 March 1889 at the MacTutor History of Mathematics archive – 10 April 1967) was an Italian . He introduced the in 1929. Biography Chisini was born in . In 1929, he founded the ''Institute of Mathematics'' (''Istituto di Matema ...[...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Augustus De Morgan
Augustus De Morgan (27 June 1806 – 18 March 1871) was a British mathematician and logician. He is best known for De Morgan's laws, relating logical conjunction, disjunction, and negation, and for coining the term "mathematical induction", the underlying principles of which he formalized. De Morgan's contributions to logic are heavily used in many branches of mathematics, including set theory and probability theory, as well as other related fields such as computer science. Biography Childhood Augustus De Morgan was born in Madurai, in the Carnatic Sultanate, Carnatic region of India, in 1806. His father was Lieutenant-Colonel John De Morgan (1772–1816), who held various appointments in the service of the East India Company, and his mother, Elizabeth (née Dodson, 1776–1856), was the granddaughter of James Dodson (mathematician), James Dodson, who computed a table of anti-logarithms (inverse logarithms). Augustus De Morgan became blind in one eye within a few months of his bi ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Fréchet Mean
In mathematics and statistics, the Fréchet mean is a generalization of centroids to metric spaces, giving a single representative point or central tendency for a cluster of points. It is named after Maurice Fréchet. Karcher mean is the renaming of the Riemannian Center of Mass construction developed by Karsten Grove and Hermann Karcher... On the real numbers, the arithmetic mean, median, geometric mean, and harmonic mean can all be interpreted as Fréchet means for different distance functions. Definition Let (''M'', ''d'') be a complete metric space. Let ''x''1, ''x''2, …, ''x''''N'' be points in ''M''. For any point ''p'' in ''M'', define the Fréchet variance to be the sum of squared distances from ''p'' to the ''x''''i'': :\Psi(p) = \sum_^N d^2(p, x_i) The Karcher means are then those points, ''m'' of ''M'', which minimise Ψ: :m = \mathop_ \sum_^N d^2(p, x_i) If there is a unique ''m'' of ''M'' that strictly minimises Ψ, then it is Fréchet mean. Examples of Fréchet ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Generalized Mean
In mathematics, generalised means (or power mean or Hölder mean from Otto Hölder) are a family of functions for aggregating sets of numbers. These include as special cases the Pythagorean means (arithmetic mean, arithmetic, geometric mean, geometric, and harmonic mean, harmonic means). Definition If is a non-zero real number, and x_1, \dots, x_n are positive real numbers, then the generalized mean or power mean with exponent of these positive real numbers is M_p(x_1,\dots,x_n) = \left( \frac \sum_^n x_i^p \right)^ . (See Norm (mathematics)#p-norm, -norm). For we set it equal to the geometric mean (which is the limit of means with exponents approaching zero, as proved below): M_0(x_1, \dots, x_n) = \left(\prod_^n x_i\right)^ . Furthermore, for a sequence of positive weights we define the weighted power mean as M_p(x_1,\dots,x_n) = \left(\frac \right)^ and when , it is equal to the weighted geometric mean: M_0(x_1,\dots,x_n) = \left(\prod_^n x_i^\right)^ . The unweight ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Jensen's Inequality
In mathematics, Jensen's inequality, named after the Danish mathematician Johan Jensen, relates the value of a convex function of an integral to the integral of the convex function. It was proved by Jensen in 1906, building on an earlier proof of the same inequality for doubly-differentiable functions by Otto Hölder in 1889. Given its generality, the inequality appears in many forms depending on the context, some of which are presented below. In its simplest form the inequality states that the convex transformation of a mean is less than or equal to the mean applied after convex transformation (or equivalently, the opposite inequality for concave transformations). Jensen's inequality generalizes the statement that the secant line of a convex function lies ''above'' the graph of the function, which is Jensen's inequality for two points: the secant line consists of weighted means of the convex function (for ''t'' ∈ ,1, :t f(x_1) + (1-t) f(x_2), while the g ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |