In
statistics, inverse-variance weighting is a method of aggregating two or more
random variables to minimize the
variance
In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of number ...
of the
weighted average
The weighted arithmetic mean is similar to an ordinary arithmetic mean (the most common type of average), except that instead of each of the data points contributing equally to the final average, some data points contribute more than others. The ...
. Each random variable is weighted in
inverse proportion
In mathematics, two sequences of numbers, often experimental data, are proportional or directly proportional if their corresponding elements have a constant ratio, which is called the coefficient of proportionality or proportionality constan ...
to its variance, i.e. proportional to its
precision.
Given a sequence of independent observations with variances , the inverse-variance weighted average is given by
:
The inverse-variance weighted average has the least variance among all weighted averages, which can be calculated as
:
If the variances of the measurements are all equal, then the inverse-variance weighted average becomes the simple average.
Inverse-variance weighting is typically used in statistical
meta-analysis
A meta-analysis is a statistical analysis that combines the results of multiple scientific studies. Meta-analyses can be performed when there are multiple scientific studies addressing the same question, with each individual study reporting m ...
or
sensor fusion
Sensor fusion is the process of combining sensor data or data derived from disparate sources such that the resulting information has less uncertainty than would be possible when these sources were used individually. For instance, one could potentia ...
to combine the results from independent measurements.
Context
Suppose an experimenter wishes to measure the value of a quantity, say the acceleration due to
gravity of Earth
The gravity of Earth, denoted by , is the net acceleration that is imparted to objects due to the combined effect of gravitation (from mass distribution within Earth) and the centrifugal force (from the Earth's rotation).
It is a vector quan ...
, whose true value happens to be
. A careful experimenter makes multiple measurements, which we denote with
random variables . If they are all noisy but unbiased, i.e., the measuring device does not systematically overestimate or underestimate the true value and the errors are scattered symmetrically, then the
expectation value
In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a l ...
. The scatter in the measurement is then characterised by the
variance
In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of number ...
of the random variables
, and if the measurements are performed under identical scenarios, then all the
are the same, which we shall refer to by
. Given the
measurements, a typical
estimator
In statistics, an estimator is a rule for calculating an estimate of a given quantity based on observed data: thus the rule (the estimator), the quantity of interest (the estimand) and its result (the estimate) are distinguished. For example, the ...
for
, denoted as
, is given by the simple
average
In ordinary language, an average is a single number taken as representative of a list of numbers, usually the sum of the numbers divided by how many numbers are in the list (the arithmetic mean). For example, the average of the numbers 2, 3, 4, 7, ...
. Note that this empirical average is also a random variable, whose expectation value