HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
and statistics, the coefficient of variation (CV), also known as relative standard deviation (RSD), is a
standardized Standardization or standardisation is the process of implementing and developing technical standards based on the consensus of different parties that include firms, users, interest groups, standards organizations and governments. Standardization ...
measure of dispersion of a
probability distribution In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomeno ...
or frequency distribution. It is often expressed as a percentage, and is defined as the ratio of the standard deviation \sigma to the
mean There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value ( magnitude and sign) of a given data set. For a data set, the '' ari ...
\mu (or its absolute value, The CV or RSD is widely used in
analytical chemistry Analytical chemistry studies and uses instruments and methods to separate, identify, and quantify matter. In practice, separation, identification or quantification may constitute the entire analysis or be combined with another method. Separati ...
to express the precision and repeatability of an
assay An assay is an investigative (analytic) procedure in laboratory medicine, mining, pharmacology, environmental biology and molecular biology for qualitatively assessing or quantitatively measuring the presence, amount, or functional activity of ...
. It is also commonly used in fields such as
engineering Engineering is the use of scientific method, scientific principles to design and build machines, structures, and other items, including bridges, tunnels, roads, vehicles, and buildings. The discipline of engineering encompasses a broad rang ...
or
physics Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge which rel ...
when doing quality assurance studies and ANOVA gauge R&R, by economists and investors in
economic model In economics, a model is a theoretical construct representing economic processes by a set of variables and a set of logical and/or quantitative relationships between them. The economic model is a simplified, often mathematical, framework desi ...
s, and in
neuroscience Neuroscience is the science, scientific study of the nervous system (the brain, spinal cord, and peripheral nervous system), its functions and disorders. It is a Multidisciplinary approach, multidisciplinary science that combines physiology, an ...
.


Definition

The coefficient of variation (CV) is defined as the ratio of the standard deviation \ \sigma to the mean \ \mu , c_ = \frac. It shows the extent of variability in relation to the mean of the population. The coefficient of variation should be computed only for data measured on scales that have a meaningful zero ( ratio scale) and hence allow relative comparison of two measurements (i.e., division of one measurement by the other). The coefficient of variation may not have any meaning for data on an
interval scale Level of measurement or scale of measure is a classification that describes the nature of information within the values assigned to variables. Psychologist Stanley Smith Stevens developed the best-known classification with four levels, or scal ...
. For example, most temperature scales (e.g., Celsius, Fahrenheit etc.) are interval scales with arbitrary zeros, so the computed coefficient of variation would be different depending on the scale used. On the other hand,
Kelvin The kelvin, symbol K, is the primary unit of temperature in the International System of Units (SI), used alongside its prefixed forms and the degree Celsius. It is named after the Belfast-born and University of Glasgow-based engineer and ph ...
temperature has a meaningful zero, the complete absence of thermal energy, and thus is a ratio scale. In plain language, it is meaningful to say that 20 Kelvin is twice as hot as 10 Kelvin, but only in this scale with a true absolute zero. While a standard deviation (SD) can be measured in Kelvin, Celsius, or Fahrenheit, the value computed is only applicable to that scale. Only the Kelvin scale can be used to compute a valid coefficient of variability. Measurements that are
log-normal In probability theory, a log-normal (or lognormal) distribution is a continuous probability distribution of a random variable whose logarithm is normally distributed. Thus, if the random variable is log-normally distributed, then has a normal ...
ly distributed exhibit stationary CV; in contrast, SD varies depending upon the expected value of measurements. A more robust possibility is the quartile coefficient of dispersion, half the
interquartile range In descriptive statistics, the interquartile range (IQR) is a measure of statistical dispersion, which is the spread of the data. The IQR may also be called the midspread, middle 50%, fourth spread, or H‑spread. It is defined as the difference ...
divided by the average of the quartiles (the midhinge), . In most cases, a CV is computed for a single independent variable (e.g., a single factory product) with numerous, repeated measures of a dependent variable (e.g., error in the production process). However, data that are linear or even logarithmically non-linear and include a continuous range for the independent variable with sparse measurements across each value (e.g., scatter-plot) may be amenable to single CV calculation using a maximum-likelihood estimation approach.


Examples

In the examples below, we will take the values given as randomly chosen from a larger population of values. * The data set
00, 100, 100 The comma is a punctuation mark that appears in several variants in different languages. It has the same shape as an apostrophe or single closing quotation mark () in many typefaces, but it differs from them in being placed on the baseline o ...
has constant values. Its standard deviation is 0 and average is 100, giving the coefficient of variation as 0 / 100 = 0 * The data set
0, 100, 110 The comma is a punctuation mark that appears in several variants in different languages. It has the same shape as an apostrophe or single closing quotation mark () in many typefaces, but it differs from them in being placed on the baseline o ...
has more variability. Its standard deviation is 10 and its average is 100, giving the coefficient of variation as 10 / 100 = 0.1 * The data set
, 5, 6, 8, 10, 40, 65, 88 The comma is a punctuation mark that appears in several variants in different languages. It has the same shape as an apostrophe or single closing quotation mark () in many typefaces, but it differs from them in being placed on the baseline o ...
has still more variability. Its standard deviation is 32.9 and its average is 27.9, giving a coefficient of variation of 32.9 / 27.9 = 1.18 In these examples, we will take the values given as the entire population of values. * The data set
00, 100, 100 The comma is a punctuation mark that appears in several variants in different languages. It has the same shape as an apostrophe or single closing quotation mark () in many typefaces, but it differs from them in being placed on the baseline o ...
has a
population standard deviation In statistics, the standard deviation is a measure of the amount of variation or dispersion of a set of values. A low standard deviation indicates that the values tend to be close to the mean (also called the expected value) of the set, while ...
of 0 and a coefficient of variation of 0 / 100 = 0 * The data set
0, 100, 110 The comma is a punctuation mark that appears in several variants in different languages. It has the same shape as an apostrophe or single closing quotation mark () in many typefaces, but it differs from them in being placed on the baseline o ...
has a population standard deviation of 8.16 and a coefficient of variation of 8.16 / 100 = 0.0816 * The data set
, 5, 6, 8, 10, 40, 65, 88 The comma is a punctuation mark that appears in several variants in different languages. It has the same shape as an apostrophe or single closing quotation mark () in many typefaces, but it differs from them in being placed on the baseline o ...
has a population standard deviation of 30.8 and a coefficient of variation of 30.8 / 27.9 = 1.10


Estimation

When only a sample of data from a population is available, the population CV can be estimated using the ratio of the
sample standard deviation In statistics, the standard deviation is a measure of the amount of variation or dispersion of a set of values. A low standard deviation indicates that the values tend to be close to the mean (also called the expected value) of the set, while ...
s \, to the sample mean \bar: :\widehat = \frac But this estimator, when applied to a small or moderately sized sample, tends to be too low: it is a
biased estimator In statistics, the bias of an estimator (or bias function) is the difference between this estimator's expected value and the true value of the parameter being estimated. An estimator or decision rule with zero bias is called ''unbiased''. In st ...
. For
normally distributed In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu is ...
data, an unbiased estimator for a sample of size n is: :\widehat^*=\bigg(1+\frac\bigg)\widehat


Log-normal data

In many applications, it can be assumed that data are log-normally distributed (evidenced by the presence of
skewness In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable about its mean. The skewness value can be positive, zero, negative, or undefined. For a unimo ...
in the sampled data). In such cases, a more accurate estimate, derived from the properties of the
log-normal distribution In probability theory, a log-normal (or lognormal) distribution is a continuous probability distribution of a random variable whose logarithm is normally distributed. Thus, if the random variable is log-normally distributed, then has a normal ...
, is defined as: :\widehat_ = \sqrt where \, is the sample standard deviation of the data after a
natural log The natural logarithm of a number is its logarithm to the base of the mathematical constant , which is an irrational and transcendental number approximately equal to . The natural logarithm of is generally written as , , or sometimes, if ...
transformation. (In the event that measurements are recorded using any other logarithmic base, b, their standard deviation s_b \, is converted to base e using s_ = s_b \ln(b) \,, and the formula for \widehat_ \, remains the same.) This estimate is sometimes referred to as the "geometric CV" (GCV) in order to distinguish it from the simple estimate above. However, "geometric coefficient of variation" has also been defined by Kirkwood as: :\mathrm = This term was intended to be ''analogous'' to the coefficient of variation, for describing multiplicative variation in log-normal data, but this definition of GCV has no theoretical basis as an estimate of c_ \, itself. For many practical purposes (such as
sample size determination Sample size determination is the act of choosing the number of observations or replicates to include in a statistical sample. The sample size is an important feature of any empirical study in which the goal is to make inferences about a populatio ...
and calculation of
confidence intervals In frequentist statistics, a confidence interval (CI) is a range of estimates for an unknown parameter. A confidence interval is computed at a designated ''confidence level''; the 95% confidence level is most common, but other levels, such as ...
) it is s_ \, which is of most use in the context of log-normally distributed data. If necessary, this can be derived from an estimate of c_ \, or GCV by inverting the corresponding formula.


Comparison to standard deviation


Advantages

The coefficient of variation is useful because the standard deviation of data must always be understood in the context of the mean of the data. In contrast, the actual value of the CV is independent of the unit in which the measurement has been taken, so it is a
dimensionless number A dimensionless quantity (also known as a bare quantity, pure quantity, or scalar quantity as well as quantity of dimension one) is a quantity to which no physical dimension is assigned, with a corresponding SI unit of measurement of one (or 1) ...
. For comparison between data sets with different units or widely different means, one should use the coefficient of variation instead of the standard deviation.


Disadvantages

* When the mean value is close to zero, the coefficient of variation will approach infinity and is therefore sensitive to small changes in the mean. This is often the case if the values do not originate from a ratio scale. * Unlike the standard deviation, it cannot be used directly to construct
confidence interval In frequentist statistics, a confidence interval (CI) is a range of estimates for an unknown parameter. A confidence interval is computed at a designated ''confidence level''; the 95% confidence level is most common, but other levels, such as ...
s for the mean. * CVs are not an ideal index of the certainty of measurement when the number of replicates varies across samples because CV is invariant to the number of replicates while the certainty of the mean improves with increasing replicates. In this case, standard error in percent is suggested to be superior.


Applications

The coefficient of variation is also common in applied probability fields such as
renewal theory Renewal theory is the branch of probability theory that generalizes the Poisson process for arbitrary holding times. Instead of exponentially distributed holding times, a renewal process may have any independent and identically distributed (IID) ho ...
,
queueing theory Queueing theory is the mathematical study of waiting lines, or queues. A queueing model is constructed so that queue lengths and waiting time can be predicted. Queueing theory is generally considered a branch of operations research because the ...
, and reliability theory. In these fields, the
exponential distribution In probability theory and statistics, the exponential distribution is the probability distribution of the time between events in a Poisson point process, i.e., a process in which events occur continuously and independently at a constant averag ...
is often more important than the
normal distribution In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu i ...
. The standard deviation of an
exponential distribution In probability theory and statistics, the exponential distribution is the probability distribution of the time between events in a Poisson point process, i.e., a process in which events occur continuously and independently at a constant averag ...
is equal to its mean, so its coefficient of variation is equal to 1. Distributions with CV < 1 (such as an Erlang distribution) are considered low-variance, while those with CV > 1 (such as a hyper-exponential distribution) are considered high-variance. Some formulas in these fields are expressed using the squared coefficient of variation, often abbreviated SCV. In modeling, a variation of the CV is the CV(RMSD). Essentially the CV(RMSD) replaces the standard deviation term with the Root Mean Square Deviation (RMSD). While many natural processes indeed show a correlation between the average value and the amount of variation around it, accurate sensor devices need to be designed in such a way that the coefficient of variation is close to zero, i.e., yielding a constant absolute error over their working range. In actuarial science, the CV is known as unitized risk. In Industrial Solids Processing, CV is particularly important to measure the degree of homogeneity of a powder mixture. Comparing the calculated CV to a specification will allow to define if a sufficient degree of mixing has been reached.


Laboratory measures of intra-assay and inter-assay CVs

CV measures are often used as quality controls for quantitative laboratory
assay An assay is an investigative (analytic) procedure in laboratory medicine, mining, pharmacology, environmental biology and molecular biology for qualitatively assessing or quantitatively measuring the presence, amount, or functional activity of ...
s. While intra-assay and inter-assay CVs might be assumed to be calculated by simply averaging CV values across CV values for multiple samples within one assay or by averaging multiple inter-assay CV estimates, it has been suggested that these practices are incorrect and that a more complex computational process is required. It has also been noted that CV values are not an ideal index of the certainty of a measurement when the number of replicates varies across samples − in this case standard error in percent is suggested to be superior. If measurements do not have a natural zero point then the CV is not a valid measurement and alternative measures such as the
intraclass correlation In statistics, the intraclass correlation, or the intraclass correlation coefficient (ICC), is a descriptive statistic that can be used when quantitative measurements are made on units that are organized into groups. It describes how strongly ...
coefficient are recommended.


As a measure of economic inequality

The coefficient of variation fulfills the requirements for a measure of economic inequality. If x (with entries xi) is a list of the values of an economic indicator (e.g. wealth), with xi being the wealth of agent ''i'', then the following requirements are met: * Anonymity – ''c''''v'' is independent of the ordering of the list x. This follows from the fact that the variance and mean are independent of the ordering of x. * Scale invariance: ''c''v(x) = ''c''v(αx) where ''α'' is a real number. * Population independence – If is the list x appended to itself, then ''c''''v''() = ''c''''v''(x). This follows from the fact that the variance and mean both obey this principle. * Pigou–Dalton transfer principle: when wealth is transferred from a wealthier agent ''i'' to a poorer agent ''j'' (i.e. ''x''''i'' > ''x''''j'') without altering their rank, then ''c''''v'' decreases and vice versa. ''c''''v'' assumes its minimum value of zero for complete equality (all ''x''''i'' are equal). Its most notable drawback is that it is not bounded from above, so it cannot be normalized to be within a fixed range (e.g. like the Gini coefficient which is constrained to be between 0 and 1). It is, however, more mathematically tractable than the Gini coefficient.


As a measure of standardisation of archaeological artefacts

Archaeologists often use CV values to compare the degree of standardisation of ancient artefacts. Variation in CVs has been interpreted to indicate different cultural transmission contexts for the adoption of new technologies. Coefficients of variation have also been used to investigate pottery standardisation relating to changes in social organisation. Archaeologists also use several methods for comparing CV values, for example the modified signed-likelihood ratio (MSLR) test for equality of CVs.


Examples of misuse

Comparing coefficients of variation between parameters using relative units can result in differences that may not be real. If we compare the same set of temperatures in Celsius and
Fahrenheit The Fahrenheit scale () is a temperature scale based on one proposed in 1724 by the physicist Daniel Gabriel Fahrenheit (1686–1736). It uses the degree Fahrenheit (symbol: °F) as the unit. Several accounts of how he originally defined h ...
(both relative units, where
kelvin The kelvin, symbol K, is the primary unit of temperature in the International System of Units (SI), used alongside its prefixed forms and the degree Celsius. It is named after the Belfast-born and University of Glasgow-based engineer and ph ...
and Rankine scale are their associated absolute values): Celsius:
, 10, 20, 30, 40 The comma is a punctuation mark that appears in several variants in different languages. It has the same shape as an apostrophe or single closing quotation mark () in many typefaces, but it differs from them in being placed on the baseline o ...
Fahrenheit:
2, 50, 68, 86, 104 The comma is a punctuation mark that appears in several variants in different languages. It has the same shape as an apostrophe or single closing quotation mark () in many typefaces, but it differs from them in being placed on the baseline o ...
The sample standard deviations are 15.81 and 28.46, respectively. The CV of the first set is 15.81/20 = 79%. For the second set (which are the same temperatures) it is 28.46/68 = 42%. If, for example, the data sets are temperature readings from two different sensors (a Celsius sensor and a Fahrenheit sensor) and you want to know which sensor is better by picking the one with the least variance, then you will be misled if you use CV. The problem here is that you have divided by a relative value rather than an absolute. Comparing the same data set, now in absolute units: Kelvin: 73.15, 283.15, 293.15, 303.15, 313.15 Rankine: 91.67, 509.67, 527.67, 545.67, 563.67 The sample standard deviations are still 15.81 and 28.46, respectively, because the standard deviation is not affected by a constant offset. The coefficients of variation, however, are now both equal to 5.39%. Mathematically speaking, the coefficient of variation is not entirely linear. That is, for a random variable X, the coefficient of variation of aX+b is equal to the coefficient of variation of X only when b = 0. In the above example, Celsius can only be converted to Fahrenheit through a linear transformation of the form ax+b with b \neq 0, whereas Kelvins can be converted to Rankines through a transformation of the form ax.


Distribution

Provided that negative and small positive values of the sample mean occur with negligible frequency, the
probability distribution In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomeno ...
of the coefficient of variation for a sample of size n of i.i.d. normal random variables has been shown by Hendricks and Robey to be \mathrmF_ = \frac \; \mathrm^\frac\sideset\sum_^\frac\frac\frac \, \mathrmc_ , where the symbol \sideset\sum indicates that the summation is over only even values of n - 1 - i, i.e., if n is odd, sum over even values of i and if n is even, sum only over odd values of i. This is useful, for instance, in the construction of hypothesis tests or
confidence interval In frequentist statistics, a confidence interval (CI) is a range of estimates for an unknown parameter. A confidence interval is computed at a designated ''confidence level''; the 95% confidence level is most common, but other levels, such as ...
s. Statistical inference for the coefficient of variation in normally distributed data is often based on McKay's chi-square approximation for the coefficient of variation


Alternative

According to Liu (2012), Lehmann (1986).Lehmann, E. L. (1986). ''Testing Statistical Hypothesis.'' 2nd ed. New York: Wiley. "also derived the sample distribution of CV in order to give an exact method for the construction of a confidence interval for CV;" it is based on a
non-central t-distribution The noncentral ''t''-distribution generalizes Student's ''t''-distribution using a noncentrality parameter. Whereas the central probability distribution describes how a test statistic ''t'' is distributed when the difference tested is null, th ...
.


Similar ratios

Standardized moments are similar ratios, / where \mu_k is the ''k''th moment about the mean, which are also dimensionless and scale invariant. The variance-to-mean ratio, \sigma^2/\mu, is another similar ratio, but is not dimensionless, and hence not scale invariant. See
Normalization (statistics) In statistics and applications of statistics, normalization can have a range of meanings. In the simplest cases, normalization of ratings means adjusting values measured on different scales to a notionally common scale, often prior to averagin ...
for further ratios. In
signal processing Signal processing is an electrical engineering subfield that focuses on analyzing, modifying and synthesizing '' signals'', such as sound, images, and scientific measurements. Signal processing techniques are used to optimize transmissions, ...
, particularly
image processing An image is a visual representation of something. It can be two-dimensional, three-dimensional, or somehow otherwise feed into the visual system to convey information. An image can be an artifact, such as a photograph or other two-dimension ...
, the reciprocal ratio \mu/\sigma (or its square) is referred to as the
signal-to-noise ratio Signal-to-noise ratio (SNR or S/N) is a measure used in science and engineering that compares the level of a desired signal to the level of background noise. SNR is defined as the ratio of signal power to the noise power, often expressed in d ...
in general and signal-to-noise ratio (imaging) in particular. Other related ratios include: * Efficiency, \sigma^2 / \mu^2 * Standardized moment, \mu_k/\sigma^k * Variance-to-mean ratio (or relative variance), \sigma^2/\mu * Fano factor, \sigma^2_W/\mu_W (windowed VMR)


See also

* Omega ratio * Sampling (statistics) *
Sharpe ratio In finance, the Sharpe ratio (also known as the Sharpe index, the Sharpe measure, and the reward-to-variability ratio) measures the performance of an investment such as a security or portfolio compared to a risk-free asset, after adjusting for it ...
*
Variance function In statistics, the variance function is a smooth function which depicts the variance of a random quantity as a function of its mean. The variance function is a measure of heteroscedasticity and plays a large role in many settings of statistica ...


References


External links


cvequality
R package to test for significant differences between multiple coefficients of variation {{DEFAULTSORT:Coefficient Of Variation Statistical deviation and dispersion Statistical ratios Income inequality metrics