In
probability theory
Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
and
statistics
Statistics (from German language, German: ', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a s ...
, the coefficient of variation (CV), also known as normalized
root-mean-square deviation (NRMSD), percent RMS, and relative standard deviation (RSD), is a
standardized
Standardization (American English) or standardisation (British English) is the process of implementing and developing technical standards based on the consensus of different parties that include firms, users, interest groups, standards organiza ...
measure of
dispersion of a
probability distribution
In probability theory and statistics, a probability distribution is a Function (mathematics), function that gives the probabilities of occurrence of possible events for an Experiment (probability theory), experiment. It is a mathematical descri ...
or
frequency distribution
In statistics, the frequency or absolute frequency of an Event (probability theory), event i is the number n_i of times the observation has occurred/been recorded in an experiment or study. These frequencies are often depicted graphically or tabu ...
. It is defined as the ratio of the
standard deviation
In statistics, the standard deviation is a measure of the amount of variation of the values of a variable about its Expected value, mean. A low standard Deviation (statistics), deviation indicates that the values tend to be close to the mean ( ...
to the
mean
A mean is a quantity representing the "center" of a collection of numbers and is intermediate to the extreme values of the set of numbers. There are several kinds of means (or "measures of central tendency") in mathematics, especially in statist ...
(or its
absolute value
In mathematics, the absolute value or modulus of a real number x, is the non-negative value without regard to its sign. Namely, , x, =x if x is a positive number, and , x, =-x if x is negative (in which case negating x makes -x positive), ...
, , and often expressed as a percentage ("%RSD"). The CV or RSD is widely used in
analytical chemistry
Analytical skill, Analytical chemistry studies and uses instruments and methods to Separation process, separate, identify, and Quantification (science), quantify matter. In practice, separation, identification or quantification may constitute t ...
to express the precision and repeatability of an
assay
An assay is an investigative (analytic) procedure in laboratory medicine, mining, pharmacology, environmental biology and molecular biology for qualitatively assessing or quantitatively measuring the presence, amount, or functional activity ...
. It is also commonly used in fields such as
engineering
Engineering is the practice of using natural science, mathematics, and the engineering design process to Problem solving#Engineering, solve problems within technology, increase efficiency and productivity, and improve Systems engineering, s ...
or
physics
Physics is the scientific study of matter, its Elementary particle, fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge whi ...
when doing quality assurance studies and
ANOVA gauge R&R, by economists and investors in
economic model
An economic model is a theoretical construct representing economic processes by a set of variables and a set of logical and/or quantitative relationships between them. The economic model is a simplified, often mathematical, framework designed ...
s, in
epidemiology
Epidemiology is the study and analysis of the distribution (who, when, and where), patterns and Risk factor (epidemiology), determinants of health and disease conditions in a defined population, and application of this knowledge to prevent dise ...
, and in
psychology
Psychology is the scientific study of mind and behavior. Its subject matter includes the behavior of humans and nonhumans, both consciousness, conscious and Unconscious mind, unconscious phenomena, and mental processes such as thoughts, feel ...
/
neuroscience
Neuroscience is the scientific study of the nervous system (the brain, spinal cord, and peripheral nervous system), its functions, and its disorders. It is a multidisciplinary science that combines physiology, anatomy, molecular biology, ...
.
Definition
The coefficient of variation (CV) is defined as the ratio of the standard deviation
to the mean
,
It shows the extent of variability in relation to the mean of the population.
The coefficient of variation should be computed only for data measured on scales that have a meaningful zero (
ratio scale) and hence allow relative comparison of two measurements (i.e., division of one measurement by the other). The coefficient of variation may not have any meaning for data on an
interval scale. For example, most temperature scales (e.g., Celsius, Fahrenheit etc.) are interval scales with arbitrary zeros, so the computed coefficient of variation would be different depending on the scale used. On the other hand,
Kelvin
The kelvin (symbol: K) is the base unit for temperature in the International System of Units (SI). The Kelvin scale is an absolute temperature scale that starts at the lowest possible temperature (absolute zero), taken to be 0 K. By de ...
temperature has a meaningful zero, the complete absence of thermal energy, and thus is a ratio scale. In plain language, it is meaningful to say that 20 Kelvin is twice as hot as 10 Kelvin, but only in this scale with a true absolute zero. While a standard deviation (SD) can be measured in Kelvin, Celsius, or Fahrenheit, the value computed is only applicable to that scale. Only the Kelvin scale can be used to compute a valid coefficient of variability.
Measurements that are
log-normally distributed exhibit stationary CV; in contrast, SD varies depending upon the expected value of measurements.
A more robust possibility is the
quartile coefficient of dispersion, half the
interquartile range
In descriptive statistics, the interquartile range (IQR) is a measure of statistical dispersion, which is the spread of the data. The IQR may also be called the midspread, middle 50%, fourth spread, or H‑spread. It is defined as the differen ...
divided by the average of the quartiles (the
midhinge),
.
In most cases, a CV is computed for a single independent variable (e.g., a single factory product) with numerous, repeated measures of a dependent variable (e.g., error in the production process). However, data that are linear or even logarithmically non-linear and include a continuous range for the independent variable with sparse measurements across each value (e.g., scatter-plot) may be amenable to single CV calculation using a
maximum-likelihood estimation approach.
Examples
In the examples below, we will take the values given as randomly chosen from a larger population of values.
* The data set
00, 100, 100has constant values. Its
standard deviation
In statistics, the standard deviation is a measure of the amount of variation of the values of a variable about its Expected value, mean. A low standard Deviation (statistics), deviation indicates that the values tend to be close to the mean ( ...
is 0 and average is 100, giving the coefficient of variation as 0 / 100 = 0
* The data set
0, 100, 110has more variability. Its standard deviation is 10 and its average is 100, giving the coefficient of variation as 10 / 100 = 0.1
* The data set
, 5, 6, 8, 10, 40, 65, 88has still more variability. Its standard deviation is 32.9 and its average is 27.9, giving a coefficient of variation of 32.9 / 27.9 = 1.18
In these examples, we will take the values given as the entire population of values.
* The data set
00, 100, 100has a
population standard deviation of 0 and a coefficient of variation of 0 / 100 = 0
* The data set
0, 100, 110has a population standard deviation of 8.16 and a coefficient of variation of 8.16 / 100 = 0.0816
* The data set
, 5, 6, 8, 10, 40, 65, 88has a population standard deviation of 30.8 and a coefficient of variation of 30.8 / 27.9 = 1.10
Estimation
When only a sample of data from a population is available, the population CV can be estimated using the ratio of the
sample standard deviation to the sample mean
:
:
But this estimator, when applied to a small or moderately sized sample, tends to be too low: it is a
biased estimator. For
normally distributed data, an unbiased estimator for a sample of size n is:
:
Log-normal data
Many datasets follow an approximately log-normal distribution. In such cases, a more accurate estimate, derived from the properties of the
log-normal distribution
In probability theory, a log-normal (or lognormal) distribution is a continuous probability distribution of a random variable whose logarithm is normal distribution, normally distributed. Thus, if the random variable is log-normally distributed ...
, is defined as:
:
where
is the sample standard deviation of the data after a
natural log transformation. (In the event that measurements are recorded using any other logarithmic base, b, their standard deviation
is converted to base e using
, and the formula for
remains the same.) This estimate is sometimes referred to as the "geometric CV" (GCV) in order to distinguish it from the simple estimate above. However, "geometric coefficient of variation" has also been defined by Kirkwood as:
:
This term was intended to be ''analogous'' to the coefficient of variation, for describing multiplicative variation in log-normal data, but this definition of GCV has no theoretical basis as an estimate of
itself.
For many practical purposes (such as
sample size determination and calculation of
confidence intervals) it is
which is of most use in the context of log-normally distributed data. If necessary, this can be derived from an estimate of
or GCV by inverting the corresponding formula.
Comparison to standard deviation
Advantages
The coefficient of variation is useful because the standard deviation of data must always be understood in the context of the mean of the data.
In contrast, the actual value of the CV is independent of the unit in which the measurement has been taken, so it is a
dimensionless number
Dimensionless quantities, or quantities of dimension one, are quantities implicitly defined in a manner that prevents their aggregation into unit of measurement, units of measurement. ISBN 978-92-822-2272-0. Typically expressed as ratios that a ...
.
For comparison between data sets with different units or widely different means, one should use the coefficient of variation instead of the standard deviation.
Disadvantages
* When the mean value is close to zero, the coefficient of variation will approach infinity and is therefore sensitive to small changes in the mean. This is often the case if the values do not originate from a ratio scale.
* Unlike the standard deviation, it cannot be used directly to construct
confidence intervals for the mean.
Applications
The coefficient of variation is also common in applied probability fields such as
renewal theory,
queueing theory
Queueing theory is the mathematical study of waiting lines, or queues. A queueing model is constructed so that queue lengths and waiting time can be predicted. Queueing theory is generally considered a branch of operations research because th ...
, and
reliability theory
Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability is defined as the probability that a product, system, or service will perform its intended funct ...
. In these fields, the
exponential distribution
In probability theory and statistics, the exponential distribution or negative exponential distribution is the probability distribution of the distance between events in a Poisson point process, i.e., a process in which events occur continuousl ...
is often more important than the
normal distribution
In probability theory and statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is
f(x) = \frac ...
.
The standard deviation of an
exponential distribution
In probability theory and statistics, the exponential distribution or negative exponential distribution is the probability distribution of the distance between events in a Poisson point process, i.e., a process in which events occur continuousl ...
is equal to its mean, so its coefficient of variation is equal to 1. Distributions with CV < 1 (such as an
Erlang distribution) are considered low-variance, while those with CV > 1 (such as a
hyper-exponential distribution) are considered high-variance. Some formulas in these fields are expressed using the squared coefficient of variation, often abbreviated SCV. In modeling, a variation of the CV is the CV(RMSD). Essentially the CV(RMSD) replaces the standard deviation term with the
Root Mean Square Deviation (RMSD). While many natural processes indeed show a correlation between the average value and the amount of variation around it, accurate sensor devices need to be designed in such a way that the coefficient of variation is close to zero, i.e., yielding a constant
absolute error over their working range.
In
actuarial science
Actuarial science is the discipline that applies mathematics, mathematical and statistics, statistical methods to Risk assessment, assess risk in insurance, pension, finance, investment and other industries and professions.
Actuary, Actuaries a ...
, the CV is known as unitized risk.
In industrial solids processing, CV is particularly important to measure the degree of homogeneity of a powder mixture. Comparing the calculated CV to a specification will allow to define if a sufficient degree of mixing has been reached.
In
fluid dynamics
In physics, physical chemistry and engineering, fluid dynamics is a subdiscipline of fluid mechanics that describes the flow of fluids – liquids and gases. It has several subdisciplines, including (the study of air and other gases in motion ...
, the CV, also referred to as Percent RMS, %RMS, %RMS Uniformity, or Velocity RMS, is a useful determination of flow uniformity for industrial processes. The term is used widely in the design of pollution control equipment, such as electrostatic precipitators (ESPs), selective catalytic reduction (SCR), scrubbers, and similar devices. The Institute of Clean Air Companies (ICAC) references RMS deviation of velocity in the design of fabric filters (ICAC document F-7). The guiding principle is that many of these pollution control devices require "uniform flow" entering and through the control zone. This can be related to uniformity of velocity profile, temperature distribution, gas species (such as ammonia for an SCR, or activated carbon injection for mercury absorption), and other flow-related parameters. The Percent RMS also is used to assess flow uniformity in combustion systems, HVAC systems, ductwork, inlets to fans and filters, air handling units, etc. where performance of the equipment is influenced by the incoming flow distribution.
Laboratory measures of intra-assay and inter-assay CVs
CV measures are often used as quality controls for quantitative laboratory
assay
An assay is an investigative (analytic) procedure in laboratory medicine, mining, pharmacology, environmental biology and molecular biology for qualitatively assessing or quantitatively measuring the presence, amount, or functional activity ...
s. While intra-assay and inter-assay CVs might be assumed to be calculated by simply averaging CV values across CV values for multiple samples within one assay or by averaging multiple inter-assay CV estimates, it has been suggested that these practices are incorrect and that a more complex computational process is required. It has also been noted that CV values are not an ideal index of the certainty of a measurement when the number of replicates varies across samples − in this case standard error in percent is suggested to be superior.
If measurements do not have a natural zero point then the CV is not a valid measurement and alternative measures such as the
intraclass correlation coefficient are recommended.
As a measure of economic inequality
The coefficient of variation fulfills the
requirements for a measure of economic inequality.
If x (with entries ''x''
''i'') is a list of the values of an economic indicator (e.g. wealth), with ''x''
''i'' being the wealth of agent ''i'', then the following requirements are met:
* Anonymity – ''c''
''v'' is independent of the ordering of the list x. This follows from the fact that the variance and mean are independent of the ordering of x.
* Scale invariance: ''c''
v(x) = ''c''
v(αx) where ''α'' is a real number.
* Population independence – If is the list x appended to itself, then ''c''
''v''() = ''c''
''v''(x). This follows from the fact that the variance and mean both obey this principle.
* Pigou–Dalton transfer principle: when wealth is transferred from a wealthier agent ''i'' to a poorer agent ''j'' (i.e. ''x''
''i'' > ''x''
''j'') without altering their rank, then ''c''
''v'' decreases and vice versa.
''c''
''v'' assumes its minimum value of zero for complete equality (all ''x''
''i'' are equal).
Its most notable drawback is that it is not bounded from above, so it cannot be normalized to be within a fixed range (e.g. like the
Gini coefficient
In economics, the Gini coefficient ( ), also known as the Gini index or Gini ratio, is a measure of statistical dispersion intended to represent the income distribution, income inequality, the wealth distribution, wealth inequality, or the ...
which is constrained to be between 0 and 1).
It is, however, more mathematically tractable than the Gini coefficient.
As a measure of standardisation of archaeological artefacts
Archaeologists often use CV values to compare the degree of standardisation of ancient artefacts. Variation in CVs has been interpreted to indicate different cultural transmission contexts for the adoption of new technologies. Coefficients of variation have also been used to investigate pottery standardisation relating to changes in social organisation. Archaeologists also use several methods for comparing CV values, for example the modified signed-likelihood ratio (MSLR) test for equality of CVs.
Examples of misuse
Comparing coefficients of variation between parameters using relative units can result in differences that may not be real. If we compare the same set of temperatures in
Celsius
The degree Celsius is the unit of temperature on the Celsius temperature scale "Celsius temperature scale, also called centigrade temperature scale, scale based on 0 ° for the melting point of water and 100 ° for the boiling point ...
and
Fahrenheit
The Fahrenheit scale () is a scale of temperature, temperature scale based on one proposed in 1724 by the German-Polish physicist Daniel Gabriel Fahrenheit (1686–1736). It uses the degree Fahrenheit (symbol: °F) as the unit. Several accou ...
(both relative units, where
kelvin
The kelvin (symbol: K) is the base unit for temperature in the International System of Units (SI). The Kelvin scale is an absolute temperature scale that starts at the lowest possible temperature (absolute zero), taken to be 0 K. By de ...
and
Rankine scale are their associated absolute values):
Celsius:
, 10, 20, 30, 40
Fahrenheit:
2, 50, 68, 86, 104
The
sample standard deviations are 15.81 and 28.46, respectively. The CV of the first set is 15.81/20 = 79%. For the second set (which are the same temperatures) it is 28.46/68 = 42%.
If, for example, the data sets are temperature readings from two different sensors (a Celsius sensor and a Fahrenheit sensor) and you want to know which sensor is better by picking the one with the least variance, then you will be misled if you use CV. The problem here is that you have divided by a relative value rather than an absolute.
Comparing the same data set, now in absolute units:
Kelvin:
73.15, 283.15, 293.15, 303.15, 313.15
Rankine:
91.67, 509.67, 527.67, 545.67, 563.67
The
sample standard deviations are still 15.81 and 28.46, respectively, because the standard deviation is not affected by a constant offset. The coefficients of variation, however, are now both equal to 5.39%.
Mathematically speaking, the coefficient of variation is not entirely linear. That is, for a random variable
, the coefficient of variation of
is equal to the coefficient of variation of
only when
. In the above example, Celsius can only be converted to Fahrenheit through a linear transformation of the form
with
, whereas Kelvins can be converted to Rankines through a transformation of the form
.
Distribution
Provided that negative and small positive values of the sample mean occur with negligible frequency, the
probability distribution
In probability theory and statistics, a probability distribution is a Function (mathematics), function that gives the probabilities of occurrence of possible events for an Experiment (probability theory), experiment. It is a mathematical descri ...
of the coefficient of variation for a sample of size
of i.i.d. normal random variables has been shown by Hendricks and Robey to be
where the symbol
indicates that the summation is over only even values of
, i.e., if
is odd, sum over even values of
and if
is even, sum only over odd values of
.
This is useful, for instance, in the construction of
hypothesis tests or
confidence intervals.
Statistical inference for the coefficient of variation in normally distributed data is often based on
McKay's chi-square approximation for the coefficient of variation.
Alternative
Liu (2012) reviews methods for the construction of a confidence interval for the coefficient of variation. Notably, Lehmann (1986) derived the sampling distribution for the coefficient of variation using a
non-central t-distribution to give an exact method for the construction of the CI.
[Lehmann, E. L. (1986). ''Testing Statistical Hypothesis.'' 2nd ed. New York: Wiley.]
Similar ratios
Standardized moments are similar ratios,
where
is the ''k''
th moment about the mean, which are also dimensionless and scale invariant. The
variance-to-mean ratio,
, is another similar ratio, but is not dimensionless, and hence not scale invariant. See
Normalization (statistics)
In statistics and applications of statistics, normalization can have a range of meanings. In the simplest cases, normalization of ratings means adjusting values measured on different scales to a notionally common scale, often prior to averaging ...
for further ratios.
In
signal processing
Signal processing is an electrical engineering subfield that focuses on analyzing, modifying and synthesizing ''signals'', such as audio signal processing, sound, image processing, images, Scalar potential, potential fields, Seismic tomograph ...
, particularly
image processing
An image or picture is a visual representation. An image can be two-dimensional, such as a drawing, painting, or photograph, or three-dimensional, such as a carving or sculpture. Images may be displayed through other media, including a pr ...
, the
reciprocal ratio
(or its square) is referred to as the
signal-to-noise ratio
Signal-to-noise ratio (SNR or S/N) is a measure used in science and engineering that compares the level of a desired signal to the level of background noise. SNR is defined as the ratio of signal power to noise power, often expressed in deci ...
in general and
signal-to-noise ratio (imaging) in particular.
Other related ratios include:
*
Efficiency
Efficiency is the often measurable ability to avoid making mistakes or wasting materials, energy, efforts, money, and time while performing a task. In a more general sense, it is the ability to do things well, successfully, and without waste.
...
,
*
Standardized moment,
*
Variance-to-mean ratio (or relative variance),
*
Fano factor,
(windowed VMR)
See also
*
Standard score
In statistics, the standard score or ''z''-score is the number of standard deviations by which the value of a raw score (i.e., an observed value or data point) is above or below the mean value of what is being observed or measured. Raw scores ...
*
Information ratio
*
Omega ratio
*
Sampling (statistics)
In this statistics, quality assurance, and survey methodology, sampling is the selection of a subset or a statistical sample (termed sample for short) of individuals from within a population (statistics), statistical population to estimate char ...
*
Variance function
References
External links
cvequality R package to test for significant differences between multiple coefficients of variation
{{DEFAULTSORT:Coefficient Of Variation
Statistical deviation and dispersion
Statistical ratios
Income inequality metrics