Partial correlation
   HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...
and statistics, partial correlation measures the degree of
association Association may refer to: *Club (organization), an association of two or more people united by a common interest or goal *Trade association, an organization founded and funded by businesses that operate in a specific industry *Voluntary associatio ...
between two random variables, with the effect of a set of controlling random variables removed. When determining the numerical relationship between two variables of interest, using their
correlation coefficient A correlation coefficient is a numerical measure of some type of correlation, meaning a statistical relationship between two variables. The variables may be two columns of a given data set of observations, often called a sample, or two components ...
will give misleading results if there is another
confounding variable In statistics, a confounder (also confounding variable, confounding factor, extraneous determinant or lurking variable) is a variable that influences both the dependent variable and independent variable, causing a spurious association. Con ...
that is numerically related to both variables of interest. This misleading information can be avoided by controlling for the confounding variable, which is done by computing the partial correlation coefficient. This is precisely the motivation for including other right-side variables in a
multiple regression In statistical modeling, regression analysis is a set of statistical processes for estimating the relationships between a dependent variable (often called the 'outcome' or 'response' variable, or a 'label' in machine learning parlance) and one ...
; but while multiple regression gives
unbiased Bias is a disproportionate weight ''in favor of'' or ''against'' an idea or thing, usually in a way that is closed-minded, prejudicial, or unfair. Biases can be innate or learned. People may develop biases for or against an individual, a group, ...
results for the
effect size In statistics, an effect size is a value measuring the strength of the relationship between two variables in a population, or a sample-based estimate of that quantity. It can refer to the value of a statistic calculated from a sample of data, the ...
, it does not give a numerical value of a measure of the strength of the relationship between the two variables of interest. For example, given
economic An economy is an area of the production, distribution and trade, as well as consumption of goods and services. In general, it is defined as a social domain that emphasize the practices, discourses, and material expressions associated with the ...
data on the consumption, income, and wealth of various individuals, consider the relationship between consumption and income. Failing to control for wealth when computing a correlation coefficient between consumption and income would give a misleading result, since income might be numerically related to wealth which in turn might be numerically related to consumption; a measured correlation between consumption and income might actually be contaminated by these other correlations. The use of a partial correlation avoids this problem. Like the correlation coefficient, the partial correlation coefficient takes on a value in the range from –1 to 1. The value –1 conveys a perfect negative correlation controlling for some variables (that is, an exact linear relationship in which higher values of one variable are associated with lower values of the other); the value 1 conveys a perfect positive linear relationship, and the value 0 conveys that there is no linear relationship. The partial correlation coincides with the conditional correlation if the random variables are jointly distributed as the
multivariate normal In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One d ...
, other elliptical, multivariate hypergeometric, multivariate negative hypergeometric, multinomial, or Dirichlet distribution, but not in general otherwise.


Formal definition

Formally, the partial correlation between ''X'' and ''Y'' given a set of ''n'' controlling variables Z = , written ''ρ''''XY''·Z, is the correlation between the residuals ''e''''X'' and ''e''''Y'' resulting from the linear regression of ''X'' with Z and of ''Y'' with Z, respectively. The first-order partial correlation (i.e., when ''n'' = 1) is the difference between a correlation and the product of the removable correlations divided by the product of the coefficients of alienation of the removable correlations. The
coefficient of alienation In mathematics, a coefficient is a multiplicative factor in some term of a polynomial, a series, or an expression; it is usually a number, but may be any expression (including variables such as , and ). When the coefficients are themselves v ...
, and its relation with joint variance through correlation are available in Guilford (1973, pp. 344–345).


Computation


Using linear regression

A simple way to compute the sample partial correlation for some data is to solve the two associated linear regression problems and calculate the correlation between the residuals. Let ''X'' and ''Y'' be random variables taking real values, and let Z be the ''n''-dimensional vector-valued random variable. Let ''xi'', ''yi'' and z''i'' denote the ''i''th of N
i.i.d. In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent. This property is us ...
observations from some joint probability distribution over real random variables ''X'', ''Y'', and Z, with z''i'' having been augmented with a 1 to allow for a constant term in the regression. Solving the linear regression problem amounts to finding (''n''+1)-dimensional regression coefficient vectors \mathbf_X^* and \mathbf_Y^* such that : \mathbf_X^* = \arg\min_ \left\ : \mathbf_Y^* = \arg\min_ \left\ where N is the number of observations, and \langle\mathbf, \mathbf_i \rangle is the
scalar product In mathematics, the dot product or scalar productThe term ''scalar product'' means literally "product with a scalar as a result". It is also used sometimes for other symmetric bilinear forms, for example in a pseudo-Euclidean space. is an alge ...
between the vectors \mathbf and \mathbf_i. The residuals are then :e_ = x_i - \langle\mathbf_X^*,\mathbf_i \rangle :e_ = y_i - \langle\mathbf_Y^*,\mathbf_i \rangle and the sample partial correlation is then given by the usual formula for sample correlation, but between these new ''derived'' values: :\begin \hat_&=\frac \\ &=\frac . \end In the first expression the three terms after minus signs all equal 0 since each contains the sum of residuals from an
ordinary least squares In statistics, ordinary least squares (OLS) is a type of linear least squares method for choosing the unknown parameters in a linear regression model (with fixed level-one effects of a linear function of a set of explanatory variables) by the ...
regression.


Example

Consider the following data on three variables, ''X'', ''Y'', and ''Z'': Computing the
Pearson correlation coefficient In statistics, the Pearson correlation coefficient (PCC, pronounced ) ― also known as Pearson's ''r'', the Pearson product-moment correlation coefficient (PPMCC), the bivariate correlation, or colloquially simply as the correlation coefficient ...
between variables ''X'' and ''Y'' results in approximately 0.970, while computing the partial correlation between ''X'' and ''Y'', using the formula given above, gives a partial correlation of 0.919. The computations were done using R with the following code. > X <- c(2,4,15,20) > Y <- c(1,2,3,4) > Z <- c(0,0,1,1) > mm1 <- lm(X~Z) > res1 <- mm1$residuals > mm2 <- lm(Y~Z) > res2 <- mm2$residuals > cor(res1,res2) 0.919145 > cor(X,Y) 0.9695016 > generalCorr::parcorMany(cbind(X,Y,Z)) nami namj partij partji rijMrji ,"X" "Y" "0.8844" "1" "-0.1156" ,"X" "Z" "0.1581" "1" "-0.8419" The lower part of the above code reports generalized nonlinear partial correlation coefficient between ''X'' and ''Y'' after removing the nonlinear effect of ''Z'' to be 0.8844. Also, the generalized partial correlation coefficient between ''X'' and ''Z'' after removing the nonlinear effect of ''Y'' to be 0.1581. See the R package `generalCorr' and its vignettes for details. Simulation and other details are in Vinod (2017) "Generalized correlation and kernel causality with applications in development economics," Communications in Statistics - Simulation and Computation, vol. 46, 513, 4534 available online: 29 Dec 2015, URL https://doi.org/10.1080/03610918.2015.1122048.


Using recursive formula

It can be computationally expensive to solve the linear regression problems. Actually, the ''n''th-order partial correlation (i.e., with , Z, = ''n'') can be easily computed from three (''n'' - 1)th-order partial correlations. The zeroth-order partial correlation ''ρ''''XY''·Ø is defined to be the regular
correlation coefficient A correlation coefficient is a numerical measure of some type of correlation, meaning a statistical relationship between two variables. The variables may be two columns of a given data set of observations, often called a sample, or two components ...
''ρ''''XY''. It holds, for any Z_0 \in \mathbf, that :\rho_ = \frac Naïvely implementing this computation as a
recursive algorithm In computer science, recursion is a method of solving a computational problem where the solution depends on solutions to smaller instances of the same problem. Recursion solves such recursive problems by using functions that call themselves ...
yields an exponential time complexity. However, this computation has the overlapping subproblems property, such that using
dynamic programming Dynamic programming is both a mathematical optimization method and a computer programming method. The method was developed by Richard Bellman in the 1950s and has found applications in numerous fields, from aerospace engineering to economics. ...
or simply caching the results of the recursive calls yields a complexity of \mathcal(n^3). Note in the case where ''Z'' is a single variable, this reduces to: :\rho_= \frac


Using matrix inversion

The partial correlation can also be written in terms of the joint precision matrix. Consider a set of random variables, \mathbf = of cardinality ''n''. We want the partial correlation between two variables X_i and X_j given all others, i.e., \mathbf \setminus \. Suppose the (joint/full) covariance matrix \Sigma = (\sigma_) is positive definite and therefore
invertible In mathematics, the concept of an inverse element generalises the concepts of opposite () and reciprocal () of numbers. Given an operation denoted here , and an identity element denoted , if , one says that is a left inverse of , and that is ...
. If the
precision matrix In statistics, the precision matrix or concentration matrix is the matrix inverse of the covariance matrix or dispersion matrix, P = \Sigma^. For univariate distributions, the precision matrix degenerates into a scalar precision, defined as the ...
is defined as \Omega = (p_) = \Sigma^, then Computing this requires \Sigma^, the inverse of the covariance matrix \Sigma which runs in \mathcal(n^3) time (using the sample covariance matrix to obtain a sample partial correlation). Note that only a single matrix inversion is required to give ''all'' the partial correlations between pairs of variables in \mathbf. To prove Equation (), return to the previous notation (i.e. X,Y,\mathbf \leftrightarrow X_i,X_j, \mathbf \setminus \) and start with the definition of partial correlation: ''ρ''''XY''·Z is the correlation between the residuals ''e''''X'' and ''e''''Y'' resulting from the linear regression of ''X'' with Z and of ''Y'' with Z, respectively. First, suppose \beta,\gamma are the coefficients for linear regression fit; that is, :\beta = \operatorname_\beta \mathbb \, X - \beta ^T Z\, ^2 :\gamma = \operatorname_\gamma \mathbb \, Y - \gamma ^T Z\, ^2 Write the joint covariance matrix for the vector (X,Y,Z^T)^T as : \Sigma = \begin \Sigma_ & \Sigma_ & \Sigma_ \\ \Sigma_ & \Sigma_ & \Sigma_ \\ \Sigma_ & \Sigma_ & \Sigma_ \end = \begin C_ & C_ \\ C_ & C_ \\ \end whereC_ = \begin \Sigma_ & \Sigma_ \\ \Sigma_ & \Sigma_ \end, \qquad C_ = \begin \Sigma_ \\ \Sigma_ \end, \qquad C_ = \begin \Sigma_ & \Sigma_ \end, \qquad C_ = \Sigma_ Then the standard formula for linear regression gives : \beta = \left(\Sigma_\right)^ \Sigma_ Hence, the residuals can be written as : R_X = X - \beta^T Z = X - \Sigma_ \left(\Sigma_\right)^ Z Note that R_X has expectation zero because of the inclusion of an intercept term in Z. Computing the
covariance In probability theory and statistics, covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the ...
now gives Next, write the precision matrix \Omega = \Sigma^ in a similar block form: : \Omega = \begin \Omega_ & \Omega_ & \Omega_ \\ \Omega_ & \Omega_ & \Omega_ \\ \Omega_ & \Omega_ & \Omega_ \end = \begin P_ & P_ \\ P_ & P_ \\ \end Then, by Schur's formula for block-matrix inversion, : P_^ = C_ - C_ C_^ C_ The entries of the right-hand-side matrix are precisely the covariances previously computed in (), giving : P_^ = \begin \operatorname(R_X,R_X) & \operatorname(R_X,R_Y) \\ \operatorname(R_Y,R_X) & \operatorname(R_Y,R_Y) \\ \end Using the formula for the inverse of a 2×2 matrix gives : \begin P_^ & = \frac \begin _ & - _ \\ - _ & _ \\ \end \\ & = \frac \begin p_ & -p_ \\ -p_ & p_ \\ \end \end So indeed, the partial correlation is : \rho_ = \frac = \frac = -\frac as claimed in ().


Interpretation


Geometrical

Let three variables ''X'', ''Y'', ''Z'' (where ''Z'' is the "control" or "extra variable") be chosen from a joint probability distribution over ''n'' variables V. Further, let v''i'', 1 ≤ ''i'' ≤ ''N'', be ''N'' ''n''-dimensional
i.i.d. In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent. This property is us ...
observations taken from the joint probability distribution over V. The geometrical interpretation comes from considering the ''N''-dimensional vectors x (formed by the successive values of ''X'' over the observations), y (formed by the values of ''Y''), and z (formed by the values of ''Z''). It can be shown that the residuals ''eX,i'' coming from the linear regression of ''X'' on Z, if also considered as an ''N''-dimensional vector e''X'' (denoted r''X'' in the accompanying graph), have a zero
scalar product In mathematics, the dot product or scalar productThe term ''scalar product'' means literally "product with a scalar as a result". It is also used sometimes for other symmetric bilinear forms, for example in a pseudo-Euclidean space. is an alge ...
with the vector z generated by Z. This means that the residuals vector lies on an (''N''–1)-dimensional hyperplane ''S''z that is
perpendicular In elementary geometry, two geometric objects are perpendicular if they intersect at a right angle (90 degrees or π/2 radians). The condition of perpendicularity may be represented graphically using the ''perpendicular symbol'', ⟂. It ca ...
to z. The same also applies to the residuals ''eY,i'' generating a vector e''Y''. The desired partial correlation is then the cosine of the angle ''φ'' between the projections e''X'' and e''Y'' of x and y, respectively, onto the hyperplane perpendicular to z.


As conditional independence test

With the assumption that all involved variables are
multivariate Gaussian In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One d ...
, the partial correlation ''ρ''''XY''·Z is zero if and only if ''X'' is
conditionally independent In probability theory, conditional independence describes situations wherein an observation is irrelevant or redundant when evaluating the certainty of a hypothesis. Conditional independence is usually formulated in terms of conditional probabi ...
from ''Y'' given Z. This property does not hold in the general case. To
test Test(s), testing, or TEST may refer to: * Test (assessment), an educational assessment intended to measure the respondents' knowledge or other abilities Arts and entertainment * ''Test'' (2013 film), an American film * ''Test'' (2014 film), ...
if a sample partial correlation \hat_ implies that the true population partial correlation differs from 0, Fisher's ''z-transform of the partial correlation'' can be used: :z(\hat_) = \frac \ln\left(\frac\right) The
null hypothesis In scientific research, the null hypothesis (often denoted ''H''0) is the claim that no difference or relationship exists between two sets of data or variables being analyzed. The null hypothesis is that any experimentally observed difference is d ...
is H_0: \rho_ = 0, to be tested against the two-tail alternative H_A: \rho_ \neq 0. H_0 can be rejected if :\sqrt\cdot , z(\hat_), > \Phi^(1-\alpha/2) where \Phi is the cumulative distribution function of a Gaussian distribution with zero
mean There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value (magnitude and sign) of a given data set. For a data set, the '' ari ...
and unit standard deviation, \alpha is the
significance level In statistical hypothesis testing, a result has statistical significance when it is very unlikely to have occurred given the null hypothesis (simply by chance alone). More precisely, a study's defined significance level, denoted by \alpha, is the p ...
of H_0, and ''N'' is the
sample size Sample size determination is the act of choosing the number of observations or replicates to include in a statistical sample. The sample size is an important feature of any empirical study in which the goal is to make inferences about a populatio ...
. This ''z''-transform is approximate, and the actual distribution of the sample (partial) correlation coefficient is not straightforward. However, an exact
t-test A ''t''-test is any statistical hypothesis test in which the test statistic follows a Student's ''t''-distribution under the null hypothesis. It is most commonly applied when the test statistic would follow a normal distribution if the value of ...
based on a combination of the partial regression coefficient, the partial correlation coefficient, and the partial variances is available. The distribution of the sample partial correlation was described by Fisher.


Semipartial correlation (part correlation)

The semipartial (or part) correlation statistic is similar to the partial correlation statistic; both compare variations of two variables after certain factors are controlled for. However, to calculate the semipartial correlation, one holds the third variable constant for either ''X'' or ''Y'' but not both; whereas for the partial correlation, one holds the third variable constant for both. The semipartial correlation compares the unique variation of one variable (having removed variation associated with the ''Z'' variable(s)) with the unfiltered variation of the other, while the partial correlation compares the unique variation of one variable to the unique variation of the other. The semipartial correlation can be viewed as more practically relevant "because it is scaled to (i.e., relative to) the total variability in the dependent (response) variable."StatSoft, Inc. (2010)
"Semi-Partial (or Part) Correlation"
Electronic Statistics Textbook. Tulsa, OK: StatSoft, accessed January 15, 2011.
Conversely, it is less theoretically useful because it is less precise about the role of the unique contribution of the independent variable. The absolute value of the semipartial correlation of ''X'' with ''Y'' is always less than or equal to that of the partial correlation of ''X'' with ''Y''. The reason is this: Suppose the correlation of ''X'' with ''Z'' has been removed from ''X'', giving the residual vector ''e''''x'' . In computing the semipartial correlation, ''Y'' still contains both unique variance and variance due to its association with ''Z''. But ''e''''x'' , being uncorrelated with ''Z'', can only explain some of the unique part of the variance of ''Y'' and not the part related to ''Z''. In contrast, with the partial correlation, only ''e''''y'' (the part of the variance of ''Y'' that is unrelated to ''Z'') is to be explained, so there is less variance of the type that ''e''''x'' cannot explain.


Use in time series analysis

In
time series analysis In mathematics Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in m ...
, the partial autocorrelation function (sometimes "partial correlation function") of a time series is defined, for lag ''h'', as :\varphi(h)= \rho_ This function is used to determine the appropriate lag length for an autoregression.


See also

* Linear regression *
Conditional independence In probability theory, conditional independence describes situations wherein an observation is irrelevant or redundant when evaluating the certainty of a hypothesis. Conditional independence is usually formulated in terms of conditional probabil ...
* Multiple correlation


References


External links

* * Mathematical formulae in the "Description" section of th
IMSL Numerical Library PCORR routine
*

{{DEFAULTSORT:Partial Correlation Covariance and correlation Autocorrelation Articles with example R code