Partial Correlation

 

Partial correlation is a method used to describe the relationship between two variables whilst taking away the effects of another variable, or several other variables, on this relationship.

 

Partial correlation is best thought of in terms of multiple regression; StatsDirect shows the partial correlation coefficient r with its main results from multiple linear regression.

 

A different way to calculate partial correlation coefficients, which does not require a full multiple regression, is show below for the sake of further explanation of the principles:

 

Consider a correlation matrix for variables A, B and C (note that the multiple line regression function in StatsDirect will output correlation matrices for you as one of its options):

 

  A B C
A: *    
B: r(AB) *  
C: r(AC) r(BC) *

 

The partial correlation of A and B adjusted for C is:

 

The same can be done using Spearman's rank correlation co-efficient.

 

The hypothesis test for the partial correlation co-efficient is performed in the same way as for the usual correlation co-efficient but it is based upon n-3 degrees of freedom.

 

Please note that this sort of relationship between three or more variables is more usefully investigated using the multiple regression itself (Altman, 1991).

 

The general form of partial correlation from a multiple regression is as follows:

- where tk is the Student t statistic for the kth term in the linear model.