This article includes a list ofgeneral references, butit lacks sufficient correspondinginline citations. Please help toimprove this article byintroducing more precise citations.(November 2010) (Learn how and when to remove this message) |
Instatistics, thecoefficient of multiple correlation is a measure of how well a given variable can be predicted using alinear function of a set of other variables. It is thecorrelation between the variable's values and the best predictions that can be computedlinearly from the predictive variables.[1]
The coefficient of multiple correlation takes values between 0 and 1. Higher values indicate higher predictability of thedependent variable from theindependent variables, with a value of 1 indicating that the predictions are exactly correct and a value of 0 indicating that no linear combination of the independent variables is a better predictor than is the fixedmean of the dependent variable.[2]
| Correlation Coefficient (r) | Direction and Strength of Correlation |
| 1 | Perfectly positive |
| 0.8 | Strongly positive |
| 0.5 | Moderately positive |
| 0.2 | Weakly positive |
| 0 | No association |
| -0.2 | Weakly negative |
| -0.5 | Moderately negative |
| -0.8 | Strongly negative |
| -1 | Perfectly negative |
The coefficient of multiple correlation is known as the square root of thecoefficient of determination, but under the particular assumptions that an intercept is included and that the best possible linear predictors are used, whereas the coefficient of determination is defined for more general cases, including those of nonlinear prediction and those in which the predicted values have not been derived from a model-fitting procedure.
The coefficient of multiple correlation, denotedR, is ascalar that is defined as thePearson correlation coefficient between the predicted and the actual values of the dependent variable in a linear regression model that includes anintercept.
The square of the coefficient of multiple correlation can be computed using thevector ofcorrelations between the predictor variables (independent variables) and the target variable (dependent variable), and thecorrelation matrix of correlations between predictor variables. It is given by
where is thetranspose of, and is theinverse of the matrix
If all the predictor variables are uncorrelated, the matrix is theidentity matrix and simply equals, the sum of the squared correlations with the dependent variable. If the predictor variables are correlated among themselves, the inverse of the correlation matrix accounts for this.
The squared coefficient of multiple correlation can also be computed as the fraction of variance of the dependent variable that is explained by the independent variables, which in turn is 1 minus the unexplained fraction. The unexplained fraction can be computed as thesum of squares of residuals—that is, the sum of the squares of the prediction errors—divided by thesum of squares of deviations of the values of the dependent variable from itsexpected value.
With more than two variables being related to each other, the value of the coefficient of multiple correlation depends on the choice of dependent variable: a regression of on and will in general have a different than will a regression of on and. For example, suppose that in a particular sample the variable isuncorrelated with both and, while and are linearly related to each other. Then a regression of on and will yield an of zero, while a regression of on and will yield a strictly positive. This follows since the correlation of with its best predictor based on and is in all cases at least as large as the correlation of with its best predictor based on alone, and in this case with providing noexplanatory power it will be exactly as large.