R Square

The residual sum of square

$\displaystyle S_{\mbox{error}}
= \displaystyle\sum_{i=1}^{n} (Y_i - \hat{\beta}_0 - \hat{\beta}_1 x_i)^2

and the total sum of squares

$\displaystyle S_{\mbox{total}} = \displaystyle\sum_{i=1}^{n} (Y_i - \bar{Y})^2

are introduced. They are used to calculate $ R^2$ and the error variance $ \hat{\sigma}^2$.

The coefficient of determination The data set consists of

  1. explanatory variable for $ x_i$'s;
  2. dependent variable for $ Y_i$'s.

$ R^2 = 1 - \displaystyle\frac{S_{\mbox{error}}}{S_{\mbox{total}}} =$

takes a value between 0 and 1, and represents the proportion which can be explained by the linear regression. The value $ R^2$ indicates how close the data points are to the regression line as $ R^2$ gets larger, and it is simply the square of the sample correlation coefficient $ \hat{\rho}$.

$ \hat{\sigma}^2 = \displaystyle\frac{S_{\mbox{error}}}{n-2} =$

can be obtained as the point estimate of the variance $ \sigma^2$ of error terms.

© TTU Mathematics