Inference on Two Variables

The multiple linear regression model

$\displaystyle Y_j = \beta_0 + \beta_1 x_{1j} + \beta_2 x_{2j} + \epsilon_j,
\quad j=1,\ldots,n,

are identified with the following variables:
  1. Response (dependent variable) for $ Y_j$'s;
  2. 1st predictor (1st explanatory variable) for $ x_{1j}$'s;
  3. 2nd predictor (2nd explanatory variable) for $ x_{2j}$'s.

The standard error $ S_i$ for the estimate $ \beta_i$ gives rise to the null hypothesis

$\displaystyle H_0:\: \beta_i = 0

for $ i = 0,1,2$. It can be constructed to find whether the response is dependent of the i-th predictor. Under the null hypothesis the test statistic $ T_i = \displaystyle\frac{\hat{\beta}_i}{S_i}$ is distributed as the t-distribution with $ (n-3)$ degrees of freedom. Thus, we reject $ H_0$ at significance level $ \alpha$ if $ \vert T_i\vert > t_{\alpha/2,n-3}$. By computing the p-value $ p_i^*$ we can equivalently reject $ H_0$ if $ p_i^* < \alpha$.