1. (2 points) In Lecture 12, we showed that using the geometric property of simple regression, we could derive the least-squares estimator for β1 from solving x ∗0 = 0, where x ∗ is of the mean-deviation form. Take this approach and derive the least-squares estimator for β in multiple regression y = Xβ + .
2. (2 points) Consider a simple linear model in mean-deviation form. We have x
∗ = x−x¯, and y
∗ = y −y¯.
Then, yˆ∗ = β1x
∗
. Prove
3. (2 points) Exercise 10.7 in textbook. Use the prestige data.
4. (3 points) Consider the model yi = β0 + β1xi1 + β2xi2 + i
. Recall that the matrix V 11 is defined as the
square submatrix consisting of the entries in the q rows and q columns of (X0X)
−1
that pertain to the
coefficients in βˆ
1 = [βˆ
1, · · · , βˆ
q]
0
(see Equation 9.16 on page 218 in textbook). Show that V
−1
11 for the
slope coefficients β1 and β2 contains mean deviation sums of squares and products for the explanatory
variables; that is
Sun | Mon | Tue | Wed | Thu | Fri | Sat |
---|---|---|---|---|---|---|
27 | 28 | 29 | 30 | 1 | 2 | 3 |
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 | 26 | 27 | 28 | 29 | 30 | 31 |