Exercise 5.17 For the regression in-sample predicted values ybi show that ybi ¯ ¯ X ∼ N ¡ x 0 i β,σ 2hi i¢ where hi i are the leverage values (3.41). Exercise 5.18 In the normal regression model, show that the leave-one out prediction errors eei and the standardized residuals e¯i are independent of βb , conditional on X . Hint: Use (3.46) and (4.24). Exercise 5.19 In the normal regression model, show that the robust covariance matrices Vb HC0 βb , Vb HC1 βb , Vb HC2 βb , and Vb HC3 βb are independent of the OLS estimator βb, conditional on X . Exercise 5.20 Let F(u) be the distribution function of a random variable X whose density is symmetric about zero. (This includes the standard normal and the student t.) Show that F(−u) = 1−F(u). Exercise 5.21 Let Cβ = [L,U] be a 1 − α confidence interval for β, and consider the transformation θ = g (β) where g (·) is monotonically increasing. Consider the confidence interval Cθ = [g (L), g (U)] for θ. Show that P(θ ∈Cθ) = P ¡ β ∈Cβ ¢ . Use this result to develop a confidence interval for σ. Exercise 5.22 Show that the test “Reject H0 if LR ≥ c1” for LR defined in (5.21), and the test “Reject H0 if F ≥ c2” for F defined in (5.22), yield the same decisions if c2 = ¡ exp(c1/n)−1 ¢ (n −k)/q. Why does this mean that the two tests are equivalent? Exercise 5.23 Show (5.23). Exercise 5.24 In the normal regression model, let s 2 be the unbiased estimator of the error variance σ 2 from (4.26). (a) Show that var¡ s 2 ¢ = 2σ 4 /(n −k). (b) Show that var¡ s 2 ¢ is strictly larger than the Cramér-Rao Lower Bound for σ 2 .
Get Free Quote!
344 Experts Online