NEW HERE? USE "AFORUM20" TO GET GET 20 % OFF CLAIM OFFER

UK: +44 748 007-0908 USA: +1 917 810-5386
My Orders
Register
Order Now

Regression analysis

  1. (8 points) In class we showed that E(SSReg) = σ
    2+β
    0X0
    (H−
    1
    n
    J)Xβ. By evaluating
    β
    0X0
    (H −
    1
    n
    J)Xβ (i.e by carrying out the matrix multiplications), prove that
    β
    0X0
    (H −
    1
    n
    J)Xβ = β
    2
    1SSXX for the simple linear regression model.
  2. (8 points) In class we showed that V ar(Yˆ ) = V ar(X0
    hB) = σ
    2X0
    h
    (X0X)
    −1Xh. By
    evaluating X0
    h
    (X0X)
    −1Xh (i.e by carrying out the matrix multiplications), prove that
    X0
    h
    (X0X)
    −1Xh =
    1
    n +
    (xh−x¯)
    2
    SSXX
    for the simple linear regression model.
  3. (a) (2 points) Find the symmetric matrix A of the quadratic form
    78Y
    2
    1 + 57Y1Y2 + 96Y
    2
    2
    .
    (i.e. Find a symmetric matrix A such that Y
    0AY = 78Y
    2
    1 + 57Y1Y2 + 96Y
    2
    2
    )
    Question 3 continues on the next page. . .
    Page 2 of 3
    (b) (3 points) Find the symmetric matrix A of the quadratic form
    85Y
    2
    1 + 39Y
    2
    2 + 35Y
    2
    3 + 72Y1Y2 − 65Y1Y3 + 91Y2Y3.
    (i.e. Find a symmetric matrix A such that Y
    0AY = 85Y
    2
    1 + 39Y
    2
    2 + 35Y
    2
    3 +
    72Y1Y2 − 65Y1Y3 + 91Y2Y3)
  4. A regression data set with a depenendt variable variable Y and two independent
    variables x1 and x2 was stored in an R code. A printout of that data set is given
    below: For all parts in this question, you can use R. You may also assume that all
    the necessary assumptions are satisfied by the data.

reg_data
y x1 x2
[1,] 48 50 51
[2,] 57 36 46
[3,] 66 40 48
[4,] 70 41 44
[5,] 89 28 43
[6,] 36 49 54
[7,] 46 42 50
[8,] 54 45 48
[9,] 26 52 62
[10,] 77 29 50
(a) (4 points) Fit the least squares regression model of Y on x1 and x2.
(b) (3 points) Test whether there is a regression relation. Use α = 0.05.
(c) (2 points) Calculate a 95% confidence interval for the coefficient of x1.

  1. Note: For this question, use matrix methods that we discussed in lectures on chapter
    5 and 6. Some question can be answered using the sinple methods discussed in the
    earlier lectures (without matrix based methods), but please don’t do that. I can only
    give zero credit for answers using only the basic formulas whenever they can also be
    done using the methods we discussed in lectures on matrix methods. The idea of this
    question is to use the matrix based methods that we discussed in lectures.
    Consider the following data set for a regression analysis with two variables x and
    y. Assume that a Normal error regression model is appropriate for this data. Also
    assume that the errors ,
    . . . , n
    i.i.d ∼ N(0, σ2
    ).
    x 2 3 4 5 6 7
    y 11 12 10 16 19 22
    Question 5 continues on the next page. . .
    Page 3 of 3
    i.e. the design matrix is
    X =

    
    1 2
    1 3
    1 4
    1 5
    1 6
    1 7

    
    .
    (a) (3 points) Calculate X0X.
    (b) (3 points) Calculate (X0X)
    −1
    .
    (c) (3 points) Calculate X0Y .
    (d) (2 points) Calculate (X0X)
    −1X0Y .
    (e) (3 points) Calculate the hat matrix H.
    (f) (6 points) Calculate MSE. Calculate this using two methods. First calculate
    MSE using the methods that we discussed in the lectures on the use of linear
    algebra and then calculate it again using the lm function in R that discussed
    earlier in lectures.
    (g) (3 points) Calculate s{b1}.
    (h) (2 points) Calculate Cov(b0, b1).
    (i) (3 points) Calculate V ar(e2). (Note e1, . . . , en are the residuals.)
    Note: This requires the value of σ
    2
    . For this, you may use MSE as the value of
    σ
    2
    .
    (j) (2 points) Calculate Cov(e1, e2).
    Note: This requires the value of σ
    2
    . For this, you may use MSE as the value of
    σ
    2
    .