10.07.2015 Views

Multiple Linear Regression

Multiple Linear Regression

Multiple Linear Regression

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

We see from the output that for the trees data our parameter estimates are b = [ −58.0 4.7 0.3 ] ,and consequently our estimate of the mean response is ˆµ given byˆµ(x 1 , x 2 ) = b 0 + b 1 x 1 + b 2 x 2 , (10)≈ − 58.0 + 4.7x 1 + 0.3x 2 . (11)We could see the entire model matrix X with the model.matrix function.> head(model.matrix(trees.lm))(Intercept) Girth Height1 1 8.3 702 1 8.6 653 1 8.8 634 1 10.5 725 1 10.7 816 1 10.8 832.3 Point Estimates of the <strong>Regression</strong> SurfaceThe parameter estimates b make it easy to find the fitted values, Ŷ. We write them individually asŶ i , i = 1, 2, . . . , n, and recall that they are defined byŶ i = ˆµ(x 1i , x 2i ), (12)= b 0 + b 1 x 1i + b 2 x 2i , i = 1, 2, . . . , n. (13)They are expressed more compactly by the matrix equationŶ = Xb. (14)From Equation 9 we know that b = ( X T X ) −1X T Y, so we can rewrite[(Ŷ = X X T X ) ]−1X T Y , (15)= HY, (16)where H = X ( X T X ) −1X T is the hat matrix. Some facts about H are• H is a symmetric square matrix, of dimension n × n.• The diagonal entries h ii satisfy 0 ≤ h ii ≤ 1.5

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!