10.07.2015 Views

Multiple Linear Regression

Multiple Linear Regression

Multiple Linear Regression

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

2.5 % 97.5 %(Intercept) 0.0002561078 0.006783093log(Girth) 6.2276411645 8.468066317log(Height) 2.0104387829 4.645475188(Note that we did not update the row labels of the matrix to show that we exponentiated andso they are misleading as written.) We do predictions just as before. Remember to transform theresponse variable back to the original scale after prediction.> new exp(predict(treesNonlin.lm, newdata = new, interval = "confidence"))fit lwr upr1 11.90117 11.25908 12.579892 20.82261 20.14652 21.521393 28.93317 27.03755 30.96169The predictions and intervals are slightly different from those calculated earlier, but they areclose. Note that we did not need to transform the Girth and Height arguments in the dataframenew. All transformations are done for us automatically.9.2 Real Nonlinear <strong>Regression</strong>We saw with the trees data that a nonlinear model might be more appropriate for the data basedon theoretical considerations, and we were lucky because the functional form of µ allowed us totake logarithms to transform the nonlinear model to a linear one. The same trick will not work inother circumstances, however. We need techniques to fit general models of the formY = µ(X) + ɛ, (64)where µ is some crazy function that does not lend itself to linear transformations.There are a host of methods to address problems like these which are studied in advancedregression classes. The interested reader should see Neter et al [?] or Tabachnick and Fidell [?].It turns out that John Fox has posted an Appendix to his book [?] which discusses some of themethods and issues associated with nonlinear regression; seehttp://cran.r-project.org/doc/contrib/Fox-Companion/appendix.html30

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!