21.04.2014 Views

1h6QSyH

1h6QSyH

1h6QSyH

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

7<br />

Moving Beyond Linearity<br />

So far in this book, we have mostly focused on linear models. Linear models<br />

are relatively simple to describe and implement, and have advantages over<br />

other approaches in terms of interpretation and inference. However, standard<br />

linear regression can have significant limitations in terms of predictive<br />

power. This is because the linearity assumption is almost always an<br />

approximation, and sometimes a poor one. In Chapter 6 we see that we can<br />

improve upon least squares using ridge regression, the lasso, principal components<br />

regression, and other techniques. In that setting, the improvement<br />

is obtained by reducing the complexity of the linear model, and hence the<br />

variance of the estimates. But we are still using a linear model, which can<br />

only be improved so far! In this chapter we relax the linearity assumption<br />

while still attempting to maintain as much interpretability as possible. We<br />

do this by examining very simple extensions of linear models like polynomial<br />

regression and step functions, as well as more sophisticated approaches<br />

such as splines, local regression, and generalized additive models.<br />

• Polynomial regression extends the linear model by adding extra predictors,<br />

obtained by raising each of the original predictors to a power.<br />

For example, a cubic regression uses three variables, X, X 2 ,andX 3 ,<br />

as predictors. This approach provides a simple way to provide a nonlinear<br />

fit to data.<br />

• Step functions cut the range of a variable into K distinct regions in<br />

order to produce a qualitative variable. This has the effect of fitting<br />

a piecewise constant function.<br />

G. James et al., An Introduction to Statistical Learning: with Applications in R,<br />

Springer Texts in Statistics 103, DOI 10.1007/978-1-4614-7138-7 7,<br />

© Springer Science+Business Media New York 2013<br />

265

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!