10.07.2015 Views

Using R for Introductory Statistics : John Verzani

Using R for Introductory Statistics : John Verzani

Using R for Introductory Statistics : John Verzani

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Two extensions of the linear model 339> curve(g(x, Y=53.903, k=1.393, t0=1.958), add=TRUE)Finally, so we can compare, we find the AIC:> AIC(res.g)[1] 1559Next, we fit the Richards model. First, we try to use the same values, to see if that willwork (not shown).> curve(f(x, Y=53.903, k=1.393, t0=1.958, m=1),add=TRUE)> legend(4,20, legend=c("logisticgrowth","Richards"),lty=l:2)It is not a great fit, but we try these as starting points <strong>for</strong> the algorithm anyway:> res.f=nls(size ~ f(age, Y, k, t0, m),data=urchin.growth,+ start=c(Y=53, k=1.393, t0=1.958, m=1))Error in numericDeriv(<strong>for</strong>m[[3]], names(ind), env) :Missing value or an Infinity produced whenevaluating the modelThis is one of the error messages that can occur when the initial guess isn’t good or themodel doesn’t fit well.<strong>Using</strong> a little hindsight, we think that the problem might be t o and k. For this model, afew exploratory graphs indicate that we should have t≥t 0 <strong>for</strong> a growth model, as thegraphs decay until t0. So,we should start with t 0 res.f=nls(size ~ f(age, Y, k, t0, m),+ start=c(Y=53, k=.5, t0=0, m=1), data=urchin.growth)> res.fNonlinear regression modelmodel: size ~ f(age, Y, k, t0, m)data: urchin.growthY k t0 m57.2649 0.7843 −0.8587 6.0636residual sum-of-squares: 6922> curve(f(x, Y=57.26, k=0.78, t0=-0.8587, m = 6.0636),add=TRUE, lty=2)Now we have convergence. The residual sum-of-squares, 6,922, is less than the 7,922 <strong>for</strong>the logistic model. This is a good thing, but if we add parameters this is often the case. †We compare models here with AIC.> AlC(res.f)[1] 1548

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!