12.07.2015 Views

Dynamical Systems in Neuroscience:

Dynamical Systems in Neuroscience:

Dynamical Systems in Neuroscience:

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

224 Bifurcations19. [M.S.] A leaky <strong>in</strong>tegrate-and-fire model has the same asymptotic fir<strong>in</strong>g rate(1/ln) as a system near saddle homocl<strong>in</strong>ic orbit bifurcation. Explore the possibilitythat <strong>in</strong>tegrate-and-fire models describe neurons near such a bifurcation.20. [M.S.] (blue-sky catastrophe) Prove that˙ϕ = ω, ẋ = a + x 2 , if x = +∞, then x ← −∞, and ϕ ← 0 ,is the canonical model (see Sect. 8.1.5) for blue-sky catastrophe. This modelwithout the reset of ϕ is canonical for the fold limit cycle on homocl<strong>in</strong>ic torusbifurcation. The model with the reset x ← b+s<strong>in</strong> ϕ is canonical for the Lukyanov-Shilnikov bifurcation of a fold limit cycle with non-central homocl<strong>in</strong>ics (Shilnikovand Cymbalyuk 2004, Shilnikov et al. 2005). Here, ϕ is the phase variable onthe unit circle and a and b are bifurcation parameters.21. [M.S.] Def<strong>in</strong>e topological equivalence and the notion of a bifurcation for piecewisecont<strong>in</strong>uous flows.22. [Ph.D.] Use the def<strong>in</strong>ition above to classify codimension-1 bifurcations <strong>in</strong> piecewisecont<strong>in</strong>uous flows.23. [M.S.] The bifurcation sequence <strong>in</strong> Fig. 6.40 seems to be typical <strong>in</strong> 2-dimensionalneuronal models. Develop the theory of Bogdanov-Takens bifurcation with aglobal reentrant orbit.24. [Ph.D.] Develop an automated dynamic clamp protocol (Sharp et al. 1993) thatanalyzes bifurcations <strong>in</strong> neurons <strong>in</strong> vitro, similar to what AUTO, XPPAUT, orMATCONT do <strong>in</strong> models.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!