Untitled - UFRJ

Untitled - UFRJ Untitled - UFRJ

11.07.2015 Views

Particle Learning for Fat-tailed DistributionsNick PolsonChicago, USAWe develop a sequential Monte Carlo method known as particle learning (PL) for fat-tailed errordistributions. Fat-tails are a common feature of many economic and financial time series and can beincorporated into state space models with a number of other features such as stochastic volatility. Anatural framework to address fat-tails is in the class of scale mixtures of normals. By doing so this createsa conditionally dynamic Gaussian model resulting in a mixture Kalman filter model. In particular, wefocus on learning the tail behavior of the time series by assuming that the errors follow a t ν -distributionwhere the researcher sequential computes the posterior distribution of the tail thickness p(ν|y t ) as newdata arrives. This framework is flexible enough to entertain infinite variance Cauchy errors on the onehand (ν = 1) to standard Gaussian errors (ν = ∞). Finally, we show how a variant of the Dickey-Savagedensity ratio can be used to calculate a sequential Bayes factor of a fat-tailed error versus the normal.Comparisons are made to standard Monte Carlo and MCMC approaches and approximate inferences forlatent Gaussian processes. This is joint work with Hedibert F. Lopes.19

Bayesian Non-Parametric Inference for Diffusion ProcessesGareth RobertsUniversity of Warwick, United KingdomThis talk will consider Bayesian inference for diffusions in a non-parametric framework. The presentationwill consider both a complete treatment for continuous data and goes on to consider the case ofdiscretely observed data. The work is motivated by problems from molecular dynamics and is illustratedby simple examples usng molecular dynamics data.20

Particle Learning for Fat-tailed DistributionsNick PolsonChicago, USAWe develop a sequential Monte Carlo method known as particle learning (PL) for fat-tailed errordistributions. Fat-tails are a common feature of many economic and financial time series and can beincorporated into state space models with a number of other features such as stochastic volatility. Anatural framework to address fat-tails is in the class of scale mixtures of normals. By doing so this createsa conditionally dynamic Gaussian model resulting in a mixture Kalman filter model. In particular, wefocus on learning the tail behavior of the time series by assuming that the errors follow a t ν -distributionwhere the researcher sequential computes the posterior distribution of the tail thickness p(ν|y t ) as newdata arrives. This framework is flexible enough to entertain infinite variance Cauchy errors on the onehand (ν = 1) to standard Gaussian errors (ν = ∞). Finally, we show how a variant of the Dickey-Savagedensity ratio can be used to calculate a sequential Bayes factor of a fat-tailed error versus the normal.Comparisons are made to standard Monte Carlo and MCMC approaches and approximate inferences forlatent Gaussian processes. This is joint work with Hedibert F. Lopes.19

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!