Untitled - UFRJ
Untitled - UFRJ Untitled - UFRJ
Particle Learning for Fat-tailed DistributionsNick PolsonChicago, USAWe develop a sequential Monte Carlo method known as particle learning (PL) for fat-tailed errordistributions. Fat-tails are a common feature of many economic and financial time series and can beincorporated into state space models with a number of other features such as stochastic volatility. Anatural framework to address fat-tails is in the class of scale mixtures of normals. By doing so this createsa conditionally dynamic Gaussian model resulting in a mixture Kalman filter model. In particular, wefocus on learning the tail behavior of the time series by assuming that the errors follow a t ν -distributionwhere the researcher sequential computes the posterior distribution of the tail thickness p(ν|y t ) as newdata arrives. This framework is flexible enough to entertain infinite variance Cauchy errors on the onehand (ν = 1) to standard Gaussian errors (ν = ∞). Finally, we show how a variant of the Dickey-Savagedensity ratio can be used to calculate a sequential Bayes factor of a fat-tailed error versus the normal.Comparisons are made to standard Monte Carlo and MCMC approaches and approximate inferences forlatent Gaussian processes. This is joint work with Hedibert F. Lopes.19
Bayesian Non-Parametric Inference for Diffusion ProcessesGareth RobertsUniversity of Warwick, United KingdomThis talk will consider Bayesian inference for diffusions in a non-parametric framework. The presentationwill consider both a complete treatment for continuous data and goes on to consider the case ofdiscretely observed data. The work is motivated by problems from molecular dynamics and is illustratedby simple examples usng molecular dynamics data.20
- Page 4 and 5: 10 o Encontro Brasileiro de Estatí
- Page 7 and 8: 10 o Encontro Brasileiro de Estatí
- Page 9 and 10: Particle LearningCarlos M. Carvalho
- Page 12 and 13: Uso de Métodos Bayesianos em Anál
- Page 15 and 16: Bayesian Inference for Aggregated F
- Page 17 and 18: Product Partition Models with Corre
- Page 19: “Objective Bayes” - A Dangerous
- Page 23 and 24: Bayesian Computing with INLAHåvard
- Page 25 and 26: 10 o Encontro Brasileiro de Estatí
- Page 27 and 28: Bayesian Inference for a Skew-Norma
- Page 29 and 30: Bayesian Coalescent Modeling of the
- Page 31 and 32: Partitioning Diversity Measure for
- Page 33 and 34: Exact Simulation and Bayesian Infer
- Page 35 and 36: Spatially Varying Autoregressive Pr
- Page 37 and 38: Bayesian Beta Dynamic Model and App
- Page 39 and 40: 10 o Encontro Brasileiro de Estatí
- Page 41 and 42: 10 o Encontro Brasileiro de Estatí
- Page 43 and 44: 10 o Encontro Brasileiro de Estatí
- Page 45 and 46: 10 o Encontro Brasileiro de Estatí
- Page 47 and 48: 10 o Encontro Brasileiro de Estatí
- Page 49 and 50: 10 o Encontro Brasileiro de Estatí
- Page 51 and 52: Generalized Exponential Distributio
- Page 53 and 54: Análise Bayesiana da Priori Jeffer
- Page 55 and 56: Neighborhood Dependence in Bayesian
- Page 57 and 58: Aplicação do Algoritmo MCMC de Sa
- Page 59 and 60: Decisões de Investimento e Restri
- Page 61 and 62: Test for Symmetry in Contingency Ta
- Page 63 and 64: Aplicações de Inferência Bayesia
- Page 65 and 66: Uma Abordagem Bayesiana do Modelo M
- Page 67 and 68: Accounting for Latent Spatio-Tempor
- Page 69 and 70: Uma Abordagem Bayesiana para Proces
Particle Learning for Fat-tailed DistributionsNick PolsonChicago, USAWe develop a sequential Monte Carlo method known as particle learning (PL) for fat-tailed errordistributions. Fat-tails are a common feature of many economic and financial time series and can beincorporated into state space models with a number of other features such as stochastic volatility. Anatural framework to address fat-tails is in the class of scale mixtures of normals. By doing so this createsa conditionally dynamic Gaussian model resulting in a mixture Kalman filter model. In particular, wefocus on learning the tail behavior of the time series by assuming that the errors follow a t ν -distributionwhere the researcher sequential computes the posterior distribution of the tail thickness p(ν|y t ) as newdata arrives. This framework is flexible enough to entertain infinite variance Cauchy errors on the onehand (ν = 1) to standard Gaussian errors (ν = ∞). Finally, we show how a variant of the Dickey-Savagedensity ratio can be used to calculate a sequential Bayes factor of a fat-tailed error versus the normal.Comparisons are made to standard Monte Carlo and MCMC approaches and approximate inferences forlatent Gaussian processes. This is joint work with Hedibert F. Lopes.19