Statistics and Data Science Seminar
Prof. Hedibert Lopes
University of Chicago
Particle Methods for General Mixtures
Abstract: This paper develops efficient sequential learning methods for the estimation of general mixture models, by working directly with particles based on conditional sufficient information. We provide an alternative to existing inference techniques that will be especially relevant in on-line estimation settings and for large,high-dimensional data-sets. With each new observation, particles are updated in two steps: first, resampling with weights proportional to the implied predictive probability distribution and, secondly, propagating the next latent mixture allocation and implicitly sampling the next particle vector. We introduce the methodology in the context of finite mixture models, before extending to any nonparametric mixture model with an available predictive probability function and focusing, in particular, on Dirichlet Process mixture models. In addition, we show that the algorithm provides a natural estimate for sequential Bayes factors and can facilitate selection between competing dynamic models. The framework is illustrated with numerous real and simulated data examples. (This is joint work with Carlos Carvalho, Nicholas Polson and Matt Taddy)
Wednesday October 7, 2009 at 3:00 PM in SEO 636