Emily Fox; Leveraging Optimization Techniques to Scale Bayesian Inference

February 24, 2015, 4:00pm
GUG 204
Emily Fox, Department of Statistics, UW.
Leveraging Optimization Techniques to Scale Bayesian Inference

Abstract:¬†Data streams of increasing complexity are being collected in a variety of fields ranging from neuroscience, genomics, and environmental monitoring to e-commerce based on technologies and infrastructures previously unavailable. With the advent of Markov chain Monte Carlo (MCMC) combined with the computational power to implement such algorithms, deploying increasingly expressive models has been a focus in recent decades. Unfortunately, traditional algorithms for Bayesian inference in these models such as MCMC and variational inference do not typically scale to the large datasets encountered in practice. Likewise, these algorithms are not applicable to the increasingly common situation where an unbounded amount of data arrive as a stream and inferences need to be made on-the-fly. In this talk, we will present a series of algorithms— stochastic gradient Hamiltonian Monte Carlo, HMM stochastic variational inference, and streaming Bayesian nonparametric inference— to address various aspects of the challenge in scaling Bayesian inference; our algorithms focus on deploying stochastic gradients and working within an optimization framework. We demonstrate our methods on a variety of applications including online movie recommendations, segmenting a human chromatin data set with 250 million observations, and clustering a stream of New York Times documents.

Joint work with Tianqi Chen, Nick Foti, Dillon Laird, Alex Tank, and Jason Xu.