Stochastic Gradient MCMC Methods for Hidden Markov Models

  • Authors:
    Yi-An Ma (Univ. of Washington), Nicholas Foti (Univ. of Washington), Emily B. Fox (Univ. of Washington)
    Publication ID:
    P091068
    Publication Type:
    Paper
    Received Date:
    2-Jun-2017
    Last Edit Date:
    5-Jun-2017
    Research:
    2386.003 (California Institute of Technology)

Abstract

Stochastic gradient MCMC (SG-MCMC) algorithms have been successfully applied in Bayesian inference with large datasets under an assumption of i.i.d data. We instead develop an SG-MCMC algorithm to learn the parameters of hidden Markov models (HMMs) for time-dependent data. The challenge in applying SG-MCMC to dependent data is the need to break the dependencies when considering mini batches of observations. We propose an algorithm that harnesses the inherent memory decay of the process. We demonstrate the effectiveness of our algorithm on synthetic experiments and on an ion channel recording dataset. In terms of runtime, our algorithm significantly outperforms the corresponding batch MCMC algorithm.

4819 Emperor Blvd, Suite 300 Durham, NC 27703 Voice: (919) 941-9400 Fax: (919) 941-9450

Important Information for the SRC website. This site uses cookies to store information on your computer. By continuing to use our site, you consent to our cookies. If you are not happy with the use of these cookies, please review our Cookie Policy to learn how they can be disabled. By disabling cookies, some features of the site will not work.