Stochastic Gradient MCMC Methods for Hidden Markov Models

  • Authors:
    Yi-An Ma (Univ. of Washington), Nicholas Foti (Univ. of Washington), Emily B. Fox (Univ. of Washington)
    Publication ID:
    Publication Type:
    Received Date:
    Last Edit Date:
    2386.003 (California Institute of Technology)


Stochastic gradient MCMC (SG-MCMC) algorithms have been successfully applied in Bayesian inference with large datasets under an assumption of i.i.d data. We instead develop an SG-MCMC algorithm to learn the parameters of hidden Markov models (HMMs) for time-dependent data. The challenge in applying SG-MCMC to dependent data is the need to break the dependencies when considering mini batches of observations. We propose an algorithm that harnesses the inherent memory decay of the process. We demonstrate the effectiveness of our algorithm on synthetic experiments and on an ion channel recording dataset. In terms of runtime, our algorithm significantly outperforms the corresponding batch MCMC algorithm.

4819 Emperor Blvd, Suite 300 Durham, NC 27703 Voice: (919) 941-9400 Fax: (919) 941-9450