An evaluation of Markov Chain Monte Carlo samplers for models with discrete parameters

Publication date

DOI

Document Type

Master Thesis

Collections

Open Access logo

License

CC-BY-NC-ND

Abstract

MCMC samplers are widely used in Bayesian inference. Samplers for models with continuous parameters are highly efficient and scalable. Creating algorithms for models with discrete parameters seems to be a lot more challenging. W. Grathwohl et al. claimed that their newly proposed sampler, Gibbs with Gradients, outperforms the current best samplers. We evaluated their claims on a series of randomly generated models as well as Ising and Potts models. The Gibbs with Gradients sampler is compared against the Gibbs sampler, and we empirically show that while Gibbs with Gradients decreases the autocorrelation of draws, the additional computational cost causes it to have a lower effective sample size per second, making it worse in practice.

Keywords

Markov Chain Monte Carlo, Statistics, Sampling, Bayesian

Citation