hmc function

Fit a generic model using Hamiltonian Monte Carlo (HMC)

Fit a generic model using Hamiltonian Monte Carlo (HMC)

This function runs the HMC algorithm on a generic model provided the logPOSTERIOR and gradient glogPOSTERIOR functions. All parameters specified within the list paramare passed to these two functions. The tuning parameters epsilon and L are passed to the Leapfrog algorithm.

hmc( N = 10000, theta.init, epsilon = 0.01, L = 10, logPOSTERIOR, glogPOSTERIOR, randlength = FALSE, Mdiag = NULL, constrain = NULL, verbose = FALSE, varnames = NULL, param = list(), chains = 1, parallel = FALSE, ... )

Arguments

  • N: Number of MCMC samples
  • theta.init: Vector of initial values for the parameters
  • epsilon: Step-size parameter for leapfrog
  • L: Number of leapfrog steps parameter
  • logPOSTERIOR: Function to calculate and return the log posterior given a vector of values of theta
  • glogPOSTERIOR: Function to calculate and return the gradient of the log posterior given a vector of values of theta
  • randlength: Logical to determine whether to apply some randomness to the number of leapfrog steps tuning parameter L
  • Mdiag: Optional vector of the diagonal of the mass matrix M. Defaults to unit diagonal.
  • constrain: Optional vector of which parameters in theta accept positive values only. Default is that all parameters accept all real numbers
  • verbose: Logical to determine whether to display the progress of the HMC algorithm
  • varnames: Optional vector of theta parameter names
  • param: List of additional parameters for logPOSTERIOR and glogPOSTERIOR
  • chains: Number of MCMC chains to run
  • parallel: Logical to set whether multiple MCMC chains should be run in parallel
  • ...: Additional parameters for logPOSTERIOR

Returns

Object of class hmclearn

Elements for hmclearn objects

  • N: Number of MCMC samples
  • theta: Nested list of length N of the sampled values of theta for each chain
  • thetaCombined: List of dataframes containing sampled values, one for each chain
  • r: List of length N of the sampled momenta
  • theta.all: Nested list of all parameter values of theta sampled prior to accept/reject step for each
  • r.all: List of all values of the momenta r sampled prior to accept/reject
  • accept: Number of accepted proposals. The ratio accept / N is the acceptance rate
  • accept_v: Vector of length N indicating which samples were accepted
  • M: Mass matrix used in the HMC algorithm
  • algorithm: HMC for Hamiltonian Monte Carlo
  • varnames: Optional vector of parameter names
  • chains: Number of MCMC chains

Available logPOSTERIOR and glogPOSTERIOR functions

  • linear_posterior: Linear regression: log posterior
  • g_linear_posterior: Linear regression: gradient of the log posterior
  • logistic_posterior: Logistic regression: log posterior
  • g_logistic_posterior: Logistic regression: gradient of the log posterior
  • poisson_posterior: Poisson (count) regression: log posterior
  • g_poisson_posterior: Poisson (count) regression: gradient of the log posterior
  • lmm_posterior: Linear mixed effects model: log posterior
  • g_lmm_posterior: Linear mixed effects model: gradient of the log posterior
  • glmm_bin_posterior: Logistic mixed effects model: log posterior
  • g_glmm_bin_posterior: Logistic mixed effects model: gradient of the log posterior
  • glmm_poisson_posterior: Poisson mixed effects model: log posterior
  • g_glmm_poisson_posterior: Poisson mixed effects model: gradient of the log posterior

Examples

# Linear regression example set.seed(521) X <- cbind(1, matrix(rnorm(300), ncol=3)) betavals <- c(0.5, -1, 2, -3) y <- X%*%betavals + rnorm(100, sd=.2) fm1_hmc <- hmc(N = 500, theta.init = c(rep(0, 4), 1), epsilon = 0.01, L = 10, logPOSTERIOR = linear_posterior, glogPOSTERIOR = g_linear_posterior, varnames = c(paste0("beta", 0:3), "log_sigma_sq"), param=list(y=y, X=X), parallel=FALSE, chains=1) summary(fm1_hmc, burnin=100) # poisson regression example set.seed(7363) X <- cbind(1, matrix(rnorm(40), ncol=2)) betavals <- c(0.8, -0.5, 1.1) lmu <- X %*% betavals y <- sapply(exp(lmu), FUN = rpois, n=1) fm2_hmc <- hmc(N = 500, theta.init = rep(0, 3), epsilon = 0.01, L = 10, logPOSTERIOR = poisson_posterior, glogPOSTERIOR = g_poisson_posterior, varnames = paste0("beta", 0:2), param = list(y=y, X=X), parallel=FALSE, chains=1) summary(fm2_hmc, burnin=100)

References

Neal, Radford. 2011. MCMC Using Hamiltonian Dynamics. In Handbook of Markov Chain Monte Carlo, edited by Steve Brooks, Andrew Gelman, Galin L. Jones, and Xiao-Li Meng, 116–62. Chapman; Hall/CRC.

Betancourt, Michael. 2017. A Conceptual Introduction to Hamiltonian Monte Carlo.

Thomas, S., Tu, W. 2020. Learning Hamiltonian Monte Carlo in R.

Author(s)

Samuel Thomas samthoma@iu.edu , Wanzhu Tu wtu@iu.edu

  • Maintainer: Samuel Thomas
  • License: GPL-3
  • Last published: 2020-10-05

Useful links