Plot sequences of Kullback distance estimates for comparison of several MCMC algorithms for a same target density
Plot sequences of Kullback distance estimates for comparison of several MCMC algorithms for a same target density
This function draws on a same plot several sequences of estimates of Kullback distances K(pt,f), i.e. the convergence criterion vs. time (iteration t), for each MCMC algorithm for which the convergence criterion has been computed.
plot_Kblist(Kb, which =1, lim =NULL, ylim =NULL)
Arguments
Kb: A list of objects of class "KbMCMC", such as the ones returned by EntropyMCMC or EntropyParallel, or their HPC versions.
which: Controls the level of details in the legend added to the plot (see details)
lim: for zooming over 1:lim iterations only.
ylim: limits on the y axis for zooming, passed to plot.
Details
The purpose of this plot if to compare K MCMC algorithms (typically based on K different simulation strategies or kernels) for convergence or efficiency in estimating a same target density f. For the kth algorithm, the user has to generate the convergence criterion, i.e. the sequence K(pt(k),f) for t=1 up to the number of iterations that has been chosen, and where pt(k) is the estimated pdf of the algorithm at time t.
For the legend, which=1 displays the MCMC's names together with some technical information depending on the algorithms definition (e.g. the proposal variance for the RWHM algorithm) and the method used for entropy estimation. The legend for which=2 is shorter, only displaying the MCMC's names together with the number of parallel chains used for each, typically to compare the effect of that number for a single MCMC algorithm.
Returns
The graphic to plot.
References
Chauveau, D. and Vandekerkhove, P. (2012), Smoothness of Metropolis-Hastings algorithm and application to entropy estimation. ESAIM: Probability and Statistics, 17 , (2013) 419--431. DOI: http://dx.doi.org/10.1051/ps/2012004
Chauveau D. and Vandekerkhove, P. (2014), Simulation Based Nearest Neighbor Entropy Estimation for (Adaptive) MCMC Evaluation, In JSM Proceedings, Statistical Computing Section. Alexandria, VA: American Statistical Association. 2816--2827.
Chauveau D. and Vandekerkhove, P. (2014), The Nearest Neighbor entropy estimate: an adequate tool for adaptive MCMC evaluation. Preprint HALhttp://hal.archives-ouvertes.fr/hal-01068081.
Author(s)
Didier Chauveau.
See Also
EntropyMCMC, EntropyMCMC.mc
Examples
## Toy example using the bivariate centered gaussian target## with default parameters value, see target_norm_paramd =2# state space dimensionn=300; nmc=100# number of iterations and iid Markov chains## initial distribution, located in (2,2), "far" from target center (0,0)Ptheta0 <- DrawInit(nmc, d, initpdf ="rnorm", mean =2, sd =1)## MCMC 1: Random-Walk Hasting-Metropolisvarq=0.05# variance of the proposal (chosen too small)q_param=list(mean=rep(0,d),v=varq*diag(d))## using Method 1: simulation with storage, and *then* entropy estimation# simulation of the nmc iid chains, single core heres1 <- MCMCcopies(RWHM, n, nmc, Ptheta0, target_norm, target_norm_param, q_param)summary(s1)# method for "plMCMC" objecte1 <- EntropyMCMC(s1)# computes Entropy and Kullback divergence## MCMC 2: Independence Sampler with large enough gaussian proposalvarq=1; q_param <- list(mean=rep(0,d),v=varq*diag(d))## using Method 2: simulation & estimation for each t, forgetting the past## HPC with 2 cores here (using parallel socket cluser, not available on Windows machines)e2 <- EntropyParallel.cl(HMIS_norm, n, nmc, Ptheta0, target_norm, target_norm_param, q_param, cltype="PAR_SOCK", nbnodes=2)## Compare these two MCMC algorithmsplot_Kblist(list(e1,e2))# MCMC 2 (HMIS, red plot) converges faster.