Bayesian Deep Gaussian Processes using MCMC
Active Learning Cohn for Sequential Design
Continues MCMC sampling
Calculates CRPS
Package deepgp
MCMC sampling for one layer GP
MCMC sampling for three layer deep GP
MCMC sampling for two layer deep GP
Integrated Mean-Squared (prediction) Error for Sequential Design
Plots object from deepgp package
Generates joint posterior samples from a trained GP/DGP
Predict posterior mean and variance/covariance
Calculates RMSE
Calculates score
Calculates squared pairwise distances
Converts non-Vecchia object to its Vecchia version
Trim/Thin MCMC iterations
Performs Bayesian posterior inference for deep Gaussian processes following Sauer, Gramacy, and Higdon (2023, <doi:10.48550/arXiv.2012.08015>). See Sauer (2023, <http://hdl.handle.net/10919/114845>) for comprehensive methodological details and <https://bitbucket.org/gramacylab/deepgp-ex/> for a variety of coding examples. Models are trained through MCMC including elliptical slice sampling of latent Gaussian layers and Metropolis-Hastings sampling of kernel hyperparameters. Gradient-enhancement and gradient predictions are offered following Booth (2025, <doi:10.48550/arXiv.2512.18066>). Vecchia approximation for faster computation is implemented following Sauer, Cooper, and Gramacy (2023, <doi:10.48550/arXiv.2204.02904>). Optional monotonic warpings are implemented following Barnett et al. (2025, <doi:10.48550/arXiv.2408.01540>). Downstream tasks include sequential design through active learning Cohn/integrated mean squared error (ALC/IMSE; Sauer, Gramacy, and Higdon, 2023), optimization through expected improvement (EI; Gramacy, Sauer, and Wycoff, 2022, <doi:10.48550/arXiv.2112.07457>), and contour location through entropy (Booth, Renganathan, and Gramacy, 2025, <doi:10.48550/arXiv.2308.04420>). Models extend up to three layers deep; a one layer model is equivalent to typical Gaussian process regression. Incorporates OpenMP and SNOW parallelization and utilizes C/C++ under the hood.