Particle Learning of Gaussian Processes
Add data to pall
Supply GP data to PL
Metropolis-Hastings draw for GP parameters
2-d Exponential Hessian Data
Initialize particles for GPs
Log-Predictive Probability Calculation for GPs
Extending apply to particles
Extract parameters from GP particles
Particle Learning Skeleton Method
Internal plgp Functions
Particle Learning of Gaussian Processes
Prediction for GPs
Generate priors for GP models
PL propagate rule for GPs
Un/Scale data in a bounding rectangle
Sequential Monte Carlo (SMC) inference for fully Bayesian Gaussian process (GP) regression and classification models by particle learning (PL) following Gramacy & Polson (2011) <arXiv:0909.5262>. The sequential nature of inference and the active learning (AL) hooks provided facilitate thrifty sequential design (by entropy) and optimization (by improvement) for classification and regression models, respectively. This package essentially provides a generic PL interface, and functions (arguments to the interface) which implement the GP models and AL heuristics. Functions for a special, linked, regression/classification GP model and an integrated expected conditional improvement (IECI) statistic provide for optimization in the presence of unknown constraints. Separable and isotropic Gaussian, and single-index correlation functions are supported. See the examples section of ?plgp and demo(package="plgp") for an index of demos.