bhetgp-package {bhetGP} | R Documentation |
Package bhetGP
Description
Performs Bayesian posterior inference for heteroskedastic Gaussian processes (hetGP; Binois, M., Gramacy, R., Ludkovski, M. (2018) <doi:10.48550/arXiv.1611.05902>) Models are trained through MCMC including elliptical slice sampling (ESS) of latent noise processes and Metropolis-Hastings sampling of kernel hyperparameters. For large data, Vecchia-approximation for faster computation is leveraged (Sauer, A., Cooper, A., and Gramacy, R., (2023), <doi:10.48550/arXiv.2204.02904>). Incorporates 'OpenMP' and SNOW parallelization and utilizes 'C'/'C++' under the hood.
Important Functions
-
bhetGP
: conducts MCMC sampling of hyperparameters and latent noise layer for a heteroskedatic GP. -
bhomGP
: conducts MCMC sampling of hyperparameters for a homoskedastic GP. -
trim
: cuts off burn-in and optionally thins samples -
predict
: calculates posterior mean and variance over a set of input locations (optionally calculates EI or entropy) -
plot
: produces trace plots, hidden layer plots, and posterior predictive plots
Author(s)
Parul Vijay Patil parulvijay@vt.edu
References
M. Binois, Robert B. Gramacy, M. Ludkovski (2018), Practical heteroskedastic Gaussian process modeling for large simulation experiments, Journal of Computational and Graphical Statistics, 27(4), 808–821.
Katzfuss, Matthias, Joseph Guinness, and Earl Lawrence. Scaled Vecchia approximation for fast computer-model emulation. SIAM/ASA Journal on Uncertainty Quantification 10.2 (2022): 537-554.
Sauer, A., Cooper, A., & Gramacy, R. B. (2023). Vecchia-approximated deep Gaussian processes for computer experiments. Journal of Computational and Graphical Statistics, 32(3), 824-837.
Examples
# More examples including real-world computer experiments are available at:
# https://bitbucket.org/gramacylab/bhgp/src/main/examples