Tools for Post-Processing Predicted Values
Add a class_pred column
Coerce to a class_pred object
Truncate a numeric prediction column
Applies a calibration to a set of existing predictions
Probability Calibration table
Uses a Beta calibration model to calculate new probabilities
Uses a bootstrapped Isotonic regression model to calibrate probabiliti...
Uses an Isotonic regression model to calibrate model predictions.
Uses a linear regression model to calibrate numeric predictions
Uses a logistic regression model to calibrate probabilities
Uses a Multinomial calibration model to calculate new probabilities
Do not calibrate model predictions.
Probability calibration plots via binning
Probability calibration plots via logistic regression
Regression calibration plots
Probability calibration plots via moving windows
Measure performance with and without using Beta calibration
Measure performance with and without using bagged isotonic regression ...
Measure performance with and without using isotonic regression calibra...
Measure performance with and without using linear regression calibrati...
Measure performance with and without using logistic calibration
Measure performance with and without using multinomial calibration
Measure performance without using calibration
Create a class prediction object
Obtain and format metrics produced by calibration validation
Obtain and format predictions produced by calibration validation
Controlling the numeric details for conformal inference
Butcher methods for conformal inteference intervals
Prediction intervals via conformal inference CV+
Prediction intervals via conformal inference
Prediction intervals via conformal inference and quantile regression
Prediction intervals via split conformal inference
Test if an object inherits from class_pred
Extract class_pred levels
Locate equivocal values
Create a class_pred vector from class probabilities
Prediction intervals from conformal methods
probably: Tools for Post-Processing Predicted Values
Objects exported from other packages
Calculate the reportable rate
S3 methods to track which additional packages are needed for specific ...
S3 methods to track which additional packages are needed for predictio...
Generate performance metrics across probability thresholds
Models can be improved by post-processing class probabilities, by: recalibration, conversion to hard probabilities, assessment of equivocal zones, and other activities. 'probably' contains tools for conducting these operations as well as calibration tools and conformal inference techniques for regression models.
Useful links