Cross-Validation for Model Selection
Create baseline evaluations for binary classification
Create baseline evaluations for regression models
Create baseline evaluations
Create baseline evaluations
Select metrics for binomial evaluation
Generate model formulas by combining predictors
Create a confusion matrix
Cross-validate custom model functions for model selection
Cross-validate regression models for model selection
cvms: A package for cross-validating regression and classification mod...
Evaluate residuals from a regression task
Evaluate your model's performance
Create a list of font settings for plots
Select metrics for Gaussian evaluation
Examples of model_fn functions
Find the data points that were hardest to predict
Generate a multiclass probability tibble
Select columns with evaluation metrics and model definitions
Select metrics for multinomial evaluation
Plot a confusion matrix
Density plot for a metric
Plot ECDF for the predicted probabilities
Plot predicted probabilities
Examples of predict_fn functions
Examples of preprocess_fn functions
A set of process information object constructors
Reconstruct model formulas from results tibbles
Render Table of Contents
Select model definition columns
Simplify formula with inline functions
Create a list of settings for the sum tiles in plot_confusion_matrix()
Summarize metrics with common descriptors
Check and update hyperparameters
Validate a custom model function on a test set
Validate regression models on a test set
Cross-validate one or multiple regression and classification models and get relevant evaluation metrics in a tidy format. Validate the best model on a test set and compare it to a baseline evaluation. Alternatively, evaluate predictions from an external model. Currently supports regression and classification (binary and multiclass). Described in chp. 5 of Jeyaraman, B. P., Olsen, L. R., & Wambugu M. (2019, ISBN: 9781838550134).