A Common API to Modeling and Analysis Functions
Functions required for parsnip-adjacent packages
Add a column of row numbers to a data frame
Augment data with predictions
Automatic Machine Learning
Create a ggplot for a model object
Ensembles of MARS models
Ensembles of neural networks
Ensembles of decision trees
Developer functions for predictions via BART models
Bayesian additive regression trees (BART)
Boosted trees
C5.0 rule-based classification models
Boosted trees via C5.0
Determine if case weights are used
Using case weights with parsnip
Calculations for inverse probability of censoring weights (IPCW)
Check to ensure that ellipses are empty
Condense control object into strictly smaller control object
Control the fit function
Helper functions to convert between formula and matrix interface
Convenience function for intervals
A wrapper function for conditional inference tree models
Cubist rule-based regression models
Decision trees
Data Set Characteristics Available when Fitting Models
Automatic machine learning via h2o
Bagged MARS via earth
Bagged neural networks via nnet
Bagged trees via C5.0
Bagged trees via rpart
Bayesian additive regression trees via dbarts
Boosted trees via C5.0
Boosted trees via h2o
Boosted trees via lightgbm
Boosted trees
Boosted trees via Spark
Boosted trees via xgboost
C5.0 rule-based classification models
Cubist rule-based regression models
Decision trees via C5.0
Decision trees via partykit
Decision trees via CART
Decision trees via Spark
Flexible discriminant analysis via earth
Linear discriminant analysis via MASS
Linear discriminant analysis via flexible discriminant analysis
Linear discriminant analysis via James-Stein-type shrinkage estimation
Linear discriminant analysis via regularization
Quadratic discriminant analysis via MASS
Quadratic discriminant analysis via regularization
Regularized discriminant analysis via klaR
Generalized additive models via mgcv
Linear regression via brulee
Linear regression via generalized estimating equations (GEE)
Linear regression via glm
Linear regression via generalized mixed models
Linear regression via glmnet
Linear regression via generalized least squares
Linear regression via h2o
Linear regression via keras/tensorflow
Linear regression via lm
Linear regression via mixed models
Linear regression via mixed models
Linear quantile regression via the quantreg package
Linear regression via spark
Linear regression via hierarchical Bayesian methods
Linear regression via Bayesian Methods
Logistic regression via brulee
Logistic regression via generalized estimating equations (GEE)
Logistic regression via glm
Logistic regression via mixed models
Logistic regression via glmnet
Logistic regression via h2o
Logistic regression via keras
Logistic regression via LiblineaR
Logistic regression via spark
Logistic regression via hierarchical Bayesian methods
Logistic regression via stan
Multivariate adaptive regression splines (MARS) via earth
Multilayer perceptron via brulee with two hidden layers
Multilayer perceptron via brulee
Multilayer perceptron via h2o
Multilayer perceptron via keras
Multilayer perceptron via nnet
Multinomial regression via brulee
Multinomial regression via glmnet
Multinomial regression via h2o
Multinomial regression via keras
Multinomial regression via nnet
Multinomial regression via spark
Naive Bayes models via naivebayes
Naive Bayes models via klaR
Naive Bayes models via naivebayes
K-nearest neighbors via kknn
Partial least squares via mixOmics
Poisson regression via generalized estimating equations (GEE)
Poisson regression via glm
Poisson regression via mixed models
Poisson regression via glmnet
Poisson regression via h2o
Poisson regression via pscl
Poisson regression via hierarchical Bayesian methods
Poisson regression via stan
Poisson regression via pscl
Proportional hazards regression
Proportional hazards regression
Oblique random survival forests via aorsf
Generalized random forests via grf
Random forests via h2o
Random forests via partykit
Random forests via randomForest
Random forests via ranger
Random forests via spark
RuleFit models via h2o
RuleFit models via xrf
Parametric survival regression
Flexible parametric survival regression
Parametric survival regression
Linear support vector machines (SVMs) via kernlab
Linear support vector machines (SVMs) via LiblineaR
Polynomial support vector machines (SVMs) via kernlab
Radial basis function support vector machines (SVMs) via kernlab
Flexible discriminant analysis
Linear discriminant analysis
Quadratic discriminant analysis
Regularized discriminant analysis
Tools for documenting engines
Extract survival status
Extract survival time
Obtain names of prediction columns for a fitted model or workflow
Translate names of model tuning parameters
Evaluate parsnip model arguments
Model Specification Checking:
Extract elements of a parsnip model object
Control the fit function
Fit a Model Specification to a Dataset
Internal functions that format predictions
Generalized additive models (GAMs)
Working with the parsnip model environment
Construct a single row summary "glance" of a model, fit, or other obje...
Fit a grouped binomial outcome from a data set with case weights
Organize glmnet predictions
Helper functions for checking the penalty of glmnet models
Technical aspects of the glmnet model
Tools for models that predict on sub-models
Activation functions for neural networks in keras
Simple interface to MLP models via keras
Wrapper for keras class predictions
Knit engine-specific documentation
Linear regression
Locate and show errors/warnings in engine-specific documentation
Logistic regression
Make a parsnip call expression
Prepend a new class
Multivariate adaptive regression splines (MARS)
Reformat quantile predictions
Determine largest value of mtry from formula. This function potentiall...
Fuzzy conversions
Execution-time data dimension checks
Single layer neural network
Model Fit Objects
Formulas with special terms in tidymodels
Print helper for model objects
Model Specifications
Model predictions across many sub-models
Multinomial regression
Naive Bayes models
K-nearest neighbors
Null model
Fit a simple, non-informative model
Other predict methods.
Start an RStudio Addin that can write model specifications
Updating a model specification
parsnip
Partial least squares (PLS)
Poisson regression models
Model predictions
Prepare data based on parsnip encoding information
Proportional hazards regression
Random forest
Objects exported from other packages
Repair a model call object
Determine required packages for a model
Determine required packages for a model
RuleFit models
Change elements of a model specification
Declare a computational engine and specific arguments
Tools to Register Models
Set seed in R and TensorFlow at the same time
Print the model call
Display currently available engines for a model
Using sparse data with parsnip
Wrapper for stan confidence intervals
Parametric survival regression
Parametric survival regression
Linear support vector machines
Polynomial support vector machines
Radial basis function support vector machines
tidy methods for glmnet models
tidy methods for LiblineaR models
Turn a parsnip model object into a tidy tibble
Tidy method for null models
Resolve a Model Specification for a Computational Engine
Succinct summary of parsnip object
Save information about models
Determine varying arguments
A placeholder function for argument values
Boosted trees via xgboost
A common interface is provided to allow users to specify a model without having to remember the different argument names across different functions or computational engines (e.g. 'R', 'Spark', 'Stan', 'H2O', etc).
Useful links