Gradient Boosting
Compute upper bound of second derivative of loss
Internal Function
Boosting for Classification and Regression
Function to select number of predictors
Control Parameters for Boosting
Cross-Validation for Boosting
Cross-Validation for one-vs-all AdaBoost with multi-class problem
Cross-Validation for Multi-class Boosting
Cross-Validation for Multi-class Hinge Boosting
Cross-Validation for one-vs-all HingeBoost with multi-class problem
Cross-Validation for Nonconvex Loss Boosting
Cross-Validation for Nonconvex Multi-class Loss Boosting
Compute prediction errors
Generating Three-class Data with 50 Predictors
Multi-class AdaBoost
Boosting for Multi-Classification
Boosting for Multi-class Classification
Multi-class HingeBoost
Find Number of Variables In Multi-class Boosting Iterations
Robust Boosting for Robust Loss Functions
Robust Boosting Path for Nonconvex Loss Functions
Robust Boosting for Multi-class Robust Loss Functions
Functional gradient descent algorithm for a variety of convex and non-convex loss functions, for both classical and robust regression and classification problems. See Wang (2011) <doi:10.2202/1557-4679.1304>, Wang (2012) <doi:10.3414/ME11-02-0020>, Wang (2018) <doi:10.1080/10618600.2018.1424635>, Wang (2018) <doi:10.1214/18-EJS1404>.