Compute feature importance in a model
Creates a data.table
of feature importances in a model.
lgb.importance(model, percentage = TRUE)
model
: object of class lgb.Booster
.percentage
: whether to show importance in relative percentage.For a tree model, a data.table
with the following columns:
Feature
: Feature names in the model.Gain
: The total gain of this feature's splits.Cover
: The number of observation related to this feature.Frequency
: The number of times a feature split in trees.data(agaricus.train, package = "lightgbm") train <- agaricus.train dtrain <- lgb.Dataset(train$data, label = train$label) params <- list( objective = "binary" , learning_rate = 0.1 , max_depth = -1L , min_data_in_leaf = 1L , min_sum_hessian_in_leaf = 1.0 , num_threads = 2L ) model <- lgb.train( params = params , data = dtrain , nrounds = 5L ) tree_imp1 <- lgb.importance(model, percentage = TRUE) tree_imp2 <- lgb.importance(model, percentage = FALSE)