lightgbm4.5.0 package

Light Gradient Boosting Machine

dim

Dimensions of an lgb.Dataset

dimnames.lgb.Dataset

Handling of column names of lgb.Dataset

get_field

Get one attribute of a lgb.Dataset

lgb.Dataset.save

Save lgb.Dataset to a binary file

getLGBMThreads

Get default number of threads used by LightGBM

lgb_shared_dataset_params

Shared Dataset parameter docs

lgb_shared_params

Shared parameter docs

lgb.configure_fast_predict

Configure Fast Single-Row Predictions

lgb.convert_with_rules

Data preparator for LightGBM datasets with rules (integer)

lgb.cv

Main CV logic for LightGBM

lgb.Dataset.construct

Construct Dataset explicitly

lgb.Dataset.create.valid

Construct validation data

lgb.Dataset

Construct lgb.Dataset object

lgb.Dataset.set.categorical

Set categorical feature of lgb.Dataset

lgb.Dataset.set.reference

Set reference of lgb.Dataset

lgb.drop_serialized

Drop serialized raw bytes in a LightGBM model object

lgb.dump

Dump LightGBM model to json

lgb.get.eval.result

Get record evaluation result from booster

lgb.importance

Compute feature importance in a model

lgb.interprete

Compute feature contribution of prediction

lgb.load

Load LightGBM model

lgb.make_serializable

Make a LightGBM object serializable by keeping raw bytes

lgb.model.dt.tree

Parse a LightGBM model json dump

lgb.plot.importance

Plot feature importance as a bar graph

lgb.plot.interpretation

Plot feature contribution as a bar graph

lgb.restore_handle

Restore the C++ component of a de-serialized LightGBM model

lgb.save

Save LightGBM model

lgb.slice.Dataset

Slice a dataset

lgb.train

Main training logic for LightGBM

lightgbm

Train a LightGBM model

predict.lgb.Booster

Predict method for LightGBM model

print.lgb.Booster

Print method for LightGBM model

set_field

Set one attribute of a lgb.Dataset object

setLGBMThreads

Set maximum number of threads used by LightGBM

summary.lgb.Booster

Summary method for LightGBM model

Tree based algorithms can be improved by introducing boosting frameworks. 'LightGBM' is one such framework, based on Ke, Guolin et al. (2017) <https://papers.nips.cc/paper/6907-lightgbm-a-highly-efficient-gradient-boosting-decision>. This package offers an R interface to work with it. It is designed to be distributed and efficient with the following advantages: 1. Faster training speed and higher efficiency. 2. Lower memory usage. 3. Better accuracy. 4. Parallel learning supported. 5. Capable of handling large-scale data. In recognition of these advantages, 'LightGBM' has been widely-used in many winning solutions of machine learning competitions. Comparison experiments on public datasets suggest that 'LightGBM' can outperform existing boosting frameworks on both efficiency and accuracy, with significantly lower memory consumption. In addition, parallel experiments suggest that in certain circumstances, 'LightGBM' can achieve a linear speed-up in training time by using multiple machines.

  • Maintainer: James Lamb
  • License: MIT + file LICENSE
  • Last published: 2024-07-26