mlr_learners.mlp function

My Little Pony

My Little Pony

Fully connected feed forward network with dropout after each activation function. The features can either be a single lazy_tensor or one or more numeric columns (but not both).

Dictionary

This Learner can be instantiated using the sugar function lrn():

lrn("classif.mlp", ...)
lrn("regr.mlp", ...)

Properties

  • Supported task types: 'classif', 'regr'

  • Predict Types:

    • classif: 'response', 'prob'
    • regr: 'response'
  • Feature Types: integer , numeric , lazy_tensor

  • Required Packages: list("mlr3"), list("mlr3torch"), list("torch")

Parameters

Parameters from LearnerTorch, as well as:

  • activation :: [nn_module]

    The activation function. Is initialized to nn_relu.

  • activation_args :: named list()

    A named list with initialization arguments for the activation function. This is intialized to an empty list.

  • neurons :: integer()

    The number of neurons per hidden layer. By default there is no hidden layer. Setting this to c(10, 20) would have a the first hidden layer with 10 neurons and the second with 20.

  • n_layers :: integer()

    The number of layers. This parameter must only be set when neurons has length 1.

  • p :: numeric(1)

    The dropout probability. Is initialized to 0.5.

  • shape :: integer() or NULL

    The input shape of length 2, e.g. c(NA, 5). Only needs to be present when there is a lazy tensor input with unknown shape (NULL). Otherwise the input shape is inferred from the number of numeric features.

Examples

# Define the Learner and set parameter values learner = lrn("classif.mlp") learner$param_set$set_values( epochs = 1, batch_size = 16, device = "cpu", neurons = 10 ) # Define a Task task = tsk("iris") # Create train and test set ids = partition(task) # Train the learner on the training ids learner$train(task, row_ids = ids$train) # Make predictions for the test rows predictions = learner$predict(task, row_ids = ids$test) # Score the predictions predictions$score()

References

Gorishniy Y, Rubachev I, Khrulkov V, Babenko A (2021). Revisiting Deep Learning for Tabular Data.

arXiv, 2106.11959 .

See Also

Other Learner: mlr_learners.tab_resnet, mlr_learners.torch_featureless, mlr_learners_torch, mlr_learners_torch_image, mlr_learners_torch_model

Super classes

mlr3::Learner -> mlr3torch::LearnerTorch -> LearnerTorchMLP

Methods

Public methods

Method new()

Creates a new instance of this R6 class.

Usage

LearnerTorchMLP$new(
  task_type,
  optimizer = NULL,
  loss = NULL,
  callbacks = list()
)

Arguments

  • task_type: (character(1))

     The task type, either `"classif`" or `"regr"`.
    
  • optimizer: (TorchOptimizer)

     The optimizer to use for training. Per default, **adam** is used.
    
  • loss: (TorchLoss)

     The loss used to train the network. Per default, **mse** is used for regression and **cross_entropy** for classification.
    
  • callbacks: (list() of TorchCallbacks)

     The callbacks. Must have unique ids.
    

Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerTorchMLP$clone(deep = FALSE)

Arguments

  • deep: Whether to make a deep clone.