Deep Learning with 'mlr3'
Convert to Data Descriptor
Convert to Lazy Tensor
Convert to CallbackSetLRScheduler
Convert to a TorchCallback
Convert to a list of Torch Callbacks
Convert to TorchLoss
Convert to TorchOptimizer
Assert Lazy Tensor
Auto Device
Batchgetter for Categorical data
Batchgetter for Numeric Data
Create a Set of Callbacks for Torch
Cross Entropy Loss
Data Descriptor
Compare lazy tensors
Infer Shapes
Ingress Token for Categorical Features
Ingress Token for Lazy Tensor Feature
Ingress Token for Numeric Features
Check for lazy tensor
Shape of Lazy Tensor
Create a lazy tensor
Materialize a Lazy Tensor
Materialize Lazy Tensor Columns
Lazy Data Backend
Checkpoint Callback
History Callback
OneCycle Learning Rate Scheduling Callback
Reduce On Plateau Learning Rate Scheduler
Learning Rate Scheduling Callback
Progress Callback
Base Class for Callbacks
TensorBoard Logging Callback
Unfreezing Weights Callback
Context for Torch Learner
Image Learner
Learner Torch Model
Base Class for Torch Learners
FT-Transformer
Multi Layer Perceptron
Learner Torch Module
Tabular ResNet
Featureless Torch Learner
AlexNet Image Classifier
Center Crop Augmentation
Color Jitter Augmentation
Crop Augmentation
Horizontal Flip Augmentation
Random Affine Augmentation
Random Choice Augmentation
Random Crop Augmentation
Random Horizontal Flip Augmentation
Random Order Augmentation
Random Resized Crop Augmentation
Random Vertical Flip Augmentation
Resized Crop Augmentation
Rotate Augmentation
Vertical Flip Augmentation
Class for Torch Module Wrappers
1D Adaptive Average Pooling
2D Adaptive Average Pooling
3D Adaptive Average Pooling
1D Average Pooling
2D Average Pooling
3D Average Pooling
1D Batch Normalization
2D Batch Normalization
3D Batch Normalization
Block Repetition
CELU Activation Function
Transpose 1D Convolution
Transpose 2D Convolution
Transpose 3D Convolution
1D Convolution
2D Convolution
3D Convolution
Dropout
ELU Activation Function
Flattens a Tensor
Custom Function
CLS Token for FT-Transformer
Single Transformer Block for the FT-Transformer
GeGLU Activation Function
GELU Activation Function
GLU Activation Function
Hard Shrink Activation Function
Hard Sigmoid Activation Function
Hard Tanh Activation Function
Output Head
Identity Layer
Layer Normalization
Leaky ReLU Activation Function
Linear Layer
Log Sigmoid Activation Function
1D Max Pooling
2D Max Pooling
3D Max Pooling
Merge by Concatenation
Merge by Product
Merge by Summation
Merge Operation
PReLU Activation Function
ReGLU Activation Function
ReLU Activation Function
ReLU6 Activation Function
Reshape a Tensor
RReLU Activation Function
SELU Activation Function
Sigmoid Activation Function
Softmax
SoftPlus Activation Function
Soft Shrink Activation Function
SoftSign Activation Function
Squeeze a Tensor
Tanh Activation Function
Tanh Shrink Activation Function
Treshold Activation Function
Categorical Tokenizer
Numeric Tokenizer
Unqueeze a Tensor
Base Class for Lazy Tensor Preprocessing
Callback Configuration
Torch Entry Point for Categorical Features
Ingress for Lazy Tensor
Torch Entry Point for Numeric Features
Entrypoint to Torch Network
Loss Configuration
PipeOp Torch Classifier
Torch Regression Model
PipeOp Torch Model
Optimizer Configuration
Base Class for Torch Module Constructor Wrappers
Adjust Brightness Transformation
Adjust Gamma Transformation
Adjust Hue Transformation
Adjust Saturation Transformation
Grayscale Transformation
Normalization Transformation
Padding Transformation
Resizing Transformation
RGB to Grayscale Transformation
CIFAR Classification Tasks
Iris Classification Task
Melanoma Image classification
MNIST Image classification
Tiny ImageNet Classification Task
Dictionary of Torch Callbacks
Loss Functions
Optimizers
mlr3torch: Deep Learning with 'mlr3'
Create a Torch Learner from a ModelDescriptor
Create a nn_graph from ModelDescriptor
Union of ModelDescriptors
Represent a Model with Meta-Info
CLS Token for FT-Transformer
Single Transformer Block for FT-Transformer
GeGLU Module
Graph Network
Concatenates multiple tensors
Product of multiple tensors
Sum of multiple tensors
ReGLU Module
Reshape
Squeeze
Categorical Tokenizer
Numeric Tokenizer
Unsqueeze
Create a Neural Network Layer
Network Output Dimension
Create Torch Preprocessing PipeOps
No Transformation
Reshaping Transformation
Replace the head of a network Replaces the head of the network with a ...
Selector Functions for Character Vectors
Sugar Function for Torch Callback
Loss Function Quick Access
Optimizers Quick Access
Create a Dataset from a Task
Create a Callback Descriptor
Torch Callback
Base Class for Torch Descriptors
Torch Ingress Token
Torch Loss
Torch Optimizer
Deep Learning library that extends the mlr3 framework by building upon the 'torch' package. It allows to conveniently build, train, and evaluate deep learning models without having to worry about low level details. Custom architectures can be created using the graph language defined in 'mlr3pipelines'.
Useful links