Deep Learning with 'mlr3'
Convert to Data Descriptor
Unfreezing Weights Callback
Iris Classification Task
Melanoma Image classification
Featureless Torch Learner
Center Crop Augmentation
Adjust Gamma Transformation
Color Jitter Augmentation
Adjust Hue Transformation
Crop Augmentation
Horizontal Flip Augmentation
Adjust Saturation Transformation
Grayscale Transformation
Normalization Transformation
Random Affine Augmentation
Random Choice Augmentation
Padding Transformation
Resizing Transformation
RGB to Grayscale Transformation
CIFAR Classification Tasks
Random Crop Augmentation
Random Horizontal Flip Augmentation
Random Order Augmentation
Random Resized Crop Augmentation
Random Vertical Flip Augmentation
Resized Crop Augmentation
Convert to Lazy Tensor
Convert to CallbackSetLRScheduler
Convert to a TorchCallback
Convert to a list of Torch Callbacks
Convert to TorchLoss
Convert to TorchOptimizer
Assert Lazy Tensor
Auto Device
Batchgetter for Categorical data
Batchgetter for Numeric Data
Create a Set of Callbacks for Torch
Data Descriptor
Compare lazy tensors
Check for lazy tensor
Create a lazy tensor
Materialize a Lazy Tensor
Materialize Lazy Tensor Columns
Lazy Data Backend
Checkpoint Callback
History Callback
Learning Rate Scheduling Callback
Progress Callback
Base Class for Callbacks
TensorBoard Logging Callback
Context for Torch Learner
Image Learner
Learner Torch Model
Base Class for Torch Learners
My Little Pony
Tabular ResNet
AlexNet Image Classifier
Class for Torch Module Wrappers
1D Adaptive Average Pooling
2D Adaptive Average Pooling
3D Adaptive Average Pooling
1D Average Pooling
2D Average Pooling
3D Average Pooling
1D Batch Normalization
2D Batch Normalization
Rotate Augmentation
3D Batch Normalization
Block Repetition
CELU Activation Function
Transpose 1D Convolution
Transpose 2D Convolution
Transpose 3D Convolution
1D Convolution
2D Convolution
3D Convolution
Vertical Flip Augmentation
Dropout
ELU Activation Function
Flattens a Tensor
GELU Activation Function
Adjust Brightness Transformation
GLU Activation Function
Hard Shrink Activation Function
Hard Sigmoid Activation Function
Hard Tanh Activation Function
Log Sigmoid Activation Function
Output Head
Layer Normalization
Leaky ReLU Activation Function
Linear Layer
1D Max Pooling
2D Max Pooling
3D Max Pooling
Merge by Concatenation
Merge by Product
Merge by Summation
Merge Operation
PReLU Activation Function
ReLU Activation Function
ReLU6 Activation Function
Reshape a Tensor
RReLU Activation Function
Optimizer Configuration
SELU Activation Function
Sigmoid Activation Function
Softmax
SoftPlus Activation Function
Base Class for Torch Module Constructor Wrappers
Soft Shrink Activation Function
SoftSign Activation Function
Squeeze a Tensor
Tanh Activation Function
Tanh Shrink Activation Function
Treshold Activation Function
Unqueeze a Tensor
Base Class for Lazy Tensor Preprocessing
Callback Configuration
Torch Entry Point for Categorical Features
Ingress for Lazy Tensor
Torch Entry Point for Numeric Features
Entrypoint to Torch Network
Loss Configuration
PipeOp Torch Classifier
Torch Regression Model
PipeOp Torch Model
MNIST Image classification
Tiny ImageNet Classification Task
Dictionary of Torch Callbacks
Loss Functions
Optimizers
mlr3torch: Deep Learning with 'mlr3'
Create a Torch Learner from a ModelDescriptor
Create a nn_graph from ModelDescriptor
Union of ModelDescriptors
Represent a Model with Meta-Info
Graph Network
Concatenates multiple tensors
Product of multiple tensors
Sum of multiple tensors
Reshape
Squeeze
Unsqueeze
Create a Neural Network Layer
Create Torch Preprocessing PipeOps
No Transformation
Reshaping Transformation
Replace the head of a network Replaces the head of the network with a ...
Selector Functions for Character Vectors
Sugar Function for Torch Callback
Loss Function Quick Access
Optimizers Quick Access
Create a Dataset from a Task
Create a Callback Descriptor
Torch Callback
Base Class for Torch Descriptors
Torch Ingress Token
Torch Loss
Torch Optimizer
Deep Learning library that extends the mlr3 framework by building upon the 'torch' package. It allows to conveniently build, train, and evaluate deep learning models without having to worry about low level details. Custom architectures can be created using the graph language defined in 'mlr3pipelines'.
Useful links
Downloads (last 30 days):