Learning Rate Scheduling Callback
Changes the learning rate based on the schedule specified by a torch::lr_scheduler
.
As of this writing, the following are available:
torch::lr_cosine_annealing()
torch::lr_lambda()
torch::lr_multiplicative()
torch::lr_one_cycle()
torch::lr_reduce_on_plateau()
torch::lr_step()
torch::lr_scheduler()
.mlr3torch::CallbackSet
-> CallbackSetLRScheduler
scheduler_fn
: (lr_scheduler_generator
)
The `torch` function that creates a learning rate scheduler
scheduler
: (LRScheduler
)
The learning rate scheduler wrapped by this callback
new()
Creates a new instance of this R6 class.
CallbackSetLRScheduler$new(.scheduler, step_on_epoch, ...)
.scheduler
: (lr_scheduler_generator
)
The `torch` scheduler generator (e.g. `torch::lr_step`).
step_on_epoch
: (logical(1)
)
Whether the scheduler steps after every epoch (otherwise every batch).
...
: (any)
The scheduler-specific arguments
on_begin()
Creates the scheduler using the optimizer from the context
CallbackSetLRScheduler$on_begin()
clone()
The objects of this class are cloneable with this method.
CallbackSetLRScheduler$clone(deep = FALSE)
deep
: Whether to make a deep clone.
Useful links