Higher-Level Interface of 'torch' Package to Auto-Train Neural Networks
Activation Functions Specification Helper
Activation Function Arguments Helper
Tunable hyperparameters for kindling models
Extract depth parameter values from n_hlayer argument
FFNN Implementation
Depth-Aware Grid Generation for Neural Networks
Base models for Neural Network Training in kindling
Basemodels-tidymodels wrappers
Variable Importance Methods for kindling Models
{kindling}: Higher-level interface of torch package to auto-train ne...
Register kindling engines with parsnip
Multi-Layer Perceptron (Feedforward Neural Network) via kindling
Functions to generate nn_module (language) expression
Ordinal Suffixes Generator
Predict method for kindling basemodel fits
Prepare arguments for kindling models
Print method for ffnn_fit objects
Print method for rnn_fit objects
Objects exported from other packages
RNN Implementation
Recurrent Neural Network via kindling
Safe sampling function
Summarize and Display a Two-Column Data Frame as a Formatted Table
Validate device and get default device
Provides a higher-level interface to the 'torch' package for defining, training, and fine-tuning neural networks, including its depth, powered by code generation. This package currently supports few to several architectures, namely feedforward (multi-layer perceptron) and recurrent neural networks (Recurrent Neural Networks (RNN), Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU)), while also reduces boilerplate 'torch' code while enabling seamless integration with 'torch'. The model methods to train neural networks from this package also bridges to titanic ML frameworks in R, namely 'tidymodels' ecosystem, which enables the 'parsnip' model specifications, workflows, recipes, and tuning tools.
Useful links