RReLU module
Applies the randomized leaky rectified liner unit function, element-wise, as described in the paper:
nn_rrelu(lower = 1/8, upper = 1/3, inplace = FALSE)
lower
: lower bound of the uniform distribution. Default: upper
: upper bound of the uniform distribution. Default: inplace
: can optionally do the operation in-place. Default: FALSE
Empirical Evaluation of Rectified Activations in Convolutional Network
.
The function is defined as:
where is randomly sampled from uniform distribution . See: https://arxiv.org/pdf/1505.00853.pdf
*
means, any number of additional dimensionsif (torch_is_installed()) { m <- nn_rrelu(0.1, 0.3) input <- torch_randn(2) m(input) }
Useful links