Training of the evidential neural network classifier
Training of the evidential neural network classifier
proDSfit performs parameter optimization for the evidential neural network classifier.
proDSfit( x, y, param, lambda =1/max(as.numeric(y)), mu =0, optimProto =TRUE, options = list(maxiter =500, eta =0.1, gain_min =1e-04, disp =10))
Arguments
x: Input matrix of size n x d, where n is the number of objects and d the number of attributes.
y: Vector of class lables (of length n). May be a factor, or a vector of integers from 1 to M (number of classes).
param: Initial parameters (see link{proDSinit}).
lambda: Parameter of the cost function. If lambda=1, the cost function measures the error between the plausibilities and the 0-1 target values. If lambda=1/M, where M is the number of classes (default), the piginistic probabilities are considered in the cost function. If lambda=0, the beliefs are used.
mu: Regularization hyperparameter (default=0).
optimProto: Boolean. If TRUE, the prototypes are optimized (default). Otherwise, they are fixed.
options: A list of parameters for the optimization algorithm: maxiter (maximum number of iterations), eta (initial step of gradient variation), gain_min (minimum gain in the optimisation loop), disp (integer; if >0, intermediate results are displayed every disp iterations).
Returns
A list with three elements:
param: Optimized network parameters.
cost: Final value of the cost function.
err: Training error rate.
Details
If optimProto=TRUE (default), the prototypes are optimized. Otherwise, they are fixed to their initial value.