The kernel generating functions provided in kernlab.
The Gaussian RBF kernel k(x,x′)=exp(−σ∥x−x′∥2)
The Polynomial kernel k(x,x′)=(scale<x,x′>+offset)degree
The Linear kernel k(x,x′)=<x,x′>
The Hyperbolic tangent kernel k(x,x′)=tanh(scale<x,x′>+offset)
The Laplacian kernel k(x,x′)=exp(−σ∥x−x′∥)
The Bessel kernel k(x,x′)=(−Bessel(ν+1)nσ∥x−x′∥2)
The ANOVA RBF kernel c("k(x,x') = \\sum_{1\\leq i_1 \\ldots < i_D \\leq\n", " N} \\prod_{d=1}^D k(x_{id}, {x'}_{id})") where k(x,x) is a Gaussian RBF kernel.
The Spline kernel c("prodd=1D1+xixj+xixjmin(xi,\n", "xj)−fracxi+xj2min(xi,xj)2+\n", "fracmin(xi,xj)33") \ The String kernels (see stringdot.
sigma: The inverse kernel width used by the Gaussian the Laplacian, the Bessel and the ANOVA kernel
degree: The degree of the polynomial, bessel or ANOVA kernel function. This has to be an positive integer.
scale: The scaling parameter of the polynomial and tangent kernel is a convenient way of normalizing patterns without the need to modify the data itself
offset: The offset used in a polynomial or hyperbolic tangent kernel
order: The order of the Bessel function to be used as a kernel
Details
The kernel generating functions are used to initialize a kernel function which calculates the dot (inner) product between two feature vectors in a Hilbert Space. These functions can be passed as a kernel argument on almost all functions in kernlab(e.g., ksvm, kpca etc).
Although using one of the existing kernel functions as a kernel argument in various functions in kernlab has the advantage that optimized code is used to calculate various kernel expressions, any other function implementing a dot product of class kernel can also be used as a kernel argument. This allows the user to use, test and develop special kernels for a given data set or algorithm. For details on the string kernels see stringdot.
Returns
Return an S4 object of class kernel which extents the function class. The resulting function implements the given kernel calculating the inner (dot) product between two vectors. - kpar: a list containing the kernel parameters (hyperparameters) used.
The kernel parameters can be accessed by the kpar function.
If the offset in the Polynomial kernel is set to 0, we obtain homogeneous polynomial kernels, for positive values, we have inhomogeneous kernels. Note that for negative values the kernel does not satisfy Mercer's condition and thus the optimizers may fail.
In the Hyperbolic tangent kernel if the offset is negative the likelihood of obtaining a kernel matrix that is not positive definite is much higher (since then even some diagonal elements may be negative), hence if this kernel has to be used, the offset should always be positive. Note, however, that this is no guarantee that the kernel will be positive.