Sigma1: symmetric, positive-definite matrix. The scatter matrix of the first distribution.
Sigma2: symmetric, positive-definite matrix. The scatter matrix of the second distribution.
distribution: the probability distribution. It can be "mggd" (multivariate generalized Gaussian distribution) "mcd" (multivariate Cauchy) or "mtd" (multivariate t).
beta1, beta2: numeric. If distribution = "mggd", the shape parameters of the first and second distributions. NULL if distribution is "mcd" or "mtd".
nu1, nu2: numeric. If distribution = "mtd", the degrees of freedom of the first and second distributions. NULL if distribution is "mggd" or "mcd".
eps: numeric. Precision for the computation of the Lauricella D-hypergeometric function if distribution is "mggd" (see kldggd) or of its partial derivative if distribution = "mcd" or distribution = "mtd" (see kldcauchy or kldstudent). Default: 1e-06.
Returns
A numeric value: the Kullback-Leibler divergence between the two distributions, with two attributes attr(, "epsilon")
(precision of the Lauricella D-hypergeometric function or of its partial derivative) and attr(, "k") (number of iterations).
N. Bouhlel, A. Dziri, Kullback-Leibler Divergence Between Multivariate Generalized Gaussian Distributions. IEEE Signal Processing Letters, vol. 26 no. 7, July 2019. tools:::Rd_expr_doi("10.1109/LSP.2019.2915000")
N. Bouhlel, D. Rousseau, A Generic Formula and Some Special Cases for the Kullback–Leibler Divergence between Central Multivariate Cauchy Distributions. Entropy, 24, 838, July 2022. tools:::Rd_expr_doi("10.3390/e24060838")
N. Bouhlel and D. Rousseau (2023), Exact Rényi and Kullback-Leibler Divergences Between Multivariate t-Distributions, IEEE Signal Processing Letters. tools:::Rd_expr_doi("10.1109/LSP.2023.3324594")