Kullback-Leibler Divergence between Centered Multivariate Cauchy Distributions
Kullback-Leibler Divergence between Centered Multivariate Cauchy Distributions
Computes the Kullback-Leibler divergence between two random vectors distributed according to multivariate Cauchy distributions (MCD) with zero location vector.
kldcauchy(Sigma1, Sigma2, eps =1e-06)
Arguments
Sigma1: symmetric, positive-definite matrix. The scatter matrix of the first distribution.
Sigma2: symmetric, positive-definite matrix. The scatter matrix of the second distribution.
eps: numeric. Precision for the computation of the partial derivative of the Lauricella D-hypergeometric function (see Details). Default: 1e-06.
Returns
A numeric value: the Kullback-Leibler divergence between the two distributions, with two attributes attr(, "epsilon") (precision of the partial derivative of the Lauricella D-hypergeometric function,see Details) and attr(, "k") (number of iterations).
Details
Given X1, a random vector of Rp distributed according to the MCD with parameters (0,Σ1)
and X2, a random vector of Rp distributed according to the MCD with parameters (0,Σ2).
Let λ1,…,λp the eigenvalues of the square matrix Σ1Σ2−1
sorted in increasing order:
λ1<⋯<λp−1<λp
Depending on the values of these eigenvalues, the computation of the Kullback-Leibler divergence of X1 from X2
Sigma1 <- matrix(c(1,0.6,0.2,0.6,1,0.3,0.2,0.3,1), nrow =3)Sigma2 <- matrix(c(1,0.3,0.1,0.3,1,0.4,0.1,0.4,1), nrow =3)kldcauchy(Sigma1, Sigma2)kldcauchy(Sigma2, Sigma1)Sigma1 <- matrix(c(0.5,0,0,0,0.4,0,0,0,0.3), nrow =3)Sigma2 <- diag(1,3)# Case when all eigenvalues of Sigma1 %*% solve(Sigma2) are < 1kldcauchy(Sigma1, Sigma2)# Case when all eigenvalues of Sigma1 %*% solve(Sigma2) are > 1kldcauchy(Sigma2, Sigma1)
References
N. Bouhlel, D. Rousseau, A Generic Formula and Some Special Cases for the Kullback–Leibler Divergence between Central Multivariate Cauchy Distributions. Entropy, 24, 838, July 2022. tools:::Rd_expr_doi("10.3390/e24060838")