Kullback-Leibler Divergence between Centered Multivariate generalized Gaussian Distributions
Kullback-Leibler Divergence between Centered Multivariate generalized Gaussian Distributions
Computes the Kullback- Leibler divergence between two random vectors distributed according to multivariate generalized Gaussian distributions (MGGD) with zero means.
kldggd(Sigma1, beta1, Sigma2, beta2, eps =1e-06)
Arguments
Sigma1: symmetric, positive-definite matrix. The dispersion matrix of the first distribution.
beta1: positive real number. The shape parameter of the first distribution.
Sigma2: symmetric, positive-definite matrix. The dispersion matrix of the second distribution.
beta2: positive real number. The shape parameter of the second distribution.
eps: numeric. Precision for the computation of the Lauricella D-hypergeometric function (see lauricella). Default: 1e-06.
Returns
A numeric value: the Kullback-Leibler divergence between the two distributions, with two attributes attr(, "epsilon") (precision of the result of the Lauricella D-hypergeometric Function) and attr(, "k") (number of iterations) except when the distributions are univariate.
Details
Given X1, a random vector of Rp (p>1) distributed according to the MGGD with parameters (0,Σ1,β1)
and X2, a random vector of Rp distributed according to the MGGD with parameters (0,Σ2,β2).
The Kullback-Leibler divergence between X1 and X2 is given by:
When p=1 (univariate case): let X1, a random variable distributed according to the centered generalized Gaussian distribution with parameters (0,σ1,β1)
and X2, a random variable distributed according to the generalized Gaussian distribution with parameters (0,σ2,β2).
N. Bouhlel, A. Dziri, Kullback-Leibler Divergence Between Multivariate Generalized Gaussian Distributions. IEEE Signal Processing Letters, vol. 26 no. 7, July 2019. tools:::Rd_expr_doi("10.1109/LSP.2019.2915000")