kldggd function

Kullback-Leibler Divergence between Centered Multivariate generalized Gaussian Distributions

Kullback-Leibler Divergence between Centered Multivariate generalized Gaussian Distributions

Computes the Kullback- Leibler divergence between two random vectors distributed according to multivariate generalized Gaussian distributions (MGGD) with zero means.

kldggd(Sigma1, beta1, Sigma2, beta2, eps = 1e-06)

Arguments

  • Sigma1: symmetric, positive-definite matrix. The dispersion matrix of the first distribution.
  • beta1: positive real number. The shape parameter of the first distribution.
  • Sigma2: symmetric, positive-definite matrix. The dispersion matrix of the second distribution.
  • beta2: positive real number. The shape parameter of the second distribution.
  • eps: numeric. Precision for the computation of the Lauricella DD-hypergeometric function (see lauricella). Default: 1e-06.

Returns

A numeric value: the Kullback-Leibler divergence between the two distributions, with two attributes attr(, "epsilon") (precision of the result of the Lauricella DD-hypergeometric Function) and attr(, "k") (number of iterations) except when the distributions are univariate.

Details

Given X1\mathbf{X}_1, a random vector of Rp\mathbb{R}^p (p>1p > 1) distributed according to the MGGD with parameters (0,Σ1,β1)(\mathbf{0}, \Sigma_1, \beta_1)

and X2\mathbf{X}_2, a random vector of Rp\mathbb{R}^p distributed according to the MGGD with parameters (0,Σ2,β2)(\mathbf{0}, \Sigma_2, \beta_2).

The Kullback-Leibler divergence between X1X_1 and X2X_2 is given by:

KL(X1X2)=ln(β1Σ11/2Γ(p2β2)β2Σ21/2Γ(p2β1))+p2(1β21β1)ln2p2β2+2β2β11Γ(β2β1+pβ1)Γ(p2β1)λpβ2 \displaystyle{ KL(\mathbf{X}_1||\mathbf{X}_2) = \ln{\left(\frac{\beta_1 |\Sigma_1|^{-1/2} \Gamma\left(\frac{p}{2\beta_2}\right)}{\beta_2 |\Sigma_2|^{-1/2} \Gamma\left(\frac{p}{2\beta_1}\right)}\right)} + \frac{p}{2} \left(\frac{1}{\beta_2} - \frac{1}{\beta_1}\right) \ln{2} - \frac{p}{2\beta_2} + 2^{\frac{\beta_2}{\beta_1}-1} \frac{\Gamma{\left(\frac{\beta_2}{\beta_1} + \frac{p}{\beta_1}\right)}}{\Gamma{\left(\frac{p}{2 \beta_1}\right)}} \lambda_p^{\beta_2} } ×FD(p1)(β1;12,,12p1;p2;1λp1λp,,1λ1λp) \displaystyle{ \times F_D^{(p-1)}\left(-\beta_1; \underbrace{\frac{1}{2},\dots,\frac{1}{2}}_{p-1}; \frac{p}{2}; 1-\frac{\lambda_{p-1}}{\lambda_p},\dots,1-\frac{\lambda_{1}}{\lambda_p}\right) }

where λ1<...<λp1<λp\lambda_1 < ... < \lambda_{p-1} < \lambda_p are the eigenvalues of the matrix Σ1Σ21\Sigma_1 \Sigma_2^{-1}

and FD(p1)F_D^{(p-1)} is the Lauricella DD-hypergeometric function defined for pp variables:

FD(p)(a;b1,...,bp;g;x1,...,xp)=m10...mp0(a)m1+...+mp(b1)m1...(bp)mp(g)m1+...+mpx1m1m1!...xpmpmp! \displaystyle{ F_D^{(p)}\left(a; b_1, ..., b_p; g; x_1, ..., x_p\right) = \sum\limits_{m_1 \geq 0} ... \sum\limits_{m_p \geq 0}{ \frac{ (a)_{m_1+...+m_p}(b_1)_{m_1} ... (b_p)_{m_p} }{ (g)_{m_1+...+m_p} } \frac{x_1^{m_1}}{m_1!} ... \frac{x_p^{m_p}}{m_p!} } }

This computation uses the lauricella function.

When p=1p = 1 (univariate case): let X1X_1, a random variable distributed according to the centered generalized Gaussian distribution with parameters (0,σ1,β1)(0, \sigma_1, \beta_1)

and X2X_2, a random variable distributed according to the generalized Gaussian distribution with parameters (0,σ2,β2)(0, \sigma_2, \beta_2).

KL(X1X2)=ln(β1σ1Γ(12β2)β2σ2Γ(12β1))+12(1β21β1)ln212β2+2β2β11Γ(β2β1+1β1)Γ(12β1)(σ1σ2)β2 KL(X_1||X_2) = \displaystyle{ \ln{\left(\frac{\frac{\beta_1}{\sqrt{\sigma_1}} \Gamma\left(\frac{1}{2\beta_2}\right)}{\frac{\beta_2}{\sqrt{\sigma_2}} \Gamma\left(\frac{1}{2\beta_1}\right)}\right)} + \frac{1}{2} \left(\frac{1}{\beta_2} - \frac{1}{\beta_1}\right) \ln{2} - \frac{1}{2\beta_2} + 2^{\frac{\beta_2}{\beta_1}-1} \frac{\Gamma{\left(\frac{\beta_2}{\beta_1} + \frac{1}{\beta_1}\right)}}{\Gamma{\left(\frac{1}{2 \beta_1}\right)}} \left(\frac{\sigma_1}{\sigma_2}\right)^{\beta_2} }

Examples

beta1 <- 0.74 beta2 <- 0.55 Sigma1 <- matrix(c(0.8, 0.3, 0.2, 0.3, 0.2, 0.1, 0.2, 0.1, 0.2), nrow = 3) Sigma2 <- matrix(c(1, 0.3, 0.2, 0.3, 0.5, 0.1, 0.2, 0.1, 0.7), nrow = 3) # Kullback-Leibler divergence kl12 <- kldggd(Sigma1, beta1, Sigma2, beta2) kl21 <- kldggd(Sigma2, beta2, Sigma1, beta1) print(kl12) print(kl21) # Distance (symmetrized Kullback-Leibler divergence) kldist <- as.numeric(kl12) + as.numeric(kl21) print(kldist)

References

N. Bouhlel, A. Dziri, Kullback-Leibler Divergence Between Multivariate Generalized Gaussian Distributions. IEEE Signal Processing Letters, vol. 26 no. 7, July 2019. tools:::Rd_expr_doi("10.1109/LSP.2019.2915000")

See Also

dmggd : probability density of a MGGD.

Author(s)

Pierre Santagostini, Nizar Bouhlel

  • Maintainer: Pierre Santagostini
  • License: GPL (>= 3)
  • Last published: 2024-12-20