kl_mvn function

Kullback-Leibler Divergence

Kullback-Leibler Divergence

Compute KL divergence for a multivariate normal distribution.

kl_mvn(true, estimate, stein = FALSE)

Arguments

  • true: Matrix. The true precision matrix (inverse of the covariance matrix)
  • estimate: Matrix. The estimated precision matrix (inverse of the covariance matrix)
  • stein: Logical. Should Stein's loss be computed (defaults to TRUE)? Note KL divergence is half of Stein's loss.

Returns

Numeric corresponding to KL divergence.

Note

A lower value is better, with a score of zero indicating that the estimated precision matrix is identical to the true precision matrix.

Examples

# nodes p <- 20 main <- gen_net(p = p, edge_prob = 0.15) y <- MASS::mvrnorm(250, rep(0, p), main$cors) fit_l1 <- ggmncv(R = cor(y), n = nrow(y), penalty = "lasso", progress = FALSE) # lasso kl_mvn(fit_l1$Theta, solve(main$cors)) fit_atan <- ggmncv(R = cor(y), n = nrow(y), penalty = "atan", progress = FALSE) kl_mvn(fit_atan$Theta, solve(main$cors))
  • Maintainer: Donald Williams
  • License: GPL-2
  • Last published: 2021-12-15

Useful links