Kullback-Leibler Divergence
Compute KL divergence for a multivariate normal distribution.
kl_mvn(true, estimate, stein = FALSE)
true
: Matrix. The true precision matrix (inverse of the covariance matrix)estimate
: Matrix. The estimated precision matrix (inverse of the covariance matrix)stein
: Logical. Should Stein's loss be computed (defaults to TRUE
)? Note KL divergence is half of Stein's loss.Numeric corresponding to KL divergence.
A lower value is better, with a score of zero indicating that the estimated precision matrix is identical to the true precision matrix.
# nodes p <- 20 main <- gen_net(p = p, edge_prob = 0.15) y <- MASS::mvrnorm(250, rep(0, p), main$cors) fit_l1 <- ggmncv(R = cor(y), n = nrow(y), penalty = "lasso", progress = FALSE) # lasso kl_mvn(fit_l1$Theta, solve(main$cors)) fit_atan <- ggmncv(R = cor(y), n = nrow(y), penalty = "atan", progress = FALSE) kl_mvn(fit_atan$Theta, solve(main$cors))
Useful links