Compute the de-sparsified (sometimes called "de-biased") glasso estimator with the approach described in Equation 7 of if(!exists(".Rdpack.currefs")) .Rdpack.currefs <-new.env();Rdpack::insert_citeOnly(keys="jankova2015confidence;textual",package="GGMncv",cached_env=.Rdpack.currefs) . The basic idea is to undo -regularization, in order to compute p-values and confidence intervals (i.e., to make statistical inference).
desparsify(object,...)
Arguments
object: An object of class ggmncv.
...: Currently ignored.
Returns
The de-sparsified estimates, including
Theta: De-sparsified precision matrix
P: De-sparsified partial correlation matrix
Details
According to if(!exists(".Rdpack.currefs")) .Rdpack.currefs <-new.env();Rdpack::insert_citeOnly(keys="jankova2015confidence;textual",package="GGMncv",cached_env=.Rdpack.currefs) , the de-sparisifed estimator, , is defined as
where denotes the graphical lasso estimator of the precision matrix and is the sample correlation matrix. Further details can be found in Section 2 ("Main Results") of if(!exists(".Rdpack.currefs")) .Rdpack.currefs <-new.env();Rdpack::insert_citeOnly(keys="jankova2015confidence;textual",package="GGMncv",cached_env=.Rdpack.currefs) .
This approach is built upon earlier work on the de-sparsified lasso estimator if(!exists(".Rdpack.currefs")) .Rdpack.currefs <-new.env();Rdpack::insert_citeOnly(keys="javanmard2014confidence,van2014asymptotically,zhang2014confidence",package="GGMncv",cached_env=.Rdpack.currefs)
Note
This assumes (reasonably) Gaussian data, and should not to be expected to work for, say, polychoric correlations. Further, all work to date has only looked at the graphical lasso estimator, and not de-sparsifying nonconvex regularization. Accordingly, it is probably best to set penalty = "lasso" in ggmncv.
This function only provides the de-sparsified estimator and not p-values or confidence intervals (see inference).
Examples
# dataY <- GGMncv::Sachs[,1:5]n <- nrow(Y)p <- ncol(Y)# fit model# note: fix lambda, as in the referencefit <- ggmncv(cor(Y), n = nrow(Y), progress =FALSE, penalty ="lasso", lambda = sqrt(log(p)/n))# fit model# note: no regularizationfit_non_reg <- ggmncv(cor(Y), n = nrow(Y), progress =FALSE, penalty ="lasso", lambda =0)# remove (some) bias and sparsityThat <- desparsify(fit)# graphical lasso estimatorfit$P
# de-sparsified estimatorThat$P
# mlefit_non_reg$P