Statistical Inference for Regularized Gaussian Graphical Models
Statistical Inference for Regularized Gaussian Graphical Models
Compute p-values for each relation based on the de-sparsified glasso estimator if(!exists(".Rdpack.currefs")) .Rdpack.currefs <-new.env();Rdpack::insert_citeOnly(keys="jankova2015confidence",package="GGMncv",cached_env=.Rdpack.currefs) .
method: Character string. A correction method for multiple comparison (defaults to fdr). Can be abbreviated. See p.adjust .
alpha: Numeric. Significance level (defaults to 0.05).
...: Currently ignored.
Returns
Theta De-sparsified precision matrix
adj Adjacency matrix based on the p-values.
pval_uncorrected Uncorrected p-values
pval_corrected Corrected p-values
method The approach used for multiple comparisons
alpha Significance level
Note
This assumes (reasonably) Gaussian data, and should not to be expected to work for, say, polychoric correlations. Further, all work to date has only looked at the graphical lasso estimator, and not de-sparsifying nonconvex regularization. Accordingly, it is probably best to set penalty = "lasso" in ggmncv.
Further, whether the de-sparsified estimator provides nominal error rates remains to be seen, at least across a range of conditions. For example, the simulation results in if(!exists(".Rdpack.currefs")) .Rdpack.currefs <-new.env();Rdpack::insert_citeOnly(keys="williams_2021;textual",package="GGMncv",cached_env=.Rdpack.currefs)
demonstrated that the confidence intervals can have (severely) compromised coverage properties (whereas non-regularized methods had coverage at the nominal level).
Examples
# dataY <- GGMncv::ptsd[,1:5]# fit modelfit <- ggmncv(cor(Y), n = nrow(Y), progress =FALSE, penalty ="lasso")# statistical inferenceinference(fit)# aliasall.equal(inference(fit), significance_test(fit))