Compute the maximum likelihood estimate of the precision matrix, given a known graphical structure (i.e., an adjacency matrix). This approach was originally described in "The Elements of Statistical Learning" if(!exists(".Rdpack.currefs")) .Rdpack.currefs <-new.env();Rdpack::insert_citeOnly(keys="@see pg. 631,@hastie2009elements",package="GGMncv", cached_env=.Rdpack.currefs) .
adj: Adjacency matrix that encodes the constraints, where a zero indicates that element should be zero.
Returns
A list containing the following:
Theta: Inverse of the covariance matrix (precision matrix)
Sigma: Covariance matrix.
wadj: Weighted adjacency matrix, corresponding to the partial correlation network.
Note
The algorithm is written in c++, and should scale to high dimensions nicely.
Note there are a variety of algorithms for this purpose. Simulation studies indicated that this approach is both accurate and computationally efficient if(!exists(".Rdpack.currefs")) .Rdpack.currefs <-new.env();Rdpack::insert_citeOnly(keys="@HFT therein,@emmert2019constrained",package="GGMncv", cached_env=.Rdpack.currefs)
Examples
# datay <- ptsd
# fit modelfit <- ggmncv(cor(y), n = nrow(y), penalty ="lasso", progress =FALSE)# set negatives to zero (sign restriction)adj_new <- ifelse( fit$P <=0,0,1)check_zeros <-TRUE# track trysiter <-0# iterate until all positivewhile(check_zeros){ iter <- iter +1 fit_new <- constrained(cor(y), adj = adj_new) check_zeros <- any(fit_new$wadj <0) adj_new <- ifelse( fit_new$wadj <=0,0,1)}# alias# datay <- ptsd
# nonreg (lambda = 0)fit <- ggmncv(cor(y), n = nrow(y), lambda =0, progress =FALSE)# set values less than |0.1| to zeroadj_new <- ifelse( abs(fit$P)<=0.1,0,1)# mle given the graphmle_known_graph(cor(y), adj_new)