Bayesian Inference for Directed Acyclic Graphs
Converting a single BiDAG chain to mcmc object
Converting multiple BiDAG chains to mcmc.list
Deriving an adjecency matrix of a full DBN
Comparing two graphs
Comparing two DBNs
Deriving connected subgraph
Calculating the BGe/BDe score of a single DAG
Calculating the BGe/BDe score of a single DBN
Estimating posterior probabilities of single edges
Deriving a compact adjacency matrix of a DBN
Extracting adjacency matrix (DAG) from MCMC object
Extracting score from MCMC object
Extracting runtime
Extracting scorespace from MCMC object
Deriving subgraph
Extracting trace from MCMC object
Deriving an adjacency matrix of a graph
Structure learning with an iterative order MCMC algorithm on an expand...
iterativeMCMC class structure
Performance assessment of iterative MCMC scheme against a known Bayesi...
Bayesian network structure learning
Deriving a graph from an adjacancy matrix
Estimating a graph corresponding to a posterior probability threshold
Structure learning with the order MCMC algorithm
orderMCMC class structure
DAG structure sampling with partition MCMC
partitionMCMC class structure
Highlighting similarities between two graphs
Plotting a DBN
Plotting difference between two graphs
Plotting difference between two DBNs
Comparing posterior probabilitites of single edges
Plotting posterior probabilities of single edges
Bayesian network structure sampling from the posterior distribution
Performance assessment of sampling algorithms against a known Bayesian...
Calculating the score of a sample against a DAG
Score against DBN
Initializing score object
Prints 'scorespace' object
scorespace class structure
Deriving interactions matrix
Implementation of a collection of MCMC methods for Bayesian structure learning of directed acyclic graphs (DAGs), both from continuous and discrete data. For efficient inference on larger DAGs, the space of DAGs is pruned according to the data. To filter the search space, the algorithm employs a hybrid approach, combining constraint-based learning with search and score. A reduced search space is initially defined on the basis of a skeleton obtained by means of the PC-algorithm, and then iteratively improved with search and score. Search and score is then performed following two approaches: Order MCMC, or Partition MCMC. The BGe score is implemented for continuous data and the BDe score is implemented for binary data or categorical data. The algorithms may provide the maximum a posteriori (MAP) graph or a sample (a collection of DAGs) from the posterior distribution given the data. All algorithms are also applicable for structure learning and sampling for dynamic Bayesian networks. References: J. Kuipers, P. Suter, G. Moffa (2022) <doi:10.1080/10618600.2021.2020127>, N. Friedman and D. Koller (2003) <doi:10.1023/A:1020249912095>, J. Kuipers and G. Moffa (2017) <doi:10.1080/01621459.2015.1133426>, M. Kalisch et al. (2012) <doi:10.18637/jss.v047.i11>, D. Geiger and D. Heckerman (2002) <doi:10.1214/aos/1035844981>, P. Suter, J. Kuipers, G. Moffa, N.Beerenwinkel (2023) <doi:10.18637/jss.v105.i09>.