Bayesian Preference Learning with the Mallows Rank Model
BayesMallows: Bayesian Preference Learning with the Mallows Rank Model
Trace Plots from Metropolis-Hastings Algorithm
Assign Assessors to Clusters
Asymptotic Approximation of Partition Function
Set the burnin
See the burnin
Compute Consensus Ranking
Compute exact partition function
Expected value of metrics under a Mallows rank model
Compute Mixtures of Mallows Models
Estimate the Bayesian Mallows Model Sequentially
Estimate Partition Function
Preference Learning with the Mallows Rank Model
Frequency distribution of the ranking sequences
Compute Posterior Intervals
Distance between a set of rankings and a given rank sequence
Convert between ranking and ordering.
Get Acceptance Ratios
Get cardinalities for each distance
Likelihood and log-likelihood evaluation for a Mallows mixture model
Get transitive closure
Heat plot of posterior probabilities
Plot Within-Cluster Sum of Distances
Plot Top-k Rankings with Pairwise Preferences
Plot Posterior Distributions
Plot SMC Posterior Distributions
Predict Top-k Rankings with Pairwise Preferences
Print Method for BayesMallows Objects
Sample from the Mallows distribution.
Random Samples from the Mallows Rank Model
Sample from prior distribution
Specify options for computation
Set initial values of scale parameter and modal ranking
Set options for Bayesian Mallows model
Set prior parameters for Bayesian Mallows model
Set progress report options for MCMC algorithm
Set SMC compute options
Setup rank data
Update a Bayesian Mallows model with new users
An implementation of the Bayesian version of the Mallows rank model (Vitelli et al., Journal of Machine Learning Research, 2018 <https://jmlr.org/papers/v18/15-481.html>; Crispino et al., Annals of Applied Statistics, 2019 <doi:10.1214/18-AOAS1203>; Sorensen et al., R Journal, 2020 <doi:10.32614/RJ-2020-026>; Stein, PhD Thesis, 2023 <https://eprints.lancs.ac.uk/id/eprint/195759>). Both Metropolis-Hastings and sequential Monte Carlo algorithms for estimating the models are available. Cayley, footrule, Hamming, Kendall, Spearman, and Ulam distances are supported in the models. The rank data to be analyzed can be in the form of complete rankings, top-k rankings, partially missing rankings, as well as consistent and inconsistent pairwise preferences. Several functions for plotting and studying the posterior distributions of parameters are provided. The package also provides functions for estimating the partition function (normalizing constant) of the Mallows rank model, both with the importance sampling algorithm of Vitelli et al. and asymptotic approximation with the IPFP algorithm (Mukherjee, Annals of Statistics, 2016 <doi:10.1214/15-AOS1389>).
Useful links