Shapley computes feature contributions for single predictions with the Shapley value, an approach from cooperative game theory. The features values of an instance cooperate to achieve the prediction. The Shapley value fairly distributes the difference of the instance's prediction and the datasets average prediction among the features.
library("rpart")# First we fit a machine learning model on the Boston housing datadata("Boston", package ="MASS")rf <- rpart(medv ~ ., data = Boston)X <- Boston[-which(names(Boston)=="medv")]mod <- Predictor$new(rf, data = X)# Then we explain the first instance of the dataset with the Shapley method:x.interest <- X[1,]shapley <- Shapley$new(mod, x.interest = x.interest)shapley
# Look at the results in a tableshapley$results
# Or as a plotplot(shapley)# Explain another instanceshapley$explain(X[2,])plot(shapley)## Not run:# Shapley() also works with multiclass classificationrf <- rpart(Species ~ ., data = iris)X <- iris[-which(names(iris)=="Species")]mod <- Predictor$new(rf, data = X, type ="prob")# Then we explain the first instance of the dataset with the Shapley() method:shapley <- Shapley$new(mod, x.interest = X[1,])shapley$results
plot(shapley)# You can also focus on one classmod <- Predictor$new(rf, data = X, type ="prob", class ="setosa")shapley <- Shapley$new(mod, x.interest = X[1,])shapley$results
plot(shapley)## End(Not run)
References
Strumbelj, E., Kononenko, I. (2014). Explaining prediction models and individual predictions with feature contributions. Knowledge and Information Systems, 41(3), 647-665. https://doi.org/10.1007/s10115-013-0679-x
See Also
Shapley
A different way to explain predictions: LocalModel
Super class
iml::InterpretationMethod -> Shapley
Public fields
x.interest: data.frame
Single row with the instance to be explained.
y.hat.interest: numeric
Predicted value for instance of interest.
y.hat.average: numeric(1)
Average predicted value for data `X`.
sample.size: numeric(1)
The number of times coalitions/marginals are sampled from data X. The higher the more accurate the explanations become.