Introduction to model exploration with code examples for R and Python.
Welcome to the “BASIC XAI with DALEX” series.
In this post, we present the Shapley values, the model agnostic method, which is the one we can use for any type of model.
Previous parts of this series are available:
- BASIC XAI with DALEX — Part 1: Introduction.
- BASIC XAI with DALEX — Part 2: Permutation-based variable importance.
- BASIC XAI with DALEX — Part 3: Partial Dependence Profile
- BASIC XAI with DALEX — Part 4: Break Down method
So, shall we start?
First — What Shapley values deliver to us?
Shapley value is a model agnostic method, we can use it for any type of model. The benefit of Shapley values is additive feature attribution property. It is a local explanation. What is a single observation, think about a patient, bank, or telecommunication client. In the previous part of the BASIC XAI SERIES, we introduced a Break Down method. The Shapley value is a generalization because in Break Down method represents one of all variable orders. Now, we consider all orders for variables, so if we have the p features in our dataset, then we have p! orders. The output is averaging the possible orders.
Second — Intuition of Shapley values
SHapley Additive exPlanations (SHAP) are based on “Shapley values” developed by Shapley in the cooperative game theory.
The Shapley value method is based on Break Down predictions into parts. This is a slightly different approach than in the Break Down method. It is based on the idea of averaging the input value of a given variable overall or a large number of possible orders.
An important practical limitation of the general model-agnostic method is that, for large models, the calculation of Shapley values is time-consuming. In specific situations, they can be calculated very quickly. For example, for additonal models and for models based on trees.
Third — let’s get a model in R and Python
Let’s write some code. We are still working on the DALEX apartments data. To calculate the Shapley value method we use the predict_parts() function with type = ‘shap’. We need the explainer object and the observation for which we want to calculate the explanation.
Let’s see now on the plot for apartment consider in the previous blog. The biggest influence on the price of the apartment has the “Ochota” district, it is close to the city center. However, the price is negatively impacted by the fact that the apartment is not in the “Srodmiescie” district — city center. Moreover, the floor number equal 7 and the 93 meter squared area have a negative contribution to price.
If you are interested in other posts about explainable, fair, and responsible ML, follow #ResponsibleML on Medium.
In order to see more R related content visit https://www.r-bloggers.com