[forecast][LSTM+SHAP]Applied SHAP on the polynomial equation case with LSTM algorithm
keyword: Python, LSTM, SHAP, weights
This article demonstrates the Python SHAP package capability in explaining the LSTM model in a known model. You will learn how to participate in the SHAP package and its accuracy.
Suppose a given model with five input state, each state has own weight factor and sum up with a result Y vector. The set weight vector is 0.15, 0.4, 0.65, 0.85 and 0.95. Our work is to find out this weight vector ratio value. This ratio value refers to the impact factor of each weight vector. In this case, it should be 0.05, 0.13, 0.22, 0.28, and 0.32.
The code represents as below.
The LSTM model is the normal vanilla stack. After compile, the loss is stable after 4 epochs.
loss table result comparison. The training operation becomes stable after the second epoch time.
OK, we can see the LSTM model does its work in predict correctly. Both Real and Predict traces are almost the same below.
OK, let’s try to find out the weight factors among with X1~X5 states. The reference code is below.
In result comparison, the SHAP explainer result is very closer to the weight vector ratio value. The numbers of the training data, predict data, LSTM_batch, and LSTM_memory_unit are 900, 100, 1 and 100, respectively.
Conclusion:
In this article, you see the SHAP explainer capability in the find out the weight factors. The implementation code is not difficult and very straight forward.
reference :