[forecast][LSTM+SHAP]Applied SHAP on the polynomial equation case with LSTM algorithm

IJN-Kasumi 1939–1945
2 min readDec 11, 2019

--

keyword: Python, LSTM, SHAP, weights

Somewhere in Taiwan. All rights reserved.

This article demonstrates the Python SHAP package capability in explaining the LSTM model in a known model. You will learn how to participate in the SHAP package and its accuracy.

Suppose a given model with five input state, each state has own weight factor and sum up with a result Y vector. The set weight vector is 0.15, 0.4, 0.65, 0.85 and 0.95. Our work is to find out this weight vector ratio value. This ratio value refers to the impact factor of each weight vector. In this case, it should be 0.05, 0.13, 0.22, 0.28, and 0.32.

The code represents as below.

create the model-matrix with X1~X5 and Y_out. All rights reserved.

The LSTM model is the normal vanilla stack. After compile, the loss is stable after 4 epochs.

establish the LSTM model. All rights reserved.

loss table result comparison. The training operation becomes stable after the second epoch time.

LSTM loss table. All rights reserved.

OK, we can see the LSTM model does its work in predict correctly. Both Real and Predict traces are almost the same below.

Real and Predict comparison chart. All rights reserved

OK, let’s try to find out the weight factors among with X1~X5 states. The reference code is below.

In result comparison, the SHAP explainer result is very closer to the weight vector ratio value. The numbers of the training data, predict data, LSTM_batch, and LSTM_memory_unit are 900, 100, 1 and 100, respectively.

SHAP explainer results versus the given weight factors. All rights reserved

Conclusion:

In this article, you see the SHAP explainer capability in the find out the weight factors. The implementation code is not difficult and very straight forward.

reference :

https://github.com/slundberg/shap

--

--