[forecast][MLP]Comparison in the Multilayer Perceptron Models (MLP), LSTM, and XGBoost in the random array predict case with SHAP explainer result

IJN-Kasumi 1939–1945
2 min readDec 28, 2019

--

A church in Taiwan mountain area (Photo by author. all rights reserved)

Scope: This article demonstrates the multilayer perceptron models(MLP) fulfillment and its model explainer result.

The background story on the MLP is not the main scope in this article. The implementation code is below. The main difference is no need for the tensor format data conversion. The model creation routine (line 4 ~ 7) is very similar to the LSTM type. SHAP explainer is also simpler in line 27.

MLP implementation with SHAP explainer

In the previous study, the LSTM and XGBoost method are discussed. Adding the MLP predict result, there is no significant difference in between. Please note that it is the random association training model without the conjunction equations.

Methods comparison in the predicted result.

The below SHAP explainer result indicates the MLP has the most accurate result compared with LSTM and XGBoost methods. MLP’s values are very closer to the preset weight values.

The computation time comparison result presents the MLP method is faster than other methods.

computation time (second)

The node memory comparison indicates the MLP method uses less memory space than the LSTM type.

MLP model summary
LSTM model summary

END.

--

--