[forecast][XGBoost]Predict method comparison between LSTM and XGBoost methods

IJN-Kasumi 1939–1945
2 min readDec 17, 2019

--

Somewhere in North-East Taiwan

Scope: This article provides a quick comparison between LSTM and XGBoost in the same predict application with its weight values extraction. You should learn about a. XGBoost method setup b. predict performance c. weight values comparison d. computation time.

Python code in the LSTM and SHAP. Please refer to the previous work.

Python code in the XGBoost method.

setup the XGBoost regression and importance

Predict result comparison chart is shown below with respect to the real, LSTM-predict and XGBoost-predict result. There is no significant difference in the general view.

Predict result comparison with respect to the real, LSTM-Predict and XGBoost-Predict result

The weight value extraction result presents the LSTM method offers a better result than the XGBoost method.

Weight values comparison. XGBoost method result (green color) doesn’t go with the Preset weight values.

Computation time:

LSTM (89.215) + SHAP (3.739) = 92.954 (seconds)

XGBoost = 2.176 (seconds)

Conclusion :

XGBoost is faster than the LSTM method with equal precision in the correct tuning parameters. The drawback is its feature-importance is not so accuracy as LSTM+SHAP combination.

Remark :

any comments are welcome to improve the feature-importance accuracy.

--

--