From the gap between your training and test set loss, and the last plot, it seems to me your model…
Claus Herther @ calogica.com
1

Overfitting is where test error is high and training error is low. Mine is opposite. In addition, they are not far apart 0.00062 vs. 0.00040. It is possible because of the dropout layer. However, it is not overfitting.