Our father who art in n-dimensions
hallowed by the backprop,
thy loss be minimized,
thy gradients unvanished,
on earth as it is in Euclidean space.
Give us this day our daily hyperparameters,
and forgive us our large learning rates,
as we forgive those whose parameters diverge,
and lead us not into discrete optimization,
but deliver us from local optima.
For thine are dimensions,
and the GPUs, and the glory,
forever and ever. Dropout.