What exactly is an optimizer, and why do we need one? Optimizers are used for changing the properties of your neural network, such as weights and learning rates, to minimize loss or cost during backpropagation. · Gradient Descent Gradient descent is an iterative machine learning optimization algorithm to reduce the…