Become a member
Sign in

Batch Normalization — Normalizes the values in a neural network layer to values between 0 and 1. This helps train the neural network faster.

The Deep Learning(.ai) Dictionary
3.2K
8
Jan Zawadzki
Max Schumacher
Max Schumacher
Aug 31, 2018 · 1 min read

No, it normalizes them to have zero mean and unit variance.

    Max Schumacher

    Written by

    Max Schumacher

    Write the first response

    Discover Medium

    Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch

    Make Medium yours

    Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore

    Become a member

    Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade
    AboutHelpLegal