Member-only story

FAU Lecture Notes in Deep Learning

Activations, Convolutions, and Pooling — Part 2

Modern Activations

Andreas Maier
Towards Data Science
9 min readJun 28, 2020

--

Deep Learning at FAU. Image under CC BY 4.0 from the Deep Learning Lecture

These are the lecture notes for FAU’s YouTube Lecture “Deep Learning”. This is a full transcript of the lecture video & matching slides. We hope, you enjoy this as much as the videos. Of course, this transcript was created with deep learning techniques largely automatically and only minor manual modifications were performed. If you spot mistakes, please let us know!

Navigation

Previous Lecture / Watch this Video / Top Level / Next Lecture

The rectified linear unit (ReLU) was one of the first modern activations. Image under CC BY 4.0 from the Deep Learning Lecture.

Welcome back to Part 2 of activation functions and convolutional neural networks! Now, we want to continue talking about the activation functions and the new ones used in deep learning. One of the most famous examples is the rectified linear unit (ReLU). Now the ReLU, we are already encountered earlier and the idea is simply to set the negative half-space to zero and the positive half-space to x. This then results in derivatives of one for the entire positive half-space and zero everywhere else. So, this is very nice because this way we get a good generalization. Due…

--

--

Towards Data Science
Towards Data Science

Published in Towards Data Science

Your home for data science and AI. The world’s leading publication for data science, data analytics, data engineering, machine learning, and artificial intelligence professionals.

Andreas Maier
Andreas Maier

Written by Andreas Maier

I do research in Machine Learning and head a Research Lab at Erlangen University, Germany.

No responses yet