FAU Lecture Notes in Pattern Recognition

In the Beginning was the Perceptron, through them all Nets were made

The Rosenblatt Perceptron

Andreas Maier
CodeX
Published in
15 min readApr 7, 2021

--

Image under CC BY 4.0 from the Pattern Recognition Lecture

These are the lecture notes for FAU’s YouTube Lecture “Pattern Recognition”. This is a full transcript of the lecture video & matching slides. The sources for the slides are available here. We hope, you enjoy this as much as the videos. This transcript was almost entirely machine generated using AutoBlog and only minor manual modifications were performed. If you spot mistakes, please let us know!

Navigation

Previous Chapter / Watch this Video / Next Chapter / Top Level

Welcome everybody to pattern recognition! Today we want to look a bit into the first storage of neural networks and in particular, we want to look into the Rosenblatt Perceptron. We will look into its optimization and the actual convergence proofs and convergence behavior.

Image under CC BY 4.0 from the Pattern Recognition Lecture

So let’s start looking into the Rosenblatt Perceptron. This was already developed in 1957 and the main idea behind the perceptron is that we want to compute a linear decision boundary…

--

--

Andreas Maier
CodeX
Writer for

I do research in Machine Learning. My positions include being Prof @FAU_Germany, President @DataDonors, and Board Member for Science & Technology @TimeMachineEU