FAU Lecture Notes in Pattern Recognition

How to train your SVM

Support Vector Machines — Optimization

Andreas Maier
CodeX
Published in
12 min readApr 20, 2021

--

Image under CC BY 4.0 from the Pattern Recognition Lecture

These are the lecture notes for FAU’s YouTube Lecture “Pattern Recognition”. This is a full transcript of the lecture video & matching slides. The sources for the slides are available here. We hope, you enjoy this as much as the videos. This transcript was almost entirely machine generated using AutoBlog and only minor manual modifications were performed. If you spot mistakes, please let us know!

Navigation

Previous Chapter / Watch this Video / Next Chapter / Top Level

Welcome back to pattern recognition. Today we want to talk about support vector machines. But we want to remember what we learned about duality and convex optimization and applied it to our support vector machines here. So, let’s see what we can learn more about support vector machines.

Image under CC BY 4.0 from the Pattern Recognition Lecture

This is the second part of support vector machines and we’re back to the hard margin problem. Here we see that the SVM can be formulated as the norm squared of our α vector. You remember α is essentially the normal vector of our…

--

--

Andreas Maier
CodeX
Writer for

I do research in Machine Learning. My positions include being Prof @FAU_Germany, President @DataDonors, and Board Member for Science & Technology @TimeMachineEU