Sitemap

Statistical Learning in Julia. Part 3 — Support Vector Machine

10 min readJan 7, 2023
Press enter or click to view image in full size
Photo by Markus Winkler on Unsplash

Chapters

Here we are with part 3! Take a sit, open your IDE, gather your cognitive resources and code along. Today, we are dealing again with a classification problem exploiting a powerful algorithm such as the Support Vector Machine. Moreover, this algorithm gives us the opportunity to introduce the kernel trick.

Introduction

To be precise, there are three different classifiers people refer to using the term Support Vector Machine (SVM), each built on top of the previous one: Maximal Margin Classifier, Support Vector Classifier and, finally, the Support Vector Machine.

The Maximal Margin Classifier

This algorithm tries to find the best hyperplane that separates our classes.

We can define a hyperplane in a p-dimensional space as a “flat affine subspace of dimensions p-1” [1]. For instance, in a two-dimensional space, an hyperplane is defined by the equation

As you can see, the hyperplane in a two-dimensional space is just a straight line. We can intuitively understand that…

--

--

Nicola Corbellini
Nicola Corbellini

Written by Nicola Corbellini

I'm a PhD student in Computer Science and Machine Learning enthusiast believing AI could have a positive impact for human behavior understanding and social good

No responses yet