Published in

Various types of support vector Machines in Machine-Learning

In machine learning, there are several models available to use. In this article, we are going to give a brief introduction to some type of support vector machines or in short form SVM.

The Original SVM

The original model's purpose was to distinguish two classes of data, in other words, it was a binary classifier. Its goal was to find a line or hyperplane to have the biggest margin with each of the data in classes, So we can say it was trying to maximize the space from the line to each data for classes.

Original SVM with support of multiple classes

The problem of the original model was that it can only classify two classes, in other words, it was a binary classifier. there are multiple ways of how to improve it to make a multiple class classifier.
One way of it is to have the original binary classifier and use it on every two classes at a time and at last, made a vote between each result of the classification process. This method is often called one-vs-one.
Another method is called one-vs-rest it tries to look at all classes as a binary classification, meaning at a time it chooses one class and considers other classes as one class. Again at the end of the process, it has a voting process to find the suitable class for each data.

Twin support vector machine

One another type of SVM is the Twin support Vector machine or TWSVM [1]. This type of SVM is again a binary classifier that aim’s to have two hyperplanes to distinguish data. There is a little difference here that using two hyperplanes would change some facts about SVM. The difference here is we aren’t going to find the biggest margin. The goal here is to find two hyperplanes that try to fit with each data, meaning each hyperplane is going to have the closest relationship with each class (Think of a regression line for each class). At last, there is a voting process that tries to classify the test data into the closest hyperplane.


This type of support vector machine is suited for more than two classes classification. It is a mixed type of classification and regression model that uses the method called one-vs-one-vs-rest [2]. To get a better understanding of how this model works we can say, this model at a time evaluates every two classes and tries to maximize the margin between two classes. This type of SVM is closer to the original SVM than TWSVM and the computational complexity is more than the original SVM because we have added the evaluation of other classes in the time of classification.


The last type of SVM we’re going to introduce is named Least squared twin multi-class support vector machine[3]. This type of SVM is a mixture of the two SVM we introduced above. To find the optimal solution it creates two hyperplanes and uses the method one-vs-one-vs-rest. This type of SVM is proposed byMárcio Dias de Lima, Nattane Costa, and Rommel Melgaço Barbosa in the article, To Read more about this method read the article Improvements on least squares twin multi-class classification support vectormachine.


[1] Twin Support Vector Machines for pattern classification (

[2] K-SVCR. A support vector machine for multi-class classification (

[3] Improvements on least squares twin multi-class classification support vectormachine(



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store