Machine Learning Basics with Examples — Part 2 Supervised Learning

Canburak Tümer
3 min readAug 26, 2018

--

Supervised Learning

Supervised Learning

Supervised learning is one of the sub-disciplines of Machine Learning. In supervised learning we are in search for the optimized function (a.k.a. model) to map input features to an output. And to find that optimized function we have a set of data with all the features and the outputs in our hands. By learning or finding the patterns from this data, we get the mapping function.

Supervised learning branches to two sub-parts depending on the output property. If we are searching for a group, category or class it is called classification; or if we are looking for a value (usually continuous) then it is called regression.

Let’s have a quick look at both of them.

Classification

Classification

As it is described above, classification tries to find the category or class for the input data. Beforehand, it needs to see some sample data for each of the classes to learn how to differentiate things, what properties make a thing in a class etc.

For example, looking at the image above; we can tell there are two classes : Apples and Cupcakes. And we have a bunch of training data for them. Our algorithm learns the specifications of both the classes and now can predict the class of a new coming object.

So that’s what classification is and how it works in a couple of sentences, you can find more information on my classification post.

Regression

Regression

In regression aim is to find a function which gives us a value for our input. Usually the output value is a continuous number. Regression algorithms try to fit a function to given data points. And after it finds the optimal fitting function, it uses this function to calculate output values from input features.

For example, we can easily see there is an linear-like correlation between X and Y axis on the graph, so we can fit a function like Y = aX+b to this graph. After the search for the best a and b values. We will be able to predict new values for the new coming data.

That’s how regression works in a couple sentences, we will dive deeper on the related post.

Roadmap

My blog post series will cover the topic below, and this road map will be updated every once a post is published.

  • Introduction
  • Supervised Learning (this post)
  • Classification
  • Decision Trees
  • Random Forests
  • SVM
  • Naive Bayes
  • Regression
  • Unsupervised Learning
  • Clustering
  • Feature Selection and PCA
  • Send Models to Production

CTA

Thanks for reading, and if you have any questions or comments please do not hesitate to comment on the post.

If you liked the post and find it useful, share or give some claps. Thank you!

--

--

Canburak Tümer

Cloud Data Engineer @ Google Cloud | Data, Coding and Travel enthusiast