How K-NN is different from K-Means

Kelvin Jose
Analytics Vidhya
Published in
2 min readFeb 5, 2020

KNN i.e K-Nearest Neighbors is a supervised classification algorithm used to classify datapoints into different categories say category-alpha and category-beta. If you don’t understand what is meant by supervise classification algorithm or if you want a quick brush-up, you better loot at this and comeback. Coming back to our topic. We will have a number of datapoints i.e. labeled data i.e. train set which would be used by the algorithm to understand the underlying dynamics and hidden feature pattern. Once it is done, we show the model a number of unseen data i.e test set to see how the model responds and perform in approximate real world scenarios.

Let’s look at couple of steps to do K-NN

  1. Get the data labeled, split it into train and test data.
  2. Select K, which is a hyperparameter refers to the number of closest neighbors we will consider while doing the majority voting of target labels.
  3. Pick one similarity and evaluation metric.
  4. Run K-NN a few times, changing K and checking the evaluation measure.
  5. In each iteration, K neighbors vote, majority vote wins and becomes the ultimate prediction.
  6. Optimize K by picking the one with the best evaluation measure.

K-Means is an unsupervised algorithm used to cluster together similar datapoints. Unlike supervised learning mechanisms, we use unlabeled data. The algorithm relies on the independent features of the underlying data. K-Means clustering is a method of vector quantization, originally from signal processing, that is popular for cluster analysis in data mining. K-Means clustering aims to partition n observations into k clusters in which each observation belongs to the cluster with the nearest mean.

Look at the given below pseudocode

  1. choose K centroids arbitrarily
  2. Assign each datapoint to the closest centroid
  3. Move the centroids to the mean of each cluster
  4. Repeat steps 2 and 3 until centroids don’t change much.

Yeah, that’s a wrap! Hold on tight. New posts are coming up.

Peace Power Pleasure

--

--