K- Nearest Neighbors.

K-nn algorithm is a very basic classification algorithm. It can be used for both regression and classification. It is a supervised learning algorithm. KNN is easy to interpret and fairs well across calculation time.

## Intuition behind knn.

Let us take an example where we are given features of a particular type of flower (like sepal length,sepal width,petal length,petal width) and its class (like Iris Setosa ,Iris Versicolour, Iris Virginica). Our task is to classify a new flower on the basis of its attributes in one of the class.

For this we select a **k value. **For example k=3 which means we are selecting 3 points which are least distant from the new point. To measure the least distance we generally use **Euclidean distance.** the maximum no points which are closest to the new point are calculated.This determines the class of new point.

— KNN is non parametric which means it does not make any assumption as practical data.

— The biggest use of KNN is recommender system.

KNN is a lazy learner or lazy algorithm — KNN is lazy because it doesn’t learn a discriminative function from the training data but “memorizes” the training dataset instead.