Day 59 of 100DaysofML

Charan Soneji
100DaysofMLcode
Published in
2 min readAug 17, 2020

Leave One Out Cross validation. This is another cross validation method which I wanted to mention and talk about. It can be abbreviated as LOOCV and is very related to K-fold.

LOOCV is the cross-validation technique in which the size of the fold is “1” with “k” being set to the number of observations in the data. This variation is useful when the training data is of limited size and the number of parameters to be tested are limited.

In case of using sklearn for the same process, we may use model_selection.LeaveOneOut()in order to get our train and testing data to be split based on the criteria of LOOCV. I shall be discussing and doing a comparative analysis in my upcoming blogs comparing K-Fold and a few other cross validation techniques.

The below given short video represents the concept very well.

This learning algorithm is applied once for each instance, using all other instances as a training set and using the selected instance as a single-item test set.

The number of possible combinations is equal to the number of data points in the original sample or n. Cross Validation is a very useful technique for assessing the effectiveness of your model, particularly in cases where you need to mitigate over-fitting.

That’s it for today. Thanks for reading. Keep Learning.

Cheers.

--

--