Feature Scaling

Baran Aksakal
2 min readJul 24, 2021

Why do we use Feature Scaling in Machine Learning Algorithms?

Feature scaling is one of the most popular subjects in Machine learning courses or data science courses.

Why is it so important?. Let’s examine together.

Feature Scaling is a method that normalizes or standardizes the range of independent variables or features of data. If you think that is why we normalize the datasets, there are many reasons for that.

→The first one is some machine learning algorithms do not work properly with raw data. For example, Linear Discriminant Analysis(LDA), Naive Bayes. These algorithms give the weights to features. If you do not use feature scaling, you face side effects in your implementation.

example of dataset

As we can see, there are three features above the data table. The first is the city’s name. The second is the population of cities. The last one is the average age of people who live in each city. According to the human brain, we do not compare the populations and average ages. However, the machine does not think that. According to the machine, they are comparable numbers. In addition, features that have a wide range are more important for the machine. In other words, a feature that has a wide range dominates other features and we cannot predict well. When we train the model, this may be a problem because the machine gives more weight to this feature and the training model may be useless. So, we should use feature scaling.

→ The second reason is that gradient descent converges faster with future scaling in deep learning.

Don’t worry. Normalization is one of the methods in feature scaling. As we can see, using feature scaling save our time, and the efficiency of our training models are better than models that use without feature scaling.

Thank you for reading my writing. This is an introduction to the feature scaling method. I will explain the methods that use in feature scaling like normalization or standardization. Stay healthy until the other writings.

--

--