Published in

The effect normalisation has on making predictions on datasets

As I have been studying Udacity’s Introduction to Machine learning course, I have come across the lesson on normalisation. Normalisation is the process of converting the values in a dataset to a value from zero to one because it is easier to fet an accurate prediction in that manner. The formula for normalising data is:-




Data Scientists must think like an artist when finding a solution when creating a piece of code. ⚪️ Artists enjoy working on interesting problems, even if there is no obvious answer ⚪️ 🔵 Follow to join our 18K+ Unique DAILY Readers 🟠

Recommended from Medium

Sparse Systolic Tensor Array for Efficient CNN Hardware Acceleration

Fine-Tuning Transformer Model for Invoice Recognition

Keepsake: Version Control For Machine Learning

Learning Machine Learning — Part 4: Neural Network Theory

Product Quantization: Nearest Neighbor Search

Fine-tuning BART for Abstractive Text Summarisation with fastai2

Using Homography for Pose Estimation in OpenCV

Deep Reinforcement Learning for Solving Rubik’s Cube

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store


I have close to five decades experience in the world of work, being in fast food, the military, business, non-profits, and the healthcare sector.

More from Medium

How I solved Kaggle’s Spaceship Titanic problem using bins

Top Trending Machine Learning Algorithms in 2022

Implementing Expectation-Maximisation Algorithm from Scratch with Python

How to find your Artificial Intelligence explainer