# Cost function — A simple explanation

Cost functions are functions that measure the performance of a Machine Learning model given a set of data.

Take a simple example of a linear regression model. The cost function of a Linear Regression model is given as

All the function does is that it calculates the difference between the actual value and the predicted value and squares it. This is done for every example and these squared values are added.

The 1/2m term is used for mathematical purposes such as averaging over the results. Now that we have covered that, let us move on to the cost function of Logistic Regression.

Logistic Regression is used for classification problems whereas Linear Regression is used for regression problems or problems that deal with continuous values. The cost function of Logistic Regression is stated as

An interesting fact is that these cost functions can have a huge impact on Gradient Descent. Consider the above examples of Linear Regression and Logistic Regression. The cost function of Linear Regression can technically be modified to suit the needs of Logistic Regression. So why don't we use it that way?

To answer that we need to take a look at 2 types of functions, namely **Convex **and **Non-Convex** functions.

A non-convex function is a function that has multiple local minimums and they look like this

So when we run gradient descent, it is prone to get stuck at any of the local minimums. This is not optimal in any scenario. So the next question is how are convex functions better than this?

Convex functions look like parabolas when plotted in a graph. The main advantage of this type of function over the former is that they have a single point of convergence when gradient descent is executed. This point is called a **Global minimum.**

When there is a global minimum the Gradient Descent algorithm is able to converge faster and it does not get stuck in a local minimum because there is only one which also happens to be the global minimum too!

Thank you for reading! I’ll catch you later in my next article !