Neural networks are a fascinating field of machine learning, but they are sometimes difficult to optimize and explain. In fact, they have several hyperparameters. The most common hyperparameter to tune is the number of neurons in the hidden layer. Let’s see how to find the best number of neurons of a neural network for our dataset.

A neural network is a particular model that tries to catch the correlation between the features and the target transforming the dataset according to a layer of neurons. There are several books that have been written around neural networks and it’s not in the…

The first thing I have learned as a data scientist is that feature selection is one of the most important steps of a machine learning pipeline. Fortunately, some models may help us accomplish this goal by giving us their own interpretation of feature importance. One of such models is the Lasso regression.

I have already talked about Lasso regression in a previous blog post. Let me summarize the main properties of such a model. It is a linear model that uses this cost function:

Stochastic processes theory is wonderful and full of theoretical opportunities for those who are interested in quantitative trading. Sometimes, in order to test a trading strategy, simulating a stock market could be useful. Let’s see how theory comes into help and how to convert it into practice using Python.

*Note from Towards Data Science’s editors:** While we allow independent authors to publish articles in accordance with our **rules and guidelines**, we do not endorse each author’s contribution. You should not rely on an author’s works without seeking professional advice. See our **Reader Terms** for details.*

More than the price, when…

*Note from Towards Data Science’s editors:** While we allow independent authors to publish articles in accordance with our **rules and guidelines**, we do not endorse each author’s contribution. You should not rely on an author’s works without seeking professional advice. See our **Reader Terms** for details.*

Quantitative traders know that they need to extract as much information as possible from price charts. Price action is a very useful technique to spot trading opportunities, but it’s sometimes difficult to read because candlestick charts might be messy or noisy. So, traders need a tool to *smooth* price action and remove noise. Heikin…

Professional data scientists know that data must be prepared before feeding any model with it. Data pre-processing is probably the most important part of a machine learning pipeline and its importance is sometimes underestimated.

Power transformations are a set of transformations that are very useful in certain situations. Let’s see when.

Power transfom is a family of functions that transform data using power laws. The idea is to apply a transformation to each feature of our dataset.

What’s the purpose of a power transform? The idea is to increase the symmetry of the distribution of the features. …

Linear models are some of the **simplest models** in machine learning. They are very powerful and, sometimes, they are really able to **avoid overfitting **and give us nice information about feature importance.

Let’s see how they work.

All regression linear models share the concept to model the target variable as a **linear combination** of the input features.

Data is everything and models are, sometimes, lazy in front of data they don’t understand. That’s why we need to apply a particular set of transformations that fall under the “pre-processing” name.

Scaling of the feature is such a transformation. Let’s see how it works and why we should use it.

Scaling is a set of linear transformations that make all the features comparable. Imagine you have a feature A that spans around 10 and a feature B that spans around 1000. We don’t want our model to consider B more important than A only because it has a higher…

Quantitative traders always look for trading opportunities based on technical indicators and advanced charts. One of the best kinds of indicators traders look for is made by key levels at which the price could bounce. Some of these levels are Fibonacci retracements. Let’s see how they work.

Fibonacci retracements are particular key levels calculated according to 2 swings, i.e. inversion points. The idea behind Fibonacci retracements is to create some key levels in the price range between these two swings according to pre-defined fractions. The most important fraction is 0.618, which is related to the golden ratio (1.618…). That’s why…

I’ve recently launched my online course about Supervised Machine Learning in Python. In this post, I’ll explain to you 5 reasons it’s worth joining it.

In my experience as a **Data Scientist** and **Physicist** I’ve understood that, sometimes, **practice **is more important than theory. Data Science is a very practical discipline and a data scientist is like a **craftsman **working data just like wood. He needs to know how the wood responds to his tools, but his job is more practical than theoretical.

Theoretical Physicists, Data Scientist and fiction author. I teach Data Science, statistics and SQL on YourDataTeacher.com