[Week 4 — House Price Prediction]

Halis Taha Şahin
bbm406f18
Published in
2 min readDec 31, 2018

Team Member: Harun Özbay, Halis Taha Şahin

Photo by Jesse Roberts on Unsplash

Hi there,

Last week we worked on train data. We used three different advanced regression techniques. We shared the results from these techniques.

This week we reviewed the data features and observed the results of the changes we made. Also, we will optimize the parameters of the regression techniques.

In machine learning, there are two important tasks. These are cross-validation and hyperparameter tuning (GridSearchCV). These tasks are generally evaluated together.

Cross-validation is the process of training learners using one set of data and testing it using a different set. Parameter tuning is the process of selecting the values for a model’s parameters that maximize the accuracy of the model. We used the grid search technique in the sklearn library for parameter setting.

We also included Support vector regression technique in our tests. Our latest results are as follows:

Linear Regression: 0.134

Random Forest: 0.051

Gradient Boosting: 0.051

Support Vector Regression:0.411

So far we have evaluated four different regression techniques. We examine these techniques and their relationships with the features of the data set.

Future Work

We will evaluate bagging and decision tree regression techniques. In addition, we continue to evaluate data features.

--

--