Machine learning with linear regression and gradient decent part 4 ( THE CODE!!)

Somto Achu
Chatbots bible and More.
3 min readJul 24, 2017

So we embarked on this mysterious journey of linear regression and gradient decent its been hard and i get that well it comes to and end in this part were i turn math to code like magic, well the truth of all this is that most machine learning problems are usually solved with libraries like numpy keras and tenserflow,libraries like this help with some very deep abstractions and lets be honnest if you ask someone like me to do some things without those libraries its gonna take a while to get it done but all the same its cool to understand some of the hidden things thats why i like doing simple problems like this without using libraries it kinda keeps a part of your brain active and gives you a better understanding when using the libraries.

So lets get started first of all lets define our cost function in code which is

E(y-y’)²/N

function costfunction(data,m,c) {
var result =0;
for (var i = 0; i < data.x.length; i++) {
result = result + ((data.y[i]-linecalc(m,data.x[i],c))**2);
}
result=result/data.x.length
}

That’s our cost function in code

Next would be our actual predicted value of y lets call it linecalc

y’=mx + c

function linecalc(m,x,c){
return m*x + c;
}

Next is our gradient which is our previous value minus the product of our learning rate and the differential of our cost function

M= m’ -(a(2N(E((y -(mx + c))* x)))

C = c’ -(a(2N(E((y-(my+ c)))))

function stepgradient(params2,data,lr){
var params={m:0,c:0};
for (var i = 0; i < data.x.length; i++) {
params.m=params.m — 2/data.x.length*data.x[i]*(data.y[i] — ((params2.m * data.x[i])+params2.c))
params.c=params.c — 2/data.x.length*(data.y[i] — ((params2.m * data.x[i])+params2.c))
}
params.m=params2.m — (lr*params.m);
params.c=params2.c — (lr*params.c);
return params
}

Yeah by now am sure you figured it out i love using arrays it keeps code tidy

So we are here out our last function this is kinda long so am gonna explain so pay attention …………………..just joking u have pretty done all u need to do by simplifying everything into smaller modular functions you have saved your self some time so all u need is a function to put everything together and thats

function findparams(data){
var params=[];
var process_ = {m:0,c:0};
for (var i = 0; i < 100; i++) {
process_=stepgradient(process_,data,0.001);
ctx.clearRect(0,0,canvas.width,canvas.height);

ctx.beginPath();
ctx.moveTo(0*4,canvas.height — linecalc(process_.m,0,process_.c)*4);
ctx.lineTo(canvas.width*4, (canvas.height — linecalc(process_.m,canvas.width,process_.c))*5.5);
ctx.stroke();
setInterval(function(){
for (var i = 0; i < data.x.length; i++) {
ctx.fillRect((data.x[i]*4),(canvas.height — ((data.y[i])*4)),4,4);
}
}, 300);

console.log(process_);
}
}

then you call the function and pass the data set mine is made up so dont worry it logs the parameters in the console to check it

var canvas = document.getElementById(“canvas”);
var ctx = canvas.getContext(“2d”);
ctx.fillStyle=’purple’;
ctx.textAlign = “center”;
var mouse ={x:0,y:0};
var data={
x:[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40],
y:[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40]
};

Then call the function

for (var i = 0; i < data.x.length; i++) {
ctx.fillRect((data.x[i])*5,(canvas.height — ((data.y[i])*4)),4,4);
}

findparams(data)

Here is another set of random data you can test with

var data={
x:[11,23,2,3,2,5,4,4,12,13,23,24,23,3,2,3,3,2,2,23,2,23,23,23,23,23,45,2,2,2,24,3,2,32,24,14,21,2,2,4],
y:[1,2,3,4,5,6,27,8,9,10,31,12,13,14,11,23,23,2,2,13,12,33,23,3,23,23,45,2,2,2,24,3,22,32,4,14,12,12,2,4]
};

and thats all for the project and implementation in javascript ,i wanna tank anyone that took his or her time to go through my four part tutorial ,am gonna be dropping more tutorials on machine learning and would soon make youtube videos (seams everyone is doing it so why not)

My next tutorial is gonna be on using linear regression and gradient decent for multiple variables not just m and c ,then logistic regression move unto svm (support vector machines) then land at neaural nets and do a combine tutorial on recurrent neural net and genetic algorithm and yeah i gotta very cool example for that and finally move on to real word examples and applications ,tanks guys and see you in the next part ,tanks…..

Download the code at

https://github.com/virgincodes/machinelearning_introduction_linearregression_gradientdecent

--

--

Somto Achu
Chatbots bible and More.

Developer living in nigeria into robotics and AI trying to make an impact with the little i know