DAY#1

Alex Code
Team snAIl
Published in
4 min readMay 1, 2018
Team brainstorming — Irfan(left) and Vivian(right) discussing different applications of ML on a self-driven car.

Day 1 out of 10:

Do you ever get stuck in traffic? Have you ever been annoyed by how inefficiently other people are at the wheel? In a world where more and more machines are automatised, one may wonder if the day when we all get home by cars that can drive themselves is still far ahead.

For that to happen, the technology surrounding that idea has to be deeply studied and experimented with. For a full understanding and a better use of the tools at our disposal the 4 authors of this project decided to immerse themselves on the study of Machine Learning and how to apply it to a reduced scale self-driven car.

Let me briefly contextualise and explain what the following 10 blog posts are all about.

Giacomo Ninniri, Irfan Durrani, Vivian Allen and myself, Alejandro Pitarch, compose an international team of 4 full-stack software engineers . Software development aside, we come from a wide range of different backgrounds, from Care Industry to Entrepreneurship's through Science or Mechanical and Aerospace Engineering. Our different perspectives and skill-sets give us a whole bunch of fresh and very diverse ideas to begin with.

Deciding the scope of our ML project — Giacomo drawing the different cases scenario in which our self-driven car may have to take decisions based on the input that it perceives from the environment.

Our first mission is to define the rules of the games. The project should have directions and specifications that we will use to delimit the scope. Once that point is agreed, we start by defining our Minimum Viable Product also known as MVP (as per our approach to programming and team projects, you may find many references to techniques that best represent the Agile values and principles).

Our MVP, may well be expressed as follows:

“We aim to create a self-driven car which uses ML for it to take the best decisions (turning right, turning left, drive straight).
We will implement our solution on a small scale car that will drive itself through a random circuit.
Given a random circuit, we assume that the car keeps a constant speed and that it will face X-degree turns.”

Here is a hand drawn sketch that represent what above has been expressed in words:

MVP Sketch — Clear exception to the saying ‘’a picture is worth a thousand words”.

Ok, that sketch may not be self-explanatory, but we all understood it and, to us, it meat the same as the paragraph above. This was the minimum graphic representation that we needed to stay in the same page and efficiently move on.

After defining our common goal, the limits around it and the learning process and objectives that we wanted to follow, we agreed on a daily structure based on early morning stand-ups, late evening retrospectives and pair programming sessions in between.

We continue our first day splitting up and dividing some of the early tasks bearing in mind that we intend to get all ready to start writing code on DAY 2.

Irfan sets up a Github organisation for the team and starts it with a repo containing the hardware instructions. https://github.com/snAIl-ML

In order to allow the car to receive input from the environment, Vivian selects and orders the webcam that will be installed on top of the car connected to the Raspberry Pi.

While I create some designs for the team logo, Giacomo carries some ML research and Vivian designs a control schema for hardware and ML.

Hardware ML flow diagram — Vivian putting our thoughts and project structure in neat and well-organised schema.

The day concludes with some three more hours during which we all drive our learning through a tutorial of TensorFlow, followed by a retrospective of the challenges and wins of the first day.

I cannot finish this post and handover the next post to one of my teammates without first introducing you to the main character of this story:

Self-driven car — Formed by 4 small electrical motors whose shafts spin making the wheels rotate. The motors are fed by batteries at the rear of the car. It has 2 ultrasonic distance sensors, 1 at the front and 1 at the back. The sensors are connected to a Raspberry Pi computer to which the future USB webcam will be linked. On top of the platform that constitutes the structure of the car it can be appreciate a big black box which is the battery that supplies the energy for the Raspberry pi.

--

--