Our Ambassadors experience at DevFest Lille

Matthieu CORNILLON
ADEO Tech Blog
Published in
6 min readDec 16, 2021
Une partie de l’équipe d’Adeo Leroy Merlin sur son stand. Emmanuel Demey, organisateur du Devfest porte un casque de VR et est pris en photo à son insu.

Lille’s Devfest took place a few weeks ago at Lomme’s “Kinépolis” theater. Adeo Services and Leroy Merlin France took part in this conference with a team of a dozen people. This allowed us to attend many conferences while being strong contributors to the partner’s village. There was a very festive atmosphere: each sponsor had set up competitions, contests and challenges.

My name is Matthieu Cornillon and I’ve summed up conference notes to share with you our key learnings from these conferences.

Quickly before letting you read our ambassadors. I would like to thank you for reading this Medium Publication, but mainly thank the voluntary organizers of the events and all the speakers. Clap this article to show your support!

Google BigQuery 101 by Aurélien Allienne

Feedbacks from Anaïs Tournois (Adeo Services)

Aurélien Allienne is a developer at SFEIR and authorized trainer for Google GCP, and as such, he has in-depth knowledge of BigQuery. At the DevFest, he presented us with a crash-course on this product. Here are a few things I learned:

  • BigQuery is often used for processing BigData (think gigas and more) but can also be used on small datasets.
  • It’s based on SQL syntax but it’s not relation-based by default.
  • It uses 3 Google products : Dremel as a query engine, Colossus as a global storage system and Google’s network Jupiter. This means there is full separation between computing and storage.
  • You can create Authorized Views to share a view without sharing a whole dataset.
  • Because of the way it’s built, a query can finish in 17s when the actual processing time is 8 hours.
  • Data are denormalized to increase performance by using a model in column, not in rows.
  • BigQuery isn’t made for massived insert, update or merge operations.
  • There are no indexes.

He also shared a few tips and tricks to make your queries faster:

  • To have better performance, use partitioning and clustering.
  • LIMIT only limits the result output, not the processing time
  • To speed the processing, put the WHERE clause as soon as possible.
  • Also good to know: query results are cached for 24 hours.

The presentation ended with a quick introduction to some extra BigQuery tools:

  • DataFlow, a streaming analytics service
  • BigQuery Omni for multi-cloud data analytics
  • BigQuery ML for creating and executing machine learning models.

All in all, I found it was a really good introduction to BigQuery, highlighting its big principles and tools, and sharing some useful advice.

The GitOps of which you’re the hero by Louis Tournayre

Feedbacks from Ahmed Kaci (Adeo Services)

It has been said that DevFest is a Developer’s Festival, but what about us, the Ops people, us, the Cloud people? This year I wanted to be part of it in order to share and meet people who love IT as much as me.

Nowadays, you can read a lot of stuff around GitOps, its tooling, its misuses, and so on…. But, it’s important to care about GitOps’s fundamentals. This is what was brilliantly offered by Louis Tournayre.

Beyond the GitOps’s mindset, it is necessary to adopt a methodology. Even after picking the best tool (maintained, updated, suitable for your use case: ArgoCD for example), you have to keep the good practices in mind.

Louis delivered a great show and storytelling that reminded me of the following: it’s important to be fully-involved in the GitOps adventure and to maintain a stable process. This means not chasing every new trend or whim which is the main downside of Gitops’s adoption.

In the corridor after the conference, I had the opportunity to touch base with Louis. We had a chat about what our teams are doing and the next steps for us. And I’m now very confident that we are on the right path using ArgoCD with the workflow we already implement.

Unit tests with Javascript, ‘to infinity and beyond’ by Mathilde Rigabert Lemée and Raphaël Verdier

Feedbacks from Mathieu Sporta (Leroy Merlin France)

Why is it important to write unit tests ?

Unit Tests ensure comfort and improve development quality by enabling :

  • Efficiency
  • Durability
  • A good working environment (create trust between developers)

There are different types of tests : integration tests, unit tests and functional tests (E2E end two end).

It is very important to differentiate these tests, as it is quite dangerous to mix tests that serve different purposes. A good test describes what is being tested, but it also has to be placed in the right place and keep a very targeted focus on what is being tested.

From speaker’s experience, the right way is the tests pyramid, in this order:

  • unit tests
  • integration tests,
  • functional tests.

How to introduce these practices ?

It is necessary to remind people of the benefits of testing, to convince developers of their usefulness, and to remind them that testing provides a more solid and high-quality code

Tip: Go step by step, starting with the simplest tests, or refactor your code to make testing easier.

The most used objects :

Developers use objects such as “Mock” and “Stub” in their unit tests. These objects reproduce the behaviour of real objects in a controlled way.

“Mocks” are used in complex cases, when the mocked method does not return anything. For example, when you have to put a condition that MySql is called at least once.

“Mocks” should only be used when necessary.

If writing a test makes us change the existing code, it means that the test has been badly chosen.

“Stubs” are used in simpler cases, as each call returns something every time.

“Given”, “when”, “then” pattern allows us to understand the initial state. It makes writing tests more efficient, because we know what the function should do, so it’s easier for us and for someone else to read later.

Tip: Write more small tests rather than a few big tests.

Artificial Intelligence to the rescue of accessibility by Guillaume Laforge and Aurélie Vache

Feedbacks from Saad Amal (Adeo Services)

“Artificial Intelligence to the rescue of accessibility” is a talk given by Aurélie Vache, developer at OVH Cloud and Guillaume Laforge, developer advocate for Google Cloud.

The main idea of this talk is to emphasize the fact that accessibility tools are not only for people with disabilities.

There are a lot of APIs that can help developers, let’s focus on some of them.

  • The speech-to-text API converts audio to text by applying powerful neural network models in an easy-to-use API. This is obviously helpful for deaf and hard of hearing people but can also be used to make vocal content searchable, generate subtitles for videos, podcasts … Google Cloud API documentation is available here: https://cloud.google.com/speech-to-text. Distinguishing speakers’ voices is possible through the “diarization” concept. Other APIs are available like https://sonix.ai
  • On the other hand, the text-to-speech API can be used to help blind people and kids who struggle with reading or dyslexic persons. But it can also be used for navigation apps or to transform text to audiobooks. Documentation for Google API is available here https://cloud.google.com/text-to-speech, other APIs exist like lightico, pureConnect …
  • Live transcribe is a built-in accessibility feature for a live transcription of what is heard by the smartphone’s microphone.
  • Other APIs like Vision API or Video Intelligence API visualizer provide a lot of metadata about an image or a video input.

I really enjoyed the enthusiasm of Aurélie Vache and Guillaume Laforge during this talk. It was very interesting and easy to follow and made me realise that accessibility is for all of us !

More is coming !

More contributions are expected soon, so make sure to subscribe to Adeo’s Tech Publication on Medium to get a notification when we publish more content.

Have a nice day 🙏

--

--