A Baby monitor Kotlin app using Jetpack Compose and HMS ML Kit sound detection

Photo by Tim Bish on Unsplash

Adding complex features to a mobile app is becoming easier and easier. In the years We saw a lot of SDKs, libraries and utilities
to help us as developers fulfil the trickiest needs of our users.

Years ago I could not imagine how difficult could it be to develop something like a Baby Monitor app for our smartphone.
Something activating the microphone and automatically recognizing the crying sound of a baby and generating effects,
like sending a notification, maybe playing a song or other more useful features.

Today We have Machine Learning, yes, we could train a model to recognize…

Photo by Jacek Dylag on Unsplash

Water is a finite resource and we should be aware to not waste it.

It’s estimated that the average American uses about 575 litres of water a day, while the average European uses around 250 litres of water.
Can you imagine how much is 250 litres?

Imagine a room filled with 250 of those big one litre bottles!

Sometimes is hard to figure out how much water we are wasting with simple gestures and habits.

With the help of current technologies, we can quickly increase this awareness.
I’ve developed a demo app using Machine Learning to detect when water is running…

Allow your users the freedom to choose their Android platform providing the same feature

A Classic Word Search game

Some time ago I developed a Word Search game solver Android application using the services from Firebase ML Kit.

It was an interesting trip discovering the features of a framework that allows the developer to use AI capabilities without knowing all the rocket science behind.

In the specific, I’ve used the Document recognition feature to try to extract text from a word search game image.

After the text recognition phase, the output was cleaned and arranged into a matrix to be processed by the solver algorithm. This algo tried to look for all the words formed by grouping the letters…

Boggle (https://upload.wikimedia.org/wikipedia/commons/thumb/f/f4/Boggle.jpg/512px-Boggle.jpg)

Some months ago, during a boring Twitter reading session, I’ve spotted a tweet from Reto Meier:

This looked like a really interesting challenge and a way to have fun with Android development and Algorithms. Also, there was a comment from Hoi Lam:

adding a little of ML Kit to it was a really good point, it would allow capturing the input for the algorithm in a modern AI-fashioned way.

So I decided to take this journey that I completed roughly in a couple of weeks

Then I decided to dedicate to the project more time, to learn…

One week ago I’ve read the article from Sara Robinson about how to add Computer Vision to an iOS app.

It is a great article describing playing around a cool idea: develop a serverless application combining the Firebase API (Cloud Storage, Cloud Functions and Cloud Firestore) with a Cloud Service like Google Vision API, offering Machine Learning powered image recognition.

I start to thinking about a cool application using these services, but I’m an Android developer. So I tried to apply the same functionalities to an Android app developed in Kotlin language.

This app allows you to upload a picture…

Dev log of the Teamwork 2 Project for Udacity VR Nanodegree

Testing Cardboard VR

Udacity Teamwork is a super cool opportunity to “learn by doing” what you are studying. So when I received the mail about the starting of the VR Teamwork 2 I immediately answer YES! “Colors” was the Theme!

I’ve some experience in organizing teams so I decide to be the project leader for the Team Sorrento :) (I choosed the Italian name to honor my origin).

My team of 4 was composed all by beginners so I have the big responsibility to guide them into learning having fun.

First thing…

How to start developing augmented reality apps with ARCore

Augmented Reality applications are spreading around us thanks to the evolution of Computer Vision algorithms and the relative easiness of development using powerful frameworks as Vuforia and ARKit.

Even Google as announced their AR framework at the end of August 2017 offering developer a new software-only solution to create Augmented Reality application in an easy way.


Google has finally released the 1.0 Version of ARCore


I’ve updated the sample source code with the new SDK — Have a look :) to compare the change in the API.


Daydream is the high-quality VR platform by Google. It has been presented at Google I/O 16 from Clay Bavor. Today this platform is compatible with only one Headset: Google Daydream View a sort of Cardboard 3.0 with a 3DOF Controller. Any only a little subset of high-end smartphones is compatible with this platform.

Published on December 15th, 2015 | by Giovanni Laquidara

Back in August 2011 Mark Andreessen‘s on the Wall Street Journal stated that “Software is eating the world”.

And Now? How is it doing? Quite everything around us is based on software. Every business is or will at some point be based on software. Nowadays everyone of us has software running in our pockets (on a Smartphone), and on our wrists (on a Smartwatch). We drive software based cars, and soon they will become software driven cars. We use software to play, to organize our life, to look for our mates…

Giovanni Laquidara

Developer Advocate @ Huawei, Android, ML and VR/AR lover

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store