Everyday AI …

Vaibhav Satpathy
Grey Matter AI
Published in
6 min readApr 25, 2021

AI — your daily companion

For many people the first thing that comes to their mind when I say AI is probably terms such as Wall.e, Terminator, Chappie or may be everyone’s hot favourite JARVIS. Although it is true that all of them AI, but our human civilisation is decades away from achieving any such milestones. Now people many a times would ask me why is that? The answer to that is very simple.

What we have today is synthetic intelligence which is capable of performing some of the mundane tasks exceptionally well, even better than humans for that fact, but they are just capable of performing one task at a time. Multi-tasking is still a great wall left to jump over. Which is absolutely fair, after all AI has just been around for a decade. But what if i told you that is wrong?

It’s true, AI has been a part of our world since the day it was born. Although it came pretty much into significance in the late 1950s. Most of 50s and 60s were spent in extensive mathematical modelling and research pretty much leading us to nowhere. Although those researches were the grounds on which our current AI exists, but well, like always we ignored the hardware limitations and growth needed to implement them at the time.

It was in the 2010s that our world finally started realising the importance and scope of AI. Which was again a stumbled upon discovery. Nvidia who had no intention of building GPUs for Neural computation is now the leading market brand for it. They built GPUs for heavy duty gaming and faster computation, it was the gradual research by the community that paved the road for them to imagine the scope of implementation and importance of GPUs in AI and how they could revolutionise the century.

Since then pretty much everything in our life has AI in it. In this article we will be taking a look at some of the basic fundamental activities of our lives being done by AI without our knowledge. In a very simple language —

Every action that has a pattern, can be automated and such an application is called ARTIFICIAL INTELLIGENCE.

Mobile keyboard

Well all your mobile keyboards have a recommendation engine, personalised to your typing patters. But that is something we all are aware of. The additional hidden technology along that is something that you can control from your settings. It’s called the Swype Keyboard.

Many phones with this functionality allow the user to swipe their fingers across the keyboard, covering the alphabets that are a part of the word that you want to type. The underlying algorithm takes all the alphabets that you feed it in a single swipe and creates a plethora of combinations possible out of them. Now these combinations are filtered based on the frequency of the words you prefer, in addition to that, based on one’s sentence structure the next set of possible words from the same swipe is also recommended to the user.

At a high level you could say that is just a recommendation system, but that’s not true. It encompasses —
Predictive model
Language model
Auto-correction model
Reinforcement engine
Recommendation system

Face and Finger print unlock

Most of the devices today come with a built in Face and Finger print lock. Here’s a though to ponder upon, we have always been told that it requires huge volumes of data to train a classification engine to satisfactory accuracy. Then how is it that our personal devices can learn our facial features or our finger prints or even for that fact our voice prints with such great accuracies with barely 5 samples.

Well what is obvious is that to recognise these features and predict them we use AI, so we need not dwell on that. But the catch in the story is the concept of —
Transfer Learning
Incremental Learning
Reinforcement Learning
Federated Learning

Every one of the branches mentioned above is a major requirement to create even the smallest and simplest of application full proof and production ready.

The learnings from different models existing on various edge devices should potentially be re-usable cross multiple other devices — Transfer learning

Gradually increasing the number of categories the system can identify, without forgetting the original learnings — Incremental learning

Every User Feedback needs to be taken into account to improve and upgrade the model over time — Reinforcement learning

Gathering the learnings from multiple personalised edge devices without accessing their data — Federated learning

Search engines

Every search engine built within an application uses multiple layers of AI to enhance its performance. Some of the trivial necessities of such an engine are as follows —
Auto-correction Model
Language Model
TF-IDF Model (Term Frequency-Inverse Document Frequency)
Relational Mapping

It’s essential for the system to make corrections to the user feed if necessary, as if the user is searching for something, mostly it means that the individual is uncertain of it, hence correction is must.

TF-IDF is a must to remove the rule based convention of dictionary search, where the results are based on the assumption that the feed is alphabetically correct. It helps in understanding the context of the search not just from Left-to-Right but also while reading from Right-to-Left.

Relational Mapping is most crucial for such a task because as vast as the world has become today, there are hundreds of results catering to different topics with the same terminology. Hence it’s must to understand user content and relate it to the relevant information that the individual requires.

Google maps

One of the most relevant examples of our times is Google maps. Not just does it have recommendation engine based on multiple parameters, extracted after years of feature engineering, but it also has been introduced with 3D models, predictive modelling not just for the time consumed but based on weather patterns and user feedback, the dynamic nature of our environment effecting our journey.

These are some of the very common use cases identified by visionaries over time to resolve and help the society be a better place to live in. Even some of the basic activities such as walking, talking, writing, reading are all based out of human cognitive patterns and each and every one of it can be leveraged over time to be developed as an AI not only to ease out the process but also sometimes provide a human empathetic touch to it.

Believe it or not, but such everyday life technologies also incorporate such intricacies of AI and software engineering, that all you need sometimes is just a little perspective and you can make wonders for the society.

I hope this article triggers curiosity and creativity in some part of your brain, to look at the world from a different perspective and bring out the best in it. 😁

--

--