Building The Obvious

Lessons from MobileXGenAI Hackathon

Sohel Kanaan
𝐀𝐈 𝐦𝐨𝐧𝐤𝐬.𝐢𝐨
3 min readJun 17, 2023

--

Joining a hackathon is incredibly exciting. The teamwork, the stress, and the bright ideas bouncing around — it’s a mix of tiring and thrilling moments. We were lucky to be part of the ‘MobileXGenAI Hackathon’. This one-of-a-kind event was all about Generative AI (GenAI) for mobile apps, with particular attention on solutions that work directly on the device over those using the cloud.

The Birth of LLMobile

Out of all the ideas we encountered, one shone brightly and became our guiding light — LLMobile. This innovative solution gave developers the power to use prompt engineering with Large Language Models (LLMs) directly on mobile devices, removing the need to implement their LLMs. As we worked hard to bring this idea to life, it felt like we were on the brink of a significant breakthrough.

The Technical Backbone

LLMobile is incorporated directly into the mobile operating system. It uses Inter-Process Communication (IPC), a mechanism within the operating system, to enable effortless interaction between various apps and the Large Language Model (LLM) situated on the device. Crucially, developers can connect with the LLM using a specialized SDK to perform IPC. This unique feature allows them to use LLMs efficiently, on-device, without the need to enlarge their app size or depend on a cloud-based model.

This concept wasn’t completely new. Google had successfully built similar functionality into Google Play Services to power Google’s ML Kit. Their approach provided developers with ready-made machine-learning solutions, making AI integration for some tasks much easier.

Effortless Sharing: How Apps and Google Play Services Talk to Each Othe

However, we decided to push the boundaries. With LLMobile, we created a Minimum Viable Product (MVP) using the LLAMA model, an LLM built to run directly on mobile devices. This advancement meant developers could harness the power of LLMs without suffering from increased latency, privacy issues, or bloated app sizes.

Facing the Judges

As we presented our solution to the panel, one judge’s comment struck us.

“You are building the obvious,” he said.

At first, it seemed a strange remark. But upon reflection, we realized what he meant. From a business perspective, this solution was something that the likes of Google’s ML Kit team or Apple’s Core ML team might naturally consider and be working on.

The judge’s comment revealed an essential facet of innovation: the notion of “obvious” is often subjective. What may seem groundbreaking to one person might be an evident next step to another. It’s a reminder that disruptive ideas can sometimes be the most obvious ones, lying in wait for someone to execute them.

This realization, however, did not diminish our pride or accomplishment. Instead, it added a layer of depth to our understanding of the innovation landscape. Whether our solution was ‘obvious’ or ‘revolutionary’ didn’t matter. What mattered was that we identified a problem, designed a solution, and brought it to life.

Final Thoughts

The MobileXGenAI Hackathon was an invaluable experience. Not only did we walk away with technical insights and a promising solution but also with a profound lesson in innovation and business strategy.

To innovators and entrepreneurs out there, remember, don’t shy away from the ‘obvious’. After all, today’s obvious can become tomorrow’s indispensable.

Also, Read

Follow our Social Accounts- Facebook/Instagram/Linkedin/Twitter

Join AImonks Youtube Channel to get interesting videos.

--

--

Sohel Kanaan
𝐀𝐈 𝐦𝐨𝐧𝐤𝐬.𝐢𝐨

Tech & business enthusiast, software dev, entrepreneur. Sharing insights on startups & tech trends. Connect👉 linkedin.com/in/sohel-kanaan