And a cool new product for you to try!

When we created Snips a few years ago, we did so because we believed in using Artificial Intelligence to solve everyday problems. From predicting passenger flow in public transport to anticipating car accidents, we always tried to find a way to bring the power of machine learning to consumers.

But then, we started thinking more long term, about what will happen with all of those connected devices, bots and apps coming onto the market. What we realized is that there is no way we will be able to cope with so much technology, at least not with the current way we interact with it.

After all, with only 2 devices per person on average today (a phone and a computer), we are already overwhelmed by technology. Who hasn’t felt the urge to check their phone when receiving a notification, or felt desperate looking at their email inbox growing faster than they can handle? Even simple things, like retrieving your hotel reservation and booking a taxi to go there requires you to keep a mental map of where your data and apps are located, going back and forth between them.

The problem is that the number of devices we interact with is growing exponentially, and will surpass 100 billion by 2025. Just imagine getting into a room and needing to use 20 different apps to control your lights, sound system, TV, thermostat, etc.. What a nightmare!

So if humans alone can’t handle the complexity of technology, why not create an Artificial Intelligence that can take care of it for us? Given enough data about our lives, such an AI could become our digital alter-ego, answering any question we might have, doing anything we ask it to, and automating devices we don’t want to handle. It would be ubiquitous, always available, and able to replicate our behavior. A bit like in the movie Her.

Once the capabilities of AIs grow beyond the complexity that connected devices bring, our perception of technology is that it will simply disappear into the background. This era, called “Ubiquitous Computing”, is when everything is hyper-connected, but actually feels unplugged. We will finally have the peace of mind to focus on what we, as humans, really care about.

Making AIs understand us

Building an AI that truly understands us is a very difficult task, as we have seen from the various attempts in the last few years. It is important to note that here, we are talking about a personal AI that each of us will have, rather than some super AI that is shared amongst all of us.

One of the main reasons is that it’s fairly easy to understand a well structured, unambiguous query (e.g. “Book an Uber to 18 rue Saint-Marc”), but it’s very hard to understand one that includes ambiguous personal references (e.g. “Book an Uber to Rand’s birthday party”). In the first example, a simple parser can detect “Uber” as the action and “18 rue Saint-Marc” as the address. The query is self-contained.

A Self contained query that is straight-forward to understand for an AI

But to be able to understand the second example, the AI needs something more: a working memory.

The human memory is a complex network of neurons that constantly form new connections when new information becomes available. But rather than store information sequentially like in a traditional database, it has it scattered all over the brain, linking it through neural pathways. This unique structure enables it to store complex concepts which are related to each other, forming what we call a “knowledge graph”.

Example of a knowledge graph generated between someone in my team and I

Using this graph, the human brain is able to retrieve information by navigating through linked concepts, using whatever is easiest to remember as its entry point. For example, if we are sitting next to each other, and I say “meet me at the office in an hour”, your brain will first recall the concept of “Rand” (the easiest, since I am right there with you), and from there navigate to the concept of “Snips”, and finally to the actual address “18 rue Saint-Marc”.

Replicating this in an AI is quite challenging though, since the user’s data is fragmented between several providers, often times without any way to access it. Nonetheless, we can already access quite a few, such as contacts, emails, calendars, locations visited, photos or messages, and link them together to create the user’s knowledge graph. For example, a calendar event will be linked both to the place where it occurs, and to the corresponding contacts, themselves linked to their social media profiles.

Taking again our previous example of booking an Uber to Rand’s birthday party, the AI would thus follow these steps to understand the query:

  1. search for “Rand” in contacts
  2. search for a related event that happens soon and contains “birthday”
  3. retrieve the address of the event
  4. book an Uber, giving it the address of the event as a parameter
An ambiguous query that requires the knowledge graph to be understood by the AI

Without the underlying memory and knowledge graph, the AI would not have been able to understand what the user wanted. This is the reason AI products today don’t work well: they simply lack the memory layer.

Introducing Snips: Your Intelligent Memory

Rather than going straight towards giving Snips natural language capabilities, we decided to focus first on its memory layer. By building your personal knowledge graph, and then eventually adding language on top of it, we are ensuring Snips will actually understand what you ask.

Today, we are proud to announce the release of our first product: an Intelligent Memory for iOS!

It allows you to store and retrieve the information buried deep into your phone, removing the need to juggle apps and copy-paste stuff around. Instead, Snips acts as your single entry point to retrieve your data.

The way it works is quite simple: connect your calendar, contacts, location and email, and Snips will link them to create your personal knowledge graph. You can then search anything stored in Snips’ memory, using any related keyword. From there, you can navigate to related pieces of information, and deeplink into your apps.

If Snips couldn’t find something in your data, it will look outside, hitting Foursquare and Google Maps to return relevant results (we will add more providers over time).

Whether it’s remembering an upcoming appointment, the name of the person you had a meeting with or simply the reservation number for the next gig you’re going to see, they all become intuitively searchable in Snips.

It gets better though: the more you use Snips, the better it understands you, suggesting things before you even search for them!

How Snips learns to predict where you want to go next

Take it for a spin and give us your feedback! This is a long term project we really care about, so we could use your help :-)

ps: Android is coming soon, with even more cool features!

Privacy by Design

But entrusting an Artificial Intelligence with this much personal data poses major privacy issues. This is why we have taken a radically new approach: Privacy by Design.

Everything Snips does either runs locally on your smartphone, or uses cutting-edge cryptographic techniques such as Homomorphic Encryption, to ensure your data can never be seen by anyone else but you.

This is important, because if we stored your data unencrypted in the cloud, we would become a target for every hacker and government on the planet. After all, why would they go and hack 50 providers if we already did the job of centralizing everything? At Snips, we consider personal data a toxic asset: we want as little of it as possible, and get rid of it away as fast as possible.

This privacy guarantee is unlike any other consumer AI product currently in the market and is built upon years of experience in both machine learning and cryptography. I cannot tell you how hard it is to make stuff run on a single device. It forces us to rethink every machine learning algorithm, while having to manage memory, CPU, battery etc..

To help us push the boundaries of AI and privacy, we have an amazing entrepreneur who just joined our team: Javier Aguera, founder of Blackphone.

The future

Snips is already capable of many things, improving the way we interact with our phone on a daily basis.

Going forward, we will focus on integrating even more sources of personal data (photos, social networks, ..) and external providers (movies, songs, tickets, …). This will ensure Snips becomes more relevant in even more scenarios, until the point where you won’t need to use your phone’s homescreen to access your apps anymore.

We are also working hard on adding natural language capabilities to Snips, so that you can ask it to do things directly, rather than having to search for it and then click a deeplink. To get there, we are exploring a few cutting-edge technologies, such as Neural Turing Machines, for which we published the first open source library. You can check some of our projects on our research page.

I hope you will enjoy using find our product as much as we do here, and that we can work together towards building a Personal Artificial Intelligence that also protects our privacy!

If you enjoyed this article, it would really help if you hit recommend below, and show your support on Product Hunt!

You can also follow us on twitter @randhindi & @snips

Oh, and we are hiring!

Like what you read? Give Rand Hindi a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.