The dawn of contextual artificial intelligence

David McNally
4 min readFeb 24, 2016

Early one morning I walked into our kitchen and said, “Alexa, good morning!

Our Amazon Echo replied with a cheery “Good morning,” and then proceeded to tell me that today was the anniversary of an invention by Thomas Edison.

It, or she as I’m prone to refer to her, included a humorous play on words that brought a smile to my face.

Good on the Amazon software engineers who programmed this interaction. I am both impressed and hungry for more.

What lies ahead for artificial intelligence?

The Echo is an amazing gadget. My entire family asks Alexa questions, sets timers and alarms and gets news and weather on a whim.

However, my wife and I wonder if we’ll ever see a truly helpful contextual assistant.

What if you were to give Alexa a checklist of activities you want your kids to do every night?

For example:

  • Change into pajamas
  • Use the facilities
  • Wash hands, face
  • Brush teeth

The kids would tell Alexa, “Good night!”

Alexa would respond, “Good night. Let’s make sure you’re really ready for bed.” And then she would interactively ask if each item on the list was accomplished.

In the case of a checklist for children, contextual A.I. would become a critical management tool.

Is this a possible future for a contextual artificial intelligent agent?

If our devices are to become true partners, they will become an extention of the human brain. They will tell us what we need to accomplish based on time, location and priority.

For example, I want to know what it is that I might have forgotten as I head out the door!

If I’m driving to a meeting in another city, I want my intelligent car to know about the upcoming journey and let me know if there are any maintenance issues or things I need to take care of before the trip. Like, is the tank full? I should go through a verbal checklist with the intelligent agent to ensure everything that goes into getting me safely to my destination is taken care of.

Throughout the day, A.I. could adapt to new data like time and location.

New reminders and tips would be provided based on context. Advanced A.I. would anticipate my needs and communicate accordingly.

In the near future, devices will hand off the task of communication from household devices to mobile devices with a constant reach back to the Cloud. You’ll never have to restart a conversation with your agent. It already will know what you’re going to talk about. It’s just the continuation of a conversation regardless of device.

Imagine how such an assistant would empower your life! You would never forget anything again.

An A.I. assistant would review emails and messages and proactively alert me. In my experience, this is where the current system falters.

My Apple Watch, for example, promise alerts but almost always falls short. Customizing what alerts get through with an eye on prioritization is a necessity.

Let’s say I want messages from my wife to get top priority. I envision a wireless earphone, similar to a hearing aid, that would unobtrusively ask, “Excuse me David, may I give you an important message from your wife?”

I would tap the earphone, or verbally respond, “Yes.” What follows would depend on the message. If it’s an audio message or voicemail, the actual audio would stream from the iPhone to the earphone. If it’s a text message, as I flick my wrist, the message would flash on my Apple Watch, or if I raise my iPhone or iPad, the message would instantly appear.

Somehow, I should be able to signal my assistant that I would like to reply, either verbally, or through a preset response.

On the run

Another example of a potential A.I. chain of action might go something like this. “Time for a run!” Right now I can tell Siri on my Apple Watch to, “Start my run!” Often, it doesn’t work. And it’s oh, so slow. Sometimes it takes more than 45 seconds to start. Hint: it should start immediately or upon sensor recognition that I’ve started to move!

But I want even more than speed. I want my smart devices to know my preferences and act upon them.

I only run with the app, “Charity Miles,” for example. Also, I always track my runs with the apps, “Runkeeper” and “Argus.”

Hey smart device, if I do this every time I run, why not learn my routine and ask me if you should take care of this in the background each time? And please, do it instantly!

When I’m done running, my device would recognize that the running has stopped (not paused for traffic). Again, a quiet voice in my ear, “Are you finished?” Maybe it would cue off of data from Runkeeper that I just met the distance goal of 5K or 10K that I usually run. The data is there, I just need my contextual A.I. agent to interpret and suggest options and then take action.

By now, you probably get that the vision of an intelligent agent assisting us through our day is really something that is attainable in the near future.

We’re almost there. We just need smart programmers to connect the dots and provide us with apps that learn from established behaviors and provide potential courses of action, and then to do it!

I’m not a programmer. But there are a lot of really smart people out there who are. I’m fascinated by the possibilities.

Alexa (or Siri), take me to the future now!

--

--

David McNally

Photographer, writer, futurist, technologist and retired soldier