A.I.mpact, Part 1: Don’t Wait for A.I., It’s Already Here

John McCarthy once made a telling complaint that still applies to how people have perceived — or, rather, failed to perceive — the arrival of artificial intelligence in their lives: “As soon as it works, no one calls it AI anymore.”

He was more than qualified to carp about the fact, having won the Turing Award and being one of the founding fathers of AI. He even gets credited with coining the very term artificial intelligence back in 1956.

Here’s how what John McCarthy said relates to what’s going on today:

Innovations like AI often sneak in on us, gradually merging into the workaday and commonplace. They do it without any of the abrupt upheavals or manic melodramatics we’ve been conditioned to expect of AI by TV, the movies and (bad) science fiction. Or by the bolder speculations of futurists like Vernor Vinge and Ray Kurzweil, whose notion of an eventual Singularity has fired the imagination of a lot of people, including the occasional hack screenwriter.

When a sweeping transformation enters our lives incrementally, in a thousand small, pedestrian and often invisible ways, we don’t recognize it’s even arrived until that evolution has nearly run its course.

Truth be told, artificial intelligence won’t unexpectedly arrive in the shape of a cheesed-off robot anytime soon.

That’s because AI is already here. And it’s everywhere.

Know your AI (because it’ll know you)

One reason most of us, including the media, have wrongheaded expectations of AI is because they see it as a monolith, not realizing the term is a catch-all for different levels of capability:

Artificial Narrow Intelligence (ANI), or “Weak AI,” specializes in a single area, like the chess program that can beat a Grandmaster, but will never understand how to operate a toaster. It’s the task-specific level of AI that’s most common today.

Artificial General Intelligence (AGI), or “Strong AI” or “Human-Level AI”, means a system that’s as smart as a human being at every intellectual task, including being able to reason, problem-solve, think in abstractions, and learn from experience. The Blue Brain project used IBM’s Blue Gene supercomputing platform to simulate the neural network of a rat, with 10,000 neurons and 108 synapses; the ability to duplicate a human brain’s functionality is vastly harder.

Artificial Superintelligence (ASI) is the trope of sci-fi movies and TV shows that gives the public a poor idea of AI, depicting machines smarter and faster than humans at every possible job. ASI presents us with the vision of an AI-dominated future that gives pause to many of us, though as long as we’ve got a Captain Kirk around we can probably talk them to death.*

ANI is becoming pervasive across our lives, from how we do online search to what route we drive to the appliance store to pick up a new fridge or dishwasher that’s also incorporating AI. Through deep learning, our smartphones, autos, wearables and homes are already understanding our needs and behaviors, and figuring out the best way to serve them.

*Here’s a game to play with the Trekkies in your life: How many times did Kirk outsmart an AI during the run of the Original Series? There’s an answer here.

Teaching AI to learn, assist…and evolve

A lay person who wants some idea of what AI is all about and how it’s already affecting modern life could easily just Google “artificial intelligence” to find out more. Or he or she could visit Google’s own AI Experiments page, where a series of interactive exercises help users explore machine learning. For Google, AI is hugely important, the backbone of every future aspect of their business.

In 2015, they rolled out RankBrain, a deep learning system that helps generate answers to search queries. RankBrain is a perfect example of deep learning through a neural network, where software and hardware approximate the lattice of neurons that make up our own brains, carrying out tasks far more quickly and accurately than human wetware ever could. It’s called deep learning due to the increasing layers of information that are processed by ever-faster computers.

If you’re searching for “automobile” in Google Photos, you’d be shown every picture of an auto you’ve stored because Google AI has analyzed each image on file and recognized the patterns that match your query.

When Google debuted a huge range of new hardware devices in early October, from its Pixel smartphone to its Home household gadget,they were really Trojan horses, as Wired put it, for Google Assistant, their AI helper app. Assistant works invisibly, predicting user needs, acting on requests and giving succinct answers to questions, while tapping into resources like Google’s Knowledge Graph, which warehouses more than 70 billion facts used to enhance search.

Other platforms like Google Translate rely on machine learning to do their jobs, too. And recently, Google announced that three of its experimental AIs, named Alice, Bob and Eve, had learned on their own how to construct encryption methods allowing them to communicate privately with each other, while also learning what data deserved to be kept safe.

“Knowing how to encrypt is seldom enough for security and privacy,” the Google team said. “Interestingly, neural networks can also learn what to encrypt in order to achieve a desired secrecy property, while maximizing utility.”

Facebook recently launched DeepText, able to read and comprehend a thousand posts per second in over 20 languages. Imagine typing in a Messenger text about needing to order pizza, and Facebook conveniently serves up links or prompts about where to order in your area: that’s DeepText at work.

DeepText will also monitor comments on a post you’re interested in to remove the dross while pulling up the ones you’d actually like to read. Eventually, its deep learning capabilities will allow text and visual content to be analyzed together. That’ll be useful on a channel where 400,000 new stories and 125,000 comments on public posts are shared every minute. In fact, Facebook already claims its AI systems report more offensive photos than humans do across its network.

Putting AI in the Everyday

Frankly? It’s getting so that it’s harder and harder to find a walk of life or commonplace device that isn’t incorporating AI in some way.

  • Smartphones are already replete with AI. An assistant like Siri is only a marquee example; from your navigation apps to your music recommendations, deep learning is at work. As we move forward, microprocessor chips designed specifically for AI will be a key feature of next-gen devices, allowing complex processing tasks to be done within your phone, without having to connect to the Cloud; you’ll be carrying a neural network in your pocket with unimaginable potential.
  • If you’re using email, you’re probably using AI: the spam filters that separate the messages you actually want from those offers to open a massage parlor franchise! or make $100,000 a day in just your spare time! depend on ANI that learns your specific preferences. Which may, for all we know, include the haunting desire to own a massage parlor franchise.
  • Household smart devices like Nest thermostats are examples of how AI learns and predicts based on your regular routines, linked into the Internet of Things (IoT). But more sophisticated executions include Amazon Echo, the spearpoint of a new inrush of AI-enabled appliances (we’ve already mentioned Google’s Home device). Gartner has predicted that the average home may include no less than 500 smart devices by the year 2022, all of them dependent on AI.
  • Wearables are an AI avenue, too; already, sales of devices such as the Apple Watch and Fitbit products, among others, are up over 67% versus 2015; it’s a market estimated to be worth over $34 billion by 2020. The fitness tracker you’re wearing may have embedded AI, and communicates with your smartphone, uploading data to Cloud-based platforms that often use AI-driven analytics to make sense of your exercise routines, sleep patterns, diet and more…and share that data with brands who can then reach out with predictive offers based on your behaviors.
  • Apparel is another realm where AI is making inroads; when somebody says your new outfit “looks smart,” there’s a chance you
    can
    take it literally. Manufacturers already use AI to optimize their supply chains (that’s critical in a segment where styles and tastes can be tough to predict), but retailers like Macy’s are trying to use AI to help shoppers find their ideal look, while others are launching AI stylists that understand the nuances of fashion and personal style. And the kinds of AI chips that’ll show up in smartphones and appliances may very soon be embedded in our clothing, too.
  • Automobiles are rife with ANI, with computers controlling everything from your engine to your brakes, your entertainment touchscreen and air conditioning. The autonomous car or Testa’s Autopilot system may get all the press, but the cars and trucks we use for every commute or delivery are already leveraging the technology in hundreds of ways.
  • Do any airline travel lately? Then you may be happy — or maybe not so happy — to learn you were in the hands of AI systems helping to control the plane in flight, suss out the best route around weather, and even decide what gate you should be using at your destination. The system-wide logistics of baggage handling are informed by AI, but now some of the baggage handlers may be running on AI, too. At least you’ll save on tips.
  • Toys and games are deploying AI, bringing even classic brands like Hot Wheels into the 21st century as it uses special tracks, sensor-embedded cars and chipped controllers to make sure cars stick to the track instead of leaving it to attempt an Evel Knievel jump over the family dog. Until Junior switches off the AI driving assist, of course.

These are just the tip of the AI iceberg in the consumer products universe. When it comes to B2B, manufacturing, healthcare, finance and other sectors, AI is already at the forefront:

  • Financial firms utilize AI everywhere from the customer service department, via chatbots, to the trading floor: AI traders account for over half of the trades of equity shares done in U.S. markets. But AI may scale down to personal finance, as well: a San Francisco startup, Wallet.AI, hopes to collect all the bread crumbs people leave behind as they eat, shop and otherwise spend money, combining that with mass data from other sources to recognize and analyze patterns and help users better manage their finances.
  • Healthcare enterprises invested $400 million in AI in 2015, a figure expected to balloon to $3 billion by 2020. Some of the areas where it’s being applied include robotic surgery, where simple surgeries are already being performed by robots. Devices such as Pillo, touted as the “world’s first artificially-intelligent healthcare companion,” can answer health questions, dispense meds and vitamins, sync with wireless and wearable devices and connect patients with caregivers via mobile alerts. Another firm, Beyond Verbal, uses an AI-powered research platform to analyze vocal biomarkers to detect heart problems, and claims it can identify other illnesses such as ALS and Parkinson’s.
  • The U.S. military has invested heavily in AI through the Defense Advanced Research Projects
  • Agency (DARPA) for years, of course. One recent revelation was how AI permits airborne drones to distinguish when an on-the-ground individual is armed — and the test they shared with the media didn’t even utilize a military unit, but an off-the-shelf retail drone. It might be reassuring to note that the Pentagon doesn’t intend to give “full autonomy” to AI weaponry…but on the other hand, the drone in this test was nicknamed Bender, of Futurama fame, who’s often gone off about his intent to “kill all humans.”
  • Architecture and construction innovators are adopting AI on various fronts; a recent search of Indeed.com using “artificial intelligence architect” as keywords found over 130 job listings. They’re particularly interested in AI integration into what they call additive construction — or what the rest of us think of as large-format 3D printing. By combining AI and machine vision, a printed structure can be better designed, more quickly constructed with a minimum of defects, fine-tuned during the actual build to suit its context or site, while using materials more efficiently. Here, as in other fields, there’s serious investigation of AI-guided robots to take over construction tasks.

How will AI evolve marketing?

Deep learning is already making its mark on marketing, and very soon it’ll bootstrap marketing automation to the next level. In the same way AI is reaching into every aspect of technology and life, it’ll compel sea-changes in marketing, too.

The actualization of personalization and 1-to-1 marketing engagement are obvious benefits, and they’re going to be essential in an increasingly mobile, omnichannel digital age. But there’s more to how AI and marketing will work together than that.

In our future A.I.mpact posts, we’ll dig into exactly where and how deep learning will evolve marketing, not just in terms of strategies, tactics and technologies, but in the very makeup of marketing departments and the skills they’ll need to leverage AI and avoid obsolescence.

In the meantime, we’d be happy to hear your thoughts and questions on the impact of AI. Mail us here!

Originally posted on www.marianaiq.com