AI News Roundup — June 2019

by Gabriella Runnels and Macon McLean

Opex Analytics
The Opex Analytics Blog
5 min readJun 28, 2019

--

The Opex AI Roundup provides you with our take on the coolest and most interesting Artificial Intelligence (AI) news and developments each month. Stay tuned and feel free to comment with any stories you think we missed!

_________________________________________________________________

This A.I. Is Starting on the Right Foot

Photo by Alex Blăjan on Unsplash

This month, researchers from the University of Michigan and the Shirley Ryan AbilityLab presented an open-source bionic leg that uses sensor data and artificial intelligence to anticipate and respond to users’ movements. This innovation represents a significant advance in the field, as bionic legs are a notorious challenge in prosthetic limb design. Although its underlying technology is open-source, the total price to build one of these legs clocks in at around $28,500 — not exactly the budget of your typical DIY project. But by making this technology open-source, researchers are hoping to tap into the knowledge of the general public to further improve this life-changing technology.

Now That’s One Hot Model

Image by Free-Photos from Pixabay

According to a recent study, building and training certain AI models can produce “more than 626,000 pounds of carbon dioxide equivalent” — roughly the carbon footprint of the average American over seventeen years. In fact, the type of model that powers Siri and other AI voice assistants may be one of the worst culprits. This set of algorithms, part of a field called “Natural Language Processing” (NLP), has made impressive strides in the last few years, but evidently at a cost. Training these complex models on huge datasets requires enormous computing power, and therefore enormous amounts of energy. If the robot uprising isn’t the downfall of humanity, AI might still wipe us out from the greenhouse gases alone.

Reduce, Reuse… Reinforcement Learning?

Photo by Gary Chan on Unsplash

While the sheer amount of energy consumed by NLP models and other machine learning algorithms is alarming, we can take some comfort in the fact that not all AI is environmentally destructive. In fact, the Colorado-based startup AMP Robotics and the Norwegian company TOMRA are both developing cutting-edge AI technologies to improve the efficiency and accuracy of the recycling process. For example, AMP’s robotic technology can differentiate between materials that recycle differently, but that humans tend to lump together, while maintaining a very high processing speed. To teach the models how to correctly identify and sort recycled materials, a convolutional neural network is trained on millions of images. (Let’s hope the carbon emissions from model training don’t offset the recycling benefits!)

Deepfake Detective

Photo by Kevin Ku on Unsplash

If the idea of deepfakes scares you (and it should), you’ve probably wondered what can be done to protect us against them. Good news — researchers have identified what they call a “soft-biometric signature,” which serves as an ersatz watermark to verify the authenticity of a video. Using generative adversarial networks, data scientists have found ways to detect an individual’s set of unique facial movements that serves as their own personal speech signature. Misinformation and false statements are a major concern, especially for world leaders; our ability to thwart bad actors will have a significant impact on our future, in ways both obvious and insidious.

AI at the Speed of Light

Image by Gerd Altmann from Pixabay

Bill Gates has just given money to a new startup that designs high-performance computer chips that use light instead of electrons to perform computations. Luminous, the company in question, has made waves recently with news of their prototype chip that ditches energy-intensive electrons for comparatively efficient photons. Light waves are bent within the chip via “waveguides,” which move data more quickly than traditional processes.

Given a new Moore’s Law-esque finding that says “the amount of computing power needed to train the largest AI models is doubling every three and a half months,” the development of this new chip is a welcome sign of progress. Speed won’t be sacrificed either; in fact, these new processors will likely be far faster than their electrical predecessors. Even better, with widespread adoption, these chips would significantly reduce AI’s environmental impact.

That’s it for this month! In case you missed it, here’s last month’s roundup with even more cool AI news. Check back in July for more of the most interesting developments in the AI community (from our point of view, of course).

_________________________________________________________________

If you liked this blog post, check out more of our work, follow us on social media (Twitter, LinkedIn, and Facebook), or join us for our free monthly Academy webinars.

--

--