The future of AI: The human component Part 2

Joanna Strom
The Startup
Published in
6 min readJan 3, 2019

The Inside/Outside World

Augmented Reality is hitting almost every industry. Its effect shows a massive evolution of the growth potential and marketing innovation, the financial gain, and the overall balance of how we incorporate this technology at home, on the go and our work environment.

Amazon Lab126 is working on a domestic robot called Vesta. Overseeing this project is Greg Zehr who runs the R & D. This project is different than the robots used in Amazon warehouses however it’s expected that the consumer robot market will be worth $15 billion by 2023.

Lens, Google’s Visual Search tool, works with your phone to identify any image.

The future of AR video games is expected to reach $11.6 billion by 2025.

  • Lumus is the maker of AR visors using Reflective Waveguide Technology.
  • Epson has designed the MOVERIO AR glasses, which offers a 55 degree angle, micro-projection. use this technology. Moverio is leading the way in a total AR immersive experience.
  • You can even have prescription lenses ordered through Rochester Optical who has partnered with Epson.

Trending AI Articles:

1. How I used machine learning as inspiration for physical paintings

2. MS or Startup Job — Which way to go to build a career in Deep Learning?

3. TOP 100 medium articles related with Artificial Intelligence

RoboFly is a robotic flying drone with gossamer wings, powered by a laser beam. Created by the Autonomous Insect Robotics Lab at the University of Washington, this miniature 190mg insect drone (uses the piezoelectric effect) is not the only one.

Harvard’s Wyss Institute has been researching RoboBees, autonomous aerial-aquatic flying microbots. In addition to flying,they can dive in the water, swim and propels out of the water.

Micro Air Vehicle Laboratory (MAVLab), Delft University of Technology, The Netherlands, in collaboration with Wageningen University & Research, are testing a flying robot, the DelFly Nimble, which hovers like a hummingbird. It flaps its wings 17 times per second. The robot has a top speed of 25 km/h.

Wageningen’s harvesting robot, SWEEPER, can harvest ripe fruits in 24 seconds with a success rate of 62%. This horticulture robot could be the answer for harvesting crops in the future.

MEMS (microelectromechanical sensors) are devices which rely on the piezoelectric effect (electric energy to mechanical energy) and have an integrated circuitry board (silicon chip). To keep it fun for a second, think of how a gyroscope spins.

In 2001, author Brett Warneke, Ph.D. published his findings on autonomous bi-directional communication mote sensors in Technical Digest by IEEE. Dr. Warneke and Professor Kristofer Pister (University of California, Berkeley, Electrical Engineering and Computer Sciences) have been involved with the Smart Dust Project. It’s unique potential is that it can detect and measure a number of things by light, acceleration, position, stress, pressure, humidity, sound, and vibration. The most covert use would be surveillance but the sensor can detect biochemical warfare, radiation, etc. At some point in the future that may come in handy.

The military has already been testing algorithms to assist the Electronic Warfare Officers (Air Force aerial navigators). The Artificial Intelligence and Machine Learning prototypes are being inserted into electronic warfare systems for data collection for the Army’s Tactical Electronic Warfare System.

MEDICAL Breakthroughs

Nanotechnology is an ever-present part of the most fascinating and amazing technological breakthroughs in medicine.

Smart Dust MEMS (microelectromechanical sensors) and an optic laser-emitted diode can be inserted into a syringe. The nanorobots (think of “Magneto” in XMen) have been used to target cancer cells then administering drugs in the bloodstream. “Neural Dust” is implanted in the brain like an optical piggyback ride to learn more on how the brain works.

Johnson & Johnson gives allergy sufferers up-to-date news on pollen counts through skills taught to Google’s Assistant, Amazon’s Alexa, etc. Johnson & Johnson has also used this marketing opportunity to include allergy medicine, Zyrtec’s AllergyCast app and Forecast Tool.

NanoRobots are being used to possibly fight cancer. They can be used, not only to deliver medicine, but the truly unique importance is being able to detect so many diseases and treat them.

Mini-Brains (develop retinal cells) lack the oxygen supply but the trick in using them is by embedding the baby brain balls into a nutritional gel. This “Vascularization” is touted as a break-through in bioengineering.

The University of Toronto has taken skin-grafting to a new level by using a patient’s own STEM cells then duplicating it from a handheld 3D skin printer. This is a great breakthrough for burn victims.

Electronic Skin or e-skin, has been around since 2005. It can measure temperature, touch sensitivity, pressure, etc. while it displays the person’s heartbeat (EKG-electrocardiogram). Professor Takao Someya has been utilizing it with VR technology.

AI and Skin Cancer Screening-Doctor Hazel project is able to use image recognition as a means to distinguish between benign lesions and skin cancer. Peter Ma and Mike Borozdin, both employees at Intel, created this prototype using a high-powered endoscope camera (from Amazon) and an Intel Xeon processor as well as a Movidius Neural Learning Compute Stick.

Amazon’s Rekognition is a technology primarily for facial identification. It was pitched to law enforcement agencies to find criminals. The residual deep learning neural network (ResNets) was a research project funded by Microsoft Research to understand visual image recognition. However, Amazon is in hot water from the ACLU. The ACLU claims they have become a surveillance business by providing law enforcement body cameras which could be used to spy on immigrants, racial riots, etc.

SFootBD uses the same technology as Amazon’s Rekognition except a behavioral biometric signal is dependent upon your exact gait/the way you walk.

Again, I think this borderlines on our right to privacy.

There is a more important issue though: we are addicted to technology. As far as the human condition, can we train AI to know good from evil? According to a recent Quartz article entitled, “We can train AI to identify good and evil, and then use it to teach us morality” by Ambarish Mitra, Morality is about the human condition’s response to whatever dilemma we are subjected to.

Aeon’s article entitled, *“Robot cognition requires machines that both think and feel,” describes how we need emotion to differentiate the personal significance or value.

*Luiz Pessoa, author, is the director of Maryland Neuroimaging Center, principal investigator at the Laboratory of Cognition and Emotion, and professor of psychology at the University of Maryland.

SENSORY/IMMERSIVE Experiences with Augmented Reality

Next we have “Smart Walls.” Carnegie Mellon University and Disney Research together have created “ Wall++, a capacitive sensing and electromagnetic (EM) sensing touchscreen/pad.

Google’s Tacotron2 is the new humanized female voice.

Sensory Reality in a telephone booth environment aka sensory pod by Sensiks, an Amsterdam-based startup, with a virtual reality app called “Tree.” It offers a way to be immersed in an Amazon Rain Forest.

SOLVING Mental Health issues such as drug addiction, ADHD, depression, psychosis, autism and the reduction for reliance on pain medication, etc. are just some of the uses for the AR/VR Headset. Dr. Alan K. Louie, MD, Professor, Associate Chair, Director of Education, Stanford University believes that this will become affordable and will offer a patient who is experiencing emotional distress a way to calm themselves.

We have ventured into areas you can’t imagine. For those who do not know what a hydrogel is: it expands in water. A student named Fanfan Fu from Southeastern University in China and colleagues have developed a “butterfly” robot which flaps and changes colors. The hydrogel marked with nanocrystals allow researchers to measure heart cell responses with a certain drug which lowers your heart rate.

There is a bio“self-healing” elastomer (polymer type) liquid metal material which works as a sealant to reroute where there may be damage in robots. Carnegie Mellon University in from eir Soft Materials Laboratory, have developed droplets which are made from (similar to the Terminator movie) gallium alloy.

The residual deep learning neural network was a research project funded by Microsoft Research.

NeuralEye will lead the way in Facial Recognition.

courtesy Cesar Balbuena via ReShot

Again, I think this borderlines on our right to privacy. There is a more important issue though: we are addicted to technology.

Stay tuned for Part 3!

This story is published in The Startup, Medium’s largest entrepreneurship publication followed by +406,714 people.

Subscribe to receive our top stories here.

--

--

Joanna Strom
The Startup

I’m a purveyor of words with a creative mind that wanders endlessly, sitting on a precipice and gazing with wonderment and awe.