Human-Robot Interaction in The Age of A.I.

Alisha Kelkar
UNC Blue Sky Innovations
3 min readNov 15, 2023
Evolution of Robotics

The way robots and computers work with people has changed a lot over the years. Traditionally, robots were only associated with factories, but now they have slowly started becoming more commonplace. Whether it is Roomba cleaning your house or PARO, the interactive therapeutic robot, we have all either heard of or directly interacted with these robots.

However, as robotic technologies advance and move beyond factories, it is important to consider how to design robots to seamlessly fit into the real world and help us in our everyday lives. While rigid machines that were designed for specific tasks might have worked in rudimentary factory settings, they lack the flexibility needed for diverse, everyday interactions.

Artificial intelligence is one tool that can be leveraged to help make robots more user-friendly. In fact, here at Blue Sky Innovations, we have our very own robotic arm ARIA. ARIA is an xArm5 that uses artificial intelligence- specifically reinforcement learning- to detect, locate, and move objects without human input. We are using AI to make ARIA more autonomous while also recognizing that it will be humans who interact with the robotic arm at the end of the day.

Google DeepMind has also been doing some ground-breaking research in this field. They have been developing a robotic arm (RT-2) that uses computer vision and language models to guide actions.

In a recent demonstration, they showed how natural language can be interpreted by their robotic arm and used to determine what action to conduct. On a table in front of their robotic arm, they placed a plastic lion, whale, and dinosaur. Next, they instructed the robotic arm to “pick up the extinct animal”. After processing for a few seconds, the arm extended, lowered its claws, grabbed the plastic dinosaur, and placed it in the bin. Just like that, it was able to interpret natural language and make a decision!

The fact that this robotic arm could make the logical leap from “extinct animal” to “plastic dinosaur” shows how the power of artificial intelligence is going to change the way that we interact with robots. Language models can now be used to give robots the ability to reason and understand what action needs to be conducted after interpreting natural language. We are in a new age of technology, and it is very interesting to see how computer vision, natural language processing, and robotics can all come together to create smarter robots.

I am looking forward to seeing how advancements that allow robots to understand nuances of language will develop across different countries with different languages. It is going to be fascinating to see how the development of human-computer interaction plays out in relation to vernacular and linguistic technologies. For example, I would be interested to see how these language models react to code-switching, and if they can be trained to recognize words across multiple languages.

There are so many different use cases across multicultural contexts for these robotic arms that use language models. It will be interesting to see how this plays out in relation to varied vernacular contexts.

--

--