Kuri resembles a robotic penguin and can be programmed to play hide-and-seek and read stories to the kids. Olly looks like a technicolor doughnut and can look into your eyes and read your current mood. JD, all arms and legs, can dance and do push-ups. These future-is-now robots are three examples of what are being called electronic home assistants, part of a fast-moving and explosively growing industry that could make such social robots as ubiquitous as smartphones in the next decade.
But do they have a dark side? Could their cameras, sensors and artificial-intelligence capabilities turn them into horror-movie Santas, who see you when you’re sleeping and know when you’re away? Could hackers or criminals co-opt the networks on which the devices operate, gaining access to bank accounts and capturing video of intimate moments at home?
These questions inspired Dr. Despoina Perouli, assistant professor of mathematics, statistics and computer science who focuses on cyber security, to put the technology to the test. She has been awarded a two-year $175,000 grant from the National Science Foundation to conduct research in four main areas related to the security and privacy of users of social robots. “This is really a new era, and a new area of research,” she says, as she embarks on work to poke and prod the devices’ cloud-based data network systems, deputizing a wide array of testers, from high schoolers to seniors in assisted living centers.
Perouli took interest in the topic in past collaborations with Dr. Andrew Williams, who directed the former Humanoid Engineering and Intelligent Robotics Lab at Marquette before taking a position at the University of Kansas. Williams has researched the potential of these new devices to serve human needs in education, health and other areas. Seeing how they work — typically operating over Wi-Fi systems and sharing data collected by sensors, cameras and microphones with a cloud-based hub hosted by the company that makes them — got Perouli thinking. “The idea came to mind: How secure are they?” she says. In digging into the question, she found comparatively little research exists on how to keep users safe and their data secure.
“The social robots differ from more traditional computing devices…relying on current security practices is not going to necessarily solve all important problems.”
A cyber security expert, Perouli was hired at Marquette in part to help open and run its growing Center for Cyber Security Awareness and Cyber Defense. It represents one big piece of the university’s move to establish itself as a national leader in the growing field of data sciences. The university wants to be ahead of the curve not only in researching what’s-now technology, but also what’s next. And for Perouli, market indicators clearly point to our world — at home, work and in public — going robotic.
“The social robots differ from more traditional computing devices, such as laptops and smartphones, on several aspects: Mobility, sensors, use and computing power are some of them,” Perouli says. “Therefore, relying on current security practices is not going to necessarily solve all important problems.” With smartphones, for example, their portability means they are with us as we move about, constantly transmitting our location data to service providers, although controls for how that information can be used are fairly strict and well-established. Social robots raise unique new issues, such as what happens to the words they record as they wait passively to hear their “wake up” word. Or how much does their role as helpful or even intentionally “cute” anthropomorphic partners lead users to trust them with sensitive information, without thinking where that information may end u
The Ultimate Quest
Data science at Marquette is expanding to provide technical and ethical leadership.
Perouli’s research will play out both in the laboratory and in public. A major lab component has already been completed, which Perouli describes as testing and identifying how the devices compare with tablets and smartphones in security vulnerabilities and threats. She’s not able to share how that research was conducted or results, as it’s currently in the process of being considered for publication. One aim is eventually to develop algorithms to help robots detect and self-police when they overstep their roles and violate users’ security and privacy. What happens, for example, when cameras or microphones are left on when they’re not intended to be on? “From a security perspective, these are very interesting interactions,” she says.
Perouli will be field-testing the devices by putting a group of people in a room with a wide range of home robotic devices — JiBo, Kuri, JD, Amazon Echo Show, Google Home — and essentially saying, “Go!”
After the interactions, testers will be surveyed on how comfortable they felt around the devices, what security alarms went off and whether they’d consider sharing sensitive information, such as credit card numbers. It will be repeated at assisted living facilities, where social robots could become always-on-call helping hands for caregivers.
“We want to hear from real people of all ages,” Perouli says. With the feedback they give her, Perouli will evaluate whether humans and their futuristic hard-plastic assistants can develop the ingredient crucial to sustain any relationship: trust. Without it, the relationships will continue, but with that dreaded status: It’s complicated.