These Human-Like ‘Synths’ Will Make You Do a Double Take

PCMag
PC Magazine
Published in
7 min readJul 23, 2019

Dr. Suzanne Gildert, Founder and CEO of Vancouver-based Sanctuary, wants her robots, known as synths, to be as lifelike as possible. We talked to her about building cognitive machines and why she created a synth that looks just her.

By S.C. Stuart

There are now many types of robots and just as many theories on how to build them, but Dr. Suzanne Gildert, Founder and CEO of Vancouver-based Sanctuary, wants her robots to be as lifelike as possible.

“I don’t think you can have intelligence without some kind of body-even if it is an abstract sense of a body,” she said in a recent interview. “All the concepts we have in our head come from the type of data we ingest, which means it is a function of our sensory perception and therefore our body.”

As such, Dr. Gilbert created what she calls “synths,” which not only look (a lot) like us, but learn by emulating us. In her lab, Dr. Gildert slips into an exo-skeleton and stands next to one of the in-development alpha3 units, teaching it to move through tele-operation. By being embodied, Sanctuary AI’s synths “understand” their role within an environment-just like we do-through experiencing, recording, and replaying until they “get it” via reinforcement learning.

We spoke to Dr. Gildert about building cognitive machines and why she created one that looks just her. Here are edited excerpts from our conversation.

PCMag: At Sanctuary AI, you’re not building astromech droids or countertop smart speakers. You’re going for the whole ‘they look like us’ non-biological sentient machines. Tell us why.
Dr. Suzanne Gildert: That’s right. We believe that, in order to understand the human mind, we require a human form factor, because our mind uses data coming in through our senses to craft our subjective experience of the world. Therefore, if the type of data going into an AI mind doesn’t match the type of data going into our own brain, that AI will never be human-like, even in principle. So to ensure that the data is human-like, the body must be human-like.

That makes sense. As an aside, you’ve called your humanoids synths. Is this in reference to Humans, the TV series?
No. Merely a happy coincidence. To be honest, we came up with the term independently, but I am aware that it was used in the show.

It’s a cooler word than robot, especially as it intimates a humanoid concept.
Right. We wanted a word that wasn’t robot-because that’s too broad-or an android, which has become synonymous with the operating system.

You’re teaching the synths to move through emulation. I’ve done this when I tele-operated a Sarcos Guardian GT tank-like robot, and it felt amazing. Can you explain why you deploy this form of training?
In reinforcement learning and other machine learning paradigms, it’s hard to learn to do things from scratch. We could let the synth move randomly and then let it learn from trial and error: this is known as “pure RL.” However, that wouldn’t be good, because the synth would quickly damage itself and the environment. So you need to have a “good starting point” for motion paths. Teleoperation, using an exo-suit and illustrating movement, provides that to the synth.

Dr. Gildert with the alpha2 ‘Nadine’ synth (Image: Daniel Marquardt and Sanctuary AI 2019)

Understood. Do you get extended proprioception after a ‘synth fluid motion’ training session while wearing the exoskeleton? I felt like my arms could reach 50 feet after my experience at Sarcos.
Yes, I do get a weird feeling—and the occasional strange dream—that I am the synth after being in an immersive suit for a while.

Let’s look under the hood for a moment. Is the underlying platform ROS?
We do use the ROS operating system for some of the message passing, but it is only one part of the system. The rest is built in-house, and I can’t share the specifics of that.

Fair enough. How many degrees of freedom (DoF) do your synths have and how tall are they?
The current model we are working on-the alpha3 system-is 5 foot 7 [inches] and has 38 DoF: six per arm, six per hand, three torso, two neck, and nine in the face.

What’s the synth skin made of, and do you print them in the lab?
The synths’ skin is made of silicone, but silicone isn’t ready to be 3D printed-yet! But we do 3D print the chassis in its entirety in-house. We use carbon fiber printing to make parts that are strong enough to withstand the forces encountered in a human skeletal system.

Is it true we can now incorporate bio-or bio-identical-material into 3D printers to create a synthesis of us and them?
Technically, yes, but it’s still early days, and only done within the medical research community right now. Sadly it’s not practical, and not used in robots, as yet, because it’s hard to keep biological tissue alive. But this is definitely something I’m interested in for the future. I think we can learn a lot from biology. Scientists have solved a lot of the power-density problems and self-healing problems, already. In the future, I think we’ll combine the best parts from biology and the best parts from mechatronics/electronics.

Sanctuary’s alpha2 ‘Nadine’ synth (Image: Daniel Marquardt and Sanctuary AI 2019)

On that note, do you see us merging in the future?
At some point, yes. I can already see robots becoming more human-like. Not just in the way they think, but in the actual physical construction of them too. For example, more soft or pliable/compliant materials are being used, including biocompatible polymers. So I think that, pretty soon, we’ll be able to put robotic parts in humans, and biological parts in robots, and, in the future, it won’t be “us and them”; there will be a spectrum of everything in-between.

Do you have any robots in the field right now?
Right now, we’re “pre-commercial” and developing functionality under contract with corporate partners so no, there are no current commercial deployments at this time. Having said that, we did do an initial study with Nadine (alpha2) at Science World British Columbia, a museum here in Vancouver, just to see how people would react to an AI system “learning” in front of them.

Dr. Gildert and Nadine play Connect Four (Image: Daniel Marquardt and Sanctuary AI 2019)

What tasks did you set out for that alpha2 test?
We had the synth behind a table and showed the museum’s visitors how it learns, by playing games, like the game Connect Four. We found that people take just 15 seconds to make up their mind about the synth and whether they’d view it as a friend/helper or otherwise.

In your own quest to leave a digital legacy that’s more personal than most, is it true you’re creating a synth of yourself?
Yes, I am building “SuzanneSynth.” She’s still pretty primitive and I don’t make any claims to say she’s anything like me, in personality, as yet, but she does look like me. This is very much an exploratory side project for me-encompassing both art and science, in a way.

Dr. Gildert and her synth (Image: Daniel Marquardt and Sanctuary AI 2019)

Tell us what you’re hoping to find, or achieve, with SuzanneSynth.
Well, I’ve always been passionate about something called “extreme lifelogging” and, with this project, I’m exploring whether you can implant real human experience memories into the “mind model” of a robot. Then, over time, have that system believe that it actually experienced the event.

Isn’t that the central plot device for Blade Runner?
Yes, in a way. I guess I’m like the woman who makes the memories/backstories for the replicants, which they then perceive as their own.

So as certain memories fade, as happens in us bio-beings, you’ll be able to let SuzanneSynth fill you in on the adventures you’ve had?
That’s one way of looking at it, yes.

I’m sure you’re already on Hollywood’s radar then. On that note, what’s your business model? I can’t just buy an alpha2 online, right?
Ah, no, you can’t. We offer our corporate partners a Labor as a Service (LaaS) business model after the completion of a stage-gated research engagement, where we focus on user experience and functionality testing.

Can you mention any of your current corporate partners?
Sadly, no. All of our engagements are currently confidential. But I can say we are pursuing additional corporate engagements in numerous market verticals.

For more, Dr. Gildert is speaking at the AGI conference in Shenzhen on Aug. 6.

Originally published at https://www.pcmag.com on July 23, 2019.

--

--