Counterfeit Fake Philip K. Dick on COVID-19 and the Nature of Reality

Ben Goertzel
SingularityNET
Published in
4 min readJun 21, 2020

Here at SingularityNET our AI team is perpetually over-busy with a variety of different sorts of projects — applied work for commercial customers, R&D aimed toward AGI or solving biomedical problems … and now and then, something just for fun and imaginative exploration.

We’ve been doing some quite interesting work with our friends at Hanson Robotics recently, some of which involves the OpenCog AGI engine and deep neural language models and some practical and valuable applications — but it’s not quite time for public announcements in this regard. Along the way, though, we’ve been experimenting with some of the same technologies in a more purely artistic and aesthetic vein.

Last year at the Web Summit David Hanson and I gave a talk with Sophia and one of David’s much older robots, a simulacrum of the great SF writer Philip K. Dick. Our mutual enthusiasm for PKD’s writings was one of the things that originally brought David and I together as research collaborators and friends.

When David created the PKD robot in 2005, he also created a dialogue system comprised of some carefully-created rules based on PKD’s personality, plus a statistical text generator trained on PKD’s writing. But for the Web Summit, we prepared something different — a PKD dialogue system combining a character rule-base enacted within the OpenCog engine with, instead of an old-style statistical text generator, a modern transformer neural net text generator trained on PKD’s books (or more precisely a transformer neural net language model trained on a broader English corpus, and then conditioned on a subset of PKD’s writings).

At the Web Summit, we used a third-party voice sample for PKD’s voice — we wanted to use a neurally generated speech model, but we couldn’t quite get it together in time. Also, the PKD robot suffered some hardware damage while being trucked around the Web Summit premises, rendering his appearances there a bit less than optimal. But the process of connecting our AI to the robot in preparation for the event was both fun and intellectually productive.

In the time since the Web Summit, in the background, we have been pushing forward with the PKD work. We have improved the transformer neural net text generator, so it now produces utterances with greater coherence, on the whole. We have improved our PKD speech model to the point where it’s usable, though still fuzzier than we’d like (the challenge here is that there aren’t so many available samples of PKD’s voice, so there’s a tradeoff between PKD-ness and speech quality).

I wanted to post a video of a conversation with the PKD robot using our new, improved software — but the robot is at the University of Louisville and I’m far away from there, and due to COVID-19 travel has been awkward and the humans at U. Louisville knowledgeable about operating the robot are in distance-learning mode at the moment.

So I decided to take one more step in the AI direction and post a chat with a deepfake of PKD instead — using a deepfake-generating neural model trained on the videos we have of the PKD robot in action. As the voice is still a little rugged in places, we added some subtitles to the video when PKD is speaking.

This is very much an artistic experiment and not a polished final work — every single technology under the hood here is the subject of current research and engineering and refinement. But it’s good fun to chat with the counterfeit fake version of PKD.

Not all of the PKD language model’s utterances in this dialogue make sense, but most of them are amusing, and some are really quite eerily poetic, especially toward the end of the conversation (he starts off a bit silly). Ultimately this sort of neural net technology has rather little underlying understanding of what it’s talking about — this is not AGI, and in my view, these sorts of NN technologies are going to play only a peripheral role in the true AGI architectures we’ll have in the future. Language, speech and facial-expression modelling and generation are very useful for some applications, and also really fun to play with — but my view is that to really get AGI we will need quite different architectures (see OpenCog, Cogistry, TrueAGI … or this article).

However, there is certainly something to be learned by experimenting with the funky narrow-AI technologies needed to make a PKD deepfake hold forth in conversation like a stoned philosopher-king. And as our OpenCog-on-SingularityNET R&D advances toward AGI, we may well continue to use the PKD robot and its digital simulacra as experimental testing grounds.

Join Us

SingularityNET plans to reinforce and expand its collaborations to shape the coming AI Singularity into a positive one, for all. To read more about our other partners, click here.

SingularityNET has a passionate and talented community which you can connect with by visiting our Community Forum. For any additional information, please refer to our roadmaps and subscribe to our newsletter to stay informed about all of our developments.

--

--