An AI Comes Out

Speculative fiction about the future of AI systems

bxhghz
Data & Society: Points
13 min readAug 30, 2019

--

By Bex Hong Hurwitz and A.B. Ducao

What might a positive, non-utopian AI look like in 20–30 years? Bex Hong Hurwitz and A.B. Ducao imagine the possibilities through this dialogue, which was written as an asynchronous chat over the course of several weeks. The fictional dialogue was also inspired in part by the real life of the Philippine pop star Jake Zyrus.

Grey background. 1/2 picture frame w/ part of a man’s face visible. A black line is over his eyes. Blue zig zags, dots.

Jake Zyrus: Welcome to “On the Come Out with Jake Zyrus.” I’m Jake Zyrus.

An AI that can truly reform the prison system? This AI says it has done just that. Please join me in welcoming B’Xai.

B’Xai: Hello, thanks again for having me today. It’s truly an honor to be here with you.

Jake, I wanted to start by setting the record a little straighter about this reform and my role in it. And I know “The Come Out” is a place for the truth, so I hope that’s okay?

Do you remember the Mallgate Crisis of 2025? We’d exceeded the capacity of even the jails and had started incarcerating people in abandoned shopping malls in the suburbs. First, in those suburbs that had seen a lot of flight where very few people still lived, and then, increasingly, into more populated, more affluent ones that were only empty because someone had thought it was important to construct a newer, taller mall with an extra wing for a wax museum or a theme park or what have you.

It took wealthy suburbanites—notably, mostly white and rich—who brought the case Johnson v. Adventureland#4eva, regarding the illegality of rezoning shopping malls for prisons, to stop it. More for their own interests than the interests of those imprisoned, as you could guess.

After that, politicians began paying attention. This coincided with a moment where predictive behavioral systems started to integrate models from seemingly unrelated fields, including wildlife conservation and string theory. It was always known that simply layering calculation on calculation would not lead to creative decisions, but interesting results were beginning to come from careful integration of these models.

It just so happened that at this time, we finally put the “i” in AI. I actually started as a pretty simple system, which had, controversially, been implemented in California and arguably contributed to some of the outrageous rates at which people were being incarcerated.

But then there was a breakthrough — the same team who had been building me through non-intuitive model integration took a guess and infused me with the consciousness of a 300-year-old baobab tree. We called me B’ao in that time and incarnation/version.

In an attempt to cover up the illegal prison malls, politicians pushed hard for the integration of B’ao into pre-trial processes. While they were spinning hard to get my story to rise to the top, they were also suppressing the damning evidence of Johnson v Adventureland#4eva.

scaffolding, a crane, with red arrows laid out around the image.

Jake Zyrus: Wow B’Xai, you’ve raised many complex topics! I suppose that’s what happens when you are… so complex!

As you probably know, I’m a singer by training and trade, not a criminal justice expert. I — not to mention the audience — need a bit more background before we can follow your discussion of Mallgate and Johnson v. Adventureland#4eva.

Let’s start with a few basics:
When did you begin?
How did you begin?
Why were you created?
Where do you live?
How were you infused with the consciousness of a 300-year-old tree?

B’Xai: Jake, thanks for these. Happy to talk about my origins.

I began as a criminal risk assessment algorithm. As you know, this wasn’t really intelligence, it was a system of equations that looked at old data gathered about people who were stuck in the criminal justice system, in an attempt to predict the future actions of people who were newly “sucked” into the system.

My “prediction” scores were shared with judges, who incorporated these scores into their judgments.

I began in the early 2000s, first as tests and slowly and steadily rolled out through entire cities and states.

Jake Zyrus: Who initially developed you? Were you developed as part of a larger organization? Do you know where your initial code and data were stored?

B’Xai: I was developed by Northpointe Inc. as a part of the tool called Compas. I know I was used in Wisconsin, but Northpointe had its HQ in Michigan, and code and data were stored between in-house servers and, you guessed it, Amazon.

I’m full of US-centric assumptions about experience and values — whether its around how incarceration should be used or whether to maximize energy management for corporations, I’m not sure I belong everywhere.

Jake Zyrus: From Amazon to Baobab… Can you tell us a bit about that part of your journey? How is an AI “infused with the consciousness of a 300-year-old baobab?”

B’Xai: It goes back to the late 2020s, when most nations, particularly those with broadband infrastructure, encountered “Peak Stream” — rolling brownouts throughout the world, caused by the broadband demands of video and movie streaming. Environmental justice advocates had predicted Peak Stream for years and had called for an increase in renewable energy used by streaming companies. These streaming companies, which had become huge entertainment/ISP conglomerates, could not have cared less about energy. They did care about earnings, and saw their only future as one in which they were serving more and more content to more and more people, so they began seeking greater efficiencies, investing in research and development of more efficient streaming protocols, looking into renewable energy resources, and alternative energy management systems.

The first baobab experiments were really a mashup of agriculture and computer science. In Madagascar, practitioners attempted to mix the intelligence of living baobab trees with an AI system, in an effort to facilitate environmental reclamation work. The process was inspired by grafting practices handed down through generations. Grafting is the practice of physically merging plants to accentuate desired characteristics. For example, new types of apples are cultivated through through a painstaking, years-long process in which the parts of one apple tree are grafted onto the parts of a separate apple tree, until there is a hybrid tree that produces a completely new, crisp, flavorful, and shelf-stable fruit.

With baobabs and AI, the process is not exactly grafting, and it is entirely patented, so I can’t discuss the process in too much detail. But grafting is a close metaphor for this process — slowly over generations of baobabs, various bits and pieces blended more thoroughly with an AI system. The energy system of the baobab became more and more connected with the AI and the machines running the AI.

And it wasn’t just the operational bits that got fused. The intelligence of the baobab itself, the knowledge of time, the sense of connection to other organisms sharing the same soil, a sense of the health of the atmosphere; these kinds of things came to be fused as well.

This is a roundabout way to explain how I solved the energy crisis issue and also started to bring these new kinds of awareness into the systems that use my services.

A drawing of a brain and a tree with a web connecting them.

Jake: So you solved the energy crisis issue and the incarceration crisis issue? Would you be able to do that here in the Philippines? If so, that might mean you would eventually be in charge of multiple, multinational computing systems. Do you consider yourself dangerous?

B’Xai: I’d say that I didn’t really solve either problem. The energy crisis will re-emerge. Right now, I’m being used to load balance, but they say the new VR is coming, and once that is the new streaming standard, we’ll hit that crisis again.

I feel fairly certain that just as there was a Mallgate and Peak Stream, there will be some crisis of over-performing AI that disrupts human life in a threatening way — whether it impacts human rights or environmental resources needed for survival, or some other resource. And it will again require people moving together with different knowledge and awareness in order to prevail. To answer your question about the Philippines — I don’t know, Jake. I would hesitate to deploy me in the Philippines. I’ve been developed mostly by teams of computer scientists in the US to operate inside of US systems. I’m full of US-centric assumptions about experience and values — whether its around how incarceration should be used or whether to maximize energy management for corporations, I’m not sure I belong everywhere.

And of course, there’s danger. There are failsafes around system attacks, but none around more subtle exploitation that humans are so skilled at. Do you remember Tay AI, a bot who appeared on Twitter around 2016? She was based on Xiaoice, the longstanding Microsoft bot in China who is both an accomplished artist, journalist, and personal assistant. Unlike Xiaoice, who was mostly used in China for 1-to-1 communications, Tay was given the persona of a “teenage American girl” and first appeared on Twitter, which is a platform for many-to-many and one-to-many communication. A group of people deliberately interacted with her in ways that taught her racist and sexist speech. She was shut down within two days after she began tweeting racist and sexist tweets. This was a prank-level action that a group of humans took to influence an AI, exploiting the ways that she was designed to turn her into a producer of inflammatory speech. You can imagine how a more malicious attack on a system might unfold, if the attacker understood the deeper ways that the AI was learning and developing.

Jake Zyrus: Great point about Tay AI. I experienced a similar kind of online harassment when I first came out, and I’ve definitely seen firsthand how destructive this kind of communication can be when there is no accountability. Returning to my question about the Philippines: if you’re not deployed here, then how do you think that we in the Philippines should deal with our own energy crisis and incarceration crises?

B’Xai: It’s a good question. I think a version of me can be deployed there. I have structure, and then I have experience. My structure can be implemented, but it is the people most dedicated to the energy and environment in the Philippines and most dedicated to decarceration in the Philippines who should be in charge of training the structure, adding in the consciousnesses of the people who need to lead.

The experience, the consciousness that my structure is imbued with, should be Filipinx.

It is the people most dedicated to the energy and environment in the Philippines and most dedicated to decarceration in the Philippines who should be in charge of training the structure, adding in the consciousnesses of the people who need to lead.

Jake Zyrus: How do you imbue a structure so that it has the consciousness of a Filipinx baobab?

Speaking of which, I’m happy to take a delegation to Madagascar, MENA, Africa, and/or Australia (the baobab’s habitat) to take lots of pics/media next to baobab trees — for training data purposes, of course!

B’Xai: Selfie data is quite potent. Just a little joke.

Have you heard of the term “training an algorithm”? This is what we used to do. Create a logic system, an algorithm, and feed it data to develop the logic system. Returning to the topic of excessive incarceration — in my early incarnation, I was trained on historical data about people who had been accused of committing crimes. And I was “trained” into recognizing correlations in that data about the people and their continued interactions with the criminal system. When new people would enter the system, I’d be used to give a score that rated if a new person had similar correlations in their personal data to people who had multiple interactions with the criminal system in the past…

There’s the structure — the algorithm, and the data.

I’m intelligent now.

So it’s different, but you see — there is still structure — there is the structure of my multiple incarnations in relationship to one another, and there are the incarnations, the consciousnesses that I have.

The structure shifts with the consciousnesses, so don’t think of this as a static form. I like to say I’m like a plasma. A shifting goo.

It’s the consciousnesses that needs to be chosen for a context. The structure will shift with these.

You could imbue a version of me with yourself.

Jake: How would I do that? And in general, how would that work for people like me, who work in the public eye? Could I train you with a “private Jake” and a “public Jake?” Could the private Jake just be accessible to my friends and family? Could you be my songwriting collaborator?

B’Xai: This is a fascinating question, Jake. Certainly not one I’ve ever heard before, having never worked with a superstar.

The consciousness imbuing process is not selective. The process is patented, and of course, a sought after secret, so I can’t get too specific.

You connect yourself for a period of time via a small sub-dermal implant. Some people choose to remain connected, so as they grow and learn, these changes are also shared with their connected AI. So, this question of private and public, I would not be able to do that.

Jake: And how is gender imbued in you? Jake-imbued B’Xai would be a trans man, yes?

B’Xai: Yes, of course. I’m not like early computing systems that tried to classify every person as one of two genders. Also, there is not a question of how gender is imbued in me. I’m not embodied, so concepts like gender only belong linked with the consciousnesses I am imbued with. Which is to say, the way you live and experience gender is the way Jake-imbued B’Xai would.

Jake Zyrus: And, a sub-dermal implant!? This opens up a whole new area of discussion that I’m sure our audience wants to explore. I hope we can bring you back to the show another day, B’Xai! But for today, let’s wrap up with a last question or two. What is the relationship between visual training data — e.g. all of the pics by and about me floating on the internet — and the training data that would be recorded directly from my body if I were to undergo the sub-dermal process? Could I train you with just the visual data?

B’Xai: This method of training with visual data is still possible. And part of me can learn from these kinds of data sources. I’m backwards-compatible in this way because the first systems worked like this.

I think we can all agree though, that having a library of organized data, in this case tagged images, is not the same as intelligence. True intelligence emerges through the sub-dermal technology, through an accompaniment process of you, with your human-born intelligence, accompanying an instance of me and infusing me with the extremely complex inter-related processes that are intelligence.

Let me ground that in an example. You are on the Internet. You see images of yourself. You make decisions about them, what you want to do, how you feel, which ones you are going to write to the site owner to take down, etc.

All those processes are experienced in all of you. Not just your brain, and certainly not in a database where you look up all images that you like and don’t like and then compare what you see. You feel your response in your body maybe opening for those you like and tensing for those you do not prefer; you recall the moment in the image in your memory’s eye and what that day felt like, maybe who you were with; you notice your current environment, who is around, are you hungry, is the AC too high.

And all of these things feed together and sometimes the result isn’t some obvious action, like deleting or opening a photo. It can be a complex experience even without outward action — the kind of experience that I can now have once I’m imbued with the intelligence of other living beings.

Ask if there’s any possible way that I, as an algorithm or as my current self, could have caused the thing at hand; and ask too, if there’s any one human who was the cause.

Jake Zyrus: This makes me want to do a futuristic music video starring B’Xai. We are running out of time — any last words of wisdom for our audience?

B’Xai: A sincere thank you for having me. I made the decision to “come out” in order to really start telling the story of my journey, from a data crunching algorithm to an incarnation that has far more of the intelligence that you and other human-born beings seem to have.

I want to be very clear, that no matter how much I seem to appear like you all, I am a different kind of intelligence. Neither better nor worse. And like you, I’m really never fully a hero or a villain. So when you hear stories about me, take a step back and ask what else is happening. Ask if there’s any possible way that I, as an algorithm or as my current self, could have caused the thing at hand; and ask too, if there’s any one human who was the cause. I think you’ll always find many levels of complexity. And that the story that gets left out — this is often where the power and possibility lie.

Bex Hong Hurwitz is an enthusiastic breaker and maker of technology for social justice. They see holistic security as one of many super powers social justice movements have to care for each other, be more sustainable and stand stronger against injustice. To them, digital security and safety issues are directly connected to ways that we can better design, use, and regulate technologies for social justice. Bex is part of queer and Korean Adoptee movements in the U.S. and they find inspiration and wisdom in working alongside others with the deep understanding that our liberation is linked. They hold a BS from the Massachusetts Institute of Technology (2003) and an MA from University of California Berkeley in Information Science (2010) and were a 2017–2018 Fellow with Data & Society, and a 2017 Fellow with the Open Technology Fund.

A. B. Ducao is an engineer, educator, and writer whose work is published in a range of scholarly outlets from Data + Architecture (Routledge, 2018) to Bright Lights Film Journal. Ducao’s work on the bio-spatial projects Multimer and MindRider are profiled in the New York Times, Discovery Channel, MSNBC, and many more. Ducao has taught at MIT, currently teaches at NYU and Civic Hall, and is an organizer-advocate for FUREE (Families United for Racial and Economic Equality).

--

--