Storyboarding a Metaverse design fiction using AI generated images

Rob Scott
6 min readFeb 28, 2023

I often find it really hard to explain what the future holds in terms of Immersive Technology (especially as “Metaverse” has become a buzzword with little real meaning). I don’t quite have the skills to make a prototype, or even a smoke and mirrors video.

Last year, I decided to go lo-fi and just try to write a short story that imagined how digital eyewear might converge with other technologies in the not-too-distant future. I enjoyed writing it, and shared the story with a few colleagues. They seemed to like it too, but I didn’t take it any further.

Recently, I’ve been playing with AI Image creation and have been musing on how AI models such as Stable Diffusion might help me create storyboards to aid in communicating concepts at work.

And so, here we are! I’ve lifted the original story and tried to generate some images that communicate the essence of what the prose lays out. I hope they help bring the potential future to life.

NB: A little while ago I read an excellent design fiction by Melanie Subin that was featured in the FTI Tech Trends Newsletter. My story owes a lot to hers.

Prepping for a workshop in 2045

Elvi is a 32 year old professional, working at an environmental NGO.

After waking up in her hotel room, Elvi puts on her SmartGlasses and walks to the bathroom to get ready. Before brushing her teeth, she transfers some saliva to an OctaFlow diagnostic device. She looks back to the mirror and sees her AR dashboard, with her day laid out before her. The 10:00am conference workshop has been pushed back half an hour. Immediately her Assistant pops up a question:

“Do you want to delay your AutoCab booking by 30 minutes?”

Elvi nods, and the notification marks itself resolved and vanishes into her mirror. As she brushes her teeth, headlines from her trusted services scroll by, each dismissed by a flick of her wrist (as detected by her SmartWatch). She notices a BBC report of an earthquake in Indonesia, and jumps into a Holocapture of the relief effort projected through the mirror. It’s heartbreaking to watch.

Composing herself, she moves on to check current crypto exchange rates and glances at the weather back home.

As she finishes brushing her teeth, she looks across at her OctaFlow and sees that the Rhinovirus she had been carrying the previous week has completely left her system. Detecting her gaze, her Assistant again pops up:

“Report OctaFlow results to trusted parties?”

Again, Elvi nods. This time, the notification surfaces a response:

“3 GovTokens received for sharing your data with NHS Infection Monitoring.
11 LibraCoins received for sharing your data with MediResearchCorp Genetic Susceptibility Project”

Elvi scrunches her fist to close the view down and finishes getting ready.

With an extra half hour before she needs to leave, Elvi spends some time prepping for her workshop. Standing in the hotel room’s remote workspace, she activates her Immersive Environment. From her calendar, she jumps into the shared activity layer and reviews the exercises for the conference session she is facilitating.

There are three main activities laid out across the virtual whiteboard. Elvi reviews her private facilitation notes attached to the introduction to make sure everything is up to-date. As she reviews the rest of the session, her Assistant pops up to let her know that her co-facilitator is also entering the activity layer:

“Frances Chan has entered the layer. Do you want to collaborate?”

Elvi nods eagerly. Almost immediately, Frances confirms the merge and her professional avatar appears at Elvi’s side.

“Morning!” Elvi chirps. “Using the extra time to do a last minute check?”

“Of course!” replies Frances. “Hey did you see the news about Indonesia? It gave me an idea…”

The second exercise is a shared spatial walkthrough of areas of environmental concern in Southeast Asia. Frances brings in a map marker for the BBC News article and it automatically drops over the earthquake epicentre.

She then asks her Assistant to overlay any georelevant news stories of environmental or socio-economic interest over the last 6 months. The result gives a more human story, showing how the earthquake further compounds existing challenges faced by local people.

After a few more tweaks, Elvi and Frances close the collaboration down and disconnect. Feeling energised, Elvi prepares to leave for the conference centre. Her cab should be 10 minutes away. She asks her Assistant to check, and sure enough it’s right on time. The Assistant asks:

“Do you want to visually track your AutoCab?”

Elvi nods, and a small cab marker appears in her peripheral vision, showing which direction it’s coming from and how far away it is. She leaves her hotel room and heads down to the lobby to catch the cab.

As she waits outside, she checks her wallet. She decides to reallocate her GovTokens for the next 2 months to help aid efforts in Indonesia; after that they’ll switch back to the community youth DAO in her home town. They’re closed for the summer anyway, and from inspecting their outgoings she can see the next few months are well catered for.

Elvi hears an audio ping off to her right to let her know the AutoCab is approaching. She looks around just as it pulls out from a side street. As she gets in, the cab marker vanishes and her dashboard pops up anchored to the wallspace inside the car, with the journey details and ETA included. Once her seatbelt is fastened, the car pulls away.

Again, her Assistant pops up:

“Do you want to share your real-time journey with your colleagues?”

Elvi nods, and immediately gets a response:

“Your real-time journey is now being shared with Patricia Chan and Tiago Alvarez”

A few moments later, she gets a follow-up:

“Tiago is now sharing his journey with you.”

As she looks at her dashboard, it updates to show that Tiago is just entering a local Grindmaster a mile or so from the conference venue. Elvi decides to call and ask him to grab her a coffee too.

“Hey Tiago! Can you grab me an Americano please? I didn’t have time to pick one up at the hotel and to be honest, it isn’t the best…”

“Sure thing Elvi, but only if you pick me up on the way?!”

“Ha — no worries!” she says, and closes the call. She then asks her Assistant to reroute the AutoCab to pick up Tiago on the way. The journey details on her dashboard update to reflect the detour, and she sees she’ll be with Tiago in less than 10 minutes. Elvi closes her eyes to relax a little ahead of the workshop.

Her Assistant calmly pops up with one final, audible question:

“Elvi, would you like to listen to your pre-workshop playlist?”

“Yes” she replies, before music pours out of the car’s speakers.

Thanks to Emily Heath for her thoughts on this post, especially the mildly inconsistent imagery.

Thanks also to my colleagues Emma Young and Spencer Marsden for their thoughts on the original story.

--

--

Rob Scott

@BBC UX Architect, IA Practitioner, Tech & xR enthusiast, @VRManchester co-organiser, Ex-Summer Camp Counselor and UK Labour Party Member. Views are my own.