Eight years ago, game developer Bethesda attempted a radical experiment with its epic role-playing adventure Skyrim — and it went horribly, beautifully wrong. To make players feel like they were a part of a living, breathing world, the designers created an artificial intelligence system named Radiant, which gave computer-controlled characters (nonplayer characters, or NPCs) a range of needs, ambitions, and personality quirks and allowed them to act on those elements dynamically. Players were meant to be delighted when they encountered characters filled with life and emotion and who seemed to really react to the things you did.

Instead, the A.I. characters went rogue. To meet their needs, they started slaying merchants, shopkeepers, and each other. Even worse, characters addicted to an in-game narcotic named skooma would do anything to get a fix. By the time a human player showed up at a tavern or a meeting place, everyone was already dead — slain by artificially intelligent drug addicts who had figured out that killing was a more effective way of meeting their needs than buying stuff.

Video games are like those charming, boorish guys you meet in bars at tech events: They tell entertaining stories and drop names, but your role is just to nod appreciatively even though they keep calling you Brian.

Bethesda scaled back Radiant, effectively lobotomizing its drug-crazed A.I. agents. Yet despite the glitch, the aim of this system was admirable. In the 40 years I’ve been playing video games, it still bothers me that games don’t really get to know me, don’t listen to me, and don’t remember me. In these respects, video games are like those charming, boorish guys you meet in bars at tech industry events: They tell entertaining stories and impressively drop names, but your role is just to nod appreciatively even though they keep calling you Brian.

Even in an epic interactive narrative game like Mass Effect or Witcher 3, the player rarely contributes to the story. There’ll be a cinematic cutscene that moves the plot along, then you kill some identikit bad guys or monsters, and then there’s another cutscene. If you’re lucky, the game has a dialogue tree system so you interact with characters you’ve become attached to via a series of multiple-choice statements, but all you’re doing is selecting between a narrow range of predestined routes. You have no real narrative agency, and this can lead to a lot of frustration.

“When I played Mass Effect, I never forgave Ashley Williams for killing Wrex, my favorite character,” says Michael Cook, an A.I. researcher at Queen Mary University in London who developed a program called Angelina that designs its own games. In a tense sequence in the sci-fi role-playing game, Williams, a human soldier, blasts the potentially traitorous alien warrior, Urdnot Wrex, to protect lead character Commander Shepard. Unless the player has accrued the correct skills through the course of the game, this assassination is unpreventable — it’s a baked-in part of the narrative. “When the time came to choose someone to send on a dangerous mission at the end of the game, I sent her to her death out of spite.”

A more satisfying outcome would have been for the game to acknowledge Michael’s fury and build his quest for revenge into the narrative. That way, Ashley’s death would have provided that sense of bloody catharsis we all love so much.

Increasing numbers of game developers are now experimenting with artificial intelligence and procedural content generation, which gives a program the components and rules for building its own world or narrative. Some are exploring machine learning, neural networks, and natural language processing to create more responsive narrative and conversational experiences, while others are exploring different ways to experience gameplay.

The fantasy action adventures Middle Earth: Shadow of Mordor and Middle Earth: Shadow of War feature an A.I.-led Nemesis System that allows enemy orcs to remember the fights you’ve had with them so that if you two meet again, they will mention your past encounters. This has proven hugely popular with players, who often forge epic rivalries with particular characters, tracking their nemeses across vast tracts of Middle Earth. A lot of this is going on in the player’s head, but it works because it makes the motivations and the plot feel more personal.

In the indie hit Hello Neighbor, you try to break into your neighbor’s house through the basement, but every time you play, the A.I. resident learns about your approach — the entry points you use, the traps you set — and begins to anticipate your peculiar tactics. It’s like a Home Alone simulator in which you’re the Wet Bandits and the computer is Kevin.

Meanwhile, RimWorld is a sci-fi colony simulator ostensibly about building a civilization on a deserted planet. As you play, an A.I. storyteller watches what you do, then serves you story quests and challenges based on your strategies and the lives of your inhabitants. You are actively participating in the creation of a folklore, a mythology. In-game characters even create sculptures inspired by your battles and triumphs — very much appeasing the egotistical tendencies of people who believe they could run a space colony.

London-based developer Bossa Studios has a small team working on advanced A.I. tools that add emergent and reactive characters and situations to games. One tool, called Reasonable, creates intelligent NPCs—or A.I. agents — that can be programmed with aims, ambitions, and proclivities and then set loose in a game world. “Reasonable is designed for us to throw away any idea of creating an overarching story for our content and really hand the reins over to a procedural approach,” says Alex Whittaker, who leads Bossa’s A.I. team. “A big part of stories is misunderstanding, deception, and ignorance, so our starting point is having our planning agents working with their own model of the world, which they have to build and then reason within. And what becomes really cool in terms of gameplay is what happens if I, the player, can in some way deceive an agent — if I can tell it something that isn’t true and then see it behave on that basis?” Whittaker imagines a future Bossa game set in an open world filled with these agents, all going about their lives — where the story emerges from the player’s interactions with them.

Think about a version of Grand Theft Auto where every civilian you pass has needs and ambitions and might be up for robbing a convenience store with you. In this way, stories emerge from your own interactions. The player becomes the author. As Whittaker acknowledges, the outcome could be chaotic and nonsensical, but perhaps that’s something we have to accept. “You need to think, okay, the player might be able to break the narrative sandbox; you could get into these Asimov-style I, Robot quandaries where the characters are just paralyzed with indecision. And that’s part of the joy of it.”

Another tool for creating personalized adventures could be the development of A.I.-powered game design engines. Matthew Guzdial built one while studying for his PhD in creative artificial intelligence and machine learning at Georgia Tech, and he is experimenting with ways to partially automate the level design process for game makers. “We’ll get experiences where the computer is acting as the dungeon master or as an interface for interacting with the world, inventing things for you as you’re moving through this space,” Guzdial says. “I expect to see games trying to give each player a space of possible experiences that maybe fit some understanding from other media or other genres. So, it would be like, ‘I want to give this player an experience like a slasher movie,’ ‘I want to give this player an experience as if they were playing a ’90s sitcom,’ instead of having to say, ‘This is the one correct experience to have.’”

Perhaps the narcissism of being central to a story appeals to all of us to some degree. We like to see our experiences reflected in the culture we consume.

For most people, though, that’s just too much of a leap from the narrative experiences we have come to expect from games. The mainstream future of interactive stories is likely to be a clever combination of A.I. and authored content: We still get a gripping handwritten plot, but we also get characters and story elements that seem to react to us as individuals. Spirit AI, based in London and New York, is developing a tool called Character Engine that will allow developers to create NPCs capable of interesting reactive dialogue with players.

“Character Engine is designed to let authors create more dynamic conversations than we’re used to from games,” says Aaron Reed, the company’s A.I. specialist. “Rather than existing in a fixed dialogue tree, dialogue is authored in pieces, with more contextual smarts about where they can be used and how they can be modified for a given situation, which lets a conversation feel more improvisational and responsive. Players can go off-script to ask follow-up questions or change the subject, and authors can allow this kind of flexibility while also determining how much the NPC should pull the conversation back to particular goals.”

Imagine you’re playing a game powered by Character Engine: You might chat with a character, and their responses will depend on what else is going in the game and how you’ve treated them in the past. “We give characters a preference about which social practices they would rather perform,” says Emily Short, a product manager at Spirit AI who, five years ago, co-wrote a revolutionary A.I. character and story engine named Versu. “An extrovert might like to talk about themselves; an introvert might not. These weightings express their personality but can also shift based on the relationship you have with the character — so a character might get to know you and act as more of an extrovert around you, even if their base personality hasn’t shifted. Two players might reach the same point in a game and have friendly relationships with the same NPC — and that NPC treats one of them with joking distrust but opens up to the other.”

Perhaps the narcissism of being central to a story appeals to all of us to some degree. We like to see our experiences reflected in the culture we consume — it’s part of the reason social media has exploded in popularity. “One way games can show curiosity is towards understanding the role a player is trying to inhabit and responding to that,” Cook says. “Games aren’t very good at responding to how players express themselves through gameplay — instead, they expect the player to come to them and match themselves into one of the archetypes they prepared in advance. What I hope to see over the next five years is more research into A.I. which can act as dungeon masters or equivalent roles; A.I. that can understand what person the player is trying to be and tweak the game to enhance that narrative they’re building up.”

Bethesda’s brave experiment was just a little too early and a little too focused on sociopathic violence. In a game with more verbs available than “eat,” “fight,” and “score drugs,” players could perhaps form lasting relationships with these A.I. tearaways. Game designer Jesse Schell once told me he imagined a futuristic online role-playing adventure in which players went on quests with A.I. characters who would live on in that world long after that player had left or passed away — they would carry the torch of those experiences, passing them on as stories to the player’s children and grandchildren. We all want to be listened to and remembered. We all have stories to tell. One day, games will play a role in that.