To Find Hidden Bias, Use an AI Lens
Analyzing film and TV scripts reveals biases we can’t always see on our own.
One way to know that recent cultural shifts are beginning to soak into our everyday lives: We don’t see familiar things the way we used to. Today, in art, entertainment, or even daily events, character types and interactions that seemed acceptably realistic and unremarkable to most of us now read as stereotyped, racist, or sexist. Now they make us uncomfortable. We are seeing the same things through a new lens.
Unfortunately, we’re training our new lens on work that has already been produced; work that, in fact, was probably developed years earlier. For change-makers, that’s too late.
Most female film characters display a far more limited emotional range than male characters. (That doesn’t sound like the real women I know!)
In the film and TV industry, writers and producers are wondering how they can identify and evaluate their own work in manuscript stage, to push back against unconscious bias and, among other things, produce more balanced gender representations in film, TV, and books. A recent New York Times article discussed script-writing software tools and add-ons that helped screenwriters discover unconscious gender imbalances in their own writing. This is a good first step for writers, offering a quick evaluation of their work-in-progress.
My company, StoryFit, of Austin, Texas, does that, and a lot more. Our data scientists have assembled more than 3,000 screenplays and manuscripts spanning several decades and used natural language processing to analyze them according to multiple criteria. We can evaluate new scripts against, for example, financial performance: how do they stack up against box-office smashes? Against esteemed award winners? Against similar story themes?
As we reported earlier this year, the results of our analyses are often surprising. For example, most female film characters display a far more limited emotional range than male characters. (That doesn’t sound like the real women I know!)
Our findings are always interesting, but the solution to ingrained gender imbalance (or any other cultural imbalance, for that matter) in scripts is not as simple as adding or subtracting words or lines. The writer who uses tools like those in the Times article (which employ formulas for counting) will no doubt make a better product, but it’s just a start; we’ve moved beyond that point in capabilities and insights. With our deep database and analytic tools we can show just where a script stands in the emotional ranges of its characters; their participation in the success of the story; and even the hidden messages about society contained in its pages.
And, in case we forget; film, TV, and publishing are businesses. Studios and producers, inundated with screenplays, use Storyfit to streamline the evaluative process and discover the script they really want. And when it’s time to make a decision, the additional insights from Storyfit can help producers choose a script that is also a good investment. Storyfit’s database continues to grow. AI technology enables us to add to its analytic capabilities and deepen our insights.
A feature film or television show is a complex, collaborative work, and its success is dependent on many factors. Film and TV industries always need to balance audience likes and dislikes, shifting societal trends and other considerations in making their investment and artistic decisions. Storyfit provides a keen, advisory eye at an early stage. This, we believe, is one way to enable real change.
The Times article on checking for gender bias quoted friend and fellow Techstars CEO Guy Goldstein, founder of WriterDuet, Inc.: “Technology can do this, and technology should be doing this.”
At StoryFit, we are doing this. And much more.