Midjourney-generated image for prompt: a factory floor filled with robots writing screenplays

The WGA’s next move

Matt Aldrich
𝐀𝐈 𝐦𝐨𝐧𝐤𝐬.𝐢𝐨
6 min readNov 27, 2023

--

Hollywood writers emerged from the strike as clear victors, especially in the area of AI. But the new contract raises a significant question: when a writer elects to use AI, who should get the credit?

In the 90s, screenwriter Ron Bass was in high demand. His produced credits include critical and commercial successes like “Rain Main”, “The Joy Luck Club”, “Sleeping with the Enemy”, “My Best Friend’s Wedding”, and “Waiting to Exhale” (among many others). He was dubbed Hollywood’s first Billion Dollar Writer, based on the earnings of his films to date.

As his reputation as a hitmaker became common knowledge, so did his eccentric writing routine. He would start his days at four AM (his company was named Predawn Productions). He employed a team of development assistants, all women, casually dubbed “The Ronettes”, who would help him with research, script notes, outlines, etc. It was widely speculated (though never confirmed) that the Ronettes wrote parts or even whole passes of Bass’ scripts. After all, how else could he churn out as many as seven screenplays a year (including multiple drafts of each)?

Of course, Bass was not the first to blur the line between assistance and authorship. In the 40s and 50s, screenwriter Philip Yordan acted as a front for blacklisted screenwriters and employed surrogates to write his scripts for him. And famously, Dutch master Peter Paul Rubens would have apprentices paint ahead of him, working off his sketches, and would then apply the finishing touches himself.

What do these three men have in common? They took the credit.

With the advent of generative AI, every screenwriter now has their own squad of Ronettes to help with research, offer notes on structure, and even write entire drafts. Seven screenplays a year has become child’s play; a person today can generate seven an hour, with time to spare.

The WGA won major protections surrounding AI which the studios were loathe to concede. According to the new contract, studios are prohibited from using AI to rewrite a script, generate a script, or generate source material on which a script will be written. Nor can studios force writers to use AI as a way to speed up the process. AI can also never receive credit on a given project; only humans can.

However: the contract stipulates that writers may elect to use AI, as long as they disclose it to the studios and comply with studio guidelines. What are those guidelines? No one knows, because they haven’t been written yet. The studios are still grappling with AI’s threat to their own copyrights and trademarks, so how they will allow the tech to be employed is still anyone’s guess. But we can make an educated guess here. Studios will likely develop proprietary AI script generators trained on their own back catalogues. These tools will be made available to those writers hired to write, say, the next Marvel or DC movie, without fear of copyright infringement.

The AI would not receive credit on such a film; that would go to the writer. The Guild must now ask itself, is that fair and accurate?

Those outside the business might find it surprising to learn that studios don’t determine credit on films. The Guild does. Here’s how it works (briefly): whenever credit is in dispute on a given project, those writers seeking credit participate in a Guild arbitration. A panel of three writers read every draft by every writer, along with all underlying material, compare it to the final shooting script, and determine which writer or writers should have their names up on the screen. Scripts are read blind (no names — only Writer A, Writer B, and so on), and are evaluated per criteria laid out in what’s called the Screen Credits Manual (SCM). Each arbiter reaches their decision independently, and if there is disagreement, they meet to deliberate and render a final recommendation.

Currently, there are no provisions in the SCM for AI. Writers in arbitration do not need to disclose whether they used AI for any step of the writing process. Their scripts will be labeled the same as the writers who did not employ AI, leaving the arbiters in the dark. This cannot stand. Arbiters need all the information they can get to reach a fair and accurate determination. Credit is not a matter of mere vanity; real money is on the line in the form of bonuses and residuals.

The SCM must be amended to address this new technology and to protect the integrity of the credit. Some might argue to wait until the courts settle questions of authorship in the age of AI. But the WGA already has a long and vigorous track record when it comes to protecting the writing credit on its own. For instance, the threshold for a director or producer to “grab” credit on script is higher than that of another writer; they need to demonstrate a contribution of greater than 50% to obtain shared credit. The Guild has also fought against the “film by” credit in cases in which the director did not contribute significantly to the screenplay. By rule, the number of people or teams that can receive credit is limited to three, so as not to dilute the credit’s meaning. And most recently, the Guild adopted the “additional literary materials by” credit to better represent the work of members who made significant contributions to a film but whose work does not exceed the credit threshold.

The Guild has shown willingness both to uphold long-held standards and to adapt with the times. It’s time for the union to decide if a writer using AI to draft some or all of a script should be granted equal footing with a writer who has not. It is, essentially, an anti-doping question. Will the union ban performance-enhancing technology or let it run rampant in the shadows?

The good news is that, at least theoretically, the Guild can act on this issue unilaterally. Changes to the SCM can be made by referendum, voted on by the membership in a process akin to a state ballot initiative. The Guild can, and should, require writers in arbitration to disclose whether they used AI to write a script. That information could then be used as a basis for either a) disqualification, b) higher threshold requirements, or c) the assignment of a new “script generated by” credit.

In practicality, such an effort would face two big hurdles. First, if a new “generated by” credit is to be minted, the Guild would likely need some level of studio cooperation, if only to help sell the idea to the membership. This is not impossible; as stated, the studios have their own concerns over copyright infringement, and transparency of credit could go a long way toward covering their exposure. While it may be hard to fathom after this hot labor summer, the Guild and the studios rely on each other, and deals can be achieved when interests are aligned.

Second, any disclosure requirement the Guild might put in place would rely on self-reporting. AI detection software will always be one step behind the generation software. And if a “script generated by” credit carries a certain foul odor, writers will be inclined to conceal the fact that they used AI to do the work for them. However, the studios could be a ally here, too. The new deal requires writers to disclose whether they use AI ahead of time. Writers’ contracts will soon include standard AI disclosures in order to preserve a clear chain of title. So by the time a script reaches arbitration, AI disclosure would already be a matter of record, something easily verified without the need of cutting-edge software or pinky-promises.

Would a “script generated by” credit carry a certain stink? Yes. As it should. Using AI to generate a script is not the same as writing one. To assign the same credit will be an insult to the vast majority of WGA members who wouldn’t even task AI with writing a to-do list. It would also be an insult to audiences, who have the right to know if their movies, TV shows, news articles novels, children’s books, etc. were crafted by actual human beings. If we have resigned ourselves to a world with AI, we must all demand transparency. Then, as they say, we can let the market decide.

When the question of authorship was limited to outliers like Ron Bass and Philip Yordan, the Guild did not need to establish new rules and norms. But that was then. The WGA was the first union in the country to strike over AI issues, and won concessions and protections that were regarded as unattainable. The Guild must now look inward and decide what it means to write, at the most fundamental level, and protect that as well.

Midjourney-generated image for prompt: a laptop that spits out hundreds of pages, out of control

--

--