Shining Moonlight On IMDb
IMDb ratings contain more than just film opinions
I hope you have seen Moonlight by now. The acting is superb; the storytelling and character development are unique. The story sits defiantly at the intersection of race and sexuality. It is brave, entertaining and provoking.
Since early 2017, Moonlight has been trashed by some of IMDb’s most vocal users. Some of those reviews express opinions on race, homosexuality and politics, suggesting Moonlight’s ratings may be more than movie-based. This article shares my exploration into the data behind Moonlight’s current “above-average” rating.
Background
I’m a white, heterosexual male who grew up upper-middle class in a semi-small town in Canada. I’m working towards understanding my privilege in a world that seems bent over backwards to provide me with opportunity. I’ve never really faced the discrimination and violence that too many experience as part of their daily existence. Moonlight is the kind of movie that helps me feel with those discriminated against, if briefly. In so doing, it hunts down my own latent prejudices, reflecting them back with a bright light. The more films like Moonlight I see, the more I see my hidden prejudices.
Movies like Moonlight that challenge our societal norms face an uphill battle in terms of funding and opportunity. Popular movies are green-lit, directed and starred in by straight white guys. Few women, few people of color, few speaking characters from the LGBTQ community. And there is minimal evidence that Hollywood is changing the way it operates. For example, F-rated (a basic standard for the presence of women in film) founder Holly Tarquini recently stated that “films by and featuring women often have significantly less spent on promotion, so they are more difficult for audiences to find.”
Less on promotion. More difficult to find. Hold that thought.
Last November my girlfriend and I went to see Moonlight. We had both read great reviews and heard from others that it was a must-see. I remember looking up Moonlight on IMDb; at that time it had scored around an 8.4 (stars, out of 10), a very positive vote in favor of going.
Since then the rating has dropped to 7.5. I noticed this a couple of months ago by accident while looking up another movie. My experience of IMDb is that a 7.5 rating implies a movie is good; an 8.4 rating implies it is one of the best of the year. Every 0.1 counts and every 1.0 seismic.
I’m a math nerd, and one skeptical of this drop from 8.4 to 7.5. What’s behind the drop? When and why did it happen?
7.5 Is Unusually Low
A 7.5 places Moonlight’s IMDb rating in the bottom eight of the past 45 films nominated for a Best Picture Oscar (i.e. since 2012). An 8.4 would have left Moonlight in the top 3 of the same list.
In terms of professional criticism, Moonlight has a Metacritic rating of 99/100 and a Rotten Tomatoes score of 98/100. This year, the 6000 voting members of the Academy Awards chose Moonlight as the Best Picture of 2016. In the past five years there is no other film that has higher reviews from professional critics, nominated for (let alone won) Best Picture, and yet has such a low IMDb rating. Falling from 8.4 to 7.5 is a big deal.
Background And Data
IMDb, owned by Amazon, enables users to rate movies, which they then aggregate into a summary rating. Until I did preliminary research, I assumed that an IMDb rating was a simple average of all user ratings. Instead, IMDb has a secret formula that attempts to minimize ratings influence from motivated groups of users (ballot stuffing). As per Wikipedia, this “sometimes produces an extreme difference between the weighted average and the arithmetic mean.”
As of June 2017, Moonlight had 145,000 user ratings, though each individual rating is not available on IMDb. Besides aggregated statistics, however, IMDb also publishes individual written reviews which include a date and a star rating. In Moonlight’s case, there are about 550 written reviews which can be used to investigate the broader dataset. It is not a perfect proxy; these written reviews represent 0.4% of total ratings. Also, I do not know what formula IMDb uses to calculate the overall rating, so a comparison is even harder. No conclusions can be made, but better questions can be asked.
Moonlight Falls Fast After The Oscars
Generally, early written-reviewers loved Moonlight. Written-review ratings, however, briefly dropped in late January before recovering mid-February. Then from late February to early March they crashed from an average written-review rating of around 8 to around 5.
The timing suggests the decline is related to the Oscars held on February 26th. Here are some headlines from reviews on February 27th:
In terms of the most negative numerical ratings, prior to Oscar night Moonlight received 11 written-review ratings of 1/10 or about 5% of written reviews up to that point. In the following two weeks it received 50 written-review ratings of 1/10 or about 25% of written reviews.
This Has Not Happened Before
I looked at written reviews for the past eight Best Picture winners. The Hurt Locker saw its written-review ratings generally fall from around 7 to around 5 from mid-January to mid-March. Birdman saw its written-review ratings drop in a similar pattern. It appears an Oscar win can bring negative ratings in response. That said, the violence of the overnight drop experienced by Moonlight is unique.
How did other 2016 Best Picture nominees perform on IMDb (i.e. is this drop a recent, broad phenomenon)? For the most part the written-review ratings remained even through time. La La Land, Manchester By The Sea and Arrival received consistently poor written reviews. Yet it is clear that Oscar night specifically affected Moonlight much more drastically than any other nominated movie.
Reviewer Motivations
Of the twenty most popular (“useful”) written reviews on IMDb, fourteen were negative; many found the movie boring and slow-paced. True, Moonlight is not a cookie-cutter script. By virtue of doing something new and different, Moonlight was bound to disappoint some people who have a different vision of a movie’s proper structure and pacing.
What is concerning is that after declaring Moonlight a terrible movie, some reviewers started to justify its Best Picture nomination as a result of the #OscarsSoWhite movement. For example, here is a section of the most popular written review for Moonlight:
544 out of 906 people found the following review useful: 5/10 Stars
With all the controversy back in 2016 over the #Oscarssowhite shambles, it seems that in 2017 the Academy has made a conscious effort to include as much diversity into the show as they possibly can. Unfortunately, the downside of that is that films like ‘Moonlight’, which are in reality very average, get recognition they don’t deserve and people are fooled into thinking they are better than they actually are.
Here’s a section of a more opinionated review:
280 out of 540 people found the following review useful: 1/10 Stars
I hate to say it, but I think the only reason it was nominated and reviewed so highly by the powers that be, is because of white guilt, and that’s the simple and plain truth. Hollywood is trying to avoid the mad black actors from last year (Hi Jada S.) that were mad at there being no people of color being nominated, so this pile of crap was given its affirmative action place with some other movies that were actually well made.
To emphasize, 280 out of 540 who bothered to rate this review itself found it “useful,” which is as concerning as the racist review itself. These two reviews (highlighted in orange below) are not the only ones filled with racial conjecture about the Oscar nomination, but these are two of the most popular.
Similarly, other reviewers postulated that Moonlight’s Oscar win was due to a liberal agenda that wanted to see a “gay Oscar winner,” rather than winning on merit. These reviews do not have the same popularity in terms of the up-votes the racist reviews above received, but they are numerous.
The ratings drop may have been exacerbated by differences in attitudes towards race and homosexuality in major urban centers vs outside of them. As Moonlight’s acclaim grew, the film’s release widened from 650 theaters (November 18) to 1564 (post-Oscars). As Moonlight reached a wider and potentially more conservative audience, the negativity towards the film increased.
Better Understanding
It is possible that the significant drop-off in written-review ratings — along with the overall drop from 8.4 to 7.5 — could be simply due to the nuance, pacing and structure of Moonlight’s script. But the ratings fall also coincides with tendencies of racial and homophobic conjecture in the written reviews that require further investigation.
Rather than answers then, questions:
- Did Moonlight’s overall IMDb rating drop after the Oscars as quickly as the written-review ratings?
- Did #OscarsSoWhite backlash and homophobia motivate the larger drop from 8.4 to 7.5 or increase the velocity of the fall?
- How does IMDb generally manage a ratings system influenced by racism and homophobia (and other bigotry)?
At this point I think only IMDb can provide greater clarity, given that they hold all the data.
Some Fixes
What’s in a rating? When I visit IMDb, is the rating an aggregation of movie opinions or is it influenced by political and social views? There are many people who do not question the crowd-sourced rating systems that influence the content they choose. More are aware of the filter bubble that is their newsfeed, from Facebook to Twitter. But sources like IMDb?
Personally, I need to be more mindful of my filters. Clearly, a four-star Yelp rating for sushi in Tokyo means more than it does for sushi in Toledo. But now I see that a 7.5 IMDb rating for Moonlight could mean more than an 8.2 for Hacksaw Ridge.
We also need better tools, or demand more from the providers of the tools we now use. For example, IMDb allows you to search for movies with a stronger female presence. However, you would have to know that this F-rated standard exists in the first place and then you would have to do an advanced IMDb search. There are no filters (e.g. the Racial Bechdel Test) on IMDb that help users find movies with a stronger presence from people of color.
New filters could reward or prioritize films that pass diversity tests, or penalize/eliminate films within a general search that fail all diversity tests. Other tools could amalgamate ratings from multiple sources — from IMDb to Rotten Tomatoes to F-rated to the Racial Bechdel Test — and combine them in to a modified rating based on customizable settings.
What Can You Do?
Actively consider whether the next potential films you want to see have a fair opportunity to compete for your time and dollars. This is probably your most powerful vote. Actively search for and enjoy films that prioritize diversity.
Ask more from Amazon/IMDb. Amazon has built an empire based on our own data; the least it can do is to provide better solutions that address racism, homophobia and sexism. Tweet at them and ask how they look for bigotry in their ratings systems. Demand clarity and data transparency so that their ratings can be properly audited. Here is a possible tweet:
@IMDb @amazon you want to keep making money off of my data? Fix the bias in your ratings #BigotRatings #Transparency bit.ly/mnlghtmv
Burst My Filter Bubble
Here’s why I care so much about what happened to Moonlight’s rating: what would my November, 2016 self have thought if I had seen Moonlight’s current 7.5 rating vs the 8.4 I actually saw back then? I cannot say for sure, but on the margin it would have decreased my chances of seeing Moonlight, and that is a damn shame.
If movies that push back against Hollywood norms, against societal norms, against narrow thinking — if these movies lack promotional support and are attacked on some of our most popular crowd-sourced platforms, then our challenges multiply. This is a world that needs more empathy and compassion; we cannot afford filters that include stealth bigotry.
###
I appreciate that by virtue of writing this post, I highlight my own naivety. The idea that filters may contain prejudice is not new, though I hope my analysis adds to the discussion. As a friend put it, “IMDb and their ratings exist in a racist world. It can’t escape the systemic racism that is all around us, and influences every user of the site.” In the meantime, I welcome criticism and I appreciate your time in reading and responding. As well as those credited by Medium below, thanks to Kaz Brecher, Katie Aholt, Becca Shipps and sianpierre for their insights and encouragement.