Whose ratings should you trust? IMDB, Rotten Tomatoes, Metacritic, or Fandango?
Alexandru Olteanu
66826

I understand the motivation to do so, but I’m not sure you can back up that it’s accurate to say that the quality of most movies actually rests in the middle of the graph since that would mean most movies are just barely above watchable. Wouldn’t it be more likely that most movies are fine and mostly good entertainment and that it’s actually more rare that a movie is only middling, which to a lot of people indicates that it might not actually be entertaining at all? As you noted, it’s just a choice one has to make in such research, but it seems an odd choice and implies that you’re intuiting your own personal view of most movies in place of some other less biased way of determining what the distribution should be.

Wouldn’t it be less prone to personal bias for you to do something like take the average of all possible review aggregators and then pick which distribution is closer to the average distribution (or something like this method)? I’m not saying that would be the correct method, just that I’m not sure you’ve actually justified that your chosen method was the right one. For instance you use this phrase “a whole bunch of average ones” to describe the bulk of movies and say that you can’t even remember the plots of them anymore, but it’s unclear why remembering the plot at an arbitrary time in the future would bear on the entertainment quality of a film at all since you can’t determine whether you’ll remember a film when you’re reviewing it after having watched it for the first time. Even just checking each site’s distribution against the cinemascore distribution might have less bias than your criteria do. I’m just not sure you’ve actually accomplished what you set out to do since your initial search criteria return inherently biased results.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.