It’s happened to all of us before. You’re browsing local theatre listings, scrolling Netflix pages or flipping through Shaw on Demand and you see a movie that catches your eye. When it first came out, the ads looked good and you made a mental note: I want to see that. It was a bare calculation then, a simple judgement call with nothing to fall back on.
Now you can look it up on Rotten Tomatoes.
The moment that page opens, your decision is made. You’re looking for that cardinal number. Please be decent, you think to yourself. If it’s over 75 %, you’re seeing it for sure. If it falls below that magic line of 60%, receives one of those green splatters and a ‘rotten’ designation, it’s out of the conversation. It’s a bit of a shame, really. That pure, unadulterated first judgement call you came to when you first heard about the movie is wiped away with one fell swoop and no hesitation.
The truth is that the ‘Tomatometer’ as it’s called, is quite trusted; Rotten Tomatoes is most popular film review aggregator on the web.
Rotten Tomatoes was founded in 1998, with the goal of accessing the reviews of multiple film critics in the United States. Soon, the site would expand, collecting all of those reviews and assigning them a blanket number, or percentage, like a badge. It has since been bought by both IGN and Flixster, who were in turn bought by Warner Bros. The site’s immense popularity has only gotten larger since its launch date in 1998, benefiting from a strong mobile presence under the Flixster app. And all of this success, with its reputation as the go-to place for film reviews, would be fine and dandy if the rating system wasn’t flawed.
The site gathers critics’ reviews and sorts through each one, designating them as either a favourable or non-favourable review (“fresh” or “rotten”, if you will). Right off the bat, this is a little troubling. Not all reviews have simple ratings out of five or ten, so interpretation of the review is in and of itself a judgement call. A positive review is counted as fresh and a negative review as rotten. From that point on, a percentage is generated. Fair enough, but this system clearly creates a discrepancy between what the rating actually means and what a site viewer (you) will think of it.
The movie Mud recently came out and turned some heads with a 99% rating on RT. And while Mud is certainly an entertaining and thought-provoking film, one that brings the nostalgia and charm of Stand by Me to the Mississippi River, it doesn’t deserve to be in the upper echelon of all-time great films, which a 99% rating ostensibly denotes it. A 99 percent rating does not mean that the critics settled on an average of 9.9 out of 10 for the movie. It just means that the charm which Mud exudes made it impossible for reviewers to give it a rotten rating. It’s subjective.
The truth is that the site also offers up an average rating on the review page (Mud got an 8.0), but it’s in small font and poorly placed position on the page, it is so often looked over. People tend to zone in on the first number they see and let it sear their brains. Thus, an opinion of a movie is formed within a second of seeing that percentage and the problem of movie rating being dictated by a simple like or dislike button becomes a deep one. A film like Goodfellas rightfully garners a 97% rating on the site, but its average rating of 8.8 is significantly higher than Mud’s.
Another problem with depending so heavily on the Tomatometer is that critics’ reviews are going to be biased from movie to movie as each film brings with it a separate set of expectations that each critic enters into the viewing room with. If you read individual critics’ takes on a film, you’re bound to get an actual sense of how they felt about it, whereas if you just see the Rotten Tomatoes rating, you will just see low reviews for The Great Gatsby or Man of Steel because critics’ anticipated more than they got (even a Rotten Tomatoes editor thought the rating on Man of Steel was far too harsh). Juxtaposed with the rating the first Hangover received, of which next to nothing was expected, you get a real sense that prior expectations can affect critics’ ratings.
Warner Bros. now owns the site, meaning an actual film studio runs Rotten Tomatoes. We’re not about to actively accuse the company of fucking with the reviews of its own films, but we’re definitely going to question whether that takes away some of the authenticity of the site. At the very least, It’s Complicated…(I never got around to seeing that…wonder what it got on RT?)
We’ve all watched a movie after checking its rating on RT, only to be shocked that it wasn’t rated higher/lower. What was your movie choice, what was the RT rating, and how big was the difference between expectation and reality?