• Following Us

  • Categories

  • Check out the Archives

  • Awards & Nominations

  • Advertisements

Bottom’s Up: The IMDb Bottom 100 and the Art of Identifying “Worst” Movies…

Readers of the site might be aware that I co-host a weekly podcast called The 250 with my good friend Andrew Quinn, in which we pick a movie ranked on the Internet Movie Database as one of the best movies of all-time. It’s a dynamic and public list, which means that it covers a wide variety of films and tastes. In the part couple of months alone, we’ve covered everything from Mission: Impossible – Fallout to Battle of Algiers to Paper Moon to The Secret in Their Eyes to The Prestige. I’m very proud of the podcast, and a lot of the discussions that we’ve had on it.

Part of this podcast has also involved looking at the list that the Internet Movie Database maintains of the worst movies ever made. We originally planned to rotate through both lists in an even-handed manner; five episodes of the top two-hundred-and-fifty for two episodes of the bottom one hundred. Indeed, we cover a number of the bottom one hundred as part of the show; episodes like Crimea and United Passions. However, we moved away from covering the bottom one hundred because we found that the movies populating the list weren’t so much awful as just mind-numbingly dull; Lawnmower Man 2, Crossover.

However, something vaguely interesting happened in the middle of July. The Internet Movie Database made a change to their list of the one hundred worst movies of all time that radically revised the nature and composition of the list. Suddenly, a lot of the smaller and stranger titles disappeared. Fringe films like Space Mutiny, Die Hard Dracula, Invasion of the Neptune Men and Santa With Muscles were all wiped out in an instant, replaced by more familiar and recognisable films like Jaws 3D, S. Darko, Blair Witch II: Book of Shadows, The Wicker ManBatman and Robin and Fifty Shades of Grey.

The result was a list that was suddenly a lot more fun to talk about, composed of films that people had actually seen instead of disastrously bad cult curiosities. Indeed, one very small consequence of this change is that we’re actually going to try to get back into talking about these terrible movies on a semi-regular basis on the podcast, because the list is now populated with films that will engender more interesting discussions both about the films themselves and their larger cultural context.

At the same time, it raises larger questions about what we consider to be the “worst” films, how we rank and access bad cinema and what that actually means in the grander scheme of popular culture. The change implemented to the IMDb’s bottom one hundred list is a conscious attempt on the part of the organisation to answer these questions, to create a broad consensus about what it means to be the “worst” films ever made. It’s an intriguing effort, and arguably something very different from trying to pick the “best” films ever made.

After all, it’s broadly possible to forge something resembling a consensus on the best movies. Trying to identify the worst is a much more difficult proposition.

One of the interesting things about end of year lists is that there tends to be a broad consensus on them. A lot of critics’ lists at the end of the year will contain familiar and recognisable films, with a broad consensus forging around a dozen or so films as the best of a given year. Individual critics will inevitably have their own unique or esoteric choices, and obviously individuals will rank their films differently, but broadly speaking something resembling a consensus emerges towards the end of the year.

Obviously, the emphasis within this consensus shifts depending on the critical body. Establishment critics might be happier to recognise Three Billboards Outside Ebbing, Missouri than a younger generation of more socially conscious critics. Critics with a fondness for independent cinema might prefer The Florida Project to something like The Disaster Artist. More mainstream-oriented critics might elevate films like Logan while the Academy Awards would be more likely to go with something like The Post.

However, by and large, at the end of given year, it is common to see the same names coming up time and time again. There are a variety of reasons for this. Some of these reasons may be cynical, suggesting the manner in which awards season is built around the lead-in to the Academy Awards and perhaps even concerns about homogeneity among these pools of critics. However, it also seems fair to concede that sometimes good films are elevated and do rise to the top of the cultural conversation.

It is heartening to think that good low-budget and independent films can be elevated by critical acknowledgement and force their way into the conversation. Films like Moonlight and Lady Bird are movies that would easily have been forgotten and lost in the mix, but were elevated because they were good. Critics praised Moonlight and Lady Bird, so more critics watched Moonlight and Lady Bird, and so Moonlight and Lady Bird reached a wider audience. Had the films floundered on the festival circuit, they would have smaller distribution deals and smaller circulation.

Indeed, this perhaps hints at why there is so much overlap between various end of year lists constructed by critics. Films that are already regarded as good, that have already garnered praise and attention, are more likely to be considered “must-see” by institutions and organisations. Time is a precious commodity, and if films like Phantom Thread and Darkest Hour are deemed “essential” parts of the end-of-year conversation based on early buzz, than more critics are likely to have seen them, and so they are far more likely to factor into these lists.

After all, as appealing as the notion of an “underseen and unheralded classic” might be, the truth is that most of the best movies of a given year (or era) tend to have champions who trumpet them and that these recommendations are enough to ensure that these films are widely seen. After all, what self-professed film lover wouldn’t want to see the films that are commonly agreed to be the best films of a given year? Who doesn’t want to sit down and enjoy a good piece of cinema? As such, many of the “best” films are widely seen and so consensus can form around them.

This is obviously an issue with terrible films. After all, relatively few people actively seek out terrible films to watch. Although many film reviewers will watch any movie with a scheduled press screening and a wide enough release, like The Happytime Murders, there are plenty of horrible movies that languish in obscurity largely unseen, like Escape Plan 2: Hades or China Salesman. These movies are undoubtedly bad, but they are also not movies that people actively seek out. They rarely have wide mainstream distribution, in part because the buzz around them is not particularly exciting.

As a result, asking somebody to pick their least favourite film of a given year is a somewhat esoteric experience. Not only because dislikes tend to be highly personal, but also because there is rarely a larger pool of shared experiences of awfulness from which the larger group might draw. Undoubtedly, every year produces a handful of truly awful big budget blockbusters like Assassin’s Creed or King Arthur: Legend of the Sword or Justice League, but most of the time bad studio films are often bland rather than truly, horrendously, spectacularly awful.

Personally, very few of the worst films that I see in a given year will go on wide release. In fact, many of the worst films that I see in a given year, I see as part of film festivals or on the circuit. Bad reviews and word of mouth drive audiences and other critics away, and they never generate the same appeal as those bigger films that attract the same attention. Most years, two or three of the worst films that I see will be at the Dublin International Film Festival, just as two or three of the best. However, those two or three best films will inevitably get wider distribution. Films like The Meeting, Unless or Price of Desire will remain obscurities.

This creates a challenge in talking about the “worst” films of a given year, because it’s often the case that people having that conversation are working from very different frames of reference. This explains the desire to paint movies that really aren’t that bad (like the ambitious-but-flawed Batman vs. Superman) as touchstones for awful cinema, because they exist within a common frame of reference and so are more likely to generate an engaging discussion and a common sense of bonding among participants than something like Keloglan versus the Black Prince.

This is why, for example, the Golden Raspberries (the “Razzies”) tend to go for the lowest hanging fruit when judging the worst of the year in cinema. Although the Academy Awards can often be predictable and stale when identifying the best films of a given year, the Razzies are even more straightforward. They award big-budget films starring recognisable actors that met with either exaggerated audience backlash or knee-jerk critical rejection. This isn’t to suggest that the Razzies nominate misunderstood classics, but they also decline to actively look for bad films outside of popular consciousness.

To nominate The Emoji Movie, The Mummy, Baywatch, Fifty Shades Darker and Transformers: The Last Night as the worst movies of a given year suggests a profound lack of imagination, and an utter unwillingness to look outside a calcified mainstream consensus. The Academy Awards serve to at least elevate indie films like Moonlight or Lady Bird from the festival circuit to the mainstream, the Goldern Raspberries decline to do anything as interesting or worthwhile. They neither slaughter sacred cows nor highlight awfulness that exists off the beaten track.

This is where the polls of discussion about “bad” movies exist, between those people who actually wade through the darkness to find the obscure titles that genuinely merit the descriptor and those who seek to acknowledge popular consensus in identifying and listing bad films, a consensus that is often limited by the fact that (for completely understandable reasons) very few people invest the time and energy to become try connoisseurs of the terrible. The reworking of the Internet Movie Database’s Bottom 100 reflects the pull from one extreme to the other.

Although the lists of the Internet Movie Database are nominally democratic, the website itself takes care to carefully manage the system to prevent any particularly surreal outcomes.  (This doesn’t always work, of course, and voting on the Internet Movie Database is becoming increasingly politicised.) It should be noted that these changes in policy are often reactive rather than proaction; the website often seems to be responding to particular controversies among its user base.

There was, for example, some anxiety when a large number of Indian and Turkish films began arriving on the list. Of course, this seems perfectly reasonable given the quantity of films produced in these markets. Statistically speaking, it seems highly unlikely that American, European and Japanese markets hold a monopoly on good cinema. However, the user base reacted to the influx of these movies with horror, and this led to a number of retools of the algorithm in order to minimise the arrival of these films on to the list.

The site careful weights its users votes to prevent ballot-stuffing; a core demographic of users (“top 1000 voters”) find their votes given greater priority than others. Similarly, minimum thresholds have been introduced and constantly revised upwards, so that certain numbers of people must see (and vote on) a movie before it can rank on the list. At the moment, the threshold is set at 25,000 for the top two hundred and fifty films. Of course, this has had a number of unintended side effects. Most obviously, many classic movies like The Sweet Smell of Success were cut from the list because they didn’t hit the quota.

Until recently, most of this tinkering had taken place on the site’s list of the top two hundred and fifty movies. This makes sense, given that this was the list that tended to garner the most attention and press coverage. Audiences were more likely to seek out good movies to watch rather than bad, and so being named one of the best movies of all time was more important to a film than being named one of the worst. Nevertheless, the Internet Movie Database made a pretty dramatic change to how it ranked the bottom one hundred films in July.

The mechanics of this change are not entirely transparent. The company does not reveal the finer points of its mechanics and does not announce these behind-the-scenes changes ahead of time. Indeed, the list seemed to change overnight, looking radically different on July 14th than it had on July 9th. Just observationally, it appeared that the company had increased the number of votes necessary for a film to place on the ranking. It seems like a movie now requires more than 10,000 votes to appear on the Internet Movie Database’s Bottom 100 Movies of All-Time.

This change to the entry criteria has a dramatic impact on the list. Foodfight, which was ranked the seventh worst movie of all-time in the earlier list, has only seven thousand five hundred votes. Titanic: The Legend Goes On, which was ranked the tenth worst movie of all-time in the earlier list, has around eight thousand votes. Movies that are genuinely awful tend to have a harder time attracting audience members, which means they have a harder time attracting IMDb voters, which means it is a lot harder for a truly spectacularly awful movie to hit that minimum threshold.

Indeed, a lot of the movies that we had covered from the list on the podcast were wiped overnight. Movies like Crimea, Keloglan versus the Black Prince, Lawnmower Man 2 and United Passions all failed to hit that bar, along with other cult “classics” like Chairman of the Board or Santa Claus Conquers the Martians. In their place, movies that were less aggressively disliked but more widely seen swarmed on to the list. It is telling that the highest score (out of ten) on the earlier list was 2.7, while the highest score (out of ten) on the current iteration is 4.2.

This is an interesting compromise, because it makes the list more interesting to discuss while also making it slightly less accurate. A much higher percentage of films on the bottom one hundred list have now had a major release, and many of them even had a serious impact on popular culture. There are two Nicolas Cage movies featured on the list at present. There are a host of sequels from RoboCop 3 to The Exorcist 2 to Superman IV: The Quest for Peace. An average cinema-goer might expect to have seen twenty to thirty films on the list, and to recognise around seventy to eighty of them.

The shift also makes the list more interesting to talk about as representative of the IMDb user base. As we noticed recorded the podcast, the IMDb list reflect a particular kind of cinema-goer. There are a lot of male-oriented movies like Fight Club or The Dark Knight. There is a very strong emphasis on New Hollywood, as reflected by the high placement of films like The Godfather or One Flew Over the Cuckoo’s Nest. There is also strong representation from the nineties, with movies like Pulp Fiction, Reservoir Dogs, Forrest Gump and The Shawshank Redemption placing highly.

In particular, there is a strong emphasis on nerd culture on the list, particularly with new releases. The major releases from the past year to place on the list include Deadpool 2, Mission: Impossible – Fallout, Incredibles 2, Avengers: Infinity War and Thor: Ragnarok. The exclusion of the critically and commercially successful Black Panther from that list is perhaps very revealing. These are the movies that tend to attract both the quantity of votes necessary to qualify for the list and the high grades from those voters in order to place on the list.

As such, the changes made to the Bottom 100 are interesting in how they illustrate the same principles. The revise Bottom 100 is much more reflective of the kinds of films that would upset the kinds of voters who define the Top 250, turning the bottom list into the shadow self of the top list. While the top list reflects masculinity, the bottom list seems distinctly uncomfortable with femininity; Fifty Shades of Grey, Spice World and Crossroads all place on the list, films consciously aimed at niche female audiences.

The bottom one hundred is now dominated by films that inspire what might be described as “nerd rage”, the films most likely to upset the sorts of fans eagerly placing Infinity War among the best fifty films ever made. The bottom one hundred is populated with movies that these fans would consider humiliating or embarrassing. There are countless failed videogame adaptations like Street Fighter or Super Mario Brothers. There are early superhero films like Catwoman, Batman and Robin, Superman IV: The Quest for Peace. There are bad sequels like Speed 2: Cruise Control and The Flintstones in Viva Rock Vegas.

The result is a list of “worst” films that is surprisingly mainstream, perhaps a lot closer to the cultural consensus than the earlier iterations of the list had been. It is certainly more reflective of what the average cinema-goer could identify as a bad film. Of course, recent films with vocal fandoms are still somewhat insulated and protected from inclusion on the list. Films like Transformers or Suicide Squad are unlikely to ever place on the list despite the low cultural esteem in which they are held, in large part because they are protected by small vocal fandoms.

There is some value in this. Certain, as the co-host of a podcast on which I have to talk about individual films for an extended period of time, it is a lot easier to have a casual and engaging conversation about Norbit or The Adventures of Pluto Nash than it would be to discuss Dracula 3000 or Fat Slags, if only because the cultural context of the former is a lot clearer and it’s a lot easier to talk about those films from a common frame of reference. Indeed, The 250 will be making an effort to reintroduce the Bottom 100 as a more regular feature, at least as part of a pilot scheme.

That said, it does feel like something esoteric has been lost in this transition, in that the average quality of movies on the list has increased significantly. Most of the list is still awful, of course, but very few of the movies on it would be among my worst of the year in question. It’s an interesting question to ponder, the gulf that exists between cultural consensus and individual experience, the concession that needs to be made towards mainstream awareness and experience when making arguments about the “worst” films ever made.

Certainly, there’s a lot to talk about.

The 250 will be discussing the changes to the Bottom 100 this weekend, with an episode focusing on “The Open House”, a Netflix movie which became the streaming giant’s first film to place on either the top two fifty or the bottom one hundred. We are provisionally looking at doing a Bottom 100 episode once a month, randomly mixed in with our regular schedule. Please let us know (a.) if you think this is a good idea and if you want greater/lesser coverage of the Bottom 100 and (b.) if there is a movie on the list that you want us to cover, which movie you’d like to see discussed.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: