“Zombie Shows” and a Short Tangent on Ratings Accuracy
1 Introduction
Recently I was reminded of the existence of the show The Walking Dead, and something just didn’t sit right in my mind—how in God’s name was it still on air by 2022? I can’t say if the show was still any good by that point (I stopped watching around season 5 in my youth) but I know, however anecdotal this is, that I did not hear a lick of news about it for the last 5 years. Actually its been more than 5 years, I haven’t heard anything in the general conscious since the death of Glenn and even that was only a blip in a growing stream of silence. I don’t want to bash on a show that has—mercifully—been over for 2 years now. Rather, my realization that The Walking Dead had crawled along for so long, and somehow with decent reviews online, has presented me with a fun question to explore:
Are obvious metrics even reliable? Can we actually trust something like user reviews simply because they are collected in aggregate?
To explain the train of thought that arrived at the question in the above block quote, I’ve provided the following extremely vital (and improperly formatted) flowchart(s):
The immediate answer to the question posed is that user-reviews may be biased by selection bias. Sometimes, perhaps, we can trust user-review aggregates, but other times selection bias is too strong. This is because user-reviews are entirely voluntary at not one but two stages. First (excluding review-bombing), users presumably must choose to watch a show and form an opinion. Second, users must decide that their opinion is worth expressing online in a review. If, for example, a show is slowly dying and only the truest of true fans are left watching, the user-reviews will be biased in aggregate, because those who may otherwise negatively review the show have already dropped out at stage one. This phenomenon isn’t particularly difficult to grasp, but, I think The Walking Dead television show provides a very good example. For the next few sections, I’ll demonstrate how even obvious data can be difficult to trust by using a long-zombified, now-dead, television show as an example. Specifically, I will be looking at viewership statistics and IMDB ratings for The Walking Dead over time.
2 The Walking Dead
To call The Walking Dead a “Zombie Show” in its final season would not only be reflective of its content by then, but too of its character. Despite facing declining average viewership as early as season 5, it trudged on for 6 more seasons. Before talking about reviews, I want to make sure that we are in agreement about the death of the show. Below is a graph displaying the television viewership counts in the United States of The Walking Dead per episode and split across seasons (“Sunday Cable Ratings [Multiple]” 2010-2018; “Showbuzz Daily Sunday Top 150 Cable Originals Network Finals [Multiple]” 2019-2022):
By the above chart in Figure 1, viewership counts have faced a sustained decline since around the middle of season 5, with a notable spike in the first episode of season 7—the death of Glenn. Looking at the data, there is a fair criticism right off the bat: these are cable television viewership stats, and people have been cord-cutting (Rainie 2021) for years. What if, the show was not experiencing as dramatic a fall-off as I am depicting? Unfortunately, no, cord-cutting does not explain away the woes of the show. In fact, according to FCC (2019-2022), more people had cable subscriptions in 2018 than any year prior in the period from 2010 to 2022, and we know The Walking Dead was losing viewers long before then.
Perhaps many with cable subscriptions just didn’t watch much cable anyways by then. As Molla (2018) points out, “Cable and satellite companies often include TV, internet and telephone for a single reduced price. For many [American cable subscribers], the cost they’d have to pay for internet alone isn’t much less than they would pay for all three services individually, so the perceived value can seem tempting. In turn, the cable companies are able to eke out higher overall monthly fees by throwing in those ‘extras.’” There could be many cable subscribers then and now who don’t actually take much advantage of their cable subscription. However, just because subscribers choose not to watch cable doesn’t mean they can’t. I am sure that if someone had easy access to something, the only reason they wouldn’t use it is because they don’t want to—if someone already has cable, the reason they wouldn’t be watching The Walking Dead is either because they don’t want to or they don’t have time to, and in the latter case, that seems not to have stopped viewership prior. Given what we know, the graph below is functionally the same as Figure 1 so I’ve put it in a collapsed note to avoid taking up too much space with graphs that convey similar points. If you’d like to view this second graph anyways, just click on the note to expand it.
Clearly the viewership decline was genuine, so what made people stop watching? If people could watch but did not want to—presumably—the show got worse, or at least worse than competitive offerings in similar categories at the time.
3 The Problem with User Ratings
Critics have been around forever, but only with the advent of review aggregators can we combine reviews from around the world and seemingly come to a general critic consensus on a show. However another—arguably more impactful—thing that aggregators did was “democratize” reviews. Now anyone from around the world can praise or hate on their media of the day. For the purpose of my guiding question from Section 1, I am focusing on user-reviews, which are both more plentiful and often more diverse at the level of individual ratings. If we can trust these reviews, surely they should tell us that The Walking Dead got worse (and that would explain why people stopped watching when they otherwise could watch). Below are two charts displaying Internet Movie Database (IMDB) viewer ratings of the show over time:
As seen in Figure 3, for the most part (excluding some real stinkers in season 10 apparently), reviews of the show remained positive throughout a dramatic and sustained loss in viewership. People tuned out—presumably because they didn’t like the show enough to watch anymore—but the reviews didn’t exactly nosedive. If we know people could watch the show, and we see the reviews weren’t totally tanking, why we fewer people watching? There are a few main possibilities:
- Even if the show wasn’t bad perhaps there were better shows. But this ignores the fact that people were already invested in one show. To completely stop watching a show you are already five seasons deep in just because another catches your eye just doesn’t sound likely.
- The reviews are not reflective of the tastes of the general viewership. This is the one I want you to believe rather than the straw-man alternative I created.
You see, I don’t believe these ratings, and I have a simple explanation for why—despite how they have great sample sizes and average everything out—I don’t. I don’t believe these ratings because of the problem of selection bias from Section 1. People that watch shows are the ones that review them. If people become bored enough with a show to simply leave it, they stop reviewing it as well. This makes taking user reviews and aggregators seriously much more difficult.
Interestingly, you can see very slightly that the the decrease in average reviews per episode and across seasons, is sharpest sometime around season 6 and 7, when there were still many people watching, but the slow trickle away of viewers had already begun. These reviews, given the scale of viewership then, likely included many from the general audience, rather than just die-hard fans, and are notably increasingly negative until there are fewer viewers. This observation aligns with the idea of reviews being biased near the end of the show by devoted viewers and as such I do not trust those reviews. Without watching these later seasons, and with a decade between myself and the last time I watched any of that show, I look to the dynamics of viewership to tell me if the show is worth watching and I don’t see good things.
4 Conclusion
I don’t want to entirely discredit user reviews—that would be foolish. Instead, I propose a simple rule for anyone looking at reviews for a show or any similar episodic medium: give weight to the early reviews because that’s when the most general of audiences that has ever watched the show will watch and review the show. If you want to know what awaits you as a viewer down the line, check viewership numbers rather than reviews—if you see a steady decline, you can probably assume the show gets drab for the average viewer by then. I know I have been a little cruel against The Walking Dead, but there is simply no way in hell reviews were only down a point near the end for a show that had fallen to a tenth of its peak viewership, and this demonstrates the value of taking reviews—no matter how big your sample size is—with a grain of salt.