Unlike many in the video game community, I don’t have fond memories of gaming as a child. It wasn’t until I was 13 or 14 when I got a PS2 for Christmas that I got serious about my gaming. Before that, video games were an entirely foreign concept to me, though I did play my sister’s Super Nintendo from time to time. We’d play Super Mario Bros., Donkey Kong Country, and Paperboy 2; it was the only time when we weren’t trying to kill each other. My dad was into games back then too, playing old adventure games like Riven and Myst.
As my interest in gaming grew, so too did my interest in gaming’s history. Time and time again in my halfhearted and aimless research I became aware of the fanboy subculture. In those days, I was a Sony fanboy, but the art of emotionally investing in a multi-million dollar company that didn’t care about its individual customers was more refined in the height of the Sega and Nintendo wars.
Among the tools of war in the limited arsenal was the argument over “bits,” a series of random numbers harvested to quantify and justify the argument of which console manufacturer was better, Nintendo or Sega. Conventional wisdom would say that Nintendo was ultimately better – they still make hardware now, where Sega is software-only – but Sega had Blast Processing! Blast Processing… how can you beat that?!
I think James “The Angry Video Game Nerd” Rolfe described the argument best, referring to it as “The Bit Wars“, a series of battles over the years fought over… well, no one was quite sure. Back when I was doing my research, a few years after the dust had settled, I struggled to find any information on what “bits” were, or what the term referred to. Apparently, I was looking in the wrong place.
The mainstream fans of Nintendo and Sega, the everyday kid with a backwards baseball cap and Dubble Bubble, didn’t know what bits were and didn’t care. It was only a numbers game to them. Instead, it would be years later when I found the answer in the realm of computers. Simply put, the term “bits” refers to the processor used in a video game console. The more bits, the better the processor. Back in them olden days, Sega had their “Blast Processing” and a 16-bit beast in the Genesis. Both terms referred to the same thing, the CPU, which was supposedly better than the one in the Super Nintendo.
The now infamous “Blast Processing” marketing campaign has come to personify the debate. It was a semi-fabricated term, meaningless in the world outside school playgrounds. Sega did base the term on “Burst Mode,” a real bit of computer jargon, which is a term used to describe when a device is transmitting data continuously instead of using multiple data packets. Fun fact… the Super Nintendo could do it too.
It’s kind of a real term, but it’s also useless information to anyone but hardcore tech enthusiasts. Think of it as being similar to when everyone talked about the PS3’s “Cell Processor.” Sure, it’s pertinent information to some, but overall, it’s meaningless to the majority of us, and is used as a marketing term to convince us how powerful the hardware is.
These days, bits are irrelevant as consoles are becoming more and more like computers; in fact the PS4 is only 64-bit, as is the Nintendo… oh, I forget the name of it. The “Bit Wars” more-or-less died when Sega dropped out of the console business, and Sony came along boasting of CDs.
I bring up the old Genesis/Super NES rivalry because there’s a new trend in gaming that mirrors the old Bit Wars disturbingly well, and that’s frames per second, or FPS. It seems before anyone on the Twitters or the Faced Books or Red Its can talk about the actual content of the game, they have to ask about the framerate. Many a conversation takes place in the comments section on any new trailer about the framerate, and plenty of online warriors proudly attempt to start a boycott of any game that runs at less than 60 FPS.
Just like in them olden days, those rooting for the smaller number, whether it be 8-bit or 30 FPS, don’t stand a chance in any debate. The “I hope this game is 5 FPS so it can look really cinematic” condescending meme is practically gaming’s version of Grumpy Cat. The Internet never acted so fast, before or since, when it decided that anyone who preferred 30 FPS over 60 FPS was the big bad villain that required a public execution.
How many frames per second a game runs at is hardly the be-all end-all. Framerate is important; I’m not arguing otherwise. But it being the factor that leads the debate over whether a game is good or not, whether it makes the PC version superior to the console version, or whether it’s worth throwing a tantrum on the Internet about is wrong. It harkens back to a time when the games didn’t really matter. Okay, that’s a vast overstatement; games have always mattered to those who played them, I’ll admit.
The ridiculous online commenters fighting over framerate is an extreme example perhaps, but it’s one that exists, and it’s the loudest voice, however small. There are still plenty of other rational commentators and critiques that put framerate and graphics above all else. What’s the first thing TotalBiscuit does when he plays a game? He goes to the option menu, talks about FOV, and mentions the framerate.
Speaking of TotalBiscuit, it’s worth pointing out that he’s started his own Steam Curator group, “The Framerate Police,” which lists only games that run at 60FPS or higher. He’s encouraged his fans to tag games that are limited to 30FPS, and there have been documented cases of those fans lashing out and attacking indie developers whose games run at the lower framerate. Maybe it’s because social media wasn’t around, but I don’t remember too many death threats being thrown around by grown men over bits back in my day.
The Bit Wars, the debate over who had the better toys, didn’t put the games first. It was about who had the better toys. It was about being part of a group that was for some reason at war with another for daring to have opposing views on something that isn’t all that important. Today, we have that same thing going when people argue and moan about FPS. Video games are made up of a narrative, graphics, programming, art, voice acting, sound design, and everything factors together into the overall quality. To say you won’t buy a game, or to say a game is terrible because its framerate is only 30 FPS, is no different than saying you wouldn’t buy a game in 1990 because it was 8-bit.
This article may read like an old man reminiscing about a rose-tinted past that never happened, although that’s inevitable in something like this. It could very well end on the “why can’t we all go back to what matters, having fun” cliché, or the “graphics don’t matter” trope of a would-be game commentator with no leg to stand on. The truth is, fun can’t always be used as a shield, and graphics, and yes, even framerate, do matter.
What I will instead call on is two things. First, a proper big boy/girl discussion on what importance framerate holds, and hopefully the realization that we’ve had this argument before under a different banner. This includes having a discussion with game developers and publishers, asking them to understand what we want in our video games.
Second, a moratorium on the need for gamers to feel superior over other gamers, and the willingness to use pieces of entertainment to wage proxy wars against one another. PC games generally have higher framerate; congratulations, no one cares. Consoles are easier to use and can be cheaper; but again, see previous sentence.
There are more pressing issues to worry about than framerate, just as there were more important things to worry about in the days of Sega versus Nintendo. Perhaps if we were more heavily invested in starting a conversation with developers about what we really wanted, things would be different today. We were young back then, as was the medium, so that can be forgiven. But the medium has grown up, and I’m beginning to fear we haven’t.