Crunch culture in one of the video games industry's most unethical practices, with game developers working long hours, often with no overtime pay, for several weeks, months, or even years on end.

But, is there an argument to be made that crunch culture is necessary to make great games, despite how grueling it is? Let's dive in.

Some Great Video Games Have Had Crunch Culture…

If you're unfamiliar with the concept of crunch culture, we've got a quick explainer on crunch culture in video games for you to check out.

Simply put, crunch culture is an extended period of time where video game developers work long hours, often unpaid, to deliver as polished a version of their game as they can do by the deadline, with overwhelming pressure throughout. While it's usually not mandatory, the implications for game developers if they don't want to crunch often include losing their jobs for people who will.

We call the term “crunch culture” because it's not just a brief, one-off period of crunch—it's a destructive lifestyle that's ingrained into workers in the video games industry. And, somehow, there are people who hail developers who crunch as being “really productive” as well as game developers who boast about crunch.

A common argument for crunch culture in video games is that it yields some incredible titles and that these games wouldn't be at the quality they are without crunch culture.

As most AAA games involve some form of crunch—indie games too—you could give numerous examples. Stand-outs would include The Last of Us Part II, Red Dead Redemption 2 (you could choose any of Naughty Dog and Rockstar's last few games), and Halo 2, which had an especially brutal crunch.

You could say that, despite the tumultuous and unrelenting development cycle, the end result stands head and shoulders above its competition. That, when faced with a huge amount of work and pressure, the game developers have created something truly special.

So, then, isn't crunch culture necessary to make great games? Well, it might not be necessary at all.

Related: What Are First, Second, and Third-Party Video Game Developers?

… As Have Bad Video Games

We've seen some examples of how crunch culture creates great games, but there's the other side of the coin. Crunch culture in video games also spawns some not-so-great video games.

Games like Anthem and Cyberpunk 2077 demonstrate that despite undergoing huge amounts of crunch, factors such as mismanagement, over-specification, and an inconsistent vision will result in a poor video game.

These games underwent crunch and didn't succeed as the game studio, publishers, and investors would've hoped. What crunch culture did in this case, as it does with good games regardless, is cause burnout, and damage the mental and physical health, family lives, and social lives of the game developers that worked so hard on it.

Crunch culture, then, doesn't always lead to a great game. You can work long hours, but if those hours are just for the sake of it and not filled with work that contributes to a holistic vision, then all that time spent crunching is not productive.

Crunch Culture Doesn't Guarantee a Game's Quality

Looking at both sides of the argument, what we can confirm is that crunch, and the idea of crunch culture overall, doesn't guarantee if a game is going to be great. Both good games and bad games have experienced crunch.

What we can say is that, regardless of the quality of the game, crunch culture is detrimental to the mental and physical health of game developers, as well as their work-life balance, family, and social lives. The long hours often lead to burnout and tired, unproductive work, as well as leaves developers walking away from the industry completely. That's a high price to pay, especially if the game you're making doesn't pan out.

Looking at the three examples we gave of good games with demanding crunch: yes, these games are, arguably, some of the best games ever made. And, yes, maybe their quality was that bit better with crunch.

But, to attribute crunch and crunch culture as the reason these games stand out is to vastly underappreciate the excellent skill, work, and ideas that the game developers working on this project have shown. It's to say that, if these incredibly skilled people didn't crunch, we'd have a sub-par game or a game lacking in greatness, which just isn't true.

What makes great games isn't crunch or the practice of crunch culture. What makes great games is great developers.

We Can Get Great Games Without Crunch Culture

So we've seen that both great games and bad games have had crunch as well as discussed that, by and large, crunch culture doesn't guarantee a game's quality. Well, what about games without crunch?

Studios such as Obsidian Entertainment (Fallout: New Vegas, Pillars of Eternity I/II, The Outer Worlds), Supergiant Games (Bastion, Transistor, Hades), and Respawn Entertainment (Titanfall 1/2, Apex Legends, Star Wars Jedi: Fallen Order) all demonstrate that you don't need crunch culture to produce excellent games.

These studios and their games are examples of what developers can achieve with an incredibly skilled team and well-managed, holistic project. Crunch culture is not a mandatory practice or mindset in this process. While, most likely, people on these games worked extra time or on weekends, they didn't face any overwhelming pressure to do so, or any negative implications if they didn't choose to.

These are developers we should support and look to as an example of how the video games industry should be operating.Related: Why DLC and Season Passes Are Here to Stay

Everyone Plays a Role in Crunch Culture

Crunch culture has ingrained itself in the video games industry. Regardless of if it generates good or bad video games, crunch culture is a grueling and deeply harmful practice for video game developers.

Crunch culture isn't an internal problem—gamers also feed into it. Everyone plays their role in contributing to this idea of crunch culture. The more we're aware of how harmful it is, the more we can do to stop it.