OPINION: Lower graphics quality in video games? Yes, please

It wasn’t too long ago when the top-of-the-line video game graphics were two white lines bouncing a ball from one side of the screen to the other. Since then, we have steadily evolved from the pixelated style of old arcades to advanced models that look more and more indistinguishable from reality. It’s awe-inspiring, breathtaking, something that makes people giddy with anticipation of what kinds of games we might play in just a few short years.

When a new free roam game takes the internet by storm and players rush to post screenshots of their grand adventures and things that impress them about the game, you might see close-ups of a small detail that the eye would normally overlook. Something like the leaves on a tree, the feathers on a bird or the texture of the water. Fans always like to praise the worthy effort that the developers put into these miniscule things.

When I see these details, however, I sometimes can’t stop myself from looking at them in confusion. If these details were so small that they would otherwise be overlooked, why put extra effort into it?

As video games become more advanced, they need extra everything in order to pull off the job. A larger team, more time, more money — and to top it off, better processing power on the player’s end. A bigger environment will need a bigger effort to make it happen. Advanced consoles or PCs are increasingly becoming the only thing that can effectively run newer video games with the top-of-the-line graphics, with some titles being exclusive to next-gen consoles. This, in combination with the price increases that come with putting more people and development towards advanced graphics, are partially responsible for the regular backlash towards the production companies that we see online about how video games cost so much. In recent years, $70 has started to become the normal price for big games.

However, money isn’t the only cost incurred when a game makes the most attractive visuals their top priority during development. All this effort needed could instead be directed towards actually providing more meat, more content to the game itself. This could mean more nonplayer characters in one location, more action taking place at once, more quests for players to do — and with it, something that takes longer to complete, which makes it something that a player can always come back to in their free time. More bang for the buck.

After all, video games survived for many years with graphics that we now see as dated, and they include plenty of good games that survived the test of time. It obviously wasn’t the visuals that made a classic a classic, it’s the game itself. It seems obvious that game creators should make it a top priority to deliver an entertaining experience before one that mostly wants to be easy on the eyes.

The way we play video games has advanced a lot since the arcade era, and that’s a fact that would please anybody. However, I think our priorities for what we should be advancing are misplaced. Graphics are really just something to make your game pretty and flashy, and they don’t really give players more of what they really want more of. I would rather play a game with a rich story and graphics that can be called good enough than a game that directs effort into making the plunge to be flawless and visually groundbreaking.

To me, it seems like a simple trade: clunky leaves on the virtual trees in exchange for more story. Unpolished birds in the sky traded for more characters to interact with. Imperfectly rendered water flow if it means that a less advanced device will be able to run it with ease.

Why be more flashy when you can be more fun?


Follow the Daily Wildcat on Instagram and Twitter/X


Ian Stash is a junior studying Journalism at the University of Arizona. In his free time, he loves video games and chilling with his cats.


评论

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注