Skip to main content

Lowering video game graphical quality compared to E3

The game Watch Dogs became very infamous because its final release had quite drastically lowered graphical quality (even at maximum graphical settings) than what was showcased prior to launch at E3 (Electronic Entertainment Expo.) People were disappointed that the final version of the game didn't look even nearly as good as what was shown at the expo.

The thing is, the footage shown at E3 was, ostensibly, video of real-time gameplay, not some pre-rendered graphics. In other words, the developers themselves had a PC that could run the game at that graphical fidelity with a good framerate. Therefore most high-end PC gamers likewise would have had a gaming PC capable of running it. Thus these gamers felt a bit cheated and robbed when what they got was visibly of lower graphical quality, even at full settings.

Watch Dogs is not the only example. Other examples of exactly this include Far Cry 3 and The Witcher 3: Wild Hunt. Both of them had E3 pre-launch demo videos with higher graphical quality than what was in the final release.

But why? Why do they do this? What exactly is the purpose? Why show something at E3, usually real-time gameplay footage, and then deliberately lower the graphical quality?

It is not a question of making the game runnable on PC's with lower specs. That can be done with graphics options within the game. There is nothing stopping a video game from having the full quality graphics options showcased in pre-releases, and lower options for slower PC's. The vast majority of PC games do this (including those three mentioned above.) It's just that, for an unknown reason, these games are deliberately crippled before launch, and even the maximum graphical options available in the game lowered.

Many have presented the hypothesis that they do this deliberately in order to bring the PC version graphically closer to the console versions of the game.

There's no way around it: Consoles always have lower specs than the highest-end PC's. (For example, the PlayStation 4, which is essentially the highest-powered game console currently on the market, has a graphics chip that's approximately as powerful as an Nvidia GTX 4xx or GTX 5xx series card, which is laughably old. Even the cheaper modern mid-range Nvidia cards are twice as fast as that.)

The hypothesis is that they don't want the console version of the game looking drastically of lower graphical quality than the high-end PC version, which is why they make the PC version look the same as the console version at its best.

If that's true, the question still remains of why. Maybe it's some kind of marketing tactic (misaimed or effective, who knows.)

It really is kind of a dick move to first tease gamers with awesome-looking graphics, and then take them away, without any possibility of getting them.

Comments