Is patching making games worse?

Gaming
Share

In the run-up to Christmas we’ve seen the release of a slew of AAA titles – the multi-million dollar blockbusters of the gaming world. But there has been a bit of a theme. Specifically, disappointment.

acface

Perhaps most famously there has been the release of Assassin’s Creed: Unity, which was riddled with bugs – including the notorious missing-face one, and now another Ubisoft title, The Crew has launched with a bucketload of problems unsolved. Including, rather unfortunately, the ability to actually join a crew.

Ubisoft isn’t the only company to have problems. Sony’s recent release of Driveclub was beset by problems in the game, that only a hefty 1GB+ patch released more than a month late seems to be starting to fix. Ubisoft too have promised to fix the problems with their games by releasing update patches.

Back in June, fans of Moto GP were left frustrated when seemingly about half the game was missing with the game’s release – it needed a massive, substantial patch later on to add in entire game modes.

These are not isolated examples either – just particularly egregious ones. Pretty much every major game these days will start downloading a “Day 1 Patch” when you put the disc in your console on release day – the build on the disc inevitably being several months old.

There is a positive case to be made for patches: They let games come to market quicker, as developers can keep on coding and testing right up until launch (and indeed, after), and crucially they can add new functionality or fix broken parts of games after launch. So far, so great – but could the fact that patches are now a possibility now mean that developers feel safer releasing unfinished games?

Consider how it worked back in the old days: Developers would spend months or years working on a game and testing it and once it was put on the cartridge or disc and shipped off to the shops, that was it – nothing could be done, so it was essential that the game be perfect.

Now though, the situation has reversed – and developers can be lazy. “We can always patch it later”, they can say – hoping that one guy who lives on an oil rig and doesn’t have his console connected to the internet will make do. Far from being a means to enhance the quality of games, patching has resulted in a bit of a mess.

Heck, perhaps most offensive is the increasing trend for developers to outsource bug testing to you and me – running “public betas” of new games. Sure, it might be exciting to think you get to play a bit earlier, but the experience you’re getting will be surely imperfect.

Imagine if Joss Whedon planned a “public beta” of Avengers: Age of Ultron next year – where a select few tens of thousands of people next year were invited to see the film before it is released. Sure, it might be fun to see what Iron Man is up to, but won’t it be a bit rubbish to see Hulk when he’s portrayed by a bloke covered in ping-pong balls?

We wouldn’t accept half-finished films, so why do we so readily accept half finished games? Let us know what you think in the comments.

James O’Malley
For latest tech stories go to TechDigest.tv