Patch It Later, It’ll Be Fine

For a long time now games have been released before they meet a commercial launch standard. It continues to happen today, though it is mystifying why it happens at all. I am reminded of this practise by two games I’ve played recently that, for different reasons, did not launch to the public in any kind of playable condition.

Before moving onto the two games in question, I would like to ask you; is patching really an acceptable excuse? I’ve lost count of the times I’ve seen it used as a defence by people. I say people, but I mean fans. It’s one of those go-to automated responses that are blurted out without thinking through a better argument. Would that work in real life? “We’re sorry that dam didn’t work as promised, but we’ll patch that hole up as soon as we can.”

That example above was totally unrealistic (like a company would ever apologise). In my own way I’ve fallen back on it as well; such as suggesting in my review of Scott Pilgrim: The Game that if a patch were to add online co-op play that it could score an extra mark. It wasn’t through being a fan that I suggested that; it’s because I thought it would deserve that extra point if the feature were added, yet I overlooked the severity of the fact that it should have been in from the very start. That game really needed the online co-op feature, so even if it did get added at a later date via a patch after every review or player pointed it out, that doesn’t mean Ubisoft would deserve kudos for capitulating to their opinions.

It is true that patching can remove errors, glitches and any other tiny hiccups that may have slipped through the QA testing phase, but at what number of these things does it stop being a genuine mistake and start being another reason? Like laziness? I once knew someone who play-tested games. It’s one of those jobs that as a kid you would think would be the best thing in the world, but when you actually get the position you find out it’s just as mind-numbing as any other 9-till-5. You sit there in a darkened room, half eaten pizza from the last three days piled on one side, pile of papers on the other, bottle of pills between your legs, as you run your character into every wall, corner, and crevice trying to see if you can find a bug that slipped by the programmers. Due to basic human error and that kind of bleak working situation it is natural that things would be missed, but that’s why games heading for release are tested by whole teams. Ever paid attention to the end credits of a game and seen just how many people have apparently tested it? Perhaps this is one of those situations where we should take down those names, find their homes, and teach them about proper quality assurance while they can’t possibly escape.

Alright. Sometimes little bugs can be funny.

When I was at university I had to take an extra course one semester to make up the points I needed to get into the next year and decided to take a brief class in moral philosophy. Don’t doze off, I’m going somewhere with this. I remember one discussion we had where an unnamed software giant (Microsoft) knowingly released a program to the market with bugs/glitches which would be fixed at a later date. We had to discuss whether this was the right or wrong thing to do, from the angle that if they had held it back from release to get it perfect then costs would have spiralled out of control and would have resulted in job losses.

Back then I remember thinking that it seemed like a rather extreme consequence of a little extra QA and somewhat unrealistic. My cynical side would jump straight to laziness being the real reason that glitches make it to market, but I think that’s probably wrong as well and the answer is in the middle ground.

Take movie tie-in games. They are almost always terrible games because they are rushed through production to meet the release date of the film, yet strangely most (that I’ve played at least) were terrible for almost every reason except that it was bug-ridden. Pressure from the publisher onto the developer to get things out quickly certainly sounds like a good reason that things would slip through the net and is probably as close to the reality of the situation as we will get in most cases. There are always exceptions, however.

The first game I wanted to mention was Final Fantasy XIV, a game which never should have been released in the condition it was. It wasn’t so much bugs or glitches which ruined it as a whole (though they were present) and more just the arrogance of a developer that chose to ignore the thousands of testers they invited into the Beta. Even though this doesn’t directly relate to crippling glitches being the problem, it is relevant for a couple of reasons.

Square Enix has (silently) acknowledged that they got it wrong by giving players an additional 30 days free over the days included with the box purchase. They also began spitting out upcoming patch details and yelling about them from the top of their headquarters at anyone who’d listen, that unsurprisingly included solutions to the main grievances players had (such as poor stability, no PC optimization and so on) as far back as the earliest stages of Beta. Granted that an MMORPG is a special case and that constant patches adding and changing content is the done thing, the interesting thing here is ignoring the player.

Part of the problem is that even if you have a play-tester group that numbers a dozen or two, you still can’t account for every action or method a player will use to achieve an objective. For linear games this isn’t so much of a problem since choices are artificially limited (so all the more irritating when a linear game does come with glitches and bugs at no extra cost – but I digress), but what about large, open-world games?

Fallout: New Vegas is one of the most shockingly poor builds I have ever seen make gold standard to begin production. Even ignoring the stability issues which have always been present in Obsidian’s engines, to ship a game with numerous glitches such as; rooms you can’t exit once you enter, extended arms with floating guns, weapons which become giant red exclamation points if you add custom parts to it as seen in the video above, plot errors which result in game breaking scenarios, constant freezes and general weirdness, is astonishing.

The lowest, most basic QA phase should have (and perhaps did) catch most if not all of the larger glitches to make it through. But no. We, the players, are now the play-testers. We go staggering forward, reporting glitch after glitch so that the future generations a few months from now will enjoy a more stable experience. We need a memorial for the price we involuntary testers paid; I’m thinking a statue of the Fallout mascot with geeky glasses and ripped shirt hurling a game pad through an Obsidian office window.

That aside, who is really to blame? I suppose that’s what I want to know. Were the testers lazy? Did they in fact find most of these problems but were ignored by a developer that thought they knew better? Was the publisher putting pressure on the developer to get it out the door no matter the condition? Were the developers in league with Satan and simply didn’t care that a glitch-ridden product was going out on the shelves? Was this all a conspiracy to get on my nerves?

Not that I need to underline the point any further; but in this specific case, if New Vegas were Dogmeat, the Vault Dweller of Fallout 3 would be dragging it behind Megaton’s grimy outer shell, shotgun in hand and tears in his eyes.

Related Posts with Thumbnails

Written by Ian D

Misanthropic git. Dislikes: Most things. Likes: Obscure references.

5 comments

  1. John Sotckton /

    You’re stupid.

  2. KrazyFace /

    LOL at the comment above me!!! Care to elaborate John?

    Anyways, I totally agree with this whole article. That video you found, looks like one of the grunts from MGS got such a fright from a moving just-a-box, he lost his “!” mark! Haha! Yeah, I don’t think we need play testers anymore coz some of the shocking bugs and glitches left in games for the public should be embarrassing to the developers… but their not. The unfortunate reason for this I fear, is because WE (gamers) just don’t complain enough about it. In the case of Fallout New Vegas they decided to…

    ***** SPOILER ALERT *****

    … give us the chineese stealth suit to wear from a previous DLC pack, BUT it’s just a basic suit that does nothing at all. Why is this? Because they thought that by giving the player the ability to have a constant “stealth feild” would be a game-breaking advantage. I’m sorry, but after my first hour with the game, I found my rescued deputy with his head jammed in the ceiling of the casino with his legs dangling around like half-eaten gator lunch. And they’re worried a SUIT might ruin a player’s experience!? I WANT MY STEALTH SUIT DAMMIT!!!!

    • That video I made, although I wasn’t the first to try sticking a weapon mod on the pre-order bonus weapon.

      I’ve heard a lot of people complain about what they did to the Stealth Suit, but I can understand why they did it. What I don’t understand is why they made Sneak pretty much pointless.

      The few occasions you can use it to an advantage beyond a single shot (which in itself is pointless since even if you one-shot an enemy, in New Vegas there are always a lot more to kill) you might as well just use a Stealth Boy which gives you 100 Sneak.

      • KrazyFace /

        Wait, what!? They made Sneak pointless, what do you mean? I’ve only just started New Vegas and Sneaking is 80% of how I play. Are you telling me I’m going to HAVE to be an “all guns blazing” kinda guy?

        • I prefer to play that way as well and yes I found it pointless. The basic flaw is there are a lot more enemies and often they are packed close together. Even ignoring the Stealth Suit you can’t sneak into a place and pick off guards – you pick off one then every other one in the place comes straight for you.

Leave a Reply

"));