Friday, July 2, 2010

Why PC Gaming Sucks


I'm kicking off my new blog (that I will probably use for a month before neglecting) with a post about why I don't play PC games anymore.

I stopped playing PC games about 5 years ago. A lot of what I'm writing about in this post, is based on my gaming experiences from then but still holds true..

The last game I really played on my computer was Half-Life 2. I had just bought a brand new system with a new video card, the works. I installed HL2, and started playing. The framerate was just OK, choppy in parts but it was playable. The game crashed a lot. Having played games on a PC for years, I was used to this sort of thing happening.


Purchased in September of 2005.

I can play a console with less hassle. Much less.

I gave up on Windows a while back. I'm more productive on Linux, so that's what I use for my operating system. Unfortunately, there aren't many games that run on it. I'm not into dual booting either. Too much rebooting every time I want to switch between tasks.

The last version of Windows I really used was XP, but I'm pretty sure Windows Vista and 7 have most of the same problems, because I hear friends talking about it all the time, and they ask me how to fix it. How many times have you been playing a game when it craps out and boots you back into Windows, or worse, locks up your system completely, requiring a reboot?

This seems to happen WAY too often, and the fix is always something stupid, like "upgrade your video card drivers to version XX" or "make this change to your registry" or some other bullshit. Granted, it's not always the fault of Windows either, it could very well be a bug in the game, or a hardware issue, or something else.

Anyways, it's happened to me countless times on a PC, while rarely ever being a problem with any of the consoles I've owned over the years.


Clippy shows up on Spock's viewer, followed by a blue screen.


Sitting at a computer desk sucks.
I'm already tied to a desk 40+ hours a week at my day job. I'm also on the computer a lot at home, surfing the internets, coding, writing e-mails, whatever. If I want to play games on my PC, I might as well install a shitter in my computer chair because it means I'm going to be spending most of my time there. With a console, I can lay back on the couch after a night of drinking and play Metal Gear Solid on a big screen. Granted, I could move my PC into my living room, but then I'd have to sit on the couch to do everything else that I use my computer for.

This is me levelling up my level 46 female Night Elf!


Expensive hardware upgrades.
When a new game is released there's always a chance you're going to need to upgrade in order to play it. It's an endless cycle. The other option might be to lower the resolution, turn off lighting effects, disable the sound or some other half-assed attempt to squeeze an extra 3 frames per second out the thing.

Yes, I understand that technology advances every few months, and newer games require steeper system requirements because the game engines are doing a lot more. But I think in some cases, developers have just gotten lazy. Instead of optimizing their game to work on a wide variety of hardware, they develop for only the latest and greatest tech, and assume everyone will upgrade just to play their game.

The result is that someone with a system that is only one or two years old, either can't play the game, or has to fuck around and disable everything to get it to run decently. Better shell out $200 for that new ATI card. Maybe some more ram too. At least with a console I only have to upgrade every 5 years or so.

You can either disable 'splosions, or buy a new computer.


DRM makes me not want to buy your shitty product.
Lately there's been some real crappy DRM implemented in PC games in an attempt to curb piracy. The latest I've read about is Assassin's Creed 2 DRM that will require a constant internet connection in order for the game to run. Apparently, if your connection drops out while your playing, the game stops and you lose all of your progress. More about it here: http://www.computerandvideogames.com/article.php?id=235290


There's nothing stopping this from being implemented in a console game, but right now the focus seems to be on the PC because those games are so easily pirated. After I've purchased a legit copy, I shouldn't have an enter a 32-character activation code in order to play it. I shouldn't need an internet connection to "validate" it, or only be allowed to install it a limited amount of times. I own it, and therefore I should be able to use it freely.

Crippling software with any sort of DRM is asinine. It only serves to piss off legitimate customers, who might decide to torrent the software instead.

Instead of installing a rootkit on my machine, just make me watch this on repeat during the installation.

PC's are for FPSes and World of Warcraft
Many games on the PC are also available for consoles as well. I can't think of any PC exclusives that I'm really interested in playing. Diablo 3, maybe. Team Fortress 2, just because I have friends that play it on PC.

Just one example of a potentially great game that I'm missing.

That's all I have. Next time someone asks why I don't play PC games, I'll point them here.

Posted via email from cjo's posterous

No comments:

Post a Comment