×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Putting Up With Consolitis

Soulskill posted more than 3 years ago | from the my-keyboard-doesn't-have-a-trigger dept.

PC Games (Games) 369

An anonymous reader tips an article about 'consolitis,' the term given to game design decisions made for the console that spill over and negatively impact the PC versions of video games. "Perhaps the most obvious indicator of consolitis, a poor control scheme can single-handedly ruin the PC version of a game and should become apparent after a short time spent playing. Generally this has to do with auto-aim in a shooter or not being able to navigate menus with the mouse. Also, not enough hotkeys in an RPG — that one’s really annoying. ... Possibly the most disastrous outcome of an industry-wide shift to console-oriented development is that technological innovation will be greatly slowed. Though a $500+ video card is considered top of the line, a $250 one will now play pretty much any game at the highest settings with no problem. (Maybe that’s what everyone wanted?) Pretty soon, however, graphics chip makers won’t be able to sustain their rate of growth because the software is so far behind, which will be bad for gamers on consoles as well as PC."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

369 comments

Nintendo Thumb (2, Funny)

zonker (1158) | more than 3 years ago | (#35134642)

Here I thought this was going to be about Nintendo Thumb.

Re:Nintendo Thumb (1)

Anonymous Coward | more than 3 years ago | (#35135186)

At first I read that title as "Putting Up With Clitoris". Look it up, Slashdotters. It's a word you don't know, for something you'll never see.

Re:Nintendo Thumb (4, Insightful)

crafty.munchkin (1220528) | more than 3 years ago | (#35135564)

I'm firmly of the opinion that some games need to be played on a console, and others need to be played on a PC. Porting one type of game to another platform means that the end result is the ported platform is a poor imitation of the original.

Thirsty for a firsty (0)

Anonymous Coward | more than 3 years ago | (#35134644)

Who cares?

What...? (2)

Windwraith (932426) | more than 3 years ago | (#35134672)

If anything games are becoming more like computer games overall. Traditional console RPGs look more like MMOs now, games require patching and even have DRM...a few quirks introduced by lazy companies that do lazy ports don't make "consolitis".

Re:What...? (3, Insightful)

nschubach (922175) | more than 3 years ago | (#35134768)

Heck, "Consolitis" has an effect on LCD Displays. Anything over 1080 horizontal lines is getting harder to find every day. I feel like things are going backwards for PC displays.

Re:What...? (2)

sortius_nod (1080919) | more than 3 years ago | (#35134968)

This is true, I have been keeping my 24" Samsung as my main screen because of the 1920x1200 resolution (my 2nd screen is an Asus 24" 1920x1080).

I think there's room for both consoles and PC games as there always has been, it's just some gaming companies are getting lazy and going console style for everything. Sure it works with things like DCUO, I use my 360 controller with that and it works great, but other games it's just terribad.

The only way this will change is if people refuse to buy bad ports from console to PC, then the designers will have to actually do the hard yards and create the PC version with PC players in mind.

Re:What...? (3, Insightful)

Spad (470073) | more than 3 years ago | (#35135288)

That's more to do with the fact that all the LCD production lines are churning out huge numbers of 16:9 panels for TVs at 720p and 1080p, so it makes sense to do so for PCs as well (made easier in no small part because the public have now been successfully sold on the idea that 16:9 = HD = Better than a 4:3 monitor somehow).

I managed to track down a pair of 21" 4:3 LCDs that do 1600x1200 for my PC and I will hold onto them as long as humanly possible because I know that it's going to be extremely hard to get a decent sized 4:3 replacement in a few year time. 16:9 for a PC is just a massive waste of screen space for most things because 90% of apps and web pages are designed, if not with 4:3 in mind, then to support 4:3 and so you end up with horizontal letterboxing all the time.

Re:What...? (3, Informative)

Chas (5144) | more than 3 years ago | (#35134770)

Take a look at Champions Online and it's follow-up, Star Trek Online.

The engine was jacked around with to specifically enable a console port that was to be released simultaneously.

Then the developer realized that a console port was going to be unsupportable and simply COULD NOT give the flexibility necessary.

Boom, console port went away. But by that point, all the console-specific stuff was so firmly embedded into the system that it couldn't be excised.

So what did we get with CO and STO? A pair of SEVERELY half-ass MMOs that were little more than button-mash-fests.

Re:What...? (1)

hitmark (640295) | more than 3 years ago | (#35135030)

Explains why CO has a option to play with gamepad (tho such a option also shows up in DDO).

Re:What...? (1)

SuricouRaven (1897204) | more than 3 years ago | (#35135110)

Also contrast Deus Ex, and very highly respected PC FPS, with it's far less respected sequal Invisible War. The latter was made for consoles, and it shows.

Re:What..?(how this flamebait of TFA got through?) (2)

sznupi (719324) | more than 3 years ago | (#35134916)

It's just what we wanted. Yes, we, that almost certainly includes you. Remember those times when we were playing our precious games, misunderstood by surroundings? When we wished they would try, and understand?

Guess what - it happened! Be happy. Games are now made for general consumption (which impacts also traditional console games / many characteristic genres almost disappeared, possibilities of controllers are also underutilized, presentation is not what it used to be, even UIs often forget that scrollable nested menus is what works in this world. What the author whines about are hybrids - probably helped by Xbox, how it brought very universal SDK ... and BTW how virtually every "PC magazine" from a decade ago was marveled for some reason about "PC console" (while openly shunting "old school" ones) looks even funnier now)

General population also doesn't like constant upgrade cycles BTW. But...no technical innovation? How hard was it to, say, miss the recent ruckus about Kinect?

And overall, I really don't know what the problem is. Sure, a lot of games "sux"...but even when limiting myself to games of the past, I'm pretty sure I would have good things to play for the rest of my life.

Re:What..?(how this flamebait of TFA got through?) (2)

Windwraith (932426) | more than 3 years ago | (#35135252)

Don't generalize, I never wished for such a thing.
I just wanted and want to have fun period, all my friends and I are gamers since the 80s. And most of us find the current game generation to be horrid outside of indie gaming or rare gems.

Re:What..?(how this flamebait of TFA got through?) (1)

sznupi (719324) | more than 3 years ago | (#35135402)

Well, generalization (etc.) is the force pushing the market in one direction or the other... and I remember quite a lot of young gamers being fed up with how many people "don't get it"

(BTW, return sometimes to those games from the 80s - and not only to those you remember fondly, but to similarly broad selection which you denounce in current ones ... and suddenly the latter won't look so bad in comparison)

Re:What...? (1)

hairyfeet (841228) | more than 3 years ago | (#35134960)

I look at it as a "some good some bad" kinda thing. On the good side whereas I used to have to blow a couple of hundred every year and a half just to have decent framerates I've been playing for nearly 3 years now on an HD4650 with decent FPS, and will be getting an upgrade* to a 4830 from my GF (along with a nice home cooked dinner and a weekend of wild monkey sex, yay me!) in a couple of weeks for my BDay. So cheap game playing (along with the cheaper game prices) I would put as a plus.

On the downside is what TFA is talking about, but frankly I've found the biggest offenders were shitty games anyway, so that is just like the rotten cherry on the shit sundae. For example I picked up turning point:fall of liberty in the bargain bin and it would have actually been enjoyable if it weren't for the fact that without an X360 controller the thing is completely unplayable. It is obvious they just tacked on PC controls without actually seeing that they worked, which of course they don't. But then again I've found plenty of bargain titles with GREAT controls, like the "Just Cause" series, which if you haven't tried it...imagine you mixed GTA and being a superhero, that's the only way I know to describe it. Stealing a chopper in midair is too much fun!

But considering how many titles we PC gamers have to choose from dealing with the occasional shitty port seems like a small price to pay. Between Amazon and GOG I've got more games than I know what to do with, I don't have to spend crazy money just toi have some crazy fun,...ehhh..we don't have much to complain about really. It ain't like the old days where you better break out your wallet if you wanted more than a slideshow from the current games, which I'll take over the occasional console port suckage any day.

*Yes if it were me I'd have bought a 5770 for around $120, but my GF has had it rough at work lately and I wanted to make it a cheap gift from her. I told her I'd be happy with her nakked wearing a ribbon, happy BDay Mr President optional but appreciated, but that is my bonus gift apparently and she wants one that is presentable in front of my family, hence the 4830.

Re:What...? (1)

Gaygirlie (1657131) | more than 3 years ago | (#35135510)

Traditional console RPGs look more like MMOs now, games require patching and even have DRM

That is simply a product of improvements in technology: before if there was a serious bug or flaw in a console game there simply was nothing you could do about it, you just had to live with it. Now you can actually do something about it, patch things up and so on, and sometimes even provide an extra feature or two. DRM is a by-product of consoles being network-connected nowadays: the companies try to keep people from cheating in online games, and I can verily understand that. If you've ever seen for examples someone shooting through several walls in CS or something you know what I'm talking about. Then there is DLCs: it's a popular way nowadays for companies to squeeze just a few more quids out of gamers, and there simply is no way to make DLCs work without being able to have both patching and DRM.

I'm mostly saying that having the ability to patch games, even when it might sometimes annoy a little, is a big benefit over having no such ability at all, and that certain kinds of DRM is quite understandable.

a few quirks introduced by lazy companies that do lazy ports don't make "consolitis".

A few quirks, you say? Lately all the games I've played have clearly been designed with a console in mind, some of them though make the transition to PC fairly well -- a good example is Batman: Arkham Asylum. An absolutely awesome game, noticeably created for consoles but works totally well on PC too -- and some make the transition fairly poorly: I just recently played through Mass Effect 1 and that one is surely an example of the poor transition. Menus worked so-and-so, sometimes you had to use mouse button to use a button, sometimes space bar, sometimes enter, some things didn't even work at all. And since I like to have my back and forward buttons mapped to left and right mouse buttons I ran into problems when trying to use my armored vehicle: it's hardwired to shoot with the right mouse button. There was plenty of other issues too, too many to list them all here.

Just to put it shortly: many games designed for consoles have plenty of actions/keys completely hardwired and then when the game gets ported to PC those hardwired things are completely forgotten. Then you run into all kinds of illogical input issues, navigating menus is often a whole puzzle of its own and so on. It's not really what I would call "quirks" since those things often hamper playability and enjoyment quite a lot.

anonymous reader = blog owner (-1)

Anonymous Coward | more than 3 years ago | (#35134674)

wow you got your gay blog featured in a /. article

congratulations fuckwit

Wider release cycles (1)

exomondo (1725132) | more than 3 years ago | (#35134680)

Seems more like major revisions will come in line with consoles, this doesn't necessarily mean the pace of innovation will slow, just the releases will be further apart.

Re:Wider release cycles (2)

exomondo (1725132) | more than 3 years ago | (#35134700)

In fact it's likely to be a good thing, programmers will need to make the most of current hardware rather than skipping out on optimisations just because they know new faster hardware is always around the corner. Just look at the way the graphics quality of games on consoles increases over the lifetime even though the hardware stays the same.

Not first post, but... (0)

bennomatic (691188) | more than 3 years ago | (#35134686)

Certainly a first-world problem. Boo hoo.

Re:Not first post, but... (1)

Brucelet (1857158) | more than 3 years ago | (#35134858)

Seriously? You have a problem with an article about video game design on Slashdot?

Re:Not first post, but... (2)

Permutation Citizen (1306083) | more than 3 years ago | (#35135506)

Who (except graphic card manufacturers) regrets the time where you had to buy a super-expensive video card to play recent games ? Of course everyone want to run their games on the cheap PC they have.

Personally, I prefer game graphics to have great artistic design, instead of higher resolution.

Why do I think that COD: Black Ops (1)

Rooked_One (591287) | more than 3 years ago | (#35134696)

was in the author's mind when he wrote this? It let down so much compared to all of the other COD's.

I really feel where the author is coming from because of all the games you hear that "are awesome" on the consoles. You try them on the PC and they are just horrible. Jittery lag, poor graphics, horrendous controls, and the list goes on and on.

Re:Why do I think that COD: Black Ops (0)

socsoc (1116769) | more than 3 years ago | (#35134740)

Jittery lag, poor graphics, horrendous controls, and the list goes on and on.

Ah, so I see you've played it on PS3 too...

must be a whiny day on Slashdot (0)

Anonymous Coward | more than 3 years ago | (#35134704)

First the article about bloatware, then this?

The reason the world isn't moving in the direction you want is that there aren't enough of you spending money on things you like. That's not an indication of the world moving in some wrong direction.

patching (0)

Anonymous Coward | more than 3 years ago | (#35134710)

Well, the shit flows both ways. 1 generation ago (PS2) there was no console game patching - so developers had to finish making the game before shipping it. Now the console gamers have to put up with fallout 3 and New Vegas - "here's the beta, we maybe fix it later" shit.

Screen resolution drives video card performance (3, Interesting)

Revvy (617529) | more than 3 years ago | (#35134714)

Video cards push pixels and the number of pixels has stalled in the last couple of years. 1920x1080 is the norm, and there appears to be no push to go higher. I read a great rant [10rem.net] last year that effective summed it up. You can't blame console games for the fact that PC gamers have screens with the same resolution as their TVs. Blame either the manufacturers for failing to increase pixel density or consumers for failing to demand it. You've got to go to a 30" monitor to get a higher resolution, and the price of those beasts scares most people away. Why pay $800+ for a 30" when a pair of 24" 1080p monitors costs half that?

----------
Still waiting for my in-retina display.

6K gaming is not uncommon (0)

Anonymous Coward | more than 3 years ago | (#35134794)

Monitors are cheap, so an 3 monitor Eyefinity-setup on Windows 7 is used by many, and held as the next upgrade by more gamers. For that setup to be of any use, you got to have 3 monitors (doh!) and a good gfx-card. Some people are prevented from doing that upgrade because they have power-supply, CPU or OS that does not allow this setup.

Re:Screen resolution drives video card performance (-1, Redundant)

Rosco P. Coltrane (209368) | more than 3 years ago | (#35134796)

1920x1080 is the norm, and there appears to be no push to go higher.

Tell that to my Dell 30" monitor and associated Nvidia 480GT. They seem to be displaying at 2560x1900 just fine.

Re:Screen resolution drives video card performance (2)

socsoc (1116769) | more than 3 years ago | (#35134816)

Did you stop reading GP halfway through?

Re:Screen resolution drives video card performance (1)

markass530 (870112) | more than 3 years ago | (#35134890)

no, i'm pretty sure he didn't read any at all. "1920x1080 is the norm" -- He counters with a ridiculously expensive Dell 30".

Re:Screen resolution drives video card performance (1)

Gordonjcp (186804) | more than 3 years ago | (#35135044)

Most small cars top out around 120mph and there appears to be no push to go faster. But, just you tell that to my Ariel Atom V8. Oh wait, that's around 15 times the price.

Re:Screen resolution drives video card performance (1)

antifoidulus (807088) | more than 3 years ago | (#35134852)

There's another party to share in the blame game too, OS makers. It's 2011 and we still don't have a truly resolution independent operating system(or flying cars, but thats another rant). Gamers are only a very tiny subset of the people who buy monitors, so very few manufacturers are going to cater to their needs expressly. Unlike gamers, normal users aren't really clamoring for denser monitors because their software doesn't play well with "unexpected" pixel densities, ie everything gets smaller.

We are still about as close to a truly "resolution independent" OS as we were 5 years ago when Apple started saying that their next version of OS X was going to be resolution independent, only to quietly drop the feature, tout it again for the next release, then drop it again ad nauseum. As far as I know Windows and the various *nix window managers aren't even as close as Apple, so you may have to wait a while for your higher density monitors.

Re:Screen resolution drives video card performance (2, Insightful)

Anonymous Coward | more than 3 years ago | (#35134932)

you might want to update your argument to this decade. Windows 7's DPI support is close to perfection. of course, this assumes you are rating the operating system, and not the flawed applications which run on it.

Re:Screen resolution drives video card performance (0)

antifoidulus (807088) | more than 3 years ago | (#35134948)

To be fair, I did say operating system, Windows doesn't really count as one of those. Plus, you have to use windows.

Re:Screen resolution drives video card performance (0)

Anonymous Coward | more than 3 years ago | (#35135002)

I don't think anyone actively likes Windows, but most of us are forced to use it in one way or another. so its feature support is still fairly important.

and how does Windows not count as an operating system? I hope you're not one of those guys who insists that kernel == operating system.

Re:Screen resolution drives video card performance (1, Informative)

Omestes (471991) | more than 3 years ago | (#35135146)

I don't think anyone actively likes Windows

Actually I do actually like Windows 7. Its a first, the last time I came close was "grudgingly respecting" XP. Trying to get Linux to play nicely on my HTPC made me love Windows 7 (holy cow, Flash actually works! Imagine that, in 2011!), and even be somewhat nostalgic for the hog that is Vista SP2.

I still hate Microsoft... But with Win7 they actually proved that they can make quality software. It would be nice if it was OSS, or done by anyone other than MS or Apple... but... oh well.

Re:Screen resolution drives video card performance (1)

Anonymous Coward | more than 3 years ago | (#35135010)

You might want to update your argument to this decade. Windows as a shell-on-top-of-DOS has been completely gone for around 10 years. It's very much so an operating system.

Re:Screen resolution drives video card performance (0)

Anonymous Coward | more than 3 years ago | (#35135058)

seeing someone else use my own words made me realize how much of a douche I sounded like. I hereby apologize to antifoidulus.

- AC from three posts upwards.

Re:Screen resolution drives video card performance (1)

VortexCortex (1117377) | more than 3 years ago | (#35135094)

There's another party to share in the blame game too, OS makers. It's 2011 and we still don't have a truly resolution independent operating system

Nah, the graphics engines of games don't balk on higeh res displays, they shouldn't, anyhow...

OS Has nothing to do with it. You can select font sizes for OS texts in XP...

It's quite simple, you select a resolution, derive an aspect ratio, create a perspective transform, and presto, all 3D games can run at any resolution. Sure, you'll run into performance problems with lower end (including console) hardware that doesn't support newer higher res displays, but that's because the machines have a limited processing power...

When you're talking 3D, resolution is something that only post processing or per-pixel shaders has to deal with, not the OS; even old games can deal with uber res if they're coded correctly.

Re:Screen resolution drives video card performance (0)

Burning1 (204959) | more than 3 years ago | (#35134942)

Sorry, but why should I care about going above 1920x1080? Honestly, the only reason I would run my games at that resolution was because it's the native resolution of my monitor. When I had a CRT, I'd often play games at 1280x1024. High enough resolution for everything to appear clear, but low enough to maintain reasonable frame rates on a system that cost less than $1000 new.

IMO, the only reason to go higher than 1080p is when you're sitting up close to the aforementioned 30 inch display.

Re:Screen resolution drives video card performance (1)

cbope (130292) | more than 3 years ago | (#35134970)

Screen resolution may not be increasing by much these days, but that does not mean graphics capabilities and image quality are not improving. Higher and better levels of AA, anisotropic filtering, tessellation, increased geometry of models and world objects... all of these require more graphics card horsepower. Look at Crysis, even today... years after it's release, there is still not a single GPU that is capable of pushing 60fps when running at the native resolution of a 30" panel on enthusiast setting. Not a single card.

Consolitis is real, and it's starting to turn me off of once-great PC gaming. Poor controls are probably the worst offenders and some of the biggest games suffer from it, Bioshock 1&2 being a prime example. I was appalled at the mouse control in Bioshock 2 when I started playing it recently. It was horrible out of the box, laggy and unresponsive or hyper aggressive were the only possible settings in-game. Even with some hacks to the ini files to calm down the mouse, it's still far from ideal. I read a preview of Dungeon Siege III yesterday in PC Gamer, still in alpha, but anyway it had no keyboard/mouse controls at all! Only gamepad. This is a game being developed for cross-platform, and they have a playable game state and still no working keyboard or mouse control on the PC!

I have reached the point where when I hear an upcoming game is going to be cross-platform, I have to immediately lower my expectations for the game in the areas of controls and graphics, because it will obviously be dumbed-down to run on consoles. Please bring back the good old days of games developed FOR the PC, not shooting for lowest common denominator consoles. Crappy ports from consoles have become the norm for most AA titles in the last few years.

Re:Screen resolution drives video card performance (0)

Anonymous Coward | more than 3 years ago | (#35135116)

What about 3D (nvidia vision)?

Re:Screen resolution drives video card performance (1)

Jupix (916634) | more than 3 years ago | (#35135396)

Why pay $800+ for a 30" when a pair of 24" 1080p monitors costs half that?

Vertical resolution, PPI and having no bezel in the center of your display.

Re:Screen resolution drives video card performance (1)

Krneki (1192201) | more than 3 years ago | (#35135558)

Don't forget the TV LCD or plasma, the prices are much better then monitors.

But i don't see any point right now in higher resolution, on my 42' plasma I can't see the pixels and I play at 1m away. I do use the 2xAA tho.

P.S: I use a dual monitor setup, so I still have my old LCD for everything else.

Re:Screen resolution drives video card performance (0)

Anonymous Coward | more than 3 years ago | (#35135580)

There's a very simple reason for that: People get older. You wouldn't believe how many people I know who set their display resolution to less than the physical resolution just so that everything is bigger. I've considered recommending 32" HDTVs as monitors to them. There's simply no point in giving them more pixels. They can't see the ones they have.

How is any of this bad? (3, Insightful)

ynp7 (1786468) | more than 3 years ago | (#35134720)

So you're complaining because you can spend a relatively modest sum to play any game that you want without having to worry about a $500-1000 upgrade every year or two? Get the fuck out! This is unequivocally a Good Thing(tm).

Re:How is any of this bad? (1)

Yaotzin (827566) | more than 3 years ago | (#35134848)

You should really find a new place to shop for computers. They aren't that expensive.

Re:How is any of this bad? (0)

Anonymous Coward | more than 3 years ago | (#35134902)

He's just referring to he video cards, and using the $$s quoted in TFS

Re:How is any of this bad? (1)

Yaotzin (827566) | more than 3 years ago | (#35134946)

What? If you pay $500 for a video card, you shouldn't even be thinking about upgrading for at least three years.

Re:How is any of this bad? (1, Insightful)

Feinu (1956378) | more than 3 years ago | (#35134930)

Lower hardware requirements are definitely a bonus, but it comes at the cost of dumbed down controls. While using a keyboard, I have about 20 buttons under my left hand, and an accurate pointing device on my right, along with several buttons. Why would I want to cycle through potential targets by pushing a button? Why do I need to hold down a button (which also has a different function), instead of just pushing a different button? Now I enter a menu, and I have to lift my hand to get to the arrow keys to navigate the menu? Not user friendly at all.

Re:How is any of this bad? (1)

sznupi (719324) | more than 3 years ago | (#35134964)

That's if the games you're interested in have underlying mechanics revolving around pointing at things...

While even in the realm of first person perspective games, joypad can be quite great - especially Dual Shock-like, especially in Descent-like game. And then that's not the only type of user interaction possible / wasn't there something about gaming "innovation"?...

Re:How is any of this bad? (1)

mjwx (966435) | more than 3 years ago | (#35135320)

While even in the realm of first person perspective games, joypad can be quite great

So, you've never played a game with a keyboard and mouse before. It's OK, we dont hate you.

A control pad will never match the accuracy and speed of a mouse. For one, you cannot move a cursor directly from point to point, you must always base your movement on the centre of the joystick/thumbstick. For things like flight sim's this is a good thing but for looking and general movement it's terrible.

Secondly comes co-ordination. You can do far more operations per second, provided you have a minimum level of co-ordination using a keyboard and mouse doing separate things.

Third, number of buttons. If you put 40 odd buttons on a joypad, it would be as big as a keyboard so you may as well have used one in the first place.

Re:How is any of this bad? (1)

sznupi (719324) | more than 3 years ago | (#35135388)

Things you say are true only in games with underlying mechanics (or UI) revolving around pointing at things ... I wrote it quite clear.

And even there not exclusively, for example: light gun shooters.

(BTW, typical homo sapiens has 10 fingers)

Re:How is any of this bad? (1)

Pentium100 (1240090) | more than 3 years ago | (#35135244)

Also, why do I have to press up, up, down, down, left mouse button, enter to perform some action instead of pressing a some button? I mean my keyboard has a bit more than 100 keys...

Re:How is any of this bad? (1)

Omestes (471991) | more than 3 years ago | (#35135268)

Huh... I always buy the exact middle of the road video card ($100-130), and they generally last me around the life of the rest of my computer, meaning around 4-5 years. You don't need the bleeding edge, ever. Right now I've got an old ATI Radeon 4650, its lasted me around 3 years now, and I can play Fallout 3: New Vegas, Dragon Age, UT3, and TF2 at the highest settings. WoW (when I played it) at close to the highest settings, and pretty much everything else I'd want to play at either "high" or "highest".

On average a computer (if bought smartly, aiming for the plateau between cheap crap and bleeding edge tax) will cost much less than keeping up with consoles. Well, it will be higher if you completely ignore the fact that your computer is mutli-use, versus a console which is pretty much good for only one thing. This is a pet peeve of mine... Console fan boys like to claim that computers cost more, and completely ignore the fact that they already have and need a computer, whereas a console is completely optional. 90% of console games (at least ones I want to play) end up on the PC. I already (obviously) have a PC, so why spend $200-300?

I just upgraded my video card, not because it failed me, but for a "hand-me-down" upgrade. My mom needs something to replace her 1.20GHz Ancient Athalon, I'm giving her my old Core 2 Duo 2.0GHz box, which needs a video card to replace the crappy Intel GMA crap. So I'm replacing my 4650 with a 5770, giving the 4650 to my GF to replace her NVidea 9400-whatnot, which is going into my mom's new computer. If not for this chain, my 4650 probably would have lasted another 2 years.

I do find it odd that my monitor from 6 years ago was much better, resolution wise, than the 28" 1920x1080 I'm typing this on now. I really wish it was possible to get beyond this 1080, "HD" crap. I've had monitors with greater pixel density for years, "HD" is a step down. It should be "MD", the "M" being "mediocre", or "moderate".

Re:How is any of this bad? (1)

SharpFang (651121) | more than 3 years ago | (#35135304)

The bad thing is that while you CAN play every game on $200 card, none of them is WORTH playing.

Vapid piece of non-journalism (3, Interesting)

billcopc (196330) | more than 3 years ago | (#35134724)

The summary should have read "FiringSquad ad revenue is on the decline, here's an article about nothing, for you to linkspam".

Yeah, console games usually make for shitty PC ports, which is freakin' pathetic since the console title had to be developed on a PC in the first place, and today's middleware makes the distinctions largely irrelevant. This is not news. The same was true back in the 80's (minus the middleware).

My biggest peeve ? Not the shitty controls. Not the slightly degraded textures. Not the total lack of post-release fixes. No, my biggest peeve is when a stupid console port restricts your choice of display resolution. It is trivial to pull a list of API-sourced geometries and run with it, rather than hardcode for 720p and 1080p... or worse yet: 640x480, 800x600, 1024x768. Yeah ok, I was running 1024x768 fifteen years ago, it's kinda tired.

Re:Vapid piece of non-journalism (1)

feepness (543479) | more than 3 years ago | (#35135462)

It is trivial to pull a list of API-sourced geometries and run with it, rather than hardcode for 720p and 1080p... or worse yet: 640x480, 800x600, 1024x768. Yeah ok, I was running 1024x768 fifteen years ago, it's kinda tired.

While it certainly shouldn't be impossible, it's not trivial. There are considerations for fixed sized graphical UI elements. You can't just blow things up or even worse shrink them down. HUD displays look terrible and text gets unreadable. There are also field of view [codinghorror.com] issues.

Now I think game makers should be professional enough to take these into account, but it certainly is far from trivially making a couple API calls.

Consolitis bad for windows, good for Mac/Linux (0)

Anonymous Coward | more than 3 years ago | (#35134750)

I LOVE Consolitis! it's holding up directx massively. The longer they take the faster Linux catches up. We already have almost full Directx 9 support in wine and directx 10/11 support is already being implemented. You can currently play pretty much any windows game except .net 3.5/games for windows live titles. Consolitis sucks for PC games in general, but for Linux/Mac gamers it's done more than any OpenGL development in the last 10 years for compatibility.

Worse... (1)

ikkonoishi (674762) | more than 3 years ago | (#35134754)

The one thing worse than consolitis is inline advertisements injected into the text of an article as fake links. D:

But going back to the subject at hand the most glaring recent example of consolitis in a game has to be The Force Unleashed. That game had horrible mouse control which made one boss fight basically impossible. With a gamepad you had to hold just both sticks down, but with the mouse you had to constantly move the mouse downwards for 30 seconds at a time. Arggg.

Penus-toe (-1)

Anonymous Coward | more than 3 years ago | (#35134776)

Foot binding was a unique Chinese sexual mutilation practice that was performed on girls of all classes. Like other fetishists, the Chinese were so afraid of the vagina as a dangerous, castrating organ that they could only feel erotic toward the woman's foot - mainly her big toe. As a Cheng Kuan-ying described foot binding in the nineteenth century: "When a child is four or five, or seven or eight, parents speak harshly to it, and frighten it with their looks, and oppress it in every conceivable manner so that the bones of its feet may be broken and its flesh may putrefy."(142) The girl undergoes this extremely painful process for from five to ten years, crying out in pain each night as she hobbles about the house to do her tasks while holding on to the walls for support.(143) As the bones became broken and the flesh deteriorated, her foot became a perfect penis - substitute, often losing several toes as they were bent under her foot in order to emphasize the big toe sticking out.

The penis-toe then became the focus of the man's perversion and of his sexual excitement during intercourse. "It formed an essential prelude to the sex act, and its manipulation excited and stimulated... The ways of grasping the foot in one's palms were both profuse and varied; ascending the heights of ecstasy, the lover transferred the foot from palm to mouth. Play included kissing, sucking, and inserting the foot in the mouth until it filled both cheeks, either nibbling at it or chewing it vigorously, and adoringly placing it against one's cheeks, chest, knees, or virile member.(144) Thus even sex with a female could simulate homosexual intercourse for Chinese males.

'consoleitis' not slowing uptake of video cards (4, Insightful)

Matthew Weigel (888) | more than 3 years ago | (#35134782)

"Though a $500+ video card is considered top of the line, a $250 one will now play pretty much any game at the highest settings with no problem. (Maybe that’s what everyone wanted?) Pretty soon, however, graphics chip makers won’t be able to sustain their rate of growth because the software is so far behind, which will be bad for gamers on consoles as well as PC."

Making content that looks good at 1080P (or 1920x1200 for some PC monitors) is hard. Some amazingly specialized people spend a lot of time working on it; the more powerful the graphics processor, the more that is possible, but the more art assets have to be created (along with all the associated maps to take advantage of lighting, special effects, shader effects...) and the more programming time has to me spent. Much like the number of pixels increases far faster than the perimeter of the screen, or the volume of a sphere increases faster than its surface area... the work to support ever-increasing graphics power grows faster than the visual difference in the image.

It's not sustainable, but those advancing graphics processors are a big part of why game developers are moving to consoles: a shinier graphics engine costs more money to develop, which increases the minimum returns for a project to be successful. Anyone who looks at the business side can see that the market of people who have $500 graphics cards is much tinier than the market of people who have an Xbox360 or Playstation3. If you're going to spend that much money on the shiny, of course you're going to shoot for a bigger return too!

When it takes a big team to develop something... well, that's generally not where the innovation is going to happen.

First to bitch about lack of Linux games! (3, Interesting)

pecosdave (536896) | more than 3 years ago | (#35134784)

I really, really miss Loki.

I still want to kick someone at Epic in the nutts for not following through with the promised Linux port of UT3. (My copy is still sitting there waiting to be played for the first time)

If you use SDL and Open GL you can make it work on everything easier! /rant complete, my version of PC gaming covered, go back to bitching about consoles and Windows Microsoft weenie.

Re:First to bitch about lack of Linux games! (1)

Beelzebud (1361137) | more than 3 years ago | (#35134838)

That was very annoying. I bought UT3 also thinking there would be a Linux client for it. They even showed screenshots of it running in Linux, at one point. Frankly I've given up on Epic Games. It's a shame they went the way they did because every PC game they made until the UT3 engine had Linux clients. The thing I'm curious about is if id will actually follow through with a Linux client for Rage. Since they're not an independent shop anymore, I hope it doesn't impact Linux clients, and source code releases.

Re:First to bitch about lack of Linux games! (2)

pecosdave (536896) | more than 3 years ago | (#35134910)

Dude, Epic has been awesome every since the old DOS pinball games they used to have!

Rumor has it pressure from Microsoft put a lid on the Linux version.

"It may be difficult to get a Linux game ported over to XBOX and certified, all the Linux code could make the certification process very difficult."
"But it's just GL and SDL code, there is no Linux code exactly".
  "Oh, there's Linux code in there alright...."

Re:First to bitch about lack of Linux games! (0)

Anonymous Coward | more than 3 years ago | (#35135476)

Eh. Most Xbox games use the unreal engine. What you suggest would be killing the goose that lays the golden eggs.

Re:First to bitch about lack of Linux games! (0)

Anonymous Coward | more than 3 years ago | (#35135466)

I think I speak for 64.44% of us (Jan 2011, statowl) when I say "Screw Linux, what we want is Ubuntu games."

Proprietary? Fine. DRMed? Just as long as it's not too bad. I want software that does what I want. Judging by OS market share, so does the other 99% of the world. If Mac OS X had had an Ubuntu-like app store half a decade ago and ran on PCs, I would be using it. But it didn't, and it doesn't. Ubuntu is superior, and I want games on my platform of choice. Until I get them I'll just find other ways to waste my time, like pursuing women, doing homework, working at my job, and posting on slashdot.

"Inflammation of the Console"? (1)

angus77 (1520151) | more than 3 years ago | (#35134804)

"Inflammation of the Console"?

C'mon now, you can butcher the language in more creative ways than that.

Sounds good to me (2)

Superdarion (1286310) | more than 3 years ago | (#35134822)

Pretty soon, however, graphics chip makers won’t be able to sustain their rate of growth because the software is so far behind

Well, that seems good to me. One of the deterrents of PC gaming is the everchanging hardware specs. If consoles have proved already that we can live with the hardware power from 6 years ago and still make games that look quite impressive (at least, sufficiently good), perhaps it's time that computer videocards slow down and allow the population to catch up. It sucks buying a $250 video card just to have to replace it in 2 years, whereas this-generation-consoles have lasted 6 years. The solution is, of course, to buy a $500 videocard, which will be good for a few years, but with that money you can get a console with controllers and one or two bundled games, so why bother? Not to mention buying a decent mouse, keyboard, screen and speakers.

Perhaps we should even learn from the wii and the indie games, which can run on computers 7 years old! Why must we have a new hyper-mega-powerful new $600 video card every year?

Sure, one could argue that video-game developers could actually take advantage of the new hardware (DX 11, anyone?) and have amazingly-looking games, but why bother? Do we really want more realism, graphics-wise, than the MOH and COD franchises currently offer? I think that the success of those franchises, specially the last three CODs, speak for itself. We don't need a new over-hyped video card every six months; we don't need a thousand different model-names that no one understands; we don't need cutting-edge technology to make games. And certainly, we don't need to have such a hostile environment like what PC gaming is, which just turns away most would-be gamers.

That is, truly, what the consoles do right. You don't have to know anything about computers or videogames to pick up one and within minutes start playing your new videogame. You need not install, tweak or configure in any way your games or consoles. You need not update to the latest card drivers. You need not replace any part of your console (except the ones that stop working) every two years; you don't need to worry about system specs, and figuring out if your GT 250 is better or worse than a GT 260 or a HD 5730. Finally, while I'm on it, you need not worry about fucking DRM in your console games, although that's another story (and perhaps the trade is fair, for PC gamers need not fear that their PC manufacturer suddenly bricks their computer... unless sony is involved).

Besides, everyone keeps complaining how games nowadays focus on looking stunning and having great sound effects and, basically, taking too much effort into the media part of the game, while slacking off in other areas, like immersiveness, story, character development and all that. Now they're saying "we should have better graphics now!". I call bullshit.

Re:Sounds good to me (0)

Anonymous Coward | more than 3 years ago | (#35134874)

I generally agree with your post, but

Besides, everyone keeps complaining how games nowadays focus on looking stunning and having great sound effects and, basically, taking too much effort into the media part of the game, while slacking off in other areas, like immersiveness, story, character development and all that. Now they're saying "we should have better graphics now!". I call bullshit.

I hate when people do this. Ever figure that maybe different subsets of "everyone" like to bitch about different things?

Re:Sounds good to me (2)

cbope (130292) | more than 3 years ago | (#35135032)

It's all a tradeoff, or more accurately a price-shift that occurs with consoles. Ever notice that console games tend to cost quite a bit more than their PC equivalent? Thanks, but I'll take my general purpose PC that I can use for many different tasks, is upgradeable, can run games with better graphics than any console (unless it's a damn cross-platform title), and games that are cheaper.

On a related note, practically every major RPG released for PC recently has been crippled as a cross-platform "port". I'm getting sick and tired of buying games built for lowest common denominator consoles, they are holding back PC gaming. Sure, be happy if you have a console and you like it, I'm fine with that. The problem is that the developers are holding back PC gaming which could be advancing a lot quicker were it not for consoles. I'm not only talking about graphics here, but things like physics, better and more intelligent AI, better storylines, etc. The capacity of the PC is far greater than any console and the possibilities are almost endless, but since everything has to be simultaneously developed for consoles, major compromises are made during development that hold back the PC version and limit what can go into the game development. It's all about fitting the game within the console's limitations.

Re:Sounds good to me (1)

MemoryDragon (544441) | more than 3 years ago | (#35135274)

Problem with the current generation of new consoles is simply they wont be upgradable. Lots of people bought into a console first time of their life. And they will be in for a major dissapointment when the next generation comes along. Reason they have plunked hundreds of dollars into games, and once the next gen hits, there is a huge chance the games will not play on the new console anymore.
Every console so far has become a doorstopper to some degree after a while. Nintendo being better than the others by trying to keep the backwards compatibility. PS4 -> probably hell freezes over that it will be backwards compatible, Sony has to either stay on exactly the same hardware and ramp up the ram and GHz or they have to drop the Cell processor line entirely.
Microsoft probably faces a similar dilemma with their Custom processor. Nintendo might have a chance, the Wii is so underspecced that they probably by now can move the core to a SOC and put it into the Wiis successor.
Now compare that with a PC where even the old infocom adventures still run given some effort.
And thats the big problem most people simply yet are not aware of!

Rip off (1)

Bensam123 (1340765) | more than 3 years ago | (#35134892)

Rip off of the more generally known term of 'consolization'. Just someone trying to coin a term that a lot of people are already aware of. It's good it's actually getting a article now though. :o

Not recognizing button changes (1)

DrHappyAngry (1373205) | more than 3 years ago | (#35134912)

I pretty much just chucked force unleashed because I always play games with the numeric keypad, and the game kept telling me to press the wrong keys. Is it so hard to check what key is mapped to do something? And what's up with having to exit the game and configure it from the pre-launch menu just to change the key config? I think overlord 2 was the last one I saw guilty of this.

Halo (1)

DarwinSurvivor (1752106) | more than 3 years ago | (#35134914)

I remember playing halo on the PC. I searched every setting I could possibly locate and yet could not figure out how to dissable the bloody auto-aim. Sure on a console it's useful (aiming with a joystick is a joke!), but it seriously destroys the entire game on the PC.

I'm running down a hallway and see an enemy (who hasn't spotted me yet) about 2/3 hidden by a wall, so I aim at his arm at fire a precision weapon (not the needler or anything) and guess what? The game goes "Oh, you are shooting at enemy FOO, let me just 'fix' your aim a little and make the bullet hit him in the chest". Now this MAY have helped had his chest not been BEHIND THE BLOODY WALL! Instead I am forced to flank around him until the CENTER of his body is visible and THEN shoot.

Played through it once and went "screw this". Haven't touch any halo games ever since. The REALLY sad part is that halo was originally going to be a PC game before Microsoft (ironically a PC Operating System developer) decided to make it "console playable".

Forget the graphics already (1)

AdamHaun (43173) | more than 3 years ago | (#35134944)

Reducing the amount of money I have to spend on video cards is not a bad thing. Control and gameplay problems are. Dead Space on PC was totally unplayable because the mouse input was converted to an analog stick-style velocity input, capping its max speed and forcing me to flail wildly at my desk just to turn around. Mass Effect doesn't let me hit escape to back out of menus. Aliens Vs. Predator was about as interactive as Doom -- point at the glowing quest object and hold down the use key; repeat fifty times. With all this advanced technology, why does it feel like time is going backwards? None of the big-name console shooters can hold a candle to 2004's Half-Life 2 in any area except graphics, and even then the poor art direction cripples them (brown, gray, brown gray, dull green...). And let's not forget the big contribution of this console generation, DLC -- the sort of add-ons we used to get for free now have a price tag attached. The PC versions don't charge yet, but you know it's coming.

Don't get me wrong, I love consoles. I've played consoles games in every generation since the NES. I just don't see a lot of positive influence on PC gaming these days aside from standardized graphics requirements. The keyboard and mouse are just better for some kinds of games, and it saddens me to see those games dumbed down so they can fit in hardware that was never designed to accomodate them.

On the other hand, it could be worse -- the next big influence is probably going to be touchscreen phone games, and we all know how great touchscreens are for gaming, right?

Re:Forget the graphics already (1)

sznupi (719324) | more than 3 years ago | (#35135070)

It goes both ways. For example, tittles also revolving around pointing at things ... but a very different kind of it: proper light gun games. They virtually died out with the arrival of current console generation, apparently sort of replaced by games offering hybrid kind of gameplay.

(yes, that's largely due to abandonment of CRT; not much of a... consolation)

Preferred Gaming Platform (1)

inglorion_on_the_net (1965514) | more than 3 years ago | (#35134952)

What amazes me is that I sometimes get the impression that consoles are somehow the preferred platform for gaming. I mean, the rate at which new consoles get brought to market is so slow, and the rate at which the PC world moves is so fast, that before you know it, Linux is a better gaming platform than the consoles, and mobile phones have better hardware. I understand the benefits of developing for a stable and homogeneous platform, but PCs are going to be running circles around consoles pretty soon.

As far as a stable platform goes - modern PCs can still run software written for the PCs of the 1980s, X is from 1984, and OpenGL from 1992. The BSD and win32 APIs have been available on PCs since 1993 or thereabouts, and DirectX since 1995. Take your pick; all of those predate the current generation of consoles.

Re:Preferred Gaming Platform (1)

sznupi (719324) | more than 3 years ago | (#35135014)

And on Wii (for example) you can play quite a few games from NES, Sega Master System, C64, Neo Geo or MSX.

Re:Preferred Gaming Platform (1)

unwesen (241906) | more than 3 years ago | (#35135226)

One word: gatekeeper.

Of course any platform that allows manufacturers to act as gatekeepers is preferred. Duh.

Whassa big deal? Just take 'em away. (2)

macraig (621737) | more than 3 years ago | (#35134956)

It's not big deal, really... I had my consils removed from me when I was a kid and I turned out (mostly) fine. Now I game on PCs and I'm better for it.

Call of Duty: Modern Warfare 2 (1)

tirefire (724526) | more than 3 years ago | (#35134988)

CoD: Modern Warfare 2 is a pretty good example of consolitis, though certainly not as bad as Black Ops.

When MWF2 came out there were a lot of complaints from PC gamers about the lack of a console, the lack of dedicated server support, inability to change field of view from default, etc.

As a PC Call of Duty fan, imagine my surprise and joy when I stumbled upon AlterIW [alteriw.net], a community hacking project that fixes all that. To add insult to injury, the hack is designed to slipsteam into a SKiDROW torrent of the game.

Elder Scrolls: Oblivion suffers from this (1)

jonwil (467024) | more than 3 years ago | (#35134990)

The Elder Scrolls IV Oblivion suffers from "consolitis" in that the controls just arent right for a PC. For example why cant I click on a chest and have it open automatically to allow me to pick stuff up without needing to press a button to open it?

On a console having a separate "open chest" button made sense but not on PC with a mouse.

Re:Elder Scrolls: Oblivion suffers from this (1)

polyp2000 (444682) | more than 3 years ago | (#35135072)

Not sure which console version you are talking about - or maybe i am misunderstanding you. Elder Scrolls: Oblivion does not have a "seperate" Open Chest button.
on my copy you use the same button to open doors, locks, and talk to people.

The cross hair is context sensitive changing shape depending on whether the item underneath it is a NPC or other object you can interactive in the game.
The action button on the PS3 (X) i think is the same button used for all these actions.

How does this work on a PC - The mechanism used in the console version sounds like it would be a no-brainer in conversion. Is it really that butchered ?

Re:Elder Scrolls: Oblivion suffers from this (1)

jonwil (467024) | more than 3 years ago | (#35135206)

yeah I think its the same on PC in that there is a single "action" button. But IMO it would be better if it was more like some PC RPGs where you just click on things that are activatable or actionable.

Lack of configurable controls!!! (0)

Anonymous Coward | more than 3 years ago | (#35134994)

One of the worst thing about the shitty console ports is that you can't easily map game controls to the controller of choice.

For example, Heroes over europe released without Joystick support, WTF a game based on air combat and no joystick.... Just cause 2 suffers from the same problems. I have a logitech game pad (yea I know it is a little old) and none of the recent console ports will easily allow me to map to it... If I want to be able to use my vast collection of non console based game controllers, I am restricted to titles that predate the 3rd gen consoles...

Almost enough to make me give up gaming altogether if it wasn't for MW2

Blame the PC users, not the consoles (3, Insightful)

tlhIngan (30335) | more than 3 years ago | (#35134996)

10 years ago, a good chunk of gaming was done on PCs because consoles were crap - standard def, too-small TVs, and the like, so people bought nice high-end PCs and invested in them. Dropping $2000+ on a PC wasn't unheard of nor unusual.

These days, spending more than $500 on a PC is very unusual - only Apple and PC gamers do that stuff, and really, it's no surprise why. And that $500 gets you a monitor, keyboard, mouse, speakers and other accessories.

Who's the #1 graphics chip maker in the world? It's not nVidia or AMD, it's Intel. (Sure, nVidia and AMD have the discrete graphics market, but that's a really tiny chunk of the whole PC market). When PC prices plummeted below $1000 and then below $500 (and laptops became "netbooks" below $500) manufacturers know that the average PC buyer cares about Gigs (hard drive space), Gigs (RAM) and Gigs (CPU GHz). Nowhere do they really care about graphics - after all, Windows does just fine on Intel graphics, and that's all the user sees.

The higher end PCs with discrete graphics sell far less, even one with a low-end graphics may be considered a gaming PC (and little Jonny's mom and pop aren't buy a PC for games, oh no, they want so Jonny can work).

PC gaming is huge - after all, FarmVille and the like don't require super high end ultimate graphics chips and many popular indie tities have lightweight requirements that even the cheapest of netbooks can play them.

The problem is, as we all know, Intel graphics are crap (though they're supposed to get better now with nVidia), and can barely do 1080p video decoding and high-def gaming.

So people buy a console as well - and with HDTV, they get high-def and on the big ol' 52" HDTV versus their 17"/20" PC monitor (or whatever is free these days). They could buy it on a PC as well (it's easy enough to do), but that requires spending money buying more PC - they could build/configure a great PC for $600, but that's over the "cap" of PC prices of $500. (Everyone gasps at the price of a $1000 MacBook Air, comparing it to a $300 netbook (despite better graphics (Intel vs nVidia) and CPU (Core2Duo is old, but runs rings around Atom), SSD, RAM, etc.).

Hell, I tried to convince someone to spend $1000 to buy a decent laptop and they balked.

No, it's not consoles limiting graphics of games - it's PCs themselves. The number of people with high end $600+ video cards (or probably any nVidia or AMD graphics cards of the past say 4 years) is very small compared to the total PC market. And we know PC gaming is larger than console gaming, but they're all for games that can play on the #1 video card on the market.

And developers go for the money - there are more console gamers out there than hardcore PC gamers with power graphics cards (and the willingness to upgrade yearly or so) - even though there are more PC gamers in general. Other than that, consoles and PCs are pretty much plug-and-play (and Sony's making the PS3 a PC experience with installers, EULAs, serial keys, online DRM, oh my).

Re:Blame the PC users, not the consoles (1)

thegarbz (1787294) | more than 3 years ago | (#35135300)

No, it's not consoles limiting graphics of games - it's PCs themselves. The number of people with high end $600+ video cards (or probably any nVidia or AMD graphics cards of the past say 4 years) is very small compared to the total PC market. And we know PC gaming is larger than console gaming, but they're all for games that can play on the #1 video card on the market.

I wholeheartedly disagree. I have a Core2Duo from about 4 years ago, I have a 2 and a half year old graphics card that cost me $160 new at the time. The most recent games I've played was Lost Cause 2 which gave me the recommended settings of 1920x1200 with everything on max. I also played split second recently which is also a console to PC port, again recommended settings were 1920x1200 everything on max.

So ... where is my motivation to buy the latest and greatest video card again? This is a chicken and egg argument, only in this case we know what came first. Back in the day people bought video cards so they could max out the graphics of the latest games. These days we spend $500 on a PC because ... it maxes out the graphics on the latest games. There's very few games these days that tax the PC. Crysis was the only game I have seen in a while which actually needed something as minor as turning off AA, but I still ran it at max resolution.

People don't spend $2000 on a PC because PCs are cheap these days. The video card is only a small component. Multi-core CPUs which sit idle when not encoding 3 videos while playing a game, a 1000watt PSU that draws about 400watt normally, two video cards in SLI configuration to increase your framerate from 90 to 150fps on your 60hz display, and 16GB of RAM which spends most of its life empty, that is what $2000 gets you. I consider myself a poweruser, but my 4 year old processor and 2 year old graphics card run every application and game I throw at it just fine.

Re:Blame the PC users, not the consoles (0)

Anonymous Coward | more than 3 years ago | (#35135310)

Who's the #1 graphics chip maker in the world? It's not nVidia or AMD, it's Intel. ...And we know PC gaming is larger than console gaming, but they're all for games that can play on the #1 video card on the market

I'm not sure how true that is. Small anecdote: I've been playing a lot of Battlefield Bad Company 2 recently, and I've been having fun keeping an eye on the stats. One set of numbers that keeps catching my eye is the total number of players per platform:

Players Online / Total

PC 8,572 | 1,673,453
360 7,945 | 1,570,904
PS3 5,340 | 1,527,433

Now, BC2 is definitely one of those games which will not be handled by Intel graphics, and it's also a less than a year old release which has had time to bed in on all platforms.

The number of people with a $600+ GPU might be a small fraction of the total PC market, but the total PC market far and away outstrips the number of consoles. Wiki reckons that as of January 2011 the 360 had sold 50+ million units. This article [dvhardware.net] from 2008 reckons that ATI had 40% of the discrete card market with ~22 million per year. That makes ~50 million discrete cards for both AMD/ATI plus nVidia per year. Granted, not all of those will be high end, but the 360 has been around for 6 years ish, so you'd need less than 20% of the discrete card market in the same time to be mid-high end for there to be as many game-capable PCs which would probably be more powerful than the 360. That's not exactly a small market, and at least for BC2 that seems to be borne out by the numbers of PC players being higher than on either of the consoles.

Re:Blame the PC users, not the consoles (0)

Anonymous Coward | more than 3 years ago | (#35135516)

I know I'm doing my part. Mid last year I bought a $6500 Alienware M17x (yeah I know, getting it from America would have been like $4000) and a friend just bought a Clevo/Seager/whatever for $7000 with Nvidia SLI, 3d capable gaming, the Intel 980x, SSDs in Raid 0 etc.

In other words, I haven't heard anyone balk at a $1000 laptop especially after I told them how much mine was worth.

Consolitis helps linux (1)

DMJC (682799) | more than 3 years ago | (#35135022)

Because consoles are getting all the ports directx is being held up and Linux/Mac are catching up with wine support for directx. Can already run pretty much everything important except .net 3.5 and games for windows Live.

A $250 dollar video card ... (1)

BitZtream (692029) | more than 3 years ago | (#35135048)

Is still WAY more expensive than a console ... because the entire console is going to cost about that, and you don't need to buy hard drives, cases, motherboards, processors, ram and whatever I'm not thinking of.

You also don't have to worry about drivers on your console, and your not going to run into a hardware compatibility problem unless you use some unbadged knockoff add-on.

And in 6 months, you'll need another $250 video card and probably some ram now that we have 64 bit OSes

Re:A $250 dollar video card ... (1)

thegarbz (1787294) | more than 3 years ago | (#35135410)

And in 6 months, you'll need another $250 video card and probably some ram now that we have 64 bit OSes

You're missing the utility of the computer, and putting 2 and 2 together in the summary. Yes a console is cheap, but a PC exists. I don't know a house with a console which doesn't have a PC of some kind. People surf the web, write documents etc, and these days the most lowly speced computer you can buy, plus a $250 video card will pretty much max out any game you throw at it.

So taking the summary further invalidates the 6 monthly spend. My computer is now a 4 year old core2duo, my graphics card is 2 years old and cost $160 at the time. These still run the latest games with graphics set nearly at max. The reason for this is consolitis has exists for some time now. A stagnant console market has resulted in really cheap computers now having more grunt than the consoles, and as such games the are designed for consoles have zero problems with the hardware specs of any remotely modern computer, this will typically continue further. 4 years ago it was an entirely different industry. Your video card won't play the latest game at 1920x1200 unless it was $400+ in a computer with the latest and greatest processor. These days most laptops even make quite capable gaming machines, and they surf the net and fit in a backpack.

I for one... (0)

Anonymous Coward | more than 3 years ago | (#35135172)

... welcome our console overlords!

Seriously, though, that's about as biased a post as you can get on this issue: Possibly the most disastrous outcome of an industry-wide shift to graphics-oriented development is that gameplay innovation has been greatly slowed.

*sigh*

This has been going on for years (1)

MemoryDragon (544441) | more than 3 years ago | (#35135254)

I am still on a 3 year old mid range PC graphics card which I back then got for 150$ it still runs pretty much every new game which comes out on the PC ad mid til high end settings.
The reason, the stallment of the update cycle caused by the last console generation.
The funny thing is, if you want cheap gaming it is currently the PC, the games are cheaper and usually hit the bargain bin earlier, and given that the consoles lack severely on the hardware side and PC only development has come to a standstill or went mostly independend you dont even have to upgrade your graphics card. And with the next console generation it probably again will be just a shift to the next mid range graphics card version 1-2 years into the consoles lifespan and you are set for another 7-8 years, depending on the lifespan of the consoles.

Does it hurt the PC Graphics card makers, sure, does it hurt the console gamers which will not be able to get such huge shift in graphics from each generation anymore, sure. What can we do about it nothing I guess, people flocked to the consoles and thats now what they get.
PC graphics card makers are aware of that paradigm shift and move slowly into other directions. NVidia currently moves into the Supercomputer market because their cards are more like modern Vector machines than anything else and also into the handheld market with their Tegra Line. The PC market is seen by them as something probably better left to Intel in the long term no growth there anymore and no big sales numbers for dedicated graphics solutions there anymore.
ATI does what AMD always did they try to stick with the PC market but they are also integrating their graphics cores.

 

Re:This has been going on for years (1)

wildstoo (835450) | more than 3 years ago | (#35135490)

The funny thing is, if you want cheap gaming it is currently the PC, the games are cheaper and usually hit the bargain bin earlier.

This.

It seems people never take into consideration the price of the actual games when they're making the PC vs Console price argument.

In general, AAA titles will run you 10-20% more on release on consoles than on PC. Also, there are thousands of Indie games on the PC that never make it to consoles. Some of them are awesome, and most of them are dirt cheap.

$600 might seem like a lot compared to $300 for the console hardware, but over their lifetimes the cost will even out. Plus, the PC will let you do a lot more than a console.

If PC gaming dies, my interest in mainstream gaming will likely die with it.

PCitis (0)

Anonymous Coward | more than 3 years ago | (#35135328)

I have the opposite problem: I have a HTPC at home in which I would like to play some games using my gamepad and maybe control some things with my remote, but I can't.

Most games require mouse and/or keyboard to navigate through a menu when this could be perfectly done with a gamepad.
And I'm not talking only about commercial games, but also free ones like Tux Racer (or whatever the new name is), bzflag, etc.

The only thing that works well with a gamepad are console emulators like zsnes, gngeo and mame, but even these lack lirc or
dbus support, so to integrate them with my HTPC I had to create workarounds.
If only these emulators supported dbus, I could send commands from irexec and I could use my remote to pause the game, save screenshots, quit, etc.

On another note, I have seen some arcade machines with integrated webcams that take a picture of the user to set their avatar. That would be nice for computer games as well.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...