×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Carmack: Next-Gen Console Games Will Still Aim For 30fps

Soulskill posted about a year ago | from the take-that-peter-jackson dept.

Graphics 230

An anonymous reader sends this excerpt from Develop: "Games developed for the next-generation of consoles will still target a performance of 30 frames per second, claims id Software co-founder John Carmack. Taking to Twitter, the industry veteran said he could 'pretty much guarantee' developers would target the standard, rather than aiming for anything as high as 60 fps. id Software games, such as Rage, and the Call of Duty series both hit up to 60 fps, but many titles in the current generation fall short such as the likes of Battlefield 3, which runs at 30 fps on consoles. 'Unfortunately, I can pretty much guarantee that a lot of next gen games will still target 30 fps,' said Carmack."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

230 comments

Detail (4, Insightful)

Dan East (318230) | about a year ago | (#42334123)

Would you rather have double the detail at 30 FPS, or half the detail at 60 FPS? Considering most people can't perceive frame rates faster than 30, it makes a bit of sense to push more polygons instead.

Re:Detail (4, Insightful)

Radres (776901) | about a year ago | (#42334135)

I think Carmack's point is that the other studios will push half the content at 30fps because they're lazy.

HOBBIT IN 48 FPS - YECHH! (-1)

Anonymous Coward | about a year ago | (#42334179)

I just came back from seeing The Hobbit in 48 FPS and I gotta say that sucked really badly. Makes me very happy to see things in the standard 24 FPS by comparison. No need for that crappy higher framerate, which doesn't improve the look of things at all.

Re:HOBBIT IN 48 FPS - YECHH! (5, Insightful)

epyT-R (613989) | about a year ago | (#42334231)

people who complain about higher framerates never seem to have a justification other than 'it's not what I'm used to'. What about the 48fps made it suck? Please avoid using 'audiophile-like' subjective/emotional terms.

Re:HOBBIT IN 48 FPS - YECHH! (0)

Anonymous Coward | about a year ago | (#42334331)

Erm... he thinks it sucked and you want him to avoid subjective terms? Okay then...

Re:HOBBIT IN 48 FPS - YECHH! (2)

TubeSteak (669689) | about a year ago | (#42334333)

Please avoid using 'audiophile-like' subjective/emotional terms.

Our expectations & emotional experience colors our subjective experience.
And it's a scientifically measurable effect.

That isn't to say objective measures are irrelevant, only that they are not all that is relevant.

Re:HOBBIT IN 48 FPS - YECHH! (0)

Anonymous Coward | about a year ago | (#42334485)

Please avoid using 'audiophile-like' subjective/emotional terms.

Our expectations & emotional experience colors our subjective experience.

Thanks for proving his point!

Re:HOBBIT IN 48 FPS - YECHH! (5, Insightful)

epyT-R (613989) | about a year ago | (#42334545)

Well, there's a bandwagon of snobbery out there about this issue. Kinda like people who say vinyl or vhs is superior to digital audio and video, I suspect this whole 'butt is it art' routine is more about social exclusivity and differentiation (and unhealthy doses of insecurity) than it is about their actual experience. I could understand if someone got motion sickness from the higher rate and didn't like that, but otherwise I cannot understand why someone would want animations deliberately choppy.

With today's style all about fast cuts and jerkycam, I think the higher framerate would help the viewer track the action.. It helps in games and I suspect it would help me in such scenes, esp when they pile on the blur and urinal tournamint style colored lighting..

Re:HOBBIT IN 48 FPS - YECHH! (5, Interesting)

SecurityTheatre (2427858) | about a year ago | (#42334623)

That's exactly the problem I had.

The "Jerkycam" works BECAUSE of the 24fps.

The only time I found the 48fps showing to be uncomfortable and weird was during very fast action, jerky motion sequences. It suddenly feels like high-fidelity jerkyness, which makes it lose its tendency to portray "oh noez, stuff is blurry and out of control, even the camera", and just feels like "why is the dude shaking the camera so much?"

Re:HOBBIT IN 48 FPS - YECHH! (5, Interesting)

epyT-R (613989) | about a year ago | (#42334653)

I guess my interpretation of jerkycam was always "why the hell is he shaking the camera so much?" Its' annoying and distracting, especially when it's every other scene. If the sharpness of movement isn't sufficient it's because the movements aren't sharp enough. The lower framerate just hid that.

Re:HOBBIT IN 48 FPS - YECHH! (1)

aaaaaaargh! (1150173) | about a year ago | (#42334963)

Kinda like people who say vinyl or vhs is superior to digital audio and video, I suspect this whole 'butt is it art' routine is more about social exclusivity and differentiation (and unhealthy doses of insecurity) than it is about their actual experience.

What's your point? People are sometimes irrational in their choices, and, of course, sociological factors play a role in determining them. Otherwise a large part of high-end markets in all kinds of domains as well as most corporate branding would vanish overnight. Objective measures, e.g. whether people would fail a blind test or not, are fairly irrelevant if people do not consume blindly. The things we are talking about are meant to be interesting and primarily entertaining. Sure, you can spend a decent amount of marketing into "educating" people about what they "should" prefer but the question is whether that money is really well spent if people already have other preferences. Experience always come in one package, including all kinds of "side" factors. There is really no point in tasting wine out of plastic cups or having a gourmet meal in a fast food restaurant.

Re:HOBBIT IN 48 FPS - YECHH! (1)

Anonymous Coward | about a year ago | (#42334389)

the problem is you don't want to have it as lifelike as real life. The higher fidelity given actually decreases the fantasy of the experience. With movies not set in the real world and out of this world cgi flitting about in massive droves, you don't want to have it 'shown'. You want it painted over, to sort of fudge it to give a better appealing finished product even if it's technically inferior.

Re:HOBBIT IN 48 FPS - YECHH! (1)

epyT-R (613989) | about a year ago | (#42334493)

For me the higher framerate helps suspend disbelief because everything moves more fluidly. 24fps always gave me a headache too.

Re:HOBBIT IN 48 FPS - YECHH! (1)

dosius (230542) | about a year ago | (#42334951)

Uncanny valley?

Re:HOBBIT IN 48 FPS - YECHH! (1)

Joce640k (829181) | about a year ago | (#42335125)

You might want to put "Uncanny Valley" in quotes and capitalize it so that people don't think you're just posting random words...

I haven't seen The Hobbit yet so I can't comment but I don't see how it can be worse. Not really. Not unless you were going into the cinema thinking "Oh, I really MUST analyze the frame rate thing down to the last minute detail so I can have an opinion later".

I bet if it was the other way round, if we'd always had 48fps and Peter Jackson was experimenting with 24fps to give it an "analog" feel, the pseuds would be complaining just as loudly. It's what they do.

Re:HOBBIT IN 48 FPS - YECHH! (1)

Anonymous Coward | about a year ago | (#42335147)

It's more like it goes from awesome the lord of the rings put into movie format to 'It looks like my cousin bob after at a LARP'

Re:HOBBIT IN 48 FPS - YECHH! (2)

gigaherz (2653757) | about a year ago | (#42334567)

It's less blurry and doesn't give you headaches, why would ANYONE want watch a movie that's NOT blurry or -- if seen in 3D -- gives you headaches?

I do agree that it doesn't have the "cinematic" feel of standard movies, so it feels weird when you watch it -- different. But it's so clear, smooth and headache-free that it's worth losing that. In fact, I'd like to see a movie in 60 or 75fps someday.

Re:HOBBIT IN 48 FPS - YECHH! (0)

Anonymous Coward | about a year ago | (#42335043)

I even get headaches for 24fps 2D movies. Especially action movies are just a blurry mess, in a movie theatre I could actually see the shutter angle on the camera, where the blurry shapes was intermitedly shown across the screen.

Re:HOBBIT IN 48 FPS - YECHH! (1)

Chrisq (894406) | about a year ago | (#42334713)

people who complain about higher framerates never seem to have a justification other than 'it's not what I'm used to'. What about the 48fps made it suck? Please avoid using 'audiophile-like' subjective/emotional terms.

I ended up liking it by the end of the film, but the "in your face" realism was quite a shock at first. I went into it thinking that there would not be much difference but movements seem much more abrupt and real, and facial expressions seem more lifelike. I put this down to seeing every micro-expression, each twitch of the eye or slight tremble on a smile. I can see that some people wouldn't like it; probably a "Cal Lightman" would get sick of seeing the expression of fear in seeing an Orc was really hiding an "oh no this is the twenty-third take and I'm getting hungry".

Re:HOBBIT IN 48 FPS - YECHH! (1)

Chrisq (894406) | about a year ago | (#42334725)

Oh I should add that there is a big difference between the Hobbit cinema HFR and HD displays that I have seen. I don't know whether this is due to compression of fast moving artefacts or physical persistence in the monitors but it is clearly very different.

Re:HOBBIT IN 48 FPS - YECHH! (2)

hack slash (1064002) | about a year ago | (#42334777)

Stupid AC, movies are not games, in games you want the highest framerate possible because this (usually) means quicker response times from keyboard/mouse/gamepad, increasing the feeling of immersion in the game.

This is especially so with the Oculus Rift type headgear being developed, the less lag between your input and the computer's visual output the more immersed you feel, with movies you're simply an outside observer.

Re:HOBBIT IN 48 FPS - YECHH! (1)

hattig (47930) | about a year ago | (#42335299)

Don't forget that in the movie, each frame is a snapshot over 1/24 (or 1/48 in the case of The Hobbit) of a second, motion blur comes for free.

In a computer game the motion blur is far harder to perform, and most frames are instantaneous snapshots of a scene. It can be made up for by having a higher framerate.

Re:HOBBIT IN 48 FPS - YECHH! (0)

Anonymous Coward | about a year ago | (#42335103)

I must humbly say that I disagree. I came back, absolutely loving it, something I can't say about 3D vs 2D. Have you compared the HFR and the normal frame rate Hobbit?

Re:HOBBIT IN 48 FPS - YECHH! (1)

bluescrn (2120492) | about a year ago | (#42335207)

That's because of years of conditioning, your brain just accepts '24fps==cinematic', and it'll take a while to get used to change/improvement.

Re:HOBBIT IN 48 FPS - YECHH! (0)

Anonymous Coward | about a year ago | (#42335283)

I don't know how you deal with broadcast TV.

Re:Detail (0)

icsx (1107185) | about a year ago | (#42334191)

You cant put it up like that. It's not a valid comparison. Level done running with 60fps can be a lot cooler than a level with 30fps, stuffed in with useless crap without creativity. Console hardware is usually so poor that often games are running on lower resolution which is then scaled to full screen to achieve that 30fps in the first place anyway.

And please, don't start that 30fps human eye thing. It's not the same as in games because if you have 60 fps instead of 30 fps, you have 50% more frames that makes the game running smoother. Then there is motion blur too that is often used to hide the fact that game runs bad.

Re:Detail (2)

Anonymous Coward | about a year ago | (#42334407)

if you have 60 fps instead of 30 fps, you have 50% more frames

None of your post really made much sense, but this bit isn't even correct math...

Re:Detail (0)

Anonymous Coward | about a year ago | (#42334747)

Prices going down 300%, 50% frames on 60 fps vs on 30 fps.

Why do idiots even try to use percentages?

Re:Detail (1)

Anonymous Coward | about a year ago | (#42334517)

The human eye can percieve far beyond 30fps. There are studies which show pilots picking out single frames from 500fps.

I was a hardcore gamer when I was younger, with some LAN achievements under my belt, and I dislike playing under 120hz, and prefered my CRT to be sitting at 140hz. People online attempt to say this is placebo, but pretty much everyone at those 4000man competitive competitions would comment on how smooth the high HZ / high FPS combos were, and that vast majority of top players (I know everyone thinks they're amazing at games, but I'm specifically talking about those who earn money playing games) play with higher than 100hz / FPS.

Obviously the difference between gaming and a movie is motion blur (deals with the smoothness) and the fact you aren't controlling the movie (meaning the input lag doesn't affect you).

Re:Detail (0)

Anonymous Coward | about a year ago | (#42334549)

Level done running with 60fps can be a lot cooler than a level with 30fps, stuffed in with useless crap without creativity.

That's only one consideration. The framerate is generally dependent on the max number of polygons which can be rendered on-screen at a time. You can increase the number of objects at the cost of detail per object, or increase the detail of individual objects at a cost of total objects on screen.
The total number of polygons on-screen at 60fps is going to be much lower than the total number at 30fps, given the same hardware. We simply don't have good enough hardware to support as many models with as much detail as consumers are demanding even at 30fps, so most studios aren't even considering a decrease in polygons in order to achieve the higher FPS numbers.
If consumers would pay for games based on gameplay rather than graphics appeal this wouldn't be nearly as much of an issue.

Re:Detail (1)

mwvdlee (775178) | about a year ago | (#42334795)

Here a "30fps human eye thing" for you; human eyes don't have anything remotely equivalent to a TV refresh rate.

Re:Detail (5, Insightful)

epyT-R (613989) | about a year ago | (#42334215)

Not this again.. This assumption is based on perceived motion from frames containing captured motion blur and even in such (24/30hz) frames, motion is NOT transparent to most people. With games there is no temporal data in frames, so it's VERY obvious. Even 60 is to many gamers, and is why they opt for 120hz (real 120hz, not hdtv '120' interpolated which looks terrible) panels and video cards that can push them.

Then there is input lag. Its perceived turnaround time is very noticeable at 30fps, and if the rendering is not decoupled from the input polling/irq, the latter's latency actually does go up. id had to patch quake 4 to make it acceptable to play because the 60hz was dropping inputs and looked choppy as hell compared to previous releases. Enemy Territory quake wars, which is also idtech4, was locked at 30 and was deemed unplayable by many.. I think it was one of the reasons the game tanked. It was actually painful to look at in motion.

Console devs always push excessive graphics at the expense of gameplay because the publishers want wow factor over playability. This was true in the 8bit and 16bit days too. Some games suffered so badly they were deemed unplayable. This is why pc gamers value useful graphics configuration capability in their games. Often what the publishers/devs thought as 'playable' was not what the community thought was playable, not that this should shock anyone with today's 'quality' releases.

Re:Detail (2)

flayzernax (1060680) | about a year ago | (#42334317)

I can attest that framerates of 60 matter. At least from a long time playing quake3 and other similar FPS's competatively online. Also a consistant framerate that does not spike is very helpful.

I would say its very game dependent though, games that are highly twitched based and less strategic tend to get more mental attention in the FPS department. Other games like an MMO or RTS I hardly notice FPS even if it can drop to an abysmal 17fps if I'm still immersed.

Then you have some displays that can barely handle motion at 20fps (some LCD's are still not quite up to par).

But as a professional basement dwelling gamer, I know the studios who want to make money off online competative play need to focus on FPS and level/game flow over everything else.

Re:Detail (4, Informative)

frinsore (153020) | about a year ago | (#42334415)

For a 60fps game there's about 16ms per frame and with current gen consoles about 8ms is lost to API call overhead on the render thread. Of course current gen consoles are years behind and constrain rendering APIs to be called from a single thread but I'd still be very surprised if there was a console that could support a triple A game above 70fps in the next 10 years (for resolutions 720p and above).

You've barely scratched the surface of input to perception lag, here's an answer by Carmack to people questioning another one of his tweets:
http://superuser.com/questions/419070/transatlantic-ping-faster-than-sending-a-pixel-to-the-screen [superuser.com]
Of course most engines come from a single threaded game mentality where they'd poll for input, apply input to game state, do some AI, do some animations, calculate physics, then render everything and repeat. Current gen consoles has freed that up some but most engines didn't go above 2 or 3 major threads because it's a difficult problem to re-architect an entire engine while it's being used to make a game at the same time. Sadly the better games gave user input it's own thread and polled input every 15ms or so, queued it up, and then passed it on to the game thread when the game thread asked for it. Input wasn't lost as often but it didn't get to the game any faster.

Re:Detail (1)

epyT-R (613989) | about a year ago | (#42334607)

Yeah, I read about that.. some games/drivers/engines are absolutely terrible. I think I was spoiled by the earlier quakes.. of course they had bugs too, but todays games are terrible. I suppose not everything is a competitive shooter, but that doesn't mean it should drop or lag input.. It makes the game incredibly frustrating to play.

Re:Detail (-1)

Anonymous Coward | about a year ago | (#42334557)

Obviously, nobody really gives a fuck about pixel-counting bullshit. Nobody's going to be doing QiII 180s with a pad anyway, and frankly that kind of game is out of date. You don't see soldiers in the field do 180 snapshots either.

And re your remark about 'quality' releases, souds like your butthurt that nobody wants to play old-style games any more. You just have to deal with it. I enjoy text adventures, but I'm hardly mad that they're a fringe activity now.

Re:Detail (5, Insightful)

epyT-R (613989) | about a year ago | (#42334645)

lets count the fallacies shall we?

1. argument from antiquity (it's old so it sucks)
2. argument from inverse popularity (no one does it now so it sucks)
3. appeal to realism (when did I say quake was realistic? I said higher steady framerate allows for better perception of action)
4. ad hominem. I'm not butthurt. Perhaps you prefer COD et al because you can't play something requiring more attention and lower reaction time. It's alright, I'm not crazy at quake either.. I was only a bit above average as far as competent players go, but I enjoyed the fluid, fast gameplay much more than the tedious waiting and camping of CS, action quake and its subsequent 'realism' clones. There's no need for insults.

If anything, it's the dominant playerbase who reason like your post who are to blame for why so many games today lack actual gameplay learning curves. There's nothing to master and it's all about pressing the right button at the right time a la dragon's lair single player, or having a real time rendered backdrop for VOIP 'multiplayer' conversations...all of this while fumbling around with simplified gameplay mechanics despite the fact they were dumbed down specifically to make the pad workable at all. That's not what I got into gaming for, but to each their own.

Re:Detail (1)

MrHanky (141717) | about a year ago | (#42334235)

Most people certainly can perceive frame rates faster than 30 FPS. The difference between 30 and 60 FPS when playing a game on a modern LCD display is huge. Stop perpetuating dumb myths.

Re:Detail (1)

Anonymous Coward | about a year ago | (#42335057)

I've seen 300fps on a prototype screen with a 300 fps camera. It helps in sharpness of moving object immensely because the eye can track objects on the screen perfectly so it can read high details. These details are either lost in motion blur, or when you use a small shutter angle it steps over the screen making it difficult for the eyes to track it as a moving object.

Re:Detail (1)

El_Muerte_TDS (592157) | about a year ago | (#42334239)

Considering most people can't perceive frame rates faster than 30

[Citation Needed]

The difference is very noticeable, but the "problem" is reduced due to enormous input lag that is present in most console setups. Also in action heavy scenes you will notice it less that everything is moving less smooth.
The difference in 30 fps vs 60 fps for cameras is less noticeable due to motion blur unless you slow down the rendering. Sure you can make 30 fps games look smoother by applying motion blur, but that only makes the end result blurrier.

Re:Detail (1)

PhrostyMcByte (589271) | about a year ago | (#42334255)

Would you rather have double the detail at 30 FPS, or half the detail at 60 FPS?

It depends entirely on the game. In a twitch shooter like Quake where you expect constant feedback, things feel drastically wrong at 30fps. In a single-player shooter? These are rarely built for competitive players, and don't need quick response. I can handle 30fps if it has decent motion blur, like Crysis. In an RPG? 30fps is mildly annoying but playable.

But that's only what I can tolerate, without shelving the game for a future video card. If I had a choice? I'd pick 60fps over 30fps every time. It's one of those things you don't realize you want until you've had it -- I guarantee console players would love it too if given the choice.

Re:Detail (0)

Nyder (754090) | about a year ago | (#42334271)

Would you rather have double the detail at 30 FPS, or half the detail at 60 FPS? Considering most people can't perceive frame rates faster than 30, it makes a bit of sense to push more polygons instead.

When it comes to games, you can tell the difference between 30 fps and 60 fps. TV/Movies, No, you can't. Video games, yes you can.

Re:Detail (5, Informative)

Nyder (754090) | about a year ago | (#42334345)

Would you rather have double the detail at 30 FPS, or half the detail at 60 FPS? Considering most people can't perceive frame rates faster than 30, it makes a bit of sense to push more polygons instead.

When it comes to games, you can tell the difference between 30 fps and 60 fps. TV/Movies, No, you can't. Video games, yes you can.

I should of mentioned the reason why.

When you shoot video you capture single pictures. When people are moving in these shots, the have motion blur. How much motion blur depends on how fast they are moving and how many shots per sec you take. Our eyes see the motion blur and our mind fills in the rest, which is why we are okay with 24 & 30 fps for movies/videos.

When you do video games, each frame is smoother, doesn't have the motion blur that real life video would have. Granted, games started adding in motion blur, but it's not the same. This is why the more frames per sec generally make games look better and play better.

We did cover this in the Hobbit at 48fps submission.

Re:Detail (2, Insightful)

Anonymous Coward | about a year ago | (#42334677)

I should of mentioned the reason why.

I should have mentioned the reason why.

Just an FYI...

Re:Detail (1)

Anonymous Coward | about a year ago | (#42335003)

You are too nice.

I would of said "Should HAVE, you illiterate son of a bitch"

Re:Detail (0)

Anonymous Coward | about a year ago | (#42335135)

Saying that movies look better at lower FPS because of motion blur is exactly like saying that anti-aliasing improves image quality by blurring the picture - it's sort of right in some twisted way, but it's very misleading (if you don't know why, look up anti-aliasing and contrast it to blurring). Think about what anti-aliasing is and then apply that to the time domain - i.e., suppose you take the average of many frames for each displayed framed. That's the benefit movies have - temporal anti-aliasing. It's far better than just motion blur.

Re:Detail (1)

somersault (912633) | about a year ago | (#42335187)

I could most definitely tell the difference between The Hobbit at 48fps and a normal movie. I didn't actually like the effect much, but I could tell the difference.

When gaming and constantly monitoring my FPS, 30 was playable, 60 was nice.

I remember with Quake 1 experimenting with different resolutions on my 486 with software rendering - 320x240 actually looked very "realistic" to me simply because it was rendering so smoothly. It looked like live action through a low resolution camera. I usually played at 640x480 though just because it's nice to be able to make enemies out at a distance..

Re:Detail (1)

Omestes (471991) | about a year ago | (#42334295)

Considering most people can't perceive frame rates faster than 30..

Can we please stop with this falsity already? In an FPS, you most assuredly can tell the difference between 30 and 60fps. More frames, means more, smoother, motion, which means higher accuracy. 30fps also looks a bit "juttery" with fast motions, especially with digital graphics, since there is no recorded motion blur to cover it up. Also, why all the brouhaha over the Hobbit being at 48fps and not the standard 24, if no one could notice it?

Re:Detail (0)

Anonymous Coward | about a year ago | (#42334679)

Why do people complain about the lack of "warmth" in a CD versus vinyl? Or the purported loss of sound quality for that matter? A lot of that isn't quite perceived, but the eye can see that something isn't right.

I'm a bit shocked that people are bitching about it, but I haven't had a chance to see the movie. The main time that you should see a difference is on panning shots, which is I believe the reason why they moved to 48fps anyways.

Re:Detail (5, Informative)

Anonymous Coward | about a year ago | (#42334335)

... Considering most people can't perceive frame rates faster than 30 ...

This myth needs to die [100fps.com].

Everybody can perceive frame rates faster than 30 fps. In fact, almost everybody can perceive frame rates faster than 100. Check the linked article, this is really a tricky question. Some things to consider:

- Games have no motion blur, or, as many modern games are implementing now, they use a pathetic, fake imitation that looks worse than no motion blur at all. Hence, they need much higher frame rates to show fluid motion. At 60 fps with current technology (including so-called next-gen), motion will look much better compared to 30.

- Decades of cinema have been training most people to perceive low-quality, blurred animation as 'film quality', and smooth, crisp animation as 'fake' or 'TV quality'. Many, many people consider a 48fps Hobbit to be worse compared to a 24 fps one. This is a perception problem. Games could have the same issues, except they've evolved much faster and most people didn't have the time to get used to bad quality.

- Consider the resolution problem. Higher resolution requires higher fidelity. At higher resolution, you'll demand higher quality textures and shading to reach similar levels of immersion, since details are now much more apparent. Same thing happens with animation and higher frame rates. This doesn't meen we should stay at low resolutions, 16 colors, black & white, or 30 fps. This just means we need to do better.

- And... a game is not film, and latency matters. A lot. At 30 fps, you need to wait twice the time to see any feedback from your input. In most games you will just train yourself to input the commands in anticipation without even knowing a word about latency, but in action games, where your reaction time matters, latency is a problem. And many other sources of latency add to the sum, such as clumsy 'smart' TVs post-processing your images, or badly engineered 'casual' motion wi-fi controllers.

In other words, yes, I'd rather have half the detail and 60 FPS. Except if your game is no game at all, and just a 6 to 10 hours movie. Since most of the top videogame chart entries fill this description today, I can see why many developers will remain at the 30 fps camp.

Re:Detail (1, Interesting)

Gerzel (240421) | about a year ago | (#42334349)

Also depends on what they are doing with that extra processing power. Are you making a game that is more intuitive? That reacts and learns better? That has AI that is more intelligent that adds to game play?

Really 30fps is the range of reasonable quality. You get a diminished return as you increase fps especially if the rest of the game doesn't perform to the same standard.

Re:Detail (0)

Anonymous Coward | about a year ago | (#42334435)

Half the detail at 60fps of course. Actually I rather play fast-paced shooters @ 120fps on 120hz screen. Although 144hz monitor would be even nicer...

Re:Detail (0)

Anonymous Coward | about a year ago | (#42334487)

I have strong feelings against that "most". However, given the majority of these players will be playing on TVs with massive input lag, using a controller, playing a lot of single player games, and having aim correction built in, I doubt most of them will have the experience to know the difference.

Re:Detail (1)

dnaumov (453672) | about a year ago | (#42334995)

Would you rather have double the detail at 30 FPS, or half the detail at 60 FPS? Considering most people can't perceive frame rates faster than 30, it makes a bit of sense to push more polygons instead.

Stop with this misinformation. Most people definately CAN percieve framerates faster than 30.

http://boallen.com/fps-compare.html [boallen.com]

If you honestly cant tell the difference between 30 and 60 in the above link, you might want to have yourself checked.

Re:Detail (1)

N!k0N (883435) | about a year ago | (#42335485)

Fun link. I couldn't really tell a difference, but I think it's more a failing of:
a. It's early, and coffee hasn't kicked in
b. shit monitor on this laptop is only tall enough to display one of the bouncy blocks.

The links on that page to other FPS examples that were closer together/side-by-side were easier to distinguish, to the point where 48 v. 60 FPS had a slight (but still noticeable) difference. Since "The Hobbit" is such a hot topic of discussion in this thread, I did see it in 3D (48FPS), and found it *much* more enjoyable than 24FPS 3D movies I've seen (e.g. "The Avengers" earlier this year) simply due to the lessening of the motion blur. Really, the only thing that took "getting used to" was that I kept thinking "holy fucking shit, why haven't we done this sooner!?".

Thing is, I think most of the complaining is more hype than anything, and people wanting to "fit in" -- friend I went with didn't know it was the 48FPS version (though read reviews of it, and kept warning us to not see it), was praising how great it looked -- was fun as hell when the rest of us told him he just sat through the 48FPS version.

Re:Detail (0)

Anonymous Coward | about a year ago | (#42335387)

you CAN perceive... because game frame rates are generated and not captured, they are a totally different beast! (basically, you're comparing apples with oranges...)

big leap (1)

RedHackTea (2779623) | about a year ago | (#42334225)

Why the big jump from 30 to 60? How about you target 35 fps or 40 fps?

The most unnerving part about the article:
"...30 frames per second could also mean many displays of future console games will also come in at a resolution of 720p."

I predict the next posts to be about FPS standing for Fraps Per Second.

Fixed Refresh Rates (4, Insightful)

bazald (886779) | about a year ago | (#42334273)

A display (television or monitor) has a fixed refresh rate. Assuming vertical synchronization is turned on to avoid tearing, you're pretty much limited to a framerate which evenly divides into the true refresh rate of the display. If the refresh rate is 60 fps, possible targets include 60 frames per second (providing 16.7 ms of computation time per frame), 30 FPS (providing 33.3 ms of computation time per frame), 15 FPS (providing 66.7 ms of computation time per frame), and so on. Anything below 30 FPS is kind of a joke, so nobody reputable would consider allowing more than 33 ms computation per frame in a shipping game.

Re:Fixed Refresh Rates (2)

stanjo74 (922718) | about a year ago | (#42334329)

Unless, you use a technique called "triple buffering", in which case you can have tear-free variable frame rate at any rate. Unfortunately, none of the major 3D APIs have provisions for this. I always wondered why such a fundamental omission for a graphics rendering API.

Re:Fixed Refresh Rates (0)

Anonymous Coward | about a year ago | (#42334453)

What are you talking about? You need support from the game, of course, but you can implement triple buffering in both OpenGL and Direct3D fairly easily.

Re:Fixed Refresh Rates (1)

stanjo74 (922718) | about a year ago | (#42334475)

Have you actually done vsynced triple buffering with OpenGL or DirectX or it just seems to you it ought to be doable?

Re:Fixed Refresh Rates (0)

Anonymous Coward | about a year ago | (#42334559)

Lots of games have a triple-buffering option, even some I own from the 20th century. Nvidia, at least, also has an option to force it in on in the control panel.

Re:Fixed Refresh Rates (2)

stanjo74 (922718) | about a year ago | (#42334669)

http://www.anandtech.com/show/2794 [anandtech.com] "So, this article is as much for gamers as it is for developers. If you are implementing render ahead (aka a flip queue), please don't call it "triple buffering," as that should be reserved for the technique we've described here in order to cut down on the confusion. There are games out there that list triple buffering as an option when the technique used is actually a short render queue. We do realize that this can cause confusion, and we very much hope that this article and discussion help to alleviate this problem."

Re:Fixed Refresh Rates (0)

Anonymous Coward | about a year ago | (#42334735)

OK, but the only time you'd need to drop old frames is when you're already rendering much faster than the display can handle. So while the claim may be true, it's not particularly useful; anyone running a game fast enough to sustain 60fps with vsync on can just disable triple-buffering.

Re:Fixed Refresh Rates (1)

stanjo74 (922718) | about a year ago | (#42334841)

Triple-buffering with vsync allows to decouple the frame rate of the renderer from that of the presenter.

Your rendering is slightly slower then 60 fps, say 58 fps. With double-buffering with vsync you have to present at 30 fps. With proper triple-buffering with vsync you can present at 58 fps.

Most games don't care about vsync and will present at the rate of the renderer, causing mid-frame tearing. If you're lucky, the tearing will occur on the top of bottom of the frame and won't be too bad.

Triple-buffering allows to present at the rate of the renderer or presenter, whichever is lesser, AND with vsync without tearing.

Re:Fixed Refresh Rates (1)

Ottibus (753944) | about a year ago | (#42335175)

But the vsync rate is still 60Hz, so if you don't display 60 different frames each second you are not going to get smooth motion. If you generate 58 fps with an even spacing you will get a noticeable 2Hz alias signal because every 30 frames you will get a duplicate rather than a new frame. In practice there will probably be enough jitter in the 58 Hz signal that the alias will become noisy and will be less noticeable.

Re:Fixed Refresh Rates (5, Interesting)

Brulath (2765381) | about a year ago | (#42334575)

TechReport analysed the nVidia 680 a bit after its release and had a piece on adaptive vsync [techreport.com] which should answer your question.

Quoted from an nVidia software engineer:

There are two definitions for triple buffering. One applies to OGL and the other to DX. Adaptive v-sync provides benefits in terms of power savings and smoothness relative to both.

- Triple buffering solutions require more frame-buffer memory than double buffering, which can be a problem at high resolutions.

- Triple buffering is an application choice (no driver override in DX) and is not frequently supported.

- OGL triple buffering: The GPU renders frames as fast as it can (equivalent to v-sync off) and the most recently completed frame is display at the next v-sync. This means you get tear-free rendering, but entire frames are affectively dropped (never displayed) so smoothness is severely compromised and the effective time interval between successive displayed frames can vary by a factor of two. Measuring fps in this case will return the v-sync off frame rate which is meaningless when some frames are not displayed (can you be sure they were actually rendered?). To summarize- this implementation combines high power consumption and uneven motion sampling for a poor user experience.

- DX triple buffering is the same as double buffering but with three back buffers which allows the GPU to render two frames before stalling for display to complete scanout of the oldest frame. The resulting behavior is the same as adaptive vsync (or regular double-buffered v-sync=on) for frame rates above 60Hz, so power and smoothness are ok. It's a different story when the frame rate drops below 60 though. Below 60Hz this solution will run faster than 30Hz (i.e. better than regular double buffered v-sync=on) because successive frames will display after either 1 or 2 v-blank intervals. This results in better average frame rates, but the samples are uneven and smoothness is compromised.

- Adaptive vsync is smooth below 60Hz (even samples) and uses less power above 60Hz.

- Triple buffering adds 50% more latency to the rendering pipeline. This is particularly problematic below 60fps. Adaptive vsync adds no latency.

So triple buffering is bad because it could cause an intermediary frame to be dropped, resulting in a small visual stutter despite being 60fps. There's a video of adaptive vsync on YouTube [youtu.be].

Re:Fixed Refresh Rates (1)

Xest (935314) | about a year ago | (#42334829)

But is that mindset still even relevant now?

The days of monitors/TVs having refresh rates that were multiples of 15, or 30 or whatever seem to have gone out the door with CRT technology.

Re:Fixed Refresh Rates (0)

Anonymous Coward | about a year ago | (#42335171)

Other way around. CRT technology was very flexible, allowing you to pick 85Hz, 80Hz, 75Hz, even 72Hz or whatever you liked as long as your monitor could go that high. With CRT the only major reasons multiples were desirable (that I can think of) were that a lower FPS device could display a higher FPS sequence simply by skipping every second frame, and a higher FPS device could display a lower FPS sequence of frames evenly.

Re:big leap (1)

Nyder (754090) | about a year ago | (#42334313)

Why the big jump from 30 to 60? How about you target 35 fps or 40 fps? ....

LCD monitors and TV's tend to update the screen at 60 frames per sec. That is 60hz. While 30 is okay, because it goes evenly into 60, 60 is optimal because the framerate and the screen refresh (update) happend at the same time.

lack of proper triple buffering (4, Informative)

stanjo74 (922718) | about a year ago | (#42334243)

Neither DirectX nor OpenGL support proper triple buffering to avoid tearing at variable frame rates. Because of that, if you want tear-free rendering, but cannot keep up at 60 fps all the time, you must render at 30 fps or 15 fps, but not, say 48 or 56 fps. You can render at any variable frame rate if you allow for tearing (which most games do and avoid the headache of v-sychs altogether).

Oh Carmack... (0, Troll)

readnotpost (2785015) | about a year ago | (#42334249)

Sounds like Carmack is still more interested in targeting woefully outclassed console hardware instead than cutting out an impressive PC-level engine for a change, particularly since Steam has shown there's significant life and popularity (and money to be made) on PCs still. But no, he prefers to target the lowest-common denominator of hardware. Not quite the philosophy of the iD software I grew up with.

Re:Oh Carmack... (0)

Anonymous Coward | about a year ago | (#42334355)

Id hasn't made a good game in over a decade amd they're totally outclassed on the technical end.

Re:Oh Carmack... (1)

bemymonkey (1244086) | about a year ago | (#42334731)

Now why would you want to keep that upgrade treadmill running? I for one quite enjoy the fact that I can play many of the latest games on a $100 video card and can focus on efficiency (just bought a Radeon 7750, which doesn't even need an additional power connector) instead of brute force... And the games look great. Does Battlefield 3 (the first PC game I've played that nearly *requires* a quad-core to run well) really look better than, say, Call of Duty MW3? MW3 feels like it needs about half the processing power that BF3 does, but visually, the difference is pretty minimal.

Am I missing something big here? Hell, I must be... I'm still gaming on a Core 2 Quad :p

pthf! (0)

Anonymous Coward | about a year ago | (#42334253)

60 fps is a waste of space and bandwidth.

Next Gen? (1, Insightful)

mjwx (966435) | about a year ago | (#42334279)

So next years consoles are going to be inferior to last years PC? Personally I think between PC and mobile, the console is doomed. This will never happen with iDevices but Android tablets already support HDMI out and input from bluetooth controllers. All we need is for them to get a bit more powerful (Nvidia is advertising a 6 fold power increase between Tegra 2 and Tegra 3) and a method of transfering large games (SD card) and they will become plugin replacements for consoles.

As for real cutting edge games, these never left the PC because it was the only platform that could be counted on to increase in processing power.

Consoles were never about power, they were about the money. Carmack should know that. Casual games are now the big earners. This does not mean that traditional cutting edge games have no place, they're earning better than ever but it's still chicken feed compared to casual.

Console developers will follow the money, which is on mobile and cutting edge developers will concentrate on PC. Traditional Consoles are left with first party developers which wont cut it. Even Nintendo with it's Mario and Zelda cash cows would struggle if it doesn't adapt (I.E. release a tablet/console hybrid. They're half way there with the Wii U).

Re:Next Gen? (0)

Anonymous Coward | about a year ago | (#42334433)

Actually, this already happened. Mobile is eating away the handheld console market. The PSVita (built mostly from mobile parts) is essentially dead in the water. The 3DS, a more innovative product which tries hard not to be an iPod with a pad, fares better, but is still a failure compared to Nintendo expectations or the previous generation.

Android (tablets specially) are an abysmal market for games. Piracy rates are close to 100%, and people have been trained to pay little to nothing for games and value them as such. Considering there are now much more units out there compared to iDevices, app revenue is much worse [flurry.com]. And iOS is a bloody arena. Everybody wants to hear Angry Birds stories, but 95% of the games in the mobile app stores are not breaking even (and we're not talking about million-dollar investments, but hobby affairs and two-people startups working for a few months).

Even then, no, the money is not in mobile. Rovio making $100 million a year sounds great to John Doe, but that is peanuts. You need more than that just to develop a top console game, and five to ten times the number to market it. A single top franchise such as Call of Duty generates more revenue than the entire mobile game industry combined. Of course, this can change. Just as the handheld consoles are going the way of the dodo, tablets, Android TVs or OUYAs could eat away the traditional console market. You may think this would make Call of Duty cost $3.99, and you'll get thousands of great games a month with Linux dominating your living room, but the actual reality will be more in the lines of millions of crap games worth zero and another '82 videogame crisis.

Re:Next Gen? (0)

Anonymous Coward | about a year ago | (#42334519)

The Call of Duty franchise made $500 million on the first day of its last 2 iterations. If casual games are the big earners as you say, could you tell me which casual games make that kind of money?

Sure, casual games sell more copies, but at $1 a throw, they are far from overtaking console titles. Console games earn chicken feed? Really?

Re:Next Gen? (0)

Anonymous Coward | about a year ago | (#42334807)

The Call of Duty franchise made $500 million on the first day of its last 2 iterations. If casual games are the big earners as you say, could you tell me which casual games make that kind of money?

Call of Duty.

Re:Next Gen? (0)

feepness (543479) | about a year ago | (#42334551)

With PC sales plunging 20% and Windows 8 considered a failure, I don't think PC is where the money is headed. Valve itself is making a set-top box because they see the writing on the wall.

Re:Next Gen? (0)

Anonymous Coward | about a year ago | (#42334723)

Console developers will follow the money

They already had, that's why they develop on consoles.

Re:Next Gen? (1)

Tagged_84 (1144281) | about a year ago | (#42334887)

You do realize that Nvidia offering a 6 fold increase is because it isn't anywhere near up to par with comparable hardware on the market? But I too foresee the death of consoles, down to a niche like the custom built desktop.

Re:Next Gen? (0)

Anonymous Coward | about a year ago | (#42335123)

"Nvidia is advertising a 6 fold power increase between Tegra 2 and Tegra 3"

Nvdia always selling smoke with his Tegra. Then the reality is another thing.

Much ado about a single tweet (3, Insightful)

dirtyhippie (259852) | about a year ago | (#42334483)

Good lord, this entire article is based on one tweet - 107 characters. Surely we could have waited for Carmack to say something more detailed than this??

Re:Much ado about a single tweet (1)

Crypto Gnome (651401) | about a year ago | (#42335255)

In the beginning was The Word.

Now some would say Grease Is The Word, others claim The Bird is The Word.

Re:Much ado about a single tweet (1)

Crypto Gnome (651401) | about a year ago | (#42335269)

Also: for those with excessively large Slashdot IDs:

Oh! (OH!)
Yo! Pretty ladies around the world,
Got a weird thing to show you, so tell all the boys and girls.
Tell your brother, your sister, and your ma-mma too,
'Cause we're about to throw down and you'll know just what to do.
Wave your hands in the air like you don't care.
Glide by the people as they start to look and stare.
Do your dance, do your dance, do your dance quick,
Ma-mma, c'mon baby, tell me what's the word?

Ah word up!

Everybody say,
When you hear the call,
You've got to get it underway.

Word up!


It's the code word,
No matter where you say it, you'll know that you'll be heard.

WordUP [youtube.com]!

Re:Much ado about a single tweet (1)

Crypto Gnome (651401) | about a year ago | (#42335257)

Also:

Good lord, this entire article is based on one tweet - 107 characters. Surely we could have waited for Carmack to say something more detailed than this??

THIS!

Same as PC. But you can still go for 60. (3, Insightful)

Sarusa (104047) | about a year ago | (#42334583)

It's a given that most will target 30fps since more shinies looks better in screenshots and youtube videos than 60fps does. And most consumers can't tell the difference until put a 60 and 30 fps version side by side and let them play.

The leaked/rumored PS4/XNext specs show them as equivalent or slightly weaker than current mid-high gaming PCs, and those can't do 60 fps locked on all the recent shiny games at 1920x1080 with all effects on (except those like CoD MP that specifically target it), so it's unlikely the consoles would. Cheap components is the driver, especially for PS4.

But there's no reason a fighting game or fps can't aim for 60fps on the new gen if it wants to. Use your shaders and effects wisely and no problem.

Re:Same as PC. But you can still go for 60. (0)

Anonymous Coward | about a year ago | (#42335163)

Once hardware is put into a console it automatically becomes much more powerful because games will target the exact configuration of hardware in the console. On PCs you have to work on all hardware - on consoles you only have to work on that exact configuration.

News Flash! (1, Insightful)

Moof123 (1292134) | about a year ago | (#42335085)

Game play still is more important than FPS, see: RAGE.

A good game with low FPS is tragic, but a lame game at even the highest FPS still just sucks.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...