Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Quest For Frames Per Second In Games

simoniker posted more than 10 years ago | from the fps-issues-for-fps-titles dept.

First Person Shooters (Games) 72

VL writes "Ever wondered why it is exactly everyone keeps striving for more frames per second from games? Is it simply for bragging rights, or is there more to it than that? After all, we watch TV at 30fps, and that's plenty." This editorial at ViperLair.com discusses motion blur, the frame-rate difference between movies and videogames, and why "...the human eye is a marvelous, complex and very clever thing indeed, but... needs a little help now and then."

cancel ×

72 comments

Sorry! There are no comments related to the filter you selected.

it plays better (3, Insightful)

Song for the Deaf (608030) | more than 10 years ago | (#6891379)

With a higher framerate, a game just feels and plays better, it's as simple as that. 30 fps is just *not enough* to have good action and feel on most pc first person shooters.

Re:it plays better (2, Interesting)

ctr2sprt (574731) | more than 10 years ago | (#6891399)

If you get a constant, true 30fps and the game action isn't tied to that framerate (rounding errors), then that would be okay. Of course, that's like physicists talking about frictionless surfaces or perfectly spherical objects, and about as attainable.

Constant Frame Rate (1)

Detritus (11846) | more than 10 years ago | (#6891442)

You could design a game to always preserve a constant frame rate. The program would synchronize itself to vertical retrace or another periodic signal. It would have to be able to build the next frame buffer in a bounded amount of time. This might require it to drop detail or features when a lot of things were going on.

uh (1, Informative)

Anonymous Coward | more than 10 years ago | (#6891532)

The problem with that is that if you're in a detailed indoor environment, and then suddenly step outside, the game will look like ass because of constantly dropping or adding detail.

Re:it plays better (2, Insightful)

trompete (651953) | more than 10 years ago | (#6891516)

My largest problem isn't the graphic card's frame rate ability. It is that damned speed of light that is keeping me from getting a low ping when I play on European servers. Seriously....you can play most games with an average machine, but your frame rate is really limited by the propogation delay and all the hops between you and the server. Get me a lower ping, and I'll be one happy guy!

Re:it plays better (1)

Sigma 7 (266129) | more than 10 years ago | (#6934729)

30 fps is just *not enough* to have good action and feel on most pc first person shooters.


I've played Project IGI (the first one), which had a smooth gameplay compared to some of the more modern FPS. It wasn't perfectly smooth, but it's was hard to detect any significant jumps between frames.

I was suprised that it ran at 30FPS - constantly. There wasn't even a loading delay between indoor and outdoor areas. It was even smoother than D1X running at 30FPS.

Must be shoddy work (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#6891397)

To properly frame a great painting it takes hours of painstaking work. Paintings framed at those rates probably have hideous gaps in the corner joints. They could not have had the proper time select colors and textures for matting and framing materials that would properly enhance the beauty of the artwork. They must be framing pictures in an Asian sweatshop with thousands of overworked and underpaid children with bleeding fingers.

Think of the children!

What happened to editors? (-1, Redundant)

ctr2sprt (574731) | more than 10 years ago | (#6891405)

3 grammatical errors (albeit identical ones) in the first three sentences of the article. Do the Slashdot editors work at that site too?

Ugh (4, Insightful)

Elwood P Dowd (16933) | more than 10 years ago | (#6891415)

That article reminds me of the TV ads with scientists explaining how our patented hydro-oxytane reaches deep into your pores and assasinates uglificating bacteria.

Author seems to understand about as much about the primate visual system as... well... anyone else that's never studied it. The visual cortex doesn't "add blur."

His general point is probably correct, but is reasoning is fucked.

Re:Ugh (0)

u-238 (515248) | more than 10 years ago | (#6892011)

yeah, the guy seems to have some serious problems with basic grammar as well...

very first fucking sentance

"Anyone who even remotely follows PC hardware websites/magazines will have noticed that a lot of the high end items for computers are tested for there ability to improve frame rates in games."

oh but it doesn't end there

"Picture it; Your playing your favourite First Person Shooter game and about to go for that head shot. Your running at 30 fps..."

Re:Ugh (0)

Anonymous Coward | more than 10 years ago | (#6892304)

When correcting others' grammar and spelling, check yours first.

Sentences start with a capital letter, and end with a full stop (OK, period to you Americans). The word "sentence" has no letter "a" in it.

Re:Ugh (1)

Daetrin (576516) | more than 10 years ago | (#6896536)

There's a big difference between a professional article and what people post on slashdot. I make a lot of spelling mistakes and gramemr mistakes when posting to message boards and writing email. (Although i usually manage to avoid the there/their and your/you're traps, though ocasionally i get caught by its/it's)

However when writing an article or paper that you expect to be published in some format you are expected to take a little more care. If someone were to post an article critisizing someone for spelling/grammer then i would consider it hypocritical for them to have mistakes themselves. However making mistakes in a message board post while pointing out that a professional article has mistakes in it isn't hypocritical, it's just acknowledging the different standards to which the two are supposed to be held.

Re:Ugh (1)

PainKilleR-CE (597083) | more than 10 years ago | (#6898405)

The primary thing he missed (although he almost got it) was that the biggest factor is the difference between your highest and lowest framerate in a given time frame. If you're running fat & happy at 100 fps and then (as someone mentioned earlier) walk outside, so to speak, everything slows to a crawl as it loads textures and tries to render an image much more complex (or at least with a much larger visible range). Until the card catches up you could be running below 30 fps, or you could be running 45 fps and it just feels like a crawl because you were running over twice as many fps before that. Eventually your eyes will adjust to the lower framerate, as long as it's tolerable, and the card will be doing fewer calculations (as long as the game's coded well), so you'll have a fairly constant, and just as tolerable, 50-60 fps from that point on and not really have a problem with it.

In the end, each person's ability to perceive framerates is different. Some people can tolerate low refresh rates on their monitors, as well. The most important things are to keep v-synch on (to prevent visual artifacts caused by frames being split on the v-synch) and to cap your framerate at a slightly-above tolerable level. Ideally, your framerate should never cut in half, but then in online games that's almost impossible to guarantee, so you should shoot for a range that is tolerable for you. Try it out for yourself, though, cap your framerate at something lower than you would normally think of as acceptable and see if you have any problems with it while playing. Check your framerates at times when you do have problems with the way the game is playing. I normally cap my framerates around my refresh rate (rounded up generally to give it some space), unless I'm playing id games, which generally seem more susceptible to having the gameplay tied to the framerate (and in these cases I still try to limit it to prevent severe drops in framerate, I'd much rather be able to play comfortably the entire time than to make that one rocket jump that requires 200 fps).

Motion Pictures (2, Insightful)

Detritus (11846) | more than 10 years ago | (#6891416)

Movie projectors cheat by displaying every frame twice, which doubles the frame rate from 24 fps to 48 fps. Cinematographers also avoid certain shots, like rapidly panning from left to right, which look terrible on a movie screen.

Re:Motion Pictures (2, Informative)

Murdock037 (469526) | more than 10 years ago | (#6891565)

Movie projectors cheat by displaying every frame twice, which doubles the frame rate from 24 fps to 48 fps.

Wrong. They show 24 fps. (There's also a bit of black in between each frame, otherwise the eye would register a blur; but it's still 24fps.)

If the projector was run at normal speed and showed each frame twice, it would look like choppy slow motion. If it was run faster at 48 fps, the motion would be fast, like how you often see old silent pictures.

You would need a print with every frame printed twice in a row for it to work, and then a faster projector than is safe for most film.

There are certain camera systems under development which would shoot film at 48 fps, and you'd then need a projector that could show the film at 48fps, but the standard rate for cameras and projector for the last fifty years, everything you've ever seen in a cinema, has been 24fps.

Cinematographers also avoid certain shots, like rapidly panning from left to right, which look terrible on a movie screen.

It's called a swish pan, and it makes for a nice transition, if you cut in between two of them. But you don't have to, and it doesn't look "terrible."

Whoever modded you up is embarassingly ignorant of the topic at hand.

sorry murdock, but you are wrong. (2, Informative)

LittleBigLui (304739) | more than 10 years ago | (#6891664)

Nowadays, movies are INDEED filmed with 24 frames per second and therefore are also projected with 24 frames per second. But, the frames are shown multiple times (i think two is standard, but i've heared about three, too).

And no, the film doesn't have to have the frames on it multiple times. The transport mechanism in a projector works like this: light off, move the film forward to the next frame, stop, light on, light off, move forward, stop, light on, .....

Now, instead of having ONE phase of light during a frame, modern projectors have TWO or THREE of them:

light off, move forward, stop, light on, light off, light on, light off, move forward ....

at least thats what i learned in school ;)

No, I'm not. (1)

Murdock037 (469526) | more than 10 years ago | (#6891836)

The bulb in the projector doesn't turn on and off continuously.

The film is pulled frame by frame in front of the lens, and you may get the impression of flicker, but that's only because of a misaligned shutter that's in front of the bulb-- it lets light through when the frame is aligned, and blocks the light as the next frame is being pulled down. This happens 24 times per second.

You may want to consult this article [howstuffworks.com] at How Stuff Works [howstuffworks.com] , specifically the fourth page [howstuffworks.com] , which deals with bulbs, shutters, etc.

What school was that?

Re:No, I'm not. (1)

LittleBigLui (304739) | more than 10 years ago | (#6892091)

The bulb in the projector doesn't turn on and off continuously.


of course not. my bad i didn't mention the shutter, but the effect is the same (light on and off).

The School was the Polytechnical University for Media Technology and Design in Hagenberg, Upper Austria, and while i think howstuffworks is a great resource, i'm sure what i learned there is correct.

if you check the first few paragraphs of this [grand-illusions.com] , you'll see that the concept isn't new either:


For much of the 'silent' period, films were shot at roughly 16 frames per second and shown on a projector with a three-bladed shutter. Each individual frame was shown three times, so around 48 screen images were projected every second, (close enough to fifty to give a reasonably flicker-free result).


Re:No, I'm not. (1)

spitzak (4019) | more than 10 years ago | (#6893113)

On those projectors pictured, the shutter rotates more than once per frame. Projectors I'm familiar with have more than one blade and rotate once per frame.

In any case there are usually 48 blinks of light per second (sometimes 72 but I belive that may only be on very old projectors designed to project silent films at 18 fps). The trick is that the same frame is shown in more than one blink.

A light blinking 24 times a second is quite obvious, while 48 times per seconds is approaching the limits of what people can see. This ability to detect blinking is much higher than the ability to detect non-smooth motion, which is why they can get away with fewer different pictures.

In fact most hand-drawn animation is done on "twos" and thus only 12fps. Still projected with 48 blinks of the light.

Re:No, I'm not. (0)

Anonymous Coward | more than 10 years ago | (#6894723)

I am a projectionist. A motion picture projector that shows 70mm, 35mm, 16mm, or 8mm (don't know about IMAX, no experience there) does show each frame two or three times - the standard shutter looks something like a circle missing two diagonally opposite quadrants. The pulldown mechanisim operates once per revolution of the shutter. So, LittleBigLui is quite correct in his assertation. If you doubt, might I suggest calling one of these projector manufactuers and checking with them - Christe, Strong, Century, Simplex, Kinoton.

Swish Pan (1)

Detritus (11846) | more than 10 years ago | (#6892152)

A swish pan may be a recognized effect but it wasn't what I was referring to. I was thinking of a shot of a landscape, where everything is in sharp focus, combined with a pan that is slow enough that instead of blurring, the landscape jerks across the screen, klunk, klunk, knunk, at 24 painfully obvious frames per second.

The projector does open the shutter twice for each frame to reduce the sensation of flicker.

Re:Motion Pictures (1)

Zathrus (232140) | more than 10 years ago | (#6899120)

It's called a swish pan, and it makes for a nice transition, if you cut in between two of them. But you don't have to, and it doesn't look "terrible."

Do a swish pan across a row of vertical lines (like a fence or vertical blinds) and it will, indeed, look horrible. The 24 fps isn't adequate to the job.

It gets even worse on NTSC/PAL though, since the interlaced nature of the picture starts breaking things up horribly.

For true pain, take a movie that does a horrible swish pan like that and then transfer it to NTSC.

They must like to part with their money. (0, Offtopic)

saden1 (581102) | more than 10 years ago | (#6891429)

I think some people take this FPS thing way too seriously. 50fps without hiccups is all I really need.

Speaking of graphic cars, I just purchased Madden 2004 and right after I bring it home I come to find out that it won't work with my GeForce 2 MX. I was planning on getting a card when HL2 cames out but it looks like I'm going to have to buy a card this weekend or next week. A GeForce FX 5200 is all I really need and that is all I'm going to get. It should last me for at least 2 years.

Re:They must like to part with their money. (1)

Moonshadow (84117) | more than 10 years ago | (#6891927)

I wouldn't recommend getting a FX5200, myself - if you're going to go FX, get a 5600 Ultra. You'll pay a little more (I've seen them for $160) but the performance increase you'll see is definitely worth it. The GeforceFX 5200 performs about on par with a GF3 Ti200 (which I've got, great card, but getting old), but it has PS2.0 and such. The 5600 is really the "midlevel" card of the FX series.

Re:They must like to part with their money. (1)

@madeus (24818) | more than 10 years ago | (#6896215)

I'd add to this that if your thinking of a GeForce FX 5600 Ultra, you'd actually be better with a Radeon 9600 IMO.

No (4, Interesting)

RzUpAnmsCwrds (262647) | more than 10 years ago | (#6891439)

1: 30 frames per second is simply not enough. It's fine for movies and TV, but that is only because TV shows and movies are designed around the limits of the medium. Ever notice how TV shows and movies don't have a lot of quick, jerky movements? Those movements lead to motion sickness on TV and in movies, and they are the exact movements in 3D games. 30fps makes me sick, I can tolerate 60fps.

2: Remember, FPS is the *average* framerate. It may dip well below that mark. My goal is not to have the most FPS but to have a reasonably high resolution with FSAA and AF on, all the detail settings to full, and to never have the game dip below my monitor's refresh rate (75Hz).

Re:No (1)

danila (69889) | more than 10 years ago | (#6951621)

Sure, action movies have no quick and jerky movies whatsover and the only reason Wachowsky brothers introduced the sol-mo effects, was to work around the limitations of the human eye.

The truth is that TV programs and movies are filmed WITH motion blur. That means that every frame (for films) is made during 1/24th of a second. It's actualy a superposition of all the trillions of images that were projected to the camera during that time. Our eye gets all the light that was destined for it, the only thing that we lose is a bit of temporal resolution - we don't know what happened in the beginning of the frame and what in the end - it's all blurred. In computer games every frame is in fact a moment in time, not a superposition of moments. That's why 24 fps is not enough for a game (although many people can play at such fps, some can't, while 24fps in movies is generally ok for everyone). Addition motion blur would help, but the trick is that calculating motion blur for 3D games is almost as computationally intensive, as just upping the fps rate, since you need to calculate all frames anyway in order to superimpose them.

Consistency also matters (2, Interesting)

Anonymous Coward | more than 10 years ago | (#6891445)

If I can get a *SOLID* 30fps, I'd prefer that to a framerate that peaks at 60 and swoops down to 15 in places. I also can't stand it when vsync is turned off in games - tearing is horrible. A nice compromise is to keep VSync on when the framerate is high, turn it off if it drops below, say, 30fps.

I'm still waiting for the day when machines are good enough and code works well enough for games can be considered "real-time" (meaning having fixed steps at, say, 60Hz - and the game is NOT ALLOWED to take longer than that 1/60th sec to render a frame).

- Disgruntled Planetside player who wishes that game always ran at 60Hz. :(

movies are too slow (1)

Tuxinatorium (463682) | more than 10 years ago | (#6891467)

The well trained FPS player can practically see the individual frames in a standard 24fps movie. It's just too slow.

Re:movies are too slow (1)

CheeseCow (576966) | more than 10 years ago | (#6891847)

That's why I play them all at 2x speed, makes them twice as fun too.

Re:movies are too slow (1)

pixel_bc (265009) | more than 10 years ago | (#6895194)

> The well trained FPS player

Good god.

You make it sound like there's an FPS101 at Gamer College that's manditory. :)

Re:movies are too slow (0)

Anonymous Coward | more than 10 years ago | (#6896857)

You make it sound like there's an FPS101 at Gamer College that's manditory. :)

Wait, you mean you don't go there?

Re:movies are too slow (1)

pixel_bc (265009) | more than 10 years ago | (#6902520)

> Wait, you mean you don't go there?

I flunked out of Crate Opening 356. Ruined my entire academic career.

Some serious flaws render the piece useless (4, Informative)

Jerf (17166) | more than 10 years ago | (#6891491)

I like the ideas behind this article (I couldn't immediately Google for a good replacement so there may be room on the web for an article like this) but the author (and there is no nice way to put this) is talking out of his ass. For instance, from the second page:
This is the Visual Cortex adding motion blur to perceived imagery so that rather than seeing everything in great detail, we are still able to perceive the effect of motion and direction as we ride by. The imagery is smoothly flowing from one point to the next and there are no jumps or flickering to be seen. If
the eye wasn't to add this motion blur, we would get to see all of the details still but the illusion of moving imagery would be lost on us, with the brick wall sort of fading in and out to different points. It's pretty simple to test this.
This is idiotically wrong. This entire paragraph is predicated on the false assumption that our eye somehow has a "framerate" itself. (Either that, or the false assumption that our eye is basically a CCD with infinite discrimination, also wrong.) It does not. Our eye is fully analog. (Go figure.) You get motion blur because the nerves and the chemical receptors can only react so quickly, and because nerves fire as light accumlates on the
receptors. Since the receptors are moving quickly relative to the transmitting object, light rays from a given point are smeared across several cones/rods before the full processing of the image can take place. (Now, I'm simplifying because this isn't the place for a
textbook on vision, but at least I know I'm simplifying.) In fact, there's nothing the visual cortex could do to remove the motion blur coming from our eyes, because the motion blur causes actual information loss! (It can (and does) do some reconstruction, but you can't fill in details that don't exist.)

(Note in the portion I italized how he jumps from the "vision cortex" to "the eye"; the two are NOT the same and can't be lumped together like that in this context.)

This simple error renders the entire second page actively wrong.

Here's another, referring to interlacing:
Using a succession of moving images, the two refreshes per frame fool us into believing there is two frames for every one frame. With the motion blur the eye believes we are watching a smoothly flowing picture.
Uh, wrong wrong wrong. Interlacing was a cheap hack to save bandwidth. "Progressive scan" is universally considered superior to interlacing (in terms of quality alone), and many (such as myself) consider keeping interlaced video modes in HDTV to be a serious
long-term mistake. It has nothing to do with convincing you you are seeing motion, in fact it has a strongly deleterious effect because you can frequently see the "combing"; that's why TVs have "anti-comb" filters. You don't see it as "motion", you see it as wierd "tearing".
Like the TV, your Computer Monitor (if it's a Cathode Ray Tube) refreshes by drawing the screen line by line horizontally, but unlike the TV, a Monitor and Video Card doesn't add extra frames. If your screen draws at 30 fps, you will GET 30 fps.
ALSO wrong. The computer monitor and video card will pump out X frames per second, period. It has to. If the CRT is going at 60 fps and the video card (as in the 3D hardware) is only pumping at 30 fps, every frame will be shown for two CRT cycles. What else is the video card (as in the rasterizer) going to display? You'd notice if the screen were blank every other cycle!
CRT Monitors are considered 'Flicker Free' at about 72Hz for a reason, and simply put it's to compensate for the lack of motion blur, afterimages and other trickery we live with every day in TV and Films.
Wrong again. CRTs at that frequency are "flicker free" because they pass the frequency the parts of our eyes more sensitive to motion (actually the peripheral vision, not the "primary" vision we're used to using) can see as a flicker. A parrot can still detect flicker into around 100Hz (or so animal behaviorists tell me). It has nothing to do with "motion blur, afterimages, and other trickery" because this
flicker is still detectable with a pure white, unchanging
screen; in fact, the pure-white screen flickers the worst, since it it cycling from full-white to full-black, the largest difference a CRT can do.

Basically, this guy is so wrong it's not even funny. This isn't even a full accounting of what I saw wrong, or even the most egregiously wrong things; this is just what jumped out at me first. You'll come out of this article not better informed, but more MISinformed. Perhaps instead of lecturing, the author should be learning.

Re:Some serious flaws render the piece useless (1)

epine (68316) | more than 10 years ago | (#6891515)


A single use of the apostrophe key would do wonders to his prose. Maybe he thumbed the entire article. That would explain a lot.

Re:Some serious flaws render the piece useless (1)

Creepy Crawler (680178) | more than 10 years ago | (#6891598)

This is the Visual Cortex adding motion blur to perceived imagery so that rather than seeing everything in great detail, we are still able to perceive the effect of motion and direction as we ride by. The imagery is smoothly flowing from one point to the next and there are no jumps or flickering to be seen. If the eye wasn't to add this motion blur, we would get to see all of the details still but the illusion of moving imagery would be lost on us, with the brick wall sort of fading in and out to different points. It's pretty simple to test this.

>>>>In fact, there's nothing the visual cortex could do to remove the motion blur coming from our eyes, because the motion blur causes actual information loss! (It can (and does) do some reconstruction, but you can't fill in details that don't exist.)

Re:Some serious flaws render the piece useless (3, Interesting)

Creepy Crawler (680178) | more than 10 years ago | (#6891611)

I messed up the 'quote' delination by putting open-brackets instead of close brackets. Sorry for the jibberish post.

This is the Visual Cortex adding motion blur to perceived imagery so that rather than seeing everything in great detail, we are still able to perceive the effect of motion and direction as we ride by. The imagery is smoothly flowing from one point to the next and there are no jumps or flickering to be seen. If the eye wasn't to add this motion blur, we would get to see all of the details still but the illusion of moving imagery would be lost on us, with the brick wall sort of fading in and out to different points. It's pretty simple to test this.

>>This is idiotically wrong. This entire paragraph is predicated on the false assumption that our eye somehow has a "framerate" itself.

It does. It's about 7000 FPS (+ or - for each individual).

The way bio-psychs tested this is by taking a high-speed controllable projecter that ranged from 30FPS to 20000FPS. Subjects were lead into the totally black room with a mic. Then they were directed to look at the projecter screen by a red dot. Once the pattern started, the projecter took a spread of 3 seconds and at 1 frame put a number on screen. The average FPS for the subjects NOT to notice the number was about 7000FPS.

>>>>(Either that, or the false assumption that our eye is basically a CCD with infinite discrimination, also wrong.) It does not. Our eye is fully analog.

You just cant say that. The ion channels are directly countable and lead to a time based binary system like that of morse code. Not even biologists are sure about that.

>>>>>(Go figure.) You get motion blur because the nerves and the chemical receptors can only react so quickly, and because nerves fire as light accumlates on the receptors. Since the receptors are moving quickly relative to the transmitting object, light rays from a given point are smeared across several cones/rods before the full processing of the image can take place. (Now, I'm simplifying because this isn't the place for a textbook on vision, but at least I know I'm simplifying.)

It's not that the rods/cones (rods are black-white, cones are color) react quickly, it's the chemical breakdown takes a while. Take the simple theater test. Go from sunny outside to a theater room. You pretty much cant see anything. It takes about 15 minutes to FULLY 'charge up' the rods back to full usage. But when you walk out of that sucky movie ;-) , your eyes hurt (due to rapid depletion of rods) and your cones take effect very rapidly.

Other side effects of bright light is that you cannot see absolute 'white' or 'black'. Similar with dark rooms, you cannot easily see color, as it takes high energy photons to allow you to see it.

>>>>>In fact, there's nothing the visual cortex could do to remove the motion blur coming from our eyes, because the motion blur causes actual information loss! (It can (and does) do some reconstruction, but you can't fill in details that don't exist.)

Re:Some serious flaws render the piece useless (1)

Makoss (660100) | more than 10 years ago | (#6891761)

>>>>>>Other side effects of bright light is that you cannot see absolute 'white' or 'black'. Similar with dark rooms, you cannot easily see color, as it takes high energy photons to allow you to see it. By 'High Energy' I presume you mean a large volume of photons? Or perhaps a 'higher amount of total energy, caused by a greater number of photons'.

Because the only thing adding more energy to a red photon is going to get you is a green photon. . .(depending of course on how much higher then energy of that particular photon is.)

Energy, on a per photon basis, is equivelent to wavelength. Energy on a "total amount of energy incident on a given area" (I paraphrase) is flux.

Re:Some serious flaws render the piece useless (1)

hamanu (23005) | more than 10 years ago | (#6896399)

TV's don't have anti-comb filters, they have COMB filters. A comb filter picks up several equaly spaced frequency ranges, and has a frequency response that looks like a comb turned bristle-side-up. This is needed to seperate the Y and C channels, since the C channel is made up of harmonics riding on the color subcarrier, that sit in the middle of the Y signal's frequency spectrum.

Re:Some serious flaws render the piece useless (1)

pjp6259 (142654) | more than 10 years ago | (#6905227)

You get motion blur because the nerves and the chemical receptors can only react so quickly, and because nerves fire as light accumlates on the receptors.

Damn, I thought I got motion blur because of all the shroooms.

Re:Some serious flaws render the piece useless (1)

danila (69889) | more than 10 years ago | (#6951636)

Fortunately, my eyes just ignored the bullshit in this article and it wasn't even passed to my visual cortex. :) But thanks for the rebuttal, hopefully this will help some readers.

Relative motion (2, Interesting)

alyandon (163926) | more than 10 years ago | (#6891619)

FPS is important to FPS gamers because of one simple fact... relative motion.

If you have something travelling at a velocity of 600 pixels/s on your screen (not uncommon for objects in FPS games) it is much easier to track it at 100 FPS (relative motion of 6 pixels per frame) than 30 FPS.

Re:Relative motion (1)

PainKilleR-CE (597083) | more than 10 years ago | (#6898476)

If you have something travelling at a velocity of 600 pixels/s on your screen (not uncommon for objects in FPS games) it is much easier to track it at 100 FPS (relative motion of 6 pixels per frame) than 30 FPS.

Except that most gamers aren't using monitors that run at 100Hz at their gaming resolution, so they're not going to see every frame, and aren't going to see 6 pixels per frame. Never mind that it is uncommon for objects to move 600 pixels/sec unless you are moving your view quickly, which most people will ignore outright except to scan for basic images in the mess that goes by.

Of course, as an added bonus, a pixel isn't a fixed unit for most games, so if you're playing games at 640x480 my whole assesment that your monitor isn't displaying 100Hz would be off, but my assesment that things won't be moving at 600pixels per second is even more accurate, unless you play with an extremely low fov (45).

Re:Relative motion (1)

sniser2 (624542) | more than 10 years ago | (#6926801)

Never mind that it is uncommon for objects to move 600 pixels/sec unless you are moving your view quickly, which most people will ignore outright except to scan for basic images in the mess that goes by.

Yeah, most people. Gamers on the other hand will turn their view a lot *and* track what's going on around them. A lot of this happens subconsciously (ie. it's fast) - for this a good framerate helps a lot. And an object crossing the screen in under one second isn't that uncommon, just think quake3 and jump pads.

the truth is... (3, Funny)

edmz (118519) | more than 10 years ago | (#6891699)

...the are trying to compensate for something.

Timing is important (2, Interesting)

kasperd (592156) | more than 10 years ago | (#6891720)

If you are to look on a CRT screen for a long time, you certainly want a high refresh rate. How much is required to be enough probably depends on who you are, but 75Hz is not enough for me. But I can hardly tell the difference between 85Hz and 100Hz. I think 100Hz is enough for most people.

When you have chosen a refresh rate, the optimal FPS is exactly the same number. Generating more FPS is waste because it is just gives worse quality. You would either be skiping frames, which harms animations. Or you would be showing parts of different frames at the same time, which gives visible horisontal lines, where the two parts doesn't match. And yes, you will spot those broken images even when only shown for 1/100th of a second.

But generating 100 FPS and showing 100 FPS is not enough, you have to ensure each frame is showed exactly once. It requires a litle help from the graphics hardware, but nothing that is hard to implement. Having a litle extra processing power is important, you must be able to produce ever frame fast enough. You don't want to miss a deadline because occationally one frame takes just a litle more CPU time to render.

Re:Timing is important (1)

PainKilleR-CE (597083) | more than 10 years ago | (#6898514)

But I can hardly tell the difference between 85Hz and 100Hz. I think 100Hz is enough for most people.

I have to have a minimum of 85Hz in most lighting environments. I can tolerate refresh rates down to almost 60Hz with very low or no light, but once a light comes on it starts to interfere and the rate needs to come back up (I start getting headaches after about 30 minutes with 60-75Hz in a lit room).

When you have chosen a refresh rate, the optimal FPS is exactly the same number. Generating more FPS is waste because it is just gives worse quality. You would either be skiping frames, which harms animations. Or you would be showing parts of different frames at the same time, which gives visible horisontal lines, where the two parts doesn't match. And yes, you will spot those broken images even when only shown for 1/100th of a second.

That's why they invented v-synch. Turn it on in your adapter's control panel, and leave it on, make sure the game isn't disabling it, either. Benchmarks will disable it because they don't want the monitor to limit the card's results, but for actual playing you want it on to reduce artifacts.

But generating 100 FPS and showing 100 FPS is not enough, you have to ensure each frame is showed exactly once. It requires a litle help from the graphics hardware, but nothing that is hard to implement. Having a litle extra processing power is important, you must be able to produce ever frame fast enough. You don't want to miss a deadline because occationally one frame takes just a litle more CPU time to render.

Actually, this is where v-synch helps, too. If your monitor is set to 100Hz (like mine is right now), and your framerate is 200fps (which isn't likely unless you have a high end card and an older game), then you're generating an average of 2 frames for every frame displayed. This is also why games should use double (or triple) buffering, so that the frame being rendered currently is not the frame that's going to be displayed on the next refresh, but rather the frame after that (or the frame after that). At low framerates, the buffering can lead to sluggish controls, but at high framerates it's usually indistinguishable (because, again, it could be rendering an average of 2 frames for every 1 displayed, and displaying them at a high enough rate that the controls don't feel like they're behind what's displayed). Generally, you don't need 2x the refresh rate, or even 1x the refresh rate. You just need as consistant a rate as possible (above a personal limit based on what your own eyes perceive as good and what your monitor's refresh rate is set to). I may usually have a refresh rate of 85Hz, but I rarely allow my framerate to go much higher than 60 fps in multiplayer games (because it's bound to drop to 30 somewhere, eventually).

Is there anything correct in that article? (2, Interesting)

zenyu (248067) | more than 10 years ago | (#6891801)

I don't even play video games and I know the reason you need high FPS has nothing to do with the framerate at which you meld seperate frames into motion. It's all about response time. When the game can render at 500 fps it means you have to wait 1/76+1/500+'AI time' seconds for a response to something you do on the controler. This assumes your refresh rate is 76 hz. The 1/76 is fixed by your refresh rate because unless you can do the entire redraw in the vertical retrace period and have dual ported RAM on the video card you need to double buffer. Some rendering engines, not designed for games, are actually triple buffered for better throughput. Video games are all about response time, and here you you will sacrifice 1000 fps for that 500 fps to avoid adding an extra 1/76 to that timing sum. There of course is a certain point at which that number is high enough that you don't need to double buffer, in reality those nv FX-2000's and ATI 98xx's are way to slow to approach that kind of framerate with the visual quality people want.

TV has an effective framerate of 60fps*, movies are 24 and cartoons are usually 12 fps. Those can all show motion just fine as long as you don't move things too fast for the medium. The average PC monitor has a refresh rate under 90hz, not really much better than the 60hz of television, so you still can't let an object move as quickly from one side of the screen to the other as we can perceive it in real life. As someone mentioned setting the refresh rate ate 72 or 80 or whatever doesn't make your eyes hurt has nothing to do with our motion perception. In normal office use you want to set this as low as possible while still avoiding flicker so that you don't waste cycles on displaying that one character you just typed into emacs a few ms faster. If you are playing a game you want to set it as high as your monitor will take it (up to 120hz at decent resolution on some monitors), while still keeping this number below the number of frames the game can render per second so that it doesn't have to show the some frames twice and mess up the motion.

Film in a projector does not flicker like a monitor running at 24 hz. The reason a monitor flickers is because the phosphor brightess decays. A film screen is fully lit while the film is in front of the light. It flickers simply because it the time it takes to change frames is not zero, doubling the frames to 48 frames per second would increase the time the screen was dark between frames.

*yes TV has 30 'frames' but this is just how many times you redraw the phosphors, as far as motion is concerned you have 60 seperate images representing 60 different snapshots in time (assuming this is really shot as TV and not an up-converted film). Your eyes don't care that the samples are offset, it is not like you ever see one moment with the same receptors as the next, they need a regeneration time before they can sample again. And they are not synchronized at an specific FPS so the flicker explanation was all wacky. The reason you see those nasty line artifacts when watching TV on your computer without a decent TV application like 'tvtime' is because simple TV apps like XawTV are showing two fields sampled at different times at the same time. Often for a variable 2-3 frames if your refresh rate is between 61 and 89 hz. If you show those in the standard 60 hz interlaced with a TV compatible resolution you won't see those artifacts outside a freeze frame, though you will get more flicker than a regular TV because the phosphors in monitors decay faster to avoid ghosting at the higher frequency and contrast they deal with.

Again, CRT flicker has nothing to do with frames rendered per second(fps), and everything to do with how long lasting the phosphors are with respect to the screen refresh rate. Film projector's flicker is a completely different beast. Heck LCD flicker is completely unrelated to refresh rate and has everything to do with your backlight's balast(flourescent) or temperature(halogen). FPS above about 50-60 fps is all about response time, and has nothing to do with persistence of vision, you don't show any more frames per second to the gamer than the refresh rate. Finally, the human eye does not have a synchronized refresh cycle, but we do have integration happening in the brain that likes nearby samplers to report the same signal through time.

Re:Is there anything correct in that article? (1)

BrookHarty (9119) | more than 10 years ago | (#6891955)

True. 100hz refresh is flicker free. 100fps has smooth action.

Real bitch is we finally get Cards that can push games past 30fps to 100fps, and then you enable AA and drops like a rock.

Then GFX cards that rock at AA are released, then the games push the polygons up so even 3ghz CPU's and 256Meg cutting edge GFX cards can only pump 30fps in a fire fight.

CS with 6x AA and high poly skins looks awesome. Cant wait to see how HL2/Doom3 work on normal hardware.

Re:Is there anything correct in that article? (1)

nivedita (179357) | more than 10 years ago | (#6893904)

AFAIK your explanation of TV is backwards. NTSC is 60Hz interlaced. That means there are effectively 30 actual images per second and it draws half an image in 1/60 of a second. The reason TV on a monitor looks weird is because your monitor is usually not running a 60Hz interlaced mode. Thus the app has to fill in the blank lines with something, which can either be just the data from the previous frame, or some sort of filter applied to it.

Re:Is there anything correct in that article? (0)

Anonymous Coward | more than 10 years ago | (#6894899)

AFAIK your explanation of TV is backwards. NTSC is 60Hz interlaced. That means there are effectively 30 actual images per second and it draws half an image in 1/60 of a second. The reason TV on a monitor looks weird is because your monitor is usually not running a 60Hz interlaced mode. Thus the app has to fill in the blank lines with something, which can either be just the data from the previous frame, or some sort of filter applied to it.

No there are 60 768 x 240.5 images per second, called fields in the NTSC standard. Except there aren't really 768 pixels across, the technology is analog so there are an infinite number of places along the line the signal changes intensity. But we take something between 300 and 1300 samples out of that with a framegrabber. Only ~1300 samples are usually sufficient because the signal is bandwidth limited, though in theory you would want more because we don't use ideal reconstruction filters.

Just because you are running your monitor in progressive mode does not mean you must fill in those lines, you can simply draw them black. That is if you are running at a multiple of 60hz and with a horizontal resolution that is a multiple of 480 lines. It still won't look as good as a television set because of the faster and less bright phosphors used in computer CRT's vs the ones used in television CRT's (i.e. the image will be dim and have more flicker.) If you want to avoid tearing completely with progressive scan the best option is to just run your monitor at H x 480 @ 60 hz, where H is some multiple of the number of samples your capture card can generate per line and for every frame grab you draw one field at a time with alternating blank and sampled lines. If you care more to have a decent picture but one where your monitor is useful for reading text at the same time, use some app like tvtime that reconstructs those missing lines and run your monitor at a refresh rate of 120hz*. I use TVTime because it lets you chose a different filter for high motion vs low motion. With low motion you can use the space displacement of the two fields to produce a signal with 480 lines of vertical resolution instead of the typical 240, with high motion you are stuck with 240 but you do get the 60 time independent samples per second and the screen is brighter than if you just drew the in-between lines black.

PS The popular bttv frame grabbers used by Hauppauge and Osprey are never going to give you the full quality of NTSC television, they simply don't sample the signal often enough. But I use them, they cost about 1/10th to 1/40th of what professional quality frame grabbers do. But, I think the bttv grabbers add to the confusion because they, like many grabbers, deliver a "frame" and not a "field" at a time, which means you have to deinterlace on the CPU to get back to untorn 60 "fields" per second.

*You must run your monitor at a multiple of 60 hz even with resampling TV viewing programs because no one reconstructs in-between fields due to the obscene memory bandwidth costs. Hence, to not have to repeat some fields more often than others, creating motion jitter, you always want a refresh rate of 60, 120, 180, etc. (Actually you will still repeat a frame an extra time once in a great while since NTSC isn't exaclty 60 hz... but I can live with that.)

The REAL difference between film and games. (2, Insightful)

iq in binary (305246) | more than 10 years ago | (#6891860)

The argument that 24 FPS should be enough for every medium is false, and here's why:

The reason film projection can smoothly present video is because the blur on film caused by movement of the target on a slow-shutter camera. This blur actually helps because when displayed with 24 other frames in one second (all having the blur effect themselves) it looks rather fluid. Even digital movie cameras accomplish their video quality using the same trick.

Video cards however, do not have the luxury of using this trick for video games. To show the movement of an avatar, for example; every single measurable instant of movement must be rendered for each measurable instant. Those instants are misleadingly called "frames". Achieving higher framerates is actually critical for good gameplay because there are more instants in a given amount of time. That's why low fps seems to feel sluggish on some games because 15/20/25/etc. instants are certainly not enough to show fluid movement. I myself feel right at home right around 75 fps on any first person shooter or what not. This is because the human brain registers information from the brain at about 75 Htz (at least that's what I was taught).

So, next time you hear "24 fps is all you should need!", you can tell them why it's not.

Re:The REAL difference between film and games. (1)

nivedita (179357) | more than 10 years ago | (#6893923)

What I wonder is why a video card can't do the same thing to show more fluid motion. For ex, suppose your card is only capable of 30fps. Why can't it just add motion blur in each frame based on what the previous one was so it looks fluid instead of like a high-speed slide-show?

Re:The REAL difference between film and games. (1)

vadim_t (324782) | more than 10 years ago | (#6896479)

Because motion blur requires quite a lot of CPU time, that's why.

Motion blur is a natural effect on film, but on a computer it'd have to be specifically computed, which would only make things worse. If you only get 30 fps already, and motion blur slows it down to 10, then it's going to be too slow for the motion blur to be of much use.

Re:The REAL difference between film and games. (1)

SuiteSisterMary (123932) | more than 10 years ago | (#6945022)

This was the next big thing for 3dfx; their 'T-buffer' was designed to do things like motion blur.

The idea being, of course, that yes, thirty or sixty FPS really is all you need *so long as you get the same blurring effects you get from film/video.*

Having 200 frames per second merely means that the jump from position to position gets smaller and smaller, in effect building in motion blur.

As an example, roll a ball across a table, at a speed that it takes one second. Film that with a film camera at five FPS; in othe words, the camera sutter will be open for 1/5th a second, five times. Watch the film; you'll get blurring and what not.

Now, do the same thing, but take five snapshots with a camera with a really fast shutter speed. You get five pictures of a ball, each picture showing the ball sitting motionless, in five separate spots. Now, flip the pictures like an old flipbook. That's how a computer currently does it.

Grammar? (4, Insightful)

JonoPlop (626887) | more than 10 years ago | (#6891888)

I tried to RTFA, but I fainted mid-way during the first paragraph.

...computers are tested for there ability to improve frame rates in games.
...heard from your friends about the latest drivers for there system...
...gave them an extra 30 fps over there old card...

(They're all from the one paragraph introduction...)

Re:Grammar? (1)

jilles (20976) | more than 10 years ago | (#6892025)

Yep, the guy consistently uses 'there' instead of 'their'. Actually the word 'their' does not appear in the article at all. The word 'there' is used correctly only once. Combined with the factual inaccuracies, this is a pretty lousy article.

Well for me personally... (2, Insightful)

psxndc (105904) | more than 10 years ago | (#6891934)

Higher FPS means it can just handle more stuff happening on the screen at once. I don't need super whopping detail to start, I just need the game to not turn into a slideshow when 5 grenades explode around me at the same time. A video card that generates higher FPS means instead of 5 grenades it can handle 7, or 9, or 11 ad nauseum. Once it can handle a good amount of "stuff" on the screen, bump the resolution up a little or add more detail and we're back to only handling 7 grenades. Is this acceptable? Personal preference. Tweak up or down, lather, rinse, repeat.

psxndc

altered physics with high FPS is the reason (1)

ville (29367) | more than 10 years ago | (#6891942)

The reason for getting the highest FPS possible in the quake line of games is physics. All(?) of them have enabled one to run faster and jump higher with FPS above certain limit making certain trick jumps easier and otherwise impossible jumps possible. I think in Q2 it was 90 FPS and Q3 there were various limits, one being 125 FPS and other being 300+ FPS. // ville

Physics approximations tied to frame rate (1)

Scorchio (177053) | more than 10 years ago | (#6898194)

Absolutely. Looks like everyone's got bogged down in details about perception and the biology of the eye, and overlooked some of the more mundane points.

The games development algorithms mailing list [sourceforge.net] has recently covered this topic in some depth. (Apologies, the archives don't seem to be working properly at the moment.)

The problem can lie in the collision detection working with sampled points along the player's trajectory during a jump, checking for collisions between those points. The lower the frame rate, the coarser the path, and the less accurate the collision response. Some games have limits after which they'll subdivide the path, giving a better approximation, but still an approximation. Higher and higher frame rates give even better approximations, so you're more likely to get a sample point at the peak of your jump, allowing you to successfully land on that ledge that is almost, but not entirely, out of reach.

I guess the complete separation of game physics and rendering would help put everyone on a level playing field, so to speak.

Monitor refresh rates (1)

Mprx (82435) | more than 10 years ago | (#6891995)

One thing I don't understand is why some people play FPSs with low monitor refresh rates. If your monitor is retracing at 75Hz, it doesn't matter that your graphics card can render 300fps, you're only going to see 75 of them.

I play Quake 3 at 800x600, so my monitor can refresh at 160Hz. With everything at minimum detail I get about 200fps average, so my real fps is a solid 160fps (except in very crowded areas).

This makes a very big difference, especially in 1v1 with railguns. Here every few milliseconds count, so anyone with a lower framerate is at a big disadvantage.

Summary (1)

tsa (15680) | more than 10 years ago | (#6892124)

The average person needs at least 72 fps. Not 30. And that is because TV's work a little bit different than computer monitors.

Go Back To School (1)

vigilology (664683) | more than 10 years ago | (#6892306)

Not one instance of "their" or "you're" in the whole article.

Stupid. (1)

jensend (71114) | more than 10 years ago | (#6892756)

The article is filled with obvious factual errors and is a badly-done apologetic for the obsessive and nonsensical quest for ludicrously high framerates.

His attempt to explain away the fact that 24-30 fps works fine for movies and television is an utter failure. Surrounding darkness is not why movies look smooth, and the feeling of continuity here has nothing to do with the afterimage effect. The refresh rate of televisions, resulting in "each frame being drawn twice", does not double the framerate of the television. If you had a 150 hz display sync'd to flicker each of the frames of a continuous 30fps video stream 5 times, you still have a 30 fps framerate, just as having a LCD, which leaves the image actually intact on the screen and thus is indiscernable, in showing a motionless picture, from a hypothetical infinite refresh rate CRT, doesn't affect your framerate either.

The most obvious factual errors are on the last page, where he deliberately tries to confuse readers into thinking the eye's capacity to discern between refresh rates has something to do with the eye's capacity to discern between framerates. The difference between having the *entire screen* flicker from white to black and then to white again and having relatively minor changes in scenery as with framerate is huge. Everybody can distinguish the headache-alleivating difference between 60hz CRT refresh and 75hz CRT refresh; not very many people can tell the difference between a 25 fps framerate and a 100 fps framerate.

Re:Stupid. (1)

zenyu (248067) | more than 10 years ago | (#6894961)

not very many people can tell the difference between a 25 fps framerate and a 100 fps framerate.

This is undoubtedly true, but anyone is capable of telling the difference between 25 and 100 fps with a little training. And would suffer from 25 fps in a video game without knowing why. Then again I consider myself pretty good at eyeing it, yet I'm sometimes off on my guess of the framerate by 60%. You need to be able to move an object fast enough for the motion to break down to get a good estimate of the framerate. You can also fool most viewers into thinking you have a great framerate at 15 fps, just by animating something properly for that framerate and not giving the viewer interactive control.

Translation (1)

cookd (72933) | more than 10 years ago | (#6892929)

Ok, this guy is mostly correct, but it is sure hard to read and get the real point. So here is a (hopefully more understandable) summary of what he was trying to say.

TV and movies can get away with 24 to 30 FPS mainly because the CAMERA that took the actor's picture kept the shutter open for almost the entire 1/24th of a second, so moving objects are blurry. That hides the fact that the actor moved quite a bit from one frame to the next, since the area between point A and point B is filled in with the blur. And since we don't have to shoot a railgun at the actor, this blur isn't a problem.

Games can't get away with that frame rate because they render for a single instant in time, and there isn't any blur to fill in the gap. Computers render instants, and figuring out blur isn't something that is really easy to do. And even if there were some blur to make the graphics move more smoothly, we might not like it as much as we do in movies -- you don't shoot railguns at blurs, you shoot them at actual heads.

The point at which the "instants" blur together in our brains is hard to determine. He claims that 72 FPS is good enough since we don't see the monitor flicker at that rate, but that is faulty reasoning. 72 Hz (or whatever) makes the flicker go away because the phosphors are still glowing pretty brightly by the time the next frame comes around, not because 72 Hz is good enough to fool our eyes. That is why the TV looks fine at 60 Hz Interlaced, and the IBM CGA monitor could get away with 60 Hz Interlaced, but a modern SVGA monitor at 60 Hz Interlaced would drive you nuts -- the phosphors don't glow as long in a modern monitor. Nevertheless, he is still correct in saying that it doesn't do much good to get a higher FPS than your monitor refresh rate can display -- at least as far as jumpiness is concerned. (As an aside, we may see future games that blend together the 5 frames that were generated while waiting for the V Synch, thus creating the same blur that makes TV and movies smooth.)

However, for fast reaction games, jumpiness isn't the only problem. You also have to consider latency.
  1. Computer calculates and then records the position of items on the screen at a particular instant.
  2. Computer renders items to a frame. (Elapsed time: 1/FPS seconds.)
  3. Computer waits for V Sync (to avoid shearing) before changing to the newly rendered frame. (Average time is half of the monitor's refresh rate, so elapsed time is 1/FPS + 0.5/Hz seconds.)
  4. CRT or LCD displays frame. (Elapsed time is 1/FPS + 1.5/Hz seconds.)
This is the latency between when something happens (according to the computer) and when that event actually appears on the screen. Of course, the frame appears over the period of time 1/FPS + 0.5/Hz (when the top of the frame appears on the screen) through 1/FPS + 1.5/Hz (when the frame is completely drawn on the screen), but the complete image doesn't appear until the end. If FPS is larger than the monitor's refresh rate, you can reduce the average 0.5/Hz time waiting for refresh by rendering new frames until the monitor is ready to display one of them, making the latency 1.5/FPS+1/Hz instead of 1/FPS+1.5/Hz.

Finally, he makes the good point that it isn't the average FPS that matters, even though it is the average that is always reported in the benchmarks. It is the minimum! It doesn't matter that the new "Fee Fie Fo Fum" adventure averages 200 FPS if it always slows to 4 FPS right when you are chopping down the beanstalk. The giant will still get you every time.

The Power of Suggestion (1)

FourPak (705017) | more than 10 years ago | (#6893915)

Test it yourself, if people are not told beforehand what the framerate is, 99% cannot distinguish whether a game is being displayed at 20 or 30 or 60 or 75 or whatever fps.

It's all hype and power of suggestion.

Take a 30 fps scene, tell someone it's running at 75, and they will tell you, yes, it looks m-u-c-h better.

Somt other points (1)

cyranose (522976) | more than 10 years ago | (#6894494)

The article and some of the comments here are all over the map. So to avoid redundancy, I'll point out a few common sense things and a few places where research since the 60s has already addressed these issues.

1. A fixed frame-rate is better than even high but variable frame-rate. A solid 30hz can actually be better than 30-120 fluctuating. A ton of research has gone into how to make the graphics pipeline effectively lock to a fixed rate and there's a good reason: variable frame-rates make people sick; fixed-framerates make things feel smooth and continuous. The article's point about motion blur seems an attempt to reconcile this point, albeit incorrectly. The key is smooth predictable motion. Note: this is harder than just locking to the vsync -- do it wrong and you randomly drop frames, which is exactly what you're trying to avoid.

2. No matter how high your instantaneous framerate, you will _never_ get more FPS than your monitor refresh rate, no matter what the little FPS number in the corner says. For me and my LCD, that's 60HZ. The rest is wasted energy. In fact, it's best if the fixed framerate is locked to the monitor refresh rate in some low multiple (1:1, 1:2, etc..) or it will effectively be a variable framerate (see: temporal aliasing, every nth frame seems to skip ahead or behind in time).

3. I can also tell you from looking at the source code to various demos that FPS is often measured by draw-time, not total time from frame-to-frame! If your game says it's getting 93fps but you don't see any horizontal tearing (partial updates racing the raster scan) on a 72hz monitor, the game is not measuring the total frame time. It is, in effect, lying to you.

4. I don't care what the peak framerate or even the average is. The only benchmark that matters is the _lowest_ framerate, because if that's too low, once you get sick, you'll stay sick for a while.
Personally, I can't stand games that optimize for empty rooms. Add too many characters or lots of action and it turnes into a real-life vomitorium. Graphics developers need to tune for the worst case, not the benchmarks. And the best case, due to locked framerates, will seem no better. Programmers can find some other way to feel macho.

5. There are many ways to reduce latency and increase the fidelity of physical simulations. For one thing, rendering can be made asnychronous from input and simulation. User motion inputs can also be added late in the pipeline (right before draw) with the proper architecture. Imagine how much better a physical simulation can be if we're not busy rendering frames no one will ever see? And given the variability of graphics and CPU performance on PCs, this seems to be critical to getting good gameplay across the spectrum of machines.

Poorly researched (and thought out) article :P (1)

ZeeCog (641179) | more than 10 years ago | (#6894638)

Unfortunately, I don't think the author of this article was ever able to really wrap his head around this concept.

Off the top of my head, I don't know where he's getting this "displayed twice" business. The closest thing I could think of the to that is the technique of interlacing frames used for displaying images on a TV screen. But when a series of images are interlaced, they're definitely not being displayed "twice"...more like one half at a time.

Also, motion blur is not "added" in the visual cortex. I don't know if anyone has really proved where the effect comes from but I'd be willing to bet that it's a side-effect of the way in which the eye continuously senses photons as they come in. In fact, I think that if games today attempted to somehow calculate the possible appearance of the blur and display it, we'd end up with much more natural looking motion. Interestingly enough, the author claims that this would NOT be natural to an observer. I think the author has managed to misplace assumption of where the brain is taking over. As he falsely stated earlier that the brain adds the effect of motion blur, he has similarly failed to realize that the brains capabilities here. A series of images with pre-applied motion blur may be somewhat indecipherable if displayed individually. However, the human brain would make quick work of the series if displayed in sequence. Applying motion blur to an image series is a very good way of tricking the brain into thinking it's observing the movement of real-world objects. The games of today make no such attempts at all, leaving the brain to compensate on its own.

I'd actually be willing to bet that a game with properly applied motion blur running at a near-constant or steady framerate above 30 could beat out another running without motion blur and at 2, 3, or even 4 times the framerate.

How to calculate minimum desirable fps (1)

extropy (669666) | more than 10 years ago | (#6895987)

Ok, I'm just speculating, but I think I see some things people are missing here...

First, the bare minimum fps should be the rate at which flicker is no longer detected (according to the article its 72 fps). In reality, the decay rate and decay curve of the display device are the real factors here, but this will soon be irrelevant as you'll see.

If you have an object moving across the screen, with no motion blur (such as with 3d games) at a low fps, you see multiple separate instances of that object rather than a single smeared object. The distance between the separate instances of the object increases with increased speed of the object, and decreases with increasing fps. And the opposite goes for the number of separate instances of the object. So the issue is now about minimizing the distance between the separate ghost images, to the point that it is perceived as a smooth blur. I'm assuming that this would actually be less than the resolution of the eye itself (not that it has one exactly), but for arguments sake, lets say the fps have to allow for distances between ghost images less than one pixel equivalent of the human eye with a display device that takes up the viewer's full field of view. This means the final determining factor is the maximum possible speed of the IMAGE of moving objects ON THE VIEWERS RETINA (or the display to be more realistic). In an application per application basis, one could determine the maximum speed of any given object. For example, in Quake 3 one could assume that rockets are the fastest moving object in the game, and the fastest it moves across the display is when it passes perpendicular to your direction of view, and passes as close to the camera as possible. With these assumptions, and the other factors determined before, you should be able to determine an ideal minimum fps. Unfortunately, in more general applications, there is probably no limit to the possible speed of images passing across the viewer's field of view, so plugging it into the equation would result in the need for infinite fps.
FPS=s
where s is the max speed of a moving image on the display in "pixels per second". So an object that moves across 20 pixels in one second would require 20 frames to avoid skipping any pixels (of course this would need to be higher for sub-pixel antialiasing)
So, it depends on the application. OR, the answer is infinity.
OR, I'm missing something.

High End Cards + New Games (1)

illumina+us (615188) | more than 10 years ago | (#6896220)

am I the only one bothered by the fact that the latest generation of graphics cards still can't output high framerates on new games such as UT2k3 and Doom III?? The Radeon 9800 Pro 256MB and the GeForce FX 5900 Ultra run these games fine. Yet if you take a 9700 Pro or a 5800 Ultra, the games get sub 100 FPS, even sub 60 in 1024x768! These cards were released after the games were developed! This is just wrong. I high end card should be able to output 100+FPS at 1600x1200 resolutions on the latest game that is available when the card is released. I am almost afraid to run Deus Ex 2 or Doom III on my GeForce FX 5200.
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>