Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Frame Latency Spikes Plague Radeon Graphics Cards

Soulskill posted about 2 years ago | from the have-you-tried-turning-it-off-and-on-again dept.

AMD 158

crookedvulture writes "AMD is bundling a stack of the latest games with graphics cards like its Radeon HD 7950. One might expect the Radeon to perform well in those games, and it does. Sort of. The Radeon posts high FPS numbers, the metric commonly used to measure graphics performance. However, it doesn't feel quite as smooth as the competing Nvidia solution, which actually scores lower on the FPS scale. This comparison of the Radeon HD 7950 and GeForce 660 Ti takes a closer look at individual frame latencies to explain why. Turns out the Radeon suffers from frequent, measurable latency spikes that noticeably disrupt the smoothness of animation without lowering the FPS average substantially. This trait spans multiple games, cards, and operating systems, and it's 'raised some alarms' internally at AMD. Looks like Radeons may have problems with smooth frame delivery in new games despite boasting competitive FPS averages."

cancel ×

158 comments

Sorry! There are no comments related to the filter you selected.

nVidia (-1, Troll)

kc67 (2789711) | about 2 years ago | (#42265917)

Any serious gamer uses nVidia. Radeon has been behind for years and I am sure this issue isn't going to help.

Re:nVidia (5, Insightful)

Anonymous Coward | about 2 years ago | (#42266061)

Any serious gamer gets the best bang for their buck to spend more on games rather than some misguided notion of brand loyalty as a significant indicator of anything.

Re:nVidia (1)

CodeReign (2426810) | about 2 years ago | (#42266407)

Agreed but in this case and many other related to computer parts it isn't a "misguided notion of brand loyalty" so much as an awareness of which brands are able to consistently produce quality goods vs brands that produce acceptable goods.

I wouldn't ever consider an HP machine and it isn't because I don't like the brand. It's because they die within months. Sony on the other hand is a kniving cunt but still manages to produce consistently acceptable hardware.

Re:nVidia (3, Insightful)

poetmatt (793785) | about 2 years ago | (#42266463)

except a statement "consistently producing quality goods" is simply inaccurate all of the time, because no brand consistently produces quality goods.

So the statement that you are replying to, was correct. That includes HP and Sony.

Re:nVidia (-1, Troll)

geekoid (135745) | about 2 years ago | (#42267127)

nVidia does.

Re:nVidia (0)

Anonymous Coward | about 2 years ago | (#42267447)

NVidia does.......not.

FTFY

Re:nVidia (0)

Anonymous Coward | about 2 years ago | (#42268859)

except a statement "consistently producing quality goods" is simply inaccurate all of the time, because no brand consistently produces quality goods.

This is something a lot of geeks seem to not want to accept. It seems way too often people's opinions on brands of various computer components depend on them or some one they know getting burned. But then they hold on to that grudge much longer than it takes for the next generation of products to come out where the brand that screwed up and the best brand of the generation swap around. Some companies managed to consistently suck, but many of them the failures are rather line and generation specific.

I've seen way too many conversations that go along the lines of, "Don't buy X, buy Y, because I bought X once and it failed," "Wasn't that a couple of years ago and brand Y had similar failures earlier this year instead, although you haven't bought brand Y for a year or two anyways?" or "Don't buy X, I got one and it was DOA, and you can see ~10% of the reviews online complain of DOA problems." "But it looks like all of the major brands have about the same 10% complaints of DOA products at that price, including the one you recommended..."

Re:nVidia (1)

Pentium100 (1240090) | about 2 years ago | (#42269527)

Well, unless you are buying an old device, there is no way to know whether it will be crap or not, so using past experience with the company is quite valid.

Let's say I bought a device and it failed very quickly (or was just crap), the company says that they fixed the problems and please buy our new model, this will work just fine. What if it doesn't? I'll wait a few years and if the company produces good quality products during that time I may buy from them again.

Losing trust is easy and fast, earning it back is not as easy or fast.

Some companies managed to consistently suck, but many of them the failures are rather line and generation specific.

A company's products of generations 1-5 were good quality, generation 6 sucked (but the suckage was not immediatly obvious, it showed up after a year or so of use, like the nvidia chips that desoldered themselves from hp (and some other) laptops). There is generation 7 announced, with supposedly all flaws from gen6 fixed. This could mean one of two things - gen6 was a genuine mistake and gen7 is good again or gen7 will suck just as much as gen6 did because the company is going downhill. How do I know which is it?

Re:nVidia (-1)

Anonymous Coward | about 2 years ago | (#42266099)

9800 pro was my last ATI card. They are fucking shit. Plain and simple.

Re:nVidia (3, Informative)

americamatrix (658742) | about 2 years ago | (#42266109)

That statement isn't true at all.

For a long time nvidia may have had the FPS crown, but the how the actual graphics looked on a radeon were MUCH better.

Quality vs Quantity, as seen here:

http://www.xbitlabs.com/articles/graphics/display/quality_vs_quantity_3.html [xbitlabs.com] https://www.nordichardware.com/Graphics/ati-radeon-x1950xtx-part-1/Image-Quality.html [nordichardware.com]


-americamatrix

Re:nVidia (0, Flamebait)

Anonymous Coward | about 2 years ago | (#42266309)

"That statement isn't true at all. "

Steam hardware survey (which is statistically valid).

http://store.steampowered.com/hwsurvey [steampowered.com]

Gamers by a significant margin (20%) prefer nvidia over ATI/AMD.

Re:nVidia (2)

Klinky (636952) | about 2 years ago | (#42266355)

Which has nothing to do with the point the person you're replying to was stating.

Re:nVidia (0, Flamebait)

g0bshiTe (596213) | about 2 years ago | (#42266405)

Why? Cause Radeon was always glitchy, iffy and didn't always play well, usually cost more than the nVidia offering which has always just worked. Been a fan of nVidia since the TNT2 card was introduced. Would I try Radeon, yes, it's difficult to forget more than 15 years of nVidia products working and working properly for me. In that time, I've had dozens of people I've gamed with online complain about Radeon cards not working properly or overheating or locking up and have seen the same in forums. Mine isn't a choice of brand loyalty, if the cards started being crap I'd jump ship faster than a rat would a sinking ship.

Re:nVidia (5, Insightful)

fostware (551290) | about 2 years ago | (#42266575)

I always bought nVidia until the 7950 / 8800 / 9800 dry solder issues. After that many RMAs and arguements with wholesalers and retailers, I bought AMD out of retaliation, and this 5870 has been rock solid. I'll be going back to nVidia for my next refresh, but for me, nVidia has not 'always just worked'.

In fact, I'd wager no one brand 'just works' these days, since extreme capitalism is in these days, and they'll shop around for manufacturing plants and methods.

Re:nVidia (1)

Reverand Dave (1959652) | about 2 years ago | (#42266757)

This is the most accurate post here so far. You need to make sure that you are getting a good card manufacturer more than the chipset designer. The chipset is only as good as the cards they are bundled with.

Re:nVidia (1)

Anonymous Coward | about 2 years ago | (#42267085)

Except the G80/G92 disaster *was* a chipset issue.
Nvidia used lead-free balls between the die and interposer with the same soft underfill they used for leaded balls before, resulting in thermal stresses causing joint fractures.
Card manufacturer doesn't play a role in it other than "more extreme thermal loads/cycles accelerate the deterioration", so cards tuned to run hot/silent died quicker.

Re:nVidia (0)

LingNoi (1066278) | about 2 years ago | (#42269087)

So one incident created due to the fact that no one is allowed to use lead anymore for environmental reasons. People keep milking this and sure it WAS a problem but compared to the amount of issues with AMD cards in terms of both hardware and software it's nothing.

Re:nVidia (0)

Anonymous Coward | about 2 years ago | (#42270587)

That wasn't Nvidia's fault, it was the fault of the PC manufacturers using boards that would warp enough to cause the solder to peel away. I know, because that exact problem happened to one of my old laptops. I ended up "fixing" it with a heat gun and an IR thermometer, but it could go out again at any time.

Re:nVidia (3, Insightful)

Anonymous Coward | about 2 years ago | (#42266807)

In fact, I'd wager no one brand 'just works' these days, since extreme capitalism is in these days, and they'll shop around for manufacturing plants and methods.

I'd argue that nVidia and AMD aren't brands in the traditional sense.

The majority of video card problems are caused by card manufacturers, rather than the chipset vendors. I've had nVidia cards that weren't worth the box they came in; I've had my share of Radeon-based cards that were complete crap. Nine times out of ten, though, whining about nVidia or AMD and video card problems is something along the lines of whining about Intel because your computer's Samsung hard drive failed. You should probably be bitching out Dell.

(Or in my case, XFX. Fool me once, shame on you. Fool me twice, I've started a list of manufacturers to avoid, because I forgot how inept you guys are.)

Re:nVidia (1)

lightknight (213164) | about 2 years ago | (#42270729)

XFX got you as well?

Re:nVidia (1)

Rasperin (1034758) | about 2 years ago | (#42266813)

I just went over to the Radeon because of the multimonitor support given off of one card. I have 5 monitors attached to my current video card and I like it that way. Before then I bought nVidia because they worked so well without issues. I have had multiple issues from radeon since purchasing it, but oh well I finally got it to work.

Re:nVidia (0)

Anonymous Coward | about 2 years ago | (#42267145)

Man buys chip set with the cheapest manufacture he can find. Blames chips set designers for solder issues.

Re:nVidia (0)

Anonymous Coward | about 2 years ago | (#42267879)

Man makes up shit to support his argument. Where in that post do you see that he went with the "cheapest manufacturer he could find"? Also, almost all of these manufacturers produce nvidia and ati/amd boards, so if the problem was the manufacturer, you'd think the issue would plague not just the nvidia boards.

Re:nVidia (2)

Tough Love (215404) | about 2 years ago | (#42268731)

Blames chips set designers for solder issues.

At that time NVidia had fallen behind AMD in terms of throughput per die area and was forced to push up clock rates to compensate, causing heating and yield problems. Those were dark days for nVidia, now mostly fading painful memory, but AMD still gets more graphic throughput out of the same die area.

Re:nVidia (0)

Anonymous Coward | about 2 years ago | (#42268541)

I was an nVidia guy back in the TNT days as well and went through several gForce generations. A couple years ago I got a couple stinkers and switched to AMD out of spite and have had no problems with my last couple. I don't swap video cards like I did 10 years ago, and I have a couple year old HD57xx that is still doing fine for the two year old games I buy for Steam $5 special games. Meh, I'll give nVidia another chance on the next card.

AMD's low end Radeon are buggy (1)

rsilvergun (571051) | about 2 years ago | (#42268729)

if you're at the low end (R4850, GT240, etc) AMD's stuff is pretty useless. They just crash a lot on everything except the biggest titles. It drive me nuts, because the grandparent is right, AMD has much, much better image quality. I've heard their high end ($300+) doesn't have this issue, but I'm old and broke (family and such) so that ain't happening.

Re:nVidia (1)

ne0n (884282) | about 2 years ago | (#42267449)

You're obviously one of the lucky nVidia customers who avoided all the shitty laptop chipsets that cracked off the board because of cheap solder [nvidiadefect.com] . Same issue as the 7/8/9xxx desktop series, except you had to basically turf an otherwise perfectly good machine. nVidia shit the bed on both sides in that debacle which spanned several years.

Re:nVidia (0)

Anonymous Coward | about 2 years ago | (#42267577)

You wouldn't get the clue, even if I bought it for you. Go suck more green team weenie, moron. The TNT2 was an over priced piece of shit. I had 11 of them. The Quantum 3D X-24s we had stomped them to death. Sure, you needed -any POS card for 2D, but I never minded the cost, because a flat 85Hz and FPS in Quake 2 was the absolute shiznit in 2000.

What of the 5xxx Nvidias?

The nuclear meltdown mobile chips?

The beok-dick 8800-9800 series cards (25% failure in under 1 year)?

Please..... type back more nonsense like that steaming pile you just dropped. It will be fun to watch you flail as the facts beat the shit out of you.

Re:nVidia (2)

Klinky (636952) | about 2 years ago | (#42266377)

Is there anything more recent? Six years is a long time ago for graphics cards.

Re:nVidia (2)

Rockoon (1252108) | about 2 years ago | (#42266619)

The article does not justify its arguments. We do not know how it should look, only the way it does in the various pictures. Seems to me that the ATI method, while offering "higher sharpness" will suffer from temporal aliasing artifacts much like you see on distance surfaces when using high resolution textures in minecraft.

Re:nVidia (0)

epyT-R (613989) | about 2 years ago | (#42267371)

That is ancient and irrelevant after the geforce 7 era, as it was fixed for these cards. Today both chipsets give very similar results. The problem lies in quality drivers. radeon drivers are riddled with rendering bugs when used with anything but the 'latest' games they were 'optimized' (patched up) to work with. They're also BSOD trigger happy. While not perfect, nvidia's drivers behave far better, to the point where I'd rather have a previous generation nvidia card in my machine than the latest ati/amd.

Re:nVidia (1)

FatdogHaiku (978357) | about 2 years ago | (#42267293)

I think that's all true, but this is a great opportunity for Weird Al to re-release his "White and Nerdy" [youtube.com] as "Fast but Jerky"...

Once again, a single measurement.... (4, Insightful)

earlzdotnet (2788729) | about 2 years ago | (#42265931)

It seems like everyone always wants a single measurement to judge how good something is. Graphics cards have FPS, CPUs have GHz, ISPs have MB/s. What's not shown in these single number measurements are things like lag, or overheating problems, or random spikes of instability.

Sigh. Maybe one day we'll learn that every product needs more than a single number to judge how good it is performance-wise.

Re:Once again, a single measurement.... (2, Informative)

Anonymous Coward | about 2 years ago | (#42265977)

The new measurement for graphics cards is ms/frame, and it's been used on some review sites for more than a year now.

Re:Once again, a single measurement.... (4, Insightful)

The Last Gunslinger (827632) | about 2 years ago | (#42266115)

So they just flipped the fraction and multiplied by 1000...brilliant! </sarcasm>

Re:Once again, a single measurement.... (3, Insightful)

TheLink (130905) | about 2 years ago | (#42270443)

There's a difference between measuring milliseconds per frame and frames per second.
With the former your minimum resolution is one frame.
With the latter your minimum resolution is one second.

Because of that even the minimum FPS rate doesn't necessarily tell you how jerky or smooth the rendering is - since it's averaged out over one second.

Taken to the extreme it could be rendering 119 frames in 1 millisecond and get stuck on one frame for 999 milliseconds, look like a frigging slide show but still show up as 120FPS. Whereas that sort of thing will stick out like a sore thumb on a milliseconds per frame graph. Hence measuring milliseconds per frame is better.

The extreme case shouldn't happen in practice, but as the article shows (and from personal experience) the max/high latency stuff does happen. I've personally experienced this on my old ATI and Nvidia cards - my Nvidia 9800GT was slower but smoother than my current Radeon. I went ATI because the Nvidia cards were dying too often (they had a manufacturing/process issue back then). But my next card is probably going to be Nvidia. Even with Guild Wars 1 my ATI card can feel annoyingly "rough" when turning in the game - you see the FPS stay high but it's rough to the eyes if you get 60 fps by getting a few frames fast then a very short pause then a few frames fast then pause repeat ad nauseum.

On a related note it's good to see that at least some benchmark sites are also starting to take latency/consistency into account for stuff like SSDs. A maximum latency that is too high and occurs too often will result in worse user experience, even if the overall throughput is high, and even for storage/drives.

Re:Once again, a single measurement.... (0)

verbatim_verbose (411803) | about 2 years ago | (#42266121)

That is fundamentally the same thing as FPS, unless there are some unstated assumptions that come along with this.

Re:Once again, a single measurement.... (0)

Anonymous Coward | about 2 years ago | (#42266275)

What part of " NEW measurement" did you not understand? It's NEW, and thus trendier and hipper! Therefore, anyone caught with old video cards using that stodgy OLD measurement that isn't trendy or hip will automatically lose at everything in life, even if they win at games and have more friends and better, longer-lasting relationships with attractive members of the appropriate sex. NEW.

Re:Once again, a single measurement.... (4, Informative)

PIBM (588930) | about 2 years ago | (#42266421)

Actually, the new part is that they are taking the 99th percentile of the ms/frame, which gives something like 'the worst fps you can expect 99% of the time' type of measure..

Re:Once again, a single measurement.... (1)

Kaenneth (82978) | about 2 years ago | (#42267533)

so, every 100th frame, pause for 100 ms to cool down.

graphics card makers have done worse to game benchmarks.

Re:Once again, a single measurement.... (0)

Anonymous Coward | about 2 years ago | (#42270525)

Really? This would seem to be an awful measurement of performance, even worse than simple lowest and highest and average FPS.
Id much rather know the number of frames per interval that where rendered with a high latency.

Re:Once again, a single measurement.... (1)

Anonymous Coward | about 2 years ago | (#42265983)

A person who knows all the different measurements usually doesn't just look at FPS. The single numbers are there for people who don't understand all the other things. It helps them decide on what card is supposed to be better.

Hell even cars use a MPG sticker on each window, they don't go into how they get that MPG number, and most are not reflected in real world tests of the car.

Re:Once again, a single measurement.... (3, Funny)

MysteriousPreacher (702266) | about 2 years ago | (#42266673)

Hell even cars use a MPG sticker on each window, they don't go into how they get that MPG number, and most are not reflected in real world tests of the car.

Or battery life on laptops. Four hours, back in the day? Sure, if usage consists of running Nethack on a screen so dim it's like playing an ASCII adaptation of Doom 3.

Re:Once again, a single measurement.... (3, Informative)

Anonymous Coward | about 2 years ago | (#42266017)

How bout next time you RTFA. For the past year or so, that site has scatter plots showing frame render times in milliseconds; they show time and time again that despite similar FPS rates, performance isn't the same.

Re:Once again, a single measurement.... (0)

The Last Gunslinger (827632) | about 2 years ago | (#42266069)

This is exactly the sort of reason why I've chosen nVidia cards time after time for my system builds. I went with the GTX660 this time around, even though a slightly "higher" rated (per THG) Radeon offering was in the same price range. Had I gone the other route, I'd be spitting nails over this.

Re:Once again, a single measurement.... (0)

Anonymous Coward | about 2 years ago | (#42267965)

Spitting nails over an issue you almost assuredly would not even notice? Yeah, ok...

Re:Once again, a single measurement.... (1)

Anonymous Coward | about 2 years ago | (#42266127)

You didn't read the article, did you...the whole point of it was that despite the Radeon's often better FPS performance, it suffers from spikes that make it feel less smooth.

Re:Once again, a single measurement.... (1, Informative)

Anaerin (905998) | about 2 years ago | (#42266663)

Oh, we have learned this. Which is why decent review sites don't just publish a "single number" representation of speed. They post a complete FPS graph [hardocp.com] for similar runs through each game, so you can compare side-by-side

Re:Once again, a single measurement.... (1)

TheLink (130905) | about 2 years ago | (#42270465)

That's not the same as milliseconds per frame.

Frames per second graphs will be showing numbers that are AVERAGED over one second. So the minimum number still might not reflect how crappy the rendering is.

To take an extreme example, you could have a card that does 120 FPS by drawing 119 frames in 1 millisecond and the last frame in 999 milliseconds.

Then you have a card that does 50FPS by drawing each and every frame in 20 milliseconds. The 50FPS card will appear smoother, whereas the 120FPS card will look like a slideshow.

This extreme case doesn't happen in practice since it'll look too obvious, but as the article shows, something milder but still detectable (even to humans) is happening with the radeon cards.

Furthermore many game benchmarks may have a part where most cards perform badly - so you get a single digit FPS for that section. Since most cards perform badly for that, that's ignored and so the minimum FPS number becomes useless.

Whereas the article's latency vs percentile graph is more useful.

Re:Once again, a single measurement.... (1)

Anonymous Coward | about 2 years ago | (#42266869)

Sigh. Maybe one day we'll learn that every product needs more than a single number to judge how good it is performance-wise.

I understand what you're saying, but it's hard to take a comment seriously from someone with such a high uid...

Re:Once again, a single measurement.... (0)

Anonymous Coward | about 2 years ago | (#42266871)

You could quite easily make the "single measure" be the latency of frames outside of a certain z-score. That would capture the "1% of frames took 1s to render."

Re:Once again, a single measurement.... (0)

Anonymous Coward | about 2 years ago | (#42266885)

Are you agreeing or disagreeing with the article? From what you said, it doesn't sound like you read it or at least didn't understand the point of it. In the article they explore the problem of relying on a single measurement, e.g., FPS and show how that doesn't tell the whole story with regard to subjective (perceptual) performance.

Most abused measurement (0)

Anonymous Coward | about 2 years ago | (#42269057)

Average, by not providing std dev.

That's why I'm sticking with a trust All-In-Wonder (0)

Anonymous Coward | about 2 years ago | (#42265955)

No need to worry about the latest games or fast framerates... I can watch TV on my PC! Oh, wait, I can't anymore? Damn.

This is a well known problem from last year ... (2)

UnknownSoldier (67820) | about 2 years ago | (#42265975)

The fact that it effects single GPU in addition to SLI and Crossfire is worry some.

Micro-Stuttering And GPU Scaling In CrossFire And SLI
http://www.tomshardware.com/reviews/radeon-geforce-stutter-crossfire,2995.html [tomshardware.com]

Re:This is a well known problem from last year ... (0)

Anonymous Coward | about 2 years ago | (#42266413)

The fact that it effects single GPU in addition to SLI and Crossfire is worry some.

Micro-Stuttering And GPU Scaling In CrossFire And SLI
http://www.tomshardware.com/reviews/radeon-geforce-stutter-crossfire,2995.html [tomshardware.com]

The "problem" is not new in any way, ANY 2-bit video game developer remotely concerned about performance is aware of stuttering because it's easy for their own code to cause it.

Benchmarks measuring frame timing across the board are newish, but this article is not about the technique, it's about findings regarding Radeon cards. Duh, "Frame Latency Spikes Plague Radeon Graphics Cards"...

CPU Backup (5, Funny)

dohzer (867770) | about 2 years ago | (#42266025)

So half of AMD's business isn't at its peak. Big deal.
They can just rely on their CPU business for a while.

Re:CPU Backup (0)

Anonymous Coward | about 2 years ago | (#42266149)

Considering the benchmarks ive seen on tomshardware... comparing price, performance, and heat, Intel is still the better CPU for the money.

Re:CPU Backup (1)

h4rr4r (612664) | about 2 years ago | (#42266173)

I think most people would agree if the motherboards were comparably priced. Of course that may have changed recently and I did not notice.

Re:CPU Backup (1, Informative)

Revotron (1115029) | about 2 years ago | (#42266307)

*whoosh*

Just wait... (2)

Jintsui (2759005) | about 2 years ago | (#42266089)

I will reserve judgement and see if other review sites have similar issues and if it is a problem with just the 7k series or not.

AMD Haters Unite (0)

Anonymous Coward | about 2 years ago | (#42266113)

This is a fine oppourtunity for you all to make a bandwagon and drive it off a cliff -- so long!

Meanwhile I'm going to enjoy my butter smooth 7970 -- stay free Internet hate machines.

The sky is falling... ? (5, Informative)

Ratchet (79516) | about 2 years ago | (#42266195)

Anyone that actually knows anything about the GPU industry knows that both AMD and NVIDIA graphics suffer from these latency spikes, but it's not with all their SKUs. NVIDIA's 660 Ti works well in this case, but their 670 and 680 has more latency spikes than the competitive AMD cards do. The 7850 demonstrated here is an anomaly for AMD. None of their other cards do this. Look at past reviews from Techreport and you will see what I mean.

Re:The sky is falling... ? (0)

Anonymous Coward | about 2 years ago | (#42267513)

Any speculation as to what the underlying issue is with those specific cards?

Re:The sky is falling... ? (1)

jamesh (87723) | about 2 years ago | (#42269477)

Any speculation as to what the underlying issue is with those specific cards?

Sure. I'd go with bad drivers. For some reason drivers never seem to have quite the same quality control as hardware... I guess because hardware is a bit harder to patch.

Re:The sky is falling... ? (1)

AvitarX (172628) | about 2 years ago | (#42267671)

hah, the last GPU I purchased was a 6600, it's 10 times better (GTX though, I think the Ti is better)

Re:The sky is falling... ? (1)

edxwelch (600979) | about 2 years ago | (#42268277)

> The 7850 demonstrated here is an anomaly for AMD. None of their other cards do this.

Just shows that you haven't RTFA. The article compairs 7950 versus 660 Ti.

Re:The sky is falling... ? (4, Informative)

enderwig (261458) | about 2 years ago | (#42268717)

However, if you actually read the Tech Report's review of the GTX 670, you will find they say the exact opposite. The GTX 670 has ridiculously low latency compared to the Radeon 6990 and just a bit lower than the 7950 and 7990.

As clearly seen on page 3 [techreport.com] of the 670's review.

Oh bugger... (1)

ctrlshift (2616337) | about 2 years ago | (#42266381)

...and I just ordered a 7850. Well, here's hoping it's overblown or fixable in software.

Re:Oh bugger... (2)

James Thompson (12208) | about 2 years ago | (#42266777)

I wouldn't get too worried. I took advantage of the free games offer to pick up a HD 7970. Running Windows 8 with the latest AMD beta drivers and I have yet to notice this issue.

Re:Oh bugger... (1)

Jetra (2622687) | about 2 years ago | (#42266843)

I got a Sapphire 6450. It's nice and works for what I use it for *coughMinecraftcoughcough*

I broke the rule and read the fine article... (0)

Anonymous Coward | about 2 years ago | (#42266459)

It's a known issue with Windows 8, not the card's hardware (but could be driver related).

Windows is a message based OS/system with no realtime capabilities. So the time between frames is a guess, at best.
While the average frame rate is high, the frame-2-frame time can vary wildy on the Windows architecture.

This is not a new thing.

Re:I broke the rule and read the fine article... (2)

Anaerin (905998) | about 2 years ago | (#42267705)

If you did, as you say, read the article, then you would have seen that this issue happened in both Windows 8 AND Windows 7. In fact, Windows 8 performance was typically better, with less micro-stuttering, than the Windows 7 performance plots. So, to put it mildly, you're speaking out of your ass.

Frequency scaling (4, Interesting)

fa2k (881632) | about 2 years ago | (#42266497)

The article didn't mention power settings. I'm quite skeptical of all the new tech which overclocks on demand and then clocks down when it gets too hot (or too idle). They should definitely try this test with the standard frequency: pinned at the nominal frequency (if there is such a thing at all).

I've switched to Nvidia too (-1)

Anonymous Coward | about 2 years ago | (#42266499)

I chose AMD gpus when Nvidia's "fermi" gpus were new. At the time the fermi chips were new and nvidia had to clock them up to "space heater" thermal outputs to make them competitive.

Since then, however, they've really improved. Later fermi cards got cooler and hotter. Kepler took the fermi arch and made it even better.

I did like my AMD based cards, but there were a lot of issues that made me switch to the kepler based part I have now.

HDMI TV support - AMD's drivers are just plain retarded when it comes to monitor handling. HDMI TVs in particular. God forbid your device gives back bad EDID infromation, because then it becomes uselss to your computer. Nvidia's drivers let you force a resolution if automatic detection fails.

CrossfireX- More trouble than it's worth. If you play anything but A-list titles that have been out for a few months forget it. Just disable/remove one of your cards. Too buggy. Too crashy. Many games run SLOWER when crossfireX mode turned on. The frame jitter/lag is a real an noticeable problem too. Especially on a 120hz display.

Just get the fastest and latest single GPU nvidia based card you can afford and be done with it. You'll be happy you went that route.

Nvidia seems to support their chips for a longer time too. I'm still using a 7800 GTX in one of my machines and it's still supported by the latest driver packages. That chip was launched in 2005 and it performs wonderfully in win7 64bit.

Re:I've switched to Nvidia too (0)

Anonymous Coward | about 2 years ago | (#42268423)

Technically the 7xxx series finally lost support a few months ago, but the point is still valid seeing as those cards are all 6 ish years old.

That's why I bought a (4, Funny)

Spy Handler (822350) | about 2 years ago | (#42266587)

Diamond Stealth 3D card with Cirrus Logic chip. It doesn't suffer from latency spikes.

Re:That's why I bought a (0)

Anonymous Coward | about 2 years ago | (#42266977)

VLB or PCI?

Those VLB cards were a pain to work with but DAMN they were fast for the time.
There are advantages to strapping your graphics processor directly to the CPU bus. :)

Re:That's why I bought a (1)

epyT-R (613989) | about 2 years ago | (#42267765)

PCI isn't the 'cpu bus' but yes they were relatively fast for 2D stuff at the time..

Re:That's why I bought a (1)

Pentium100 (1240090) | about 2 years ago | (#42269421)

VLB is.

Re:That's why I bought a (0)

Anonymous Coward | about 2 years ago | (#42270031)

Yeah. I had a VLB card that allowed writes to the framebuffer at a screaming 1.5MB/s (better yet, svga modes which gave up vga compatibility and enabled 16-bit transfers even doubled that).

Re:That's why I bought a (1)

gallondr00nk (868673) | about 2 years ago | (#42266981)

Diamond Stealth 3D card with Cirrus Logic chip. It doesn't suffer from latency spikes.

I know what you mean. I was annoyed my car had a flat spot when accelerating, so I ditched it and went and bought a Cutlass with the Olds diesel engine. Problem solved.

Summary is Misleading (4, Informative)

slacka (713188) | about 2 years ago | (#42266601)

"This trait spans multiple games, cards, and operating systems, "
First of all the article only tests 2 cards accross Win7 and Win8. Considering that Win8 is basically just Win7 SP2, it's hardly fair to make that statement. Micro-stuttering an issue that mainly affects multi-GPU cards. Both Nvidia and ATI have had issues with this in their SLI and Crossfire cards. You can read more about it here:
http://hardforum.com/showthread.php?t=1317582 [hardforum.com]

Re:Summary is Misleading (-1)

Anonymous Coward | about 2 years ago | (#42267455)

First of all the article only tests 2 cards accross Win7 and Win8. Considering that Win8 is basically just Win7 SP2, it's hardly fair to make that statement

How is it unfair to say it spans Win 7 and 8? Because you think 8 doesn't deserve the number 8? Cool story bro, Microsoft doesn't agree, so there you have it.

Re:Summary is Misleading (1)

shutdown -p now (807394) | about 2 years ago | (#42267723)

As far as 3D graphics is concerned, there are noticeable differences between Win7 and Win8 - the former is WDDM 1.1, the latter is WDDM 1.2, which covers quite a bit of new stuff (basically all the new features in Direct3D 11.1).

Re:Summary is Misleading (0)

Anonymous Coward | about 2 years ago | (#42268713)

Am I missing something, or shouldn't you be able to integrate the ms times in all of those graphs and get a total run time for each game/card? I would seem (just looking at techreports charts) that were one to do that, EVERY SINGLE TEST would show the AMD cards running for "at the wall" longer times than the Geforce cards. I'd like that explained then. "Sir, your card just run a 60 second benchmark, and mine ran a 70 second benchmark, yet you, sir, say that I have a higher average FPS". Maybe it's late, but that just doesn't compute for me.

I noticed this on 6850 (2 years ao) (1)

Sarusa (104047) | about 2 years ago | (#42266603)

Games on the Radeon 6850 would generally perform fine but seem 'jittery'. Usually didn't notice it, but sometimes it was quite obvious that you were losing frames here and there even when in non-complex situations. Of course you wouldn't notice a single one, but when it's happening every 2-3 seconds it starts being noticable when you're playing for hours. Some games were much worse than others, though I couldn't say it was directly related to how shiny the game looked. It was never big enough a deal that I did anything about it (had it for two years).

When the fan on it exploded last month and burned out the 6850 I got a GeForce 660 (no brand loyalty either way) and the issue just went away. Obviously it's a much faster card than a two year old card, but it's also noticeably smoother even with graphics cranked up to about the same FPS. It's the exact same system, that's all that changed.

Just one anecdote, but the article would explain what I was noticing (though not why).

Re:I noticed this on 6850 (2 years ao) (1)

issicus (2031176) | about 2 years ago | (#42268337)

so im not crazy....

Noticed on 6900 only recently (1)

swilly (24960) | about 2 years ago | (#42266901)

A few months ago I decided to do a complete replay of the entire Mass Effect trilogy with my 6900 series card, and I am seeing the occasional lag that didn't used to be there. I also revisited Skyrim when Dawnguard came out, and I'm seeing it there too. This machine didn't used to do this, and since I can't find anything else running that could cause the CPU to spike, I have been working on the assumption that some driver update (perhaps as far back as six months ago) has been to blame.

It's nice to see that others have been seeing this as well, and I hope something is done about it. The Radeon cards are awesome hardware, but AMD/ATI drivers have never been very good.

Re:Noticed on 6900 only recently (0)

Anonymous Coward | about 2 years ago | (#42268461)

I got a 6850 last year, and all was well. Then AMD threw a driver update in my face and I bit. Then I got bluescreens. Rolled back the drivers and all was well again.

Waited 6 months and tried the newest drivers again. Same BS with the BSOD's.

In any case my November 2011 drivers on Windows 7 seem to be stable and never give me problems. I feel lucky I came across a good release from the beginning and had a solid reference point to go back to.

Not surprising (0)

Anonymous Coward | about 2 years ago | (#42267011)

Considering how these cards contain heuristics to detect benchmark programs and load specifically crafted routines to produce better results for them, regular usage scenarios will suffer in consequence.

Re:Not surprising (1)

epyT-R (613989) | about 2 years ago | (#42267827)

ati pulled that shit with quake3 and their radeon 9000 series, and nvidia pulled that with 3dmark and their geforce 5/fx line. They were outed both times. This doesn't happen anymore. Now, they do optimize for specific applications and that is bad because optimizing for corner cases has a history of breaking basic functionality elsewhere.

Latenc issues with a radeon card! (0)

geekoid (135745) | about 2 years ago | (#42267103)

I'm shocked, simply shocked.

Looks like a sync problem to me... (5, Interesting)

dkegel (904729) | about 2 years ago | (#42267211)

Look at the bottom Skyrim graph in
http://techreport.com/review/24022/does-the-radeon-hd-7950-stumble-in-windows-8/8 [techreport.com]

The slow frames always follow extra-fast ones. Looks like some work is being deferred past a frame boundary?!

Re:Looks like a sync problem to me... (0)

Anonymous Coward | about 2 years ago | (#42270269)

looks like the AMD latency series is 25, 15, 5, which is 10ms apart.
NV can apparently schedule frames at 15ms intervals where AMD must be on 10ms boundaries?

DISINFORMATION PUT FORTH BY NVIDIA !! (0)

Anonymous Coward | about 2 years ago | (#42268335)

And suckers have sucked it up !!

Suckers !!

Lag spikes on ATI Cards (1)

tuxrulz (853366) | about 2 years ago | (#42268787)

I notice that like 2 months ago, after upgrading from a Radeon 5850 to a 7850. Since my CPU is an old Core Duo 2.4GHz I didn't expect that much of a performance boost, but expected a noticeable change. When comparing the Haven benchmark results with the previous card, the higher frame rate went up as expected (15-25 fps not remember now), but the lower frame rate went down too from 18 fps to 6 fps on new card. Tried with some driver revisions, being 10.10 the last one tested having same behavior on the 7850. So I guess is an architecture glitch (or driver to architecture bug) since don't affected my old 5850. During games (usually play SWTOR) I notice the lag spikes, but always blame SWTOR, but now looks like the problem is somewhere else.

Still alive? (1)

jameshofo (1454841) | about 2 years ago | (#42269769)

Well it's a good thing they just finished laying off a slew of engineers now they can afford to absorb the bad press from this one.

nVidia sleaze at it again or just generic AMD FUD? (0)

Anonymous Coward | about 2 years ago | (#42270071)

Lately a number of hit pieces have accompanied AMD's troubled corporate performance, which also means vulturefunds are shorting its stock and attacking with FUD in support of their positions, gleefully cheered on by competitors' PR machines.

As a matter of fact the latency issues exist on both nVidia and AMD, especially in SLI/Crossfire and have been publicized for over a year at least (which begs to question the timing of this news). In every day use it isn't actually a problem except for genital comparisons and maybe for olympic level gamers who are on the same setups during a competition anyway. There are several wider problems with nVidia however:

- nVidia has shown a distinct lack of corporate ethics: not only did they cheat on 3DMark 2003 (tests for binary, disables bunch of quality settings to get better scores than in actual games), after having been caught, unlike the competition, they went on and did it again with 3DMark Vantage (PhysX) five years later.
- in the mean time, they also mount a sleaze attack on Intel graphics (http://www.theinquirer.net/inquirer/news/1035036/nvidia-mounts-sleaze-attack-intel)
- Linus deservedly blasted them for their broken Linux policies.
- realignment of 6x0 series for gaming, means GPU computing on backburner (prime example Bitcoin performance drop compared to AMD)
- if this kind of FUD succeeds, without AMD consumer choice and prices will go the wrong way, both for graphics and CPU. If Linux/choice matters to you (with Valve support it may soon), AMD is a better option due to their open source support policy and superior graphics performance compared to Intel (who are ahead on CPU and may catch up on graphics but don't have discrete GPU solutions and are going surface mount only for CPU).

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>