Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

New GPU Testing Methodology Puts Multi-GPU Solutions In Question

Soulskill posted about a year and a half ago | from the only-765-fps-in-pong dept.

AMD 112

Vigile writes "A big shift in the way graphics cards and gaming performance are tested has been occurring over the last few months, with many review sites now using frame times rather than just average frame rates to compare products. Another unique testing methodology called Frame Rating has been started by PC Perspective that uses video capture equipment capable of recording uncompressed high resolution output direct from the graphics card, a colored bar overlay system and post-processing on that recorded video to evaluate performance as it is seen by the end user. The benefit is that there is literally no software interference between the data points and what the user sees, making it is as close to an 'experience metric' as any developed. Interestingly, multi-GPU solutions like SLI and CrossFire have very different results when viewed in this light, with AMD's offering clearly presenting a poorer, and more stuttery, animation."

cancel ×

112 comments

Marketing claims exaggerated? (1, Redundant)

gstoddart (321705) | about a year and a half ago | (#42982997)

That unpossible. :-P

Re:Marketing claims exaggerated? (-1)

masternerdguy (2468142) | about a year and a half ago | (#42983247)

ATI cards suck? That much is obvious. I made a permanent switch to NVIDIA after I had 3 cards from ATI in a row (top of the line for their era, each) and had serious issues with framerate stability and texture glitches. The final straw was when the official AMD driver for Linux caused a kernel panic. I switched to NVIDIA and have been a happier gamer and a happier Linux enthusiast.

Re:Marketing claims exaggerated? (0)

Anonymous Coward | about a year and a half ago | (#42983461)

after I had 3 cards from ATI in a row

Took a lot of tries before that lesson took, eh? Back in the days of voodoo and matrox I had a machine at work with ATI hardware. That card, the company that made it and the anecdotes one can so easily find about their problems dissuaded me from ever buying an ATI GPU. Since then I have listened to one gamer after another struggling with that stuff. Always some mysterious texture problem, always waiting for another driver update to make the card run right with some new game, always diagnosing mysterious frame rate problems.

The 10% or so price advantage that ATI/AMD seems to have over NVidia has been suckering people for a long time. Please stop being suckered. The only way AMD will make the necessary improvements is if buyers stop allowing themselves to be suckered by the small price advantage.

Re:Marketing claims exaggerated? (1)

Dins (2538550) | about a year and a half ago | (#42984095)

I've never had much of a problem with ATI/AMD cards. It's funny, I'm not an AMD person or an NVidia person. I've actually alternated manufacturers and more or less had good luck with each. But maybe I've been lucky because I've seen this said many places. I have a Radeon 7870 right now and it runs everything I want just fine. But if I'm true to form, that means my next card will probably be an NVidia. I don't plan it that way, it's just worked out that when I go to upgrade it seems that the best price/performance option has switched back and forth for me. I guess we'll see...

Re:Marketing claims exaggerated? (0)

Mike Frett (2811077) | about a year and a half ago | (#42988157)

You're not lonely, in fact AMD cards seem to be the only cards that don't constantly lock up my Linux boxes. It's very odd to me that my experience can be so different from others. And I've never had a problem with their software since before AMD took over, starting with the 9500 Pro.

I'm not a fanboi of either company, I just go with what I can afford and is good at the time I'm buying parts. This time around, the 6670 was what I needed, in another box here I have a GT240.

Guy down there below me, I actually have several 3Dfx and S3 cards laying around I tinker with sometimes, Voodoo Banshee, Voodoo 3, Elsa TNT2, S3 Virge etc. I been trying to see what modern Linux distro will work 'good' on an old P3 192Mb Ram box I have. So far Absolute Linux takes the prize, but you can't use the default Chrome browser; the guy made a mistake to include that for older systems. xxxterm or Midori would have been better choices. I'm rambling now.

Re:Marketing claims exaggerated? (2)

unixisc (2429386) | about a year and a half ago | (#42984195)

It's just tragic that all the other 3D graphics vendors - 3D!fx, 3DLabs, Matrox, et al went out of business or exited the market. The NVidia-ATI duopoly (or the NVidia-ATI-Intel triopoly) has really trimmed things down.

Re:Marketing claims exaggerated? (0)

Anonymous Coward | about a year and a half ago | (#42985927)

S3 is still around, but their GPUs are even worse than Intel's.

Re:Marketing claims exaggerated? (1)

unixisc (2429386) | about a year and a half ago | (#42988329)

Apparently, they don't seem to do PC products [s3.com] any more. And who can blame them - or Matrox? 3DLABS too has stopped developing graphics chipsets, and now develops ARM based processors, platforms and software

Re:Marketing claims exaggerated? (1)

unixisc (2429386) | about a year and a half ago | (#42988343)

Actually, 3DLABS rebranded itself as ZiiLABS, and joined the Personal Digital Entertainment part of Creative Technologies 3 years ago. All one sees on 3DLABS page is support for their legacy stuff.

Re:Marketing claims exaggerated? (0)

Anonymous Coward | about a year and a half ago | (#42988279)

Matrox is still around, they just don't do consumer gear.

Re:Marketing claims exaggerated? (0)

Anonymous Coward | about a year and a half ago | (#42985211)

That card, the company that made it and the anecdotes one can so easily find about their problems dissuaded me from ever buying an ATI GPU.

That is an awfully long time to hold a grudge in the tech world where large changes in quality can be seen between generations of products. You do say you kept hearing complaints from other gamers, which is something actually relevant to current generation products, but experiences with products from the late 90s or early 2000s isn't relevant.

And stuff like that makes it much more difficult to get a decent comparison of products out of some tech people and gamers. I've had people tell me to stay away from a product... that they had not used in 10 years, but insist it can't be better than what they've been using, even if I remind them of all the complaints they've made. It seems to be a more of an issue with hard drives than video cards though, as there has been at least some consistency between the two brands there.

Re:Marketing claims exaggerated? (0)

Anonymous Coward | about a year and a half ago | (#42983957)

Chairman Mao and Stalin were both big ATI fans.

Damned commie rubbish!

Re:Marketing claims exaggerated? (2)

K. S. Kyosuke (729550) | about a year and a half ago | (#42983261)

Mind you, the only thing worse than stuttery animation is jerky sound.

Re:Marketing claims exaggerated? (0)

Anonymous Coward | about a year and a half ago | (#42983451)

Mind..... you, the o......only thing w..w....w....w...orse thanananananan stutttttttttttttttttttttttery animation is jerkyerkyerky so........und.

FTFY

Re:Marketing claims exaggerated? (2)

K. S. Kyosuke (729550) | about a year and a half ago | (#42984497)

Mind..... you, the o......only thing w..w....w....w...orse thanananananan stutttttttttttttttttttttttery animation is jerkyerkyerky so........und.

FTFY

--- the pun -->

.....o
..../|\ <-- you
.....|
..../ \

Re:Marketing claims exaggerated? (0)

Anonymous Coward | about a year and a half ago | (#42988571)

Mind..... you, the o......only thing w..w....w....w...orse thanananananan stutttttttttttttttttttttttery animation is jerkyerkyerky so........und

"Never go full retard!" - Robert Downey Jr, Tropic Thunder

Re:Marketing claims exaggerated? (1)

MachDelta (704883) | about a year and a half ago | (#42985151)

SHODAN? Is that you?

Re:Marketing claims exaggerated? (1)

segin (883667) | about a year and a half ago | (#42987617)

You replayed that quote through PulseAudio?!

Re:Marketing claims exaggerated? (1)

zipn00b (868192) | about a year and a half ago | (#42984189)

It's even worse when they're combined although sometimes the porn seems more natural that way.......

You use GPUs for video games? (5, Funny)

amanicdroid (1822516) | about a year and a half ago | (#42983035)

My AMD is cranking out Bitcoin hashes 15 times faster than an equivalently priced Nvidia so I'm okay with the results of this article.

Re:You use GPUs for video games? (3, Interesting)

gstoddart (321705) | about a year and a half ago | (#42983095)

My AMD is cranking out Bitcoin hashes 15 times faster than an equivalently priced Nvidia so I'm okay with the results of this article.

Out of curiosity, what's your break even point?

If you went out now, and bought one of these video cards solely for this ... how long would it take to recoup the cost of the card? Or is this something you'd run for a long time, and get two bucks out of, but still have had to pay for your electricity?

I hear people talking about this, but since I don't follow BitCoin closely enough, I have no idea if it's lucrative, or just geeky.

Re:You use GPUs for video games? (4, Funny)

Anonymous Coward | about a year and a half ago | (#42983271)

People that "mine" bitcoins don't pay for their own electricity. Most people don't have the basement circuits metered separately from the rest of the house.

Re:You use GPUs for video games? (0)

Anonymous Coward | about a year and a half ago | (#42984825)

Then wouldn't stealing pennies from mama's change purse be easier (and more lucrative)?

Re:You use GPUs for video games? (5, Insightful)

lordofthechia (598872) | about a year and a half ago | (#42983323)

Don't forget electrical costs. At $0.10 a kWh you are paying $0.24 a day (24 hours) per 100 watts of continuous average power consumption. This is $7.20 per month per 100W @ $0.10 /kWh or $87.60 a year. Adjust up/down for your cost of electricity and power usage (120W and $0.12/kWh = 1.2 * 1.2 = 1.44x adjustment)

Now add to this the waste heat vented into your house on the months you cool your house + the depreciated costs (and wear and tear) of the computer assets you tied up processing Bitcoins, then you'll have your true cost and you can calculate your break even point based on initial investment + ongoing costs - product (bitcoins) produced.

Re:You use GPUs for video games? (1)

gstoddart (321705) | about a year and a half ago | (#42983391)

Don't forget electrical costs.

You'll note that I didn't. ;-)

Re:You use GPUs for video games? (2)

amanicdroid (1822516) | about a year and a half ago | (#42983517)

Waste heat? You mean my ATI Radeon 200W Space Heater® that takes away the night chills?

Re:You use GPUs for video games? (1)

h4rr4r (612664) | about a year and a half ago | (#42983847)

That costs a lot more to run than a natural gas furnace.

Re:You use GPUs for video games? (2)

amanicdroid (1822516) | about a year and a half ago | (#42984007)

Mine is a joke? People call it waste heat, I use and enjoy the waste heat so I made a joke about it.

I doubt a natural gas furnace even has a PCIx16 slot much less match even a budget Nvidia's hashing specs.

^See what I did there? Another joke.

Re:You use GPUs for video games? (0)

Anonymous Coward | about a year and a half ago | (#42983897)

The nVidia GeForce 590 makes a much better heater clocking in at 365 watts.

Re:You use GPUs for video games? (1)

mikael (484) | about a year and a half ago | (#42984865)

In every house or apartment that has frosted windows in the doors or skylight windows above the doors, a single laptop screen would light up the entire floor - much to the annoyance of those who went to bed early to sleep vs. those who wanted to read slashdot.

Re:You use GPUs for video games? (3, Interesting)

megamerican (1073936) | about a year and a half ago | (#42983937)

Don't forget electrical costs. At $0.10 a kWh you are paying $0.24 a day (24 hours) per 100 watts of continuous average power consumption. This is $7.20 per month per 100W @ $0.10 /kWh or $87.60 a year. Adjust up/down for your cost of electricity and power usage (120W and $0.12/kWh = 1.2 * 1.2 = 1.44x adjustment)

Believe me, I do not. With electricity costs taken into account I make around $4 per day (from 4 video cards) from Bitcoin or Litecoin on 2 gaming systems I rarely use. When I use my main gaming system it is slightly less.

Now add to this the waste heat vented into your house on the months you cool your house

Living in a colder climate these costs offset, however I have no hard numbers. The slightly higher electricity cost in the summer months are offset from a savings in natural gas cost in the winter months.

+ the depreciated costs (and wear and tear) of the computer assets you tied up processing Bitcoins

The goal is to maximize profits and not necessarily maximize the amount of bitcoins/litecoins I mine, so thanks to the power curve of most cards, it is more profitable to slightly underclock the core and/or memory clock which helps minimize wear and tear on the cards. The cards I've had since 2009 are still running and producing the same MH/s as they always have.

Many people who still mine bitcoins with GPU's are people who don't pay for electricity costs thanks to the difficulty rise from FPGA's and ASIC's. This pushed out any profitability for me, but I still have profitability from Litecoin, which is a similar cryptocurrency.

Even if there were no profits and I was just breaking even I would still do it because I would like a use for my gaming machines since I rarely game anymore but still want to sit down and play every couple of weeks.

Re:You use GPUs for video games? (1)

amanicdroid (1822516) | about a year and a half ago | (#42984349)

Considering the cards overheat at above 120 C and overclocking doesn't really do anything for hashing, miners in hot climate could put the boxes in a protected area outside and suffer few ill effects.

Re:You use GPUs for video games? (1)

chichilalescu (1647065) | about a year and a half ago | (#42986583)

if you would like a use for your gaming machines, why not BOINC? you can choose where to donate computing power, although I'm not sure how many projects work on the gpu.

Re:You use GPUs for video games? (1)

antdude (79039) | about a year and a half ago | (#42984193)

Or how much heat is added during the hot times. Ugh. It's nice during the winter though!

Re:You use GPUs for video games? (5, Interesting)

amanicdroid (1822516) | about a year and a half ago | (#42983465)

Haha, I'm at less than 1:1 electricity to bitcoin ratio after ~5 months.
Kill-A-Watt says I've used approx $68.23 of electricity at 11.5 cents per kWh. Bitcoins currently trade at 1 to $30 and I've got 2.2 bitcoins. The Radeon 6770 was (and still is) ~$110.

Additional factors to consider:
-The bitcoin machine is also my daily workstation so if it were running headless and otherwise unused it would have probably done better in the electricity used category.
-It makes a damn fine space heater and I've enjoyed it immensely this winter.
-My focus in this project was to learn hands-on about scientific computing applications and it's been great for that.

In conclusion: as a business it would have been a flop, partially because I haven't optimized the setup for that application. As a learning opportunity and 200 watt heater it's been phenomenal.

Re:You use GPUs for video games? (1)

gstoddart (321705) | about a year and a half ago | (#42983513)

In conclusion: as a business it would have been a flop, partially because I haven't optimized the setup for that application. As a learning opportunity and 200 watt heater it's been phenomenal.

Well then, a learning opportunity and a 200 watt heater are fine outcomes then. :-P

Re:You use GPUs for video games? (1)

Synerg1y (2169962) | about a year and a half ago | (#42983827)

That is the most insight i've ever gotten into the bitcoin economy, I've always passed because the hardware is worth more and can break from such use, especially in the long term. I'm still passing, but it's good to know about the electricity. As for TFA, I don't think the author realizes that there's a ton of video cards with multi-gpus on board, it's not all crossfire and SLI, and hasn't been for the last decade. The method they're using seems legit on the surface until you read at the bottom that they're still developing it. I'd also imagine that a game is a capable of controlling its frame rendering through code, making the technique even more spotty.

Re:You use GPUs for video games? (1)

GigaplexNZ (1233886) | about a year and a half ago | (#42987577)

Those multi-GPU cards still use SLI/Crossfire under the hood.

Re:You use GPUs for video games? (1)

retep (108840) | about a year and a half ago | (#42984177)

Keep in mind that for Bitcoin the individuals like you running tiny little mining setups that might not be actually profitable as a fun hobby are a very good thing. Bitcoin needs mining power to be as well distributed as possible to make it difficult to co-opt, so the hundreds or maybe even thousands of individuals like you help that goal. However, it's helped best if you actually validate your blocks properly, and that means mining with P2Pool right now.

Bitcoin is lucky that the costs to mine for a small rig, on a $/hash/sec basis, are probably actually less than larger setups because on a small enough scale you can ignore cooling issues and often ignore power issues too. (heating in the winter or free power) There is overhead of course, you have to setup your mining rig, but that's often written off as a fun hobby.

Re:You use GPUs for video games? (1)

amanicdroid (1822516) | about a year and a half ago | (#42984381)

ADDED BONUS: many have already have all the equipment they need to get started. Like you said, it's a fun hobby.

Re:You use GPUs for video games? (1)

The Mighty Buzzard (878441) | about a year and a half ago | (#42983481)

The break-even point for GPU mining doesn't exist anymore if you have to pay for power and it's a very, very long time if you don't. Why? ASICs.

Re:You use GPUs for video games? (3, Insightful)

amanicdroid (1822516) | about a year and a half ago | (#42983875)

Bitcoins stayed around $13 to 1 for months and you're correct, there wasn't a breakeven point for GPU mining. With Bitcoins trading at $30 the breakeven point is available again. For how long, I don't know, and I wouldn't bet a business on it.

Re:You use GPUs for video games? (4, Informative)

pclminion (145572) | about a year and a half ago | (#42984555)

Out of curiosity, what's your break even point?

I don't know where the break even point is, but once you pass it, you can be very profitable. One of my friends built a custom "supercomputer" out of cheap motherboards and graphics cards for about $80k -- along with completely custom software to automatically tune clock speeds and fan rates in real time (all of which was written in bash script). At peak performance, his machine generated about $20k worth of bitcoin every month, which easily paid for the $12k monthly electric bill.

After a couple of difficulty-doublings, and the imminent arrival of the ASIC miners, this lost its profitability, and he went back to being a DBA... The machine is still out at the farm, cranking away. I think he'll disassemble it and part it out for cash in a month or two.

Re:You use GPUs for video games? (1)

TheRealMindChild (743925) | about a year and a half ago | (#42985325)

So one of your friends is using company equipment in a server farm to mine bitcoins? Sounds very illegal

Re:You use GPUs for video games? (1)

pclminion (145572) | about a year and a half ago | (#42985415)

I don't get it. Are you assuming that anybody who spends $80k on something must be using someone else's money? You're a moron. This was his private project, which he managed to live off of for almost two years.

Re:You use GPUs for video games? (1)

TheRealMindChild (743925) | about a year and a half ago | (#42985601)

The machine is still out at the farm

So you are telling me he had the upfront money AND a server farm that rented him space, using a completely custom, unknown type of machine that no one had a problem with? Sounds legit

Or am I an asshole and it is on a real vegetable/animal farm?

Re:You use GPUs for video games? (3, Informative)

pclminion (145572) | about a year and a half ago | (#42985725)

Dude, it's a farm. A fucking farm. 40 acres of red wheat.

He designed the rack system himself, along with custom power supply headers that he had fabbed at a nearby plant. He even tried to reduce equipment costs by hiring a Taiwanese company to produce custom GPU cards for him for $70 a piece (they didn't work very well).

Nobody does that shit anymore. It was like watching Steve Wozniak.

Re:You use GPUs for video games? (1)

greg1104 (461138) | about a year and a half ago | (#42986089)

It makes me sad that someone could run up a $12K monthly electric bill without assigning an environmental cost to where that power was coming from.

Re:You use GPUs for video games? (3, Informative)

pclminion (145572) | about a year and a half ago | (#42986213)

It makes me sad that someone could run up a $12K monthly electric bill without assigning an environmental cost to where that power was coming from.

Making assumptions is bad.

Before the Bitcoin operation got started, my friend's business was making biodiesel out of local rendered chicken fat and other things. He single-handedly supplied most of the farmers in a 5 mile radius with fuel for their farm operations. Prior to the biodiesel years, he ran the largest privately owned solar grid in the county, providing something like 25 kilowatts back to the grid, for a couple of years solid. He is the most environmentally obsessed person I know, and has certainly contributed far more to the local green economy than he has taken out of it.

The ultimate plan, which did not come to fruition (because of the rising difficulty of mining bitcoin, as I stated earlier), was to completely cover the 40 acre property with an array of solar panels, each panel having a custom GPU mining module installed on the underside -- open air cooling of the machines, solar power for the bitcoins, and it would have qualified as the largest solar array in the United States.

To think that he's some kind of forest-destroying air-blackening capitalist is about the furthest from the truth as you can get. Check your assumptions.

Re:You use GPUs for video games? (1)

Anonymous Coward | about a year and a half ago | (#42983197)

This is only because nvidia intentionally cripples all consumer grade GPUs, artificially reducing their precision to make them near useless as GPGPU devices.
This is simply to price gouge the high end.
And they get away with it because they have a near monopoly on the market.
AMD is a latecomer and for whatever reason they don't have the platform and support to break in to the GPGPU market. (Nvidia really did take the torch and pioneer the field)

Re:You use GPUs for video games? (5, Informative)

amanicdroid (1822516) | about a year and a half ago | (#42983735)

This is the explanation I've been given for the disparity between Nvidia and AMD:
https://en.bitcoin.it/wiki/Why_a_GPU_mines_faster_than_a_CPU#Why_are_AMD_GPUs_faster_than_Nvidia_GPUs.3F [bitcoin.it]

Specifically:

Secondly, another difference favoring Bitcoin mining on AMD GPUs instead of Nvidia's is that the mining algorithm is based on SHA-256, which makes heavy use of the 32-bit integer right rotate operation. This operation can be implemented as a single hardware instruction on AMD GPUs (BIT_ALIGN_INT), but requires three separate hardware instructions to be emulated on Nvidia GPUs (2 shifts + 1 add). This alone gives AMD another 1.7x performance advantage (~1900 instructions instead of ~3250 to execute the SHA-256 compression function).

For GPU programming I've enjoyed Nvidia's CUDA package greatly over wrangling OpenCL that Radeon relies on.

Re:You use GPUs for video games? (3, Insightful)

tyrione (134248) | about a year and a half ago | (#42986499)

This is the explanation I've been given for the disparity between Nvidia and AMD: https://en.bitcoin.it/wiki/Why_a_GPU_mines_faster_than_a_CPU#Why_are_AMD_GPUs_faster_than_Nvidia_GPUs.3F [bitcoin.it] Specifically:

Secondly, another difference favoring Bitcoin mining on AMD GPUs instead of Nvidia's is that the mining algorithm is based on SHA-256, which makes heavy use of the 32-bit integer right rotate operation. This operation can be implemented as a single hardware instruction on AMD GPUs (BIT_ALIGN_INT), but requires three separate hardware instructions to be emulated on Nvidia GPUs (2 shifts + 1 add). This alone gives AMD another 1.7x performance advantage (~1900 instructions instead of ~3250 to execute the SHA-256 compression function).

For GPU programming I've enjoyed Nvidia's CUDA package greatly over wrangling OpenCL that Radeon relies on.

You're living on borrowed time with CUDA. The entire industry has already moved to OpenCL and it will only expand when all the heavy Engineering and Science vendors are fully on-board. When Ansys 14.5 already moved to OpenCL for its latest release you know such a conservative corporation is one of the last to make the transition.

Re:You use GPUs for video games? (1)

The Master Control P (655590) | about a year and a half ago | (#42987715)

Does OpenCL support device-to-device remote copy over Infiniband?

Honestly asking, because that's an absolute killer feature for HPC applications. PCIx is abhorrently, soul-crushingly slow from the GPU's perspective, and being able to RDMA without ever moving through the hosts' memory saves half your pcix bandwidth use.

Re:You use GPUs for video games? (-1)

Anonymous Coward | about a year and a half ago | (#42988111)

You're just wrong - there are no clusters in the HPC world currently being built with any other accelerator than NVidia. In that space you tailor the software to the hardware and everything is being done with CUDA. OpenCL is an Apple technology and will remain irrelevant to HPC.

AMD will dispute (1)

Anonymous Coward | about a year and a half ago | (#42983051)

I'm sure that AMD, the losing party, will dispute the results and come up with its own methoology to counter this.

Then again, everyone knew nVidia high end cards are better, so was this new test really necessary??

Re:AMD will dispute (2)

sexconker (1179573) | about a year and a half ago | (#42983137)

I'm sure that AMD, the losing party, will dispute the results and come up with its own methoology to counter this.

Then again, everyone knew nVidia high end cards are better, so was this new test really necessary??

The point of the "new test" is that framerate is a terrible metric because it averages out what you care about.
When you measure frame times individually you can then quantify how often a game slows down and by how much.
You don't just have an average FPS, or a MAX/AVG/MIN.

Regardless... (4, Interesting)

Cinder6 (894572) | about a year and a half ago | (#42983163)

As an owner of a Crossfire setup, it's obviously not a 2x improvement over a single card; however, it's also a marked improvement over a single card. When I first set up this rig (August), I had problems with micro-stutter.* Now, though, after AMD's newer drivers and manually limiting games to 59 FPS, I don't see it anymore; games appear smooth as silk.

At a mathematical level, it may not be a perfect solution, but at a perceptual level, I am perfectly satisfied with my purchase. With that said, buying two mid-line cards instead of one high-end card isn't a good choice. Only buy two (or more) cards if you're going high-end.

*I was initially very disappointed with the Radeons. That's no longer the case, but I will probably still go nVidia the next time I upgrade, which hopefully won't be for years.

Re:Regardless... (1)

O('_')O_Bush (1162487) | about a year and a half ago | (#42983397)

Limiting your framerate to only 59fps on Crossfire is acceptable to you? What resolution are you pushing that 59fps doesn't defeat the purpose of having Crossfire?

Re:Regardless... (1)

Cinder6 (894572) | about a year and a half ago | (#42983565)

Why yes, it's acceptable, because 59 is more than enough for smooth animations--your eyes don't notice the difference, and your monitor probably couldn't even refresh fast enough to show it, anyway. My games never drop below this rate, so it looks smooth throughout.

Re:Regardless... (1)

citizenr (871508) | about a year and a half ago | (#42984145)

Why yes, it's acceptable, because 59 is more than enough for smooth animations--your eyes don't notice the difference ...

proves the point that only suckers buy into SLI/CF scheme

Re:Regardless... (1)

epyT-R (613989) | about a year and a half ago | (#42984329)

People who want and can see the difference between 60hz and 120 aren't suckers for their willingness to pay up, but it is true that SLI doesn't always deliver. We are far from the 3dfx days where a second card gave an automatic 100% performance boost in every application. As someone who can easily see the difference, I always shoot for a single GPU whenever possible.

Re:Regardless... (1)

TheRealMindChild (743925) | about a year and a half ago | (#42985347)

3dfx cards only gave 100% performance gains when your rendering pipeline was two pass... one for models, one for textures. Not every game did it this way. Quake 2 did, however

Re:Regardless... (1)

PickyH3D (680158) | about a year and a half ago | (#42984367)

proves the point that only suckers buy into SLI/CF scheme

SLI/CF decrease the chances that the frame rate will drop below an acceptable level. They're pointlessly rendering if they go beyond what you, and your monitor, can perceive.

The only point proven is that you do not understand FPS, nor do you understand the purpose of SLI/CF.

Re:Regardless... (1)

citizenr (871508) | about a year and a half ago | (#42986171)

They're pointlessly rendering if they go beyond what you, and your monitor, can perceive.

The only point proven is that you do not understand FPS, nor do you understand the purpose of SLI/CF.

I understand fps plenty. More than OP because I know you can spot more than 60Hz. You need to be a really big sucker if you
a believe going over 60Hz will be unnoticeable
b pay for two cards

Re:Regardless... (1)

Anonymous Coward | about a year and a half ago | (#42983777)

Limiting your framerate to only 59fps on Crossfire is acceptable to you?

You must be one of those fucking morons who think there's a visual difference between 120fps and 60fps on a 60Hz LCD monitor.

Re:Regardless... (2)

epyT-R (613989) | about a year and a half ago | (#42984425)

You must be one of those fucking morons who:
1. doesn't realize real 120hz panels exist.
2. doesn't realize that even a vsync'd disabled 60hz display still allows for lower input response latency if the graphics cards allow higher framerate.
3. doesn't realize that 60hz+ isn't the only reason people do multigpu. Having twice the fillrate helps in other areas too.

Re:Regardless... (0)

Anonymous Coward | about a year and a half ago | (#42984765)

Congratulations on confirming that you are in fact, a fucking moron.

1. on a 60Hz LCD monitor. means exactly what it means. Of course there are 120Hz monitors, but if you own one of those then obviously you'll limit your frame rate to that instead of 59.

2. visual differences have nothing to do with input lag. Your rendering and input loops should be able to run at different speeds without any visual or input artifacts.

3. This is completely irrelevant. Limiting your FPS before output doesn't reduce your fill rate.

Re:Regardless... (0)

Beardydog (716221) | about a year and a half ago | (#42987419)

2. visual differences have nothing to do with input lag. Your rendering and input loops should be able to run at different speeds without any visual or input artifacts.

It has everything to do with input lag. If I move my mouse just under half-a-frame (monitor frame) before the monitor refreshes, then the monitor gives me a frame it started rendering just OVER half-frame before the refresh, I get to wait another full frame before my mouse movement is reflected in-game. If the GPU produces frames significantly faster, the frames you see reflect a much more current game-state.

Re:Regardless... (1)

drinkypoo (153816) | about a year and a half ago | (#42985037)

2. doesn't realize that even a vsync'd disabled 60hz display still allows for lower input response latency if the graphics cards allow higher framerate.

No, no it doesn't, unless the input system is part of the graphics thread and running on the frame update timer, which is conspicuously not the case these days in any case of competence.

Re:Regardless... (1)

Anonymous Coward | about a year and a half ago | (#42986987)

How much better is your gaming experience with a Monster brand HDMI cable?

Re:Regardless... (1)

zipn00b (868192) | about a year and a half ago | (#42984489)

There is a difference as I have to blink twice as fast at 120fps...............

Re:Regardless... (2)

espiesp (1251084) | about a year and a half ago | (#42985069)

Ever hear of Eyefinity? 5760x1200 is a lot of pixels to push.

Re:Regardless... (0)

Anonymous Coward | about a year and a half ago | (#42986691)

Hear of? Tried it, didn't like the ultra-wide aspect ratio.
Now running 3P 3600x1920. With a single pre-ghz-edition 7970 @ 1025/1375
Can't run heavy games with everything maxed, but... meh.

This trend has been going on a little longer (3, Interesting)

WilliamGeorge (816305) | about a year and a half ago | (#42983177)

It started when people began to look not only at average frame rate, but at *minimum* frame rate during a benchmark run. That shows how low the FPS can dip, which was the beginning of acknowledging that something in the user-experience mattered beyond average frame rate. It has gotten a lot more advanced, as pointed out in the article here, and this sort of information is very helpful for people building or buying gaming computers. I use info like this on an almost daily basis to help my customers get the best system for their needs, and I greatly appreciate the enthusiasts and websites which continue to push the ways we do testing!

Not the only reason to ignore multi-GPU setups. (0)

Anonymous Coward | about a year and a half ago | (#42983227)

I thought having a second GPU would be a good upgrade path. Boy was I ever wrong.
None of the soulutions scale gracefully. Both games and drivers have to be tweaked on a case by case basis before performance becomes good.
Often new cutting edge games work like shit on launch day.

I decided to ditch the scheme when the only way to make both max payne 3 and skyrim playable on launch day was to pull out one of my radeons.

Re:Not the only reason to ignore multi-GPU setups. (1)

spire3661 (1038968) | about a year and a half ago | (#42983473)

Agree 100%. Ill help someone build a multi-GPU system, but i wont run one at home unless its an experimental rig.

Re:Not the only reason to ignore multi-GPU setups. (2)

Guspaz (556486) | about a year and a half ago | (#42984427)

The new methodology shows that nVidia SLI gets excellent results. However, I've owned SLI solutions, both two 285s and a 290. Neither one was satisfactory. The higher failure rate was of particular note. One of the 285s failed, I RMAd it, got a 290, eventually that failed... On top of that, even when things were working, there were lots of problems. Some games didn't run correctly. Crysis, for example, didn't seem to like SLI; it would get into a bizarre state where the screen would begin flickering rapidly between black and normal frames, and you'd have to restart the game to recover. The problem never occurred in the same machine when I disabled SLI.

After that experience, both with dual-card SLI and single-card SLI, I decided it just wasn't worth it, and that my money was better spent on a good single-GPU solution. I currently have a single GTX 670, which works fine.

Another test I'm seeing more of (4, Interesting)

gman003 (1693318) | about a year and a half ago | (#42983411)

99th percentile frame times. That gives you a realistic minimum framerate, discarding most outliers (many games, particularly those using UE3, tend to have a few very choppy frames right on level load, that don't really affect performance).

Re:Another test I'm seeing more of (-1)

Anonymous Coward | about a year and a half ago | (#42983479)

Can I stick my dick in your asshole?

Re:Another test I'm seeing more of (0)

Anonymous Coward | about a year and a half ago | (#42983843)

UE has traditionally been very CPU bound. I haven't messed with it much in the last several years (UE3 especially), but I know that the UE1 and UE2 were only offloading video rendering to the GPU. Lighting was mostly pre-baked. When shaders were used, they were slow.

It was always written with compatibility in mind, not necessarily speed. That's why UE1 included a software renderer that pounded the CPU. On systems of the time (Pentium 2 and 3 era), you were lucky to get 10fps with that thing. But it would run (poorly) on anything. UE2 required a GPU, and allowed for some limited shader use (they were a "new" concept at the time, in 2003 or so), but had fallbacks to allow the CPU to get pounded doing those as well. I don't doubt that UE3 has some very compatible, but slow map loading code, and I wouldn't be surprised to find out there's some hardware that you can add to your machine to make it faster. I'd also be surprised if UE4 worked without that hardware.

Re:Another test I'm seeing more of (2)

X0563511 (793323) | about a year and a half ago | (#42984143)

UE3 uses some kind of deferred loading. Notice when you first enter the menu and whatnot that everything looks like garbage for a moment - the hitching you see at the start is because of texture uploads to your VRAM and the like.

Re:Another test I'm seeing more of (1)

gman003 (1693318) | about a year and a half ago | (#42984609)

Yeah, I knew that was why it happened. Many games, even most open-world games do that - they have low-res textures loaded for everything, and dynamically load and unload the higher-res ones depending on what the scene needs. Late UE2.5 and early UE3 titles seem to stick out as the ones that preload *no* high-res textures until the level actually starts. UT3, The Last Remnant, Bioshock, games like that.

Rage is another example of one that is extremely aggressive about unloading textures - you can look at a wall, and just turn 180 degrees and it will be unloaded. Most just take player position and maybe vistrees into account, not view frustrum.

Re:Another test I'm seeing more of (1)

drinkypoo (153816) | about a year and a half ago | (#42985017)

I'm not interested in that, but I'm fine with chopping the first and last 5% of the frames off before calculating the frame times. I want to know what the frame rate is going to be when everything is exploding and a texture is being loaded and my bus is congested etc etc.

Re:Another test I'm seeing more of (0)

Anonymous Coward | about a year and a half ago | (#42986591)

Christ, all this effort over a fucking game. A GAME, people!

Developers (4, Insightful)

LBt1st (709520) | about a year and a half ago | (#42983637)

This is interesting from a developer standpoint as well. This means we are wasting processing time rendering frames and are only displayed for a handful of milliseconds. These frames could be dropped entirely and that processing time could go to use elsewhere.

Re:Developers (1)

epyT-R (613989) | about a year and a half ago | (#42984501)

I guess.. if you're targeting your game at mouthbreathing harelips.. might as well just produce your shovelware for the consoles then and not worry about multigpu PC at all.

Re:Developers (1)

Dr. Spork (142693) | about a year and a half ago | (#42985097)

But isn't that par for the course? I mean, whenever the frame rate exceeds the refresh rate of the monitor, you're using resources to render literally invisible frames. Yet it's my impressions that games/drivers will still render those frames. Isn't that right? I would appreciate games that save me energy, or used those resources to make better AI decisions, etc.

Re:Developers (3, Informative)

mikael (484) | about a year and a half ago | (#42985301)

Developers still like to have everything on a "main loop" - render static scenery, get user move, render player, get network player moves, render network players, render auxiliary data). Other stuff will be spinning and bobbing up and down on its own based on timers. Some frames might never be rendered, but they help keep the "tempo" or the smoothness of the animation. As each PC screen can have a different screen resolution, it will have a different refresh rate, anything from 50Hz to 120/240Hz. Every rendered frame is only going to be visibile for several milliseconds (50Hz = 20 milliseconds, 100Hz = 10 milliseconds). If a frame is rendered, it will be perceived even if not consciously.

Early home computers allowed the program to synchronize animation updates to the VBI (Vertical Blank Interrupt) and HBI (Horizontal Blank Interrupt). That way, you could do smooth jitter-free physics synchronised to the frame flipping.

16-bit console system programmers would render out lines across the current scan-line to see how much processing they could do in each frame. While the tiles were updated during the VBI, the physics could be updated during the CRT scanning.

These days, I would guess you would need either a vertical blank callback for the CPU or shader for the GPU.

Re:Developers (1)

phizi0n (1237812) | about a year and a half ago | (#42986533)

Turn on Vsync and then you won't render more frames than the monitor can display. If you want to go a step further then fix your engine so that everything isn't stuck on the main loop waiting for a frame to be rendered like many developers still do many years after the proliferation of multi-core cpu's.

Re:Developers (1)

igomaniac (409731) | about a year and a half ago | (#42988143)

It's generally desirable to have the AI and physics run at a fixed time step because it allows you to reproduce results exactly. That way you can record gameplay by just recording the inputs. So usually you will have a 'simulation' thread running at a fixed tick rate and a 'render' thread producing frames as fast as possible. I agree about the Vsync, there is not point whatsoever in making frames faster than the display can display them.

And in fact that's the problem with this frame-time benchmarking, if the workload is such that it's artificially rendering more frames than can displayed it doesn't really matter much if they are displayed at a consistent rate. If you want to see how much better a multi-GPU solution is, you need to test a workload that is being rendered at less than the Vsync rate (or at least around that rate).

Game developer support (0)

Anonymous Coward | about a year and a half ago | (#42983651)

nVidia and AMD both have to bribe game developers to get them to consider threaded rendering at all. Coupled with the bugs in the respective companies' implementations you have lost entire generations of hardware purchases to the simple economics of "if I buy a better single-GPU card now, I get more value for my money than multiple lesser GPUs that might not even work."

Multi-GPU solution? (0)

Anonymous Coward | about a year and a half ago | (#42983757)

What is a multi-GPU solution? Liquid Hydrogen?

Solutions are all tech articles talk about these days. Shouldn't electronics be kept safe from liquids?

Not Real World (0)

Anonymous Coward | about a year and a half ago | (#42984281)

If I'm buying a GPU for a video game, I only care about how it benchmarks on video games.

Re:Not Real World (1)

GigaplexNZ (1233886) | about a year and a half ago | (#42987665)

If I'm buying a GPU for a video game, I only care about how it benchmarks on video games.

Uh... No, you don't. You care how it actually PLAYS the games. Benchmarks are just an indication used for purchasing the GPU, and this article talks about a better way of measuring gaming performance to make better decisions.

Methodology? (0)

Anonymous Coward | about a year and a half ago | (#42984595)

Why would the study of methods to ascertain representative GPU speed change multiple GPU solutions?

Do you mean a new testing method perhaps?

Testing Bias (0)

Anonymous Coward | about a year and a half ago | (#42984777)

It seems to me that lots of times when AMD or ATI back in the day made a damn good product most testers would go out of there way to emphasize Intel/Nvidia's good points, while emphasizing the bad. It's made me suspicious as all hell.
Personally out of all the different products I've used I've been most impressed by AMD and ATI.
Hell, I am running on an i7 and GTX 260 and it stutters while playing Synthesia. It's terrible.
But the Radeon 7570 (I think thats the model) that I put in my buddies old Core2 duo setup runs everything he's thrown at it.

Multi-GPU works somewhat... (1)

Anonymous Coward | about a year and a half ago | (#42984833)

As a long-time GTX 295 owner, I've known for quite a while that my eyes are really good at seeing stuttering. For a few years, my GTX 295 did a splendid job keeping up with games, and as long as I could manage 60 FPS everything went seemed pretty smooth. I did have a few moments where I did see micro-stuttering but I found that either enabling V-sync or enabling frame limiting solved the problem. As you can see in this diagram http://www.pcper.com/files/review/2013-02-22/fr-3.png it's very possible that your GPUs synch up perfectly, producing two frames at the same time effectively causing every other frame to never reach the screen. Forcing games to synchronize to a certain FPS either via V-sync or frame limiting therefore helps the GPUs render at perfect intervals, assuming your hardware can render at that speed (usually 60 FPS).

And there's my point. SLI and Crossfire works perfectly fine as long as your hardware can pump out 60 FPS so you can synch it to that frame rate. As soon as it drops below that for even just a few seconds, it's easy as hell to spot a drop to 55 FPS that looks like 30-40. Therefore, SLI and Crossfire has tremendous value at their time, but are far from future-proof since as soon as you go under 60 FPS you basically drop to singe GPU framerates. For me, that wall was hit around when I started playing BFBC2, and I'm forced to play BF3 at low to ensure a minimum 60 FPS at all times.

A multi-GPU solution is a great investment for enthusiasts who change upgrade their hardware every year or two, but a horrible mistake for people expecting it to last several years.

Failed methodology... (0)

Anonymous Coward | about a year and a half ago | (#42985847)

The methodology to use FRAPS to capture the frame rate is flawed at several levels.

To start with, FRAPS will take system resources, IO, CPU and probably on certain configurations even GPU resources.

The "overlay" is sort of interesting, BUT doesn't reveal what is actual frames or not.

They will need to get back with a better solution for measuring real frames per second.

Besides, 60 fps, which is the refresh of the "image" is more then the human eye can distinguish (i will not enter into the actual details of the maximum screen refresh rates, i will leave that to you, to do the math of screen mhz into max FPS that they can render).

There is also the issue of the actual screen materials that have a max capability of changing luminescence (which may or may not be in sync with the max screen refresh rate).

So interesting analysis, but i fail to see where the result is (if you are measuring actual frames, or actual limitations of any of the mentioned issues).

Re:Failed methodology... (1)

UnknownSoldier (67820) | about a year and a half ago | (#42986265)

> Besides, 60 fps, which is the refresh of the "image" is more then the human eye can distinguish

No it is not. A few us gamers can tell the difference between 60 Hz and 120 Hz.

The "practical" limit is anywhere from 72 Hz - 96 Hz. Sadly I'm not aware of anyone who has actually a done study what the practical limit is.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...