Fastest Graphics Ever, Asus ARES Rips Benchmarks 208
MojoKid writes "Over-the-top, killer graphics cards are always fun to play with, though they may not be all that practical. With a pair of ATI Radeon HD 5870 GPUs on a single PCB and 4GB of GDDR5 graphics memory on board, the recently released Asus ARES is one such card that can currently claim the title of being the fastest single gaming graphics card on the planet. This dual-GPU-infused beast rips through benchmarks, besting even the likes of a Radeon HD 5970 or NVIDIA GeForce GTX 480. You can even run a pair of them in CrossFire mode, if you're hell-bent on the fastest frame rates money can buy currently."
OpenCL? (Score:5, Insightful)
Yeah but what about OpenCL performance?
Some of Anandtech's Fermi benchmarks put it 4x+ behind in GPGPU tests.
Re:OpenCL? (Score:5, Interesting)
Folding seems to indicate the same. nVidia's recent changes to their architecture boosted power consumption, but made double-precision floating point ops about 4x faster. Good for GPGPU, but not so good for games. (which don't really use double-precision floating point)
Re:OpenCL? (Score:5, Interesting)
I seem to recall that the double-precision performance of NVidia's latest graphics cards would've been truly impressive... if they hadn't intentionally crippled it on all of the gaming cards in order to force people to buy compute cards costing several times the price. Works out that double precision runs at 1/8th of the speed of single precision - the same ratio as the previous generation - as opposed to 1/5th on ATI Radeon hardware and 1/2 on NVidia's really expensive professional cards.
Flying fuck. (Score:4, Insightful)
How relevant is that for a gaming card?
Remember, this is a product that comes with a GAMING MOUSE thrown in. It's like asking how much of a load the latest supercar can haul. It's irrelevant, as long as there's no games using OpenCL. Trust me, when OpenCL is a big thing in gaming, these cards will be long forgotten.
Re: (Score:3, Insightful)
My first question is what is openCL(blahblahblah googlit), my second question is why is it important on a card where I'm going to be using it for gaming in a super-dominated DX market as it is? Because as it stands, I don't see it.
GPGPU? (Score:2)
Lots of frames is neat, but how fast can it run my BOINC client?
5890 Ultra (Score:3, Informative)
So it's actually a ATI Radeon 5890 Ultra. You will be cheaper off buying two discrete 5870 cards and running them in Crossfire. Thermals will be better and thus you will be able to overclock them further.
Re: (Score:2)
Re: (Score:2)
No idea what you just said dude.
Re: (Score:2)
Just the idea of buying the best card today or linking to cards like the 5770 together then hunting for games to play.
Re: (Score:3, Interesting)
This article also compares the ARES to a pair of HD 5870s and you are mostly correct:
http://www.pcper.com/article.php?aid=953 [pcper.com]
Keep in mind that with 2GB cards you are actually only saving about $200 by NOT using the ARES.
Am I a cheap bastard? (Score:5, Insightful)
The Asus ARES commands a hefty $1200 MSRP.
What the fuck
Re: (Score:2)
The Asus ARES commands a hefty $1200 MSRP.
What the fuck
I am not really sure what your price expectations are for the fastest videocard ever made, of which the manufacturer makes only 1000 to be sold?
Re: (Score:2)
I am not really sure what your price expectations are for the fastest videocard ever made...
My price expectation is that this thing will cost $150 in 18 months, just like every expensive video card that came before it.
Re: (Score:2)
It doesn't really. It depends on the context and the tense of the verb it's used with. Example: "I wondered if he'd ever seen a computer before" is entirely past-tense in meaning.
Regarding the discussion at hand, "X-est ever" really is more or less equivalent to "X-est to date". If you want to include the future, then you'd say "X-est that will ever be made".
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I would have just pointed out the fact that the sentence also contained the word "made".... I would find it hard to "made" something in the future. The word made means it has already in the past, make is in the future.
So the sentence "The fasted video card ever made" means exactly that it is the fastest card made yet.
The sentence may not hold true for very long but on this day it is correct lol
Besides if the wold got blown up this afternoon then it would be true for all time (unless you believe that E.T's a
Re: (Score:3, Funny)
Re: (Score:2)
The word is rarely ever* used that way in this context.
* "rarely ever" as in practically never
HTH
Re: (Score:3, Insightful)
Don't worry, nobody is forcing you to buy one, besides only a thousand will be sold in the US anyway, I am sure this Ferrari of a video card will find it's buyer.
Re: (Score:2)
Re: (Score:2)
"Big $ and small brain" aren't a prerequisite for wasteful spending. In fact, if you look around, you'll notice that many of those who really should be more frugal (considering their real level of "$"), are actually often first in line for pointless shopping.
Re: (Score:2)
Unless there are some REALLY fuckin' stupid people over there in the states with lots of money
Sir or madam, speaking as an American, I can ASSURE you that there are more than a thousand people here who fit your description. If you desire evidence, I would like to remind you of the kind of candidate Big Money likes to elect in this country, and the judgment (or lack thereof) that illustrates. I don't think they'll have any problem selling out of this particular piece of hardware. It just won't be to working stiffs.
Comment removed (Score:4, Informative)
Re: (Score:3, Insightful)
Don't worry, nobody is forcing you to buy one,
You'll eat those words upon the next release of the Crytek engine...
Re: (Score:2, Informative)
I only spend ~$100 on average on my videocards.
I got a GTS 250 for $100 close to a half-year ago. A friend of mine just got a Radeon 4870 for $100!
Re: (Score:3, Insightful)
I only spend ~$100 on average on my videocards.
I got a GTS 250 for $100 close to a half-year ago. A friend of mine just got a Radeon 4870 for $100!
Great. Now what does this have to do with anything?
Re: (Score:3, Insightful)
Re: (Score:2)
That there are always people that are too stupid to buy such a thing when you can get a great GPU for 100 dollars?
In other words parent explains why this card is totally useless...
So you are saying that the current 200, 300, 500 and 1000 $ cards offer no value over 100$ ones? That's true I guess, if you are playing games a decade old and/or have a small monitor.
Re: (Score:3, Informative)
Crysis Warhead at 1680*1050 at max setting 'enthusiast' or something gives 30+ fps on Windows XP SP2 with my AMD Phenom 9950 X4, 8GB RAM, HD5770...
So you were saying?
Re: (Score:2)
Crysis Warhead at 1680*1050 at max setting 'enthusiast' or something gives 30+ fps on Windows XP SP2 with my AMD Phenom 9950 X4, 8GB RAM, HD5770...
So you were saying?
I am saying that:
1) 30fps is a joke and not anywhere near a playable framerate
2) 5770 is a 150-200$ videocard.
Re: (Score:3, Informative)
It is perfectly playable, for anyone with human eyes [wikipedia.org]
Re: (Score:2)
Here is a simple test. Try adjusting the refresh rate on your monitor. Start above 75 and go lower. At some point you will start to see it flicker. Note that the point you start to see it flicker is well above 30.
Re: (Score:2)
That depends on the speed of movement in the image. The more things move between two frames, the harder it will be for your eyes to connect the two images together. This is made even worse when the whole image moves, since you aren't getting corrective feedback from your inner ear.
What I'm saying is that 30 fps seems smooth, but 60 fps makes it easier to keep track of what's happening.
Re: (Score:3, Insightful)
I can't believe anyone has seen a spoked wheel [wikipedia.org] in a movie and never wondered why it rotates backwards.
This "debunking" shown in your first link is not showing the difference between 30 fps and 60 fps. Considering the fourfold symmetry of the rotating square, what it's actually demonstrating is that 7.5 fps looks choppier than 15 fps.
There will
Re: (Score:3, Insightful)
Movies do just fine with motion blur and so does Crysis.
But if you are a pro online FPS gamer who gets loads of cash then the extra frame that half renders without vsync can make that tiny important difference between who shoots first in a crytical encounter.
But when you are not a pro FPS gamer playing for money on a LAN then higher than 30min - 60max FPS is totally bullshit.
Besides, all hardcore gamers play in a little less high res and everything but lightning, reflection and model detail is as much downs
Re: (Score:3, Insightful)
For video editing that is crucial so that you can get overlapping frames inside of that 24fps timeframe.
Games however do not work like this. They crank out as much frames per seconds and your HW can handle and then without vsync it is all hit and miss. Ask your movie expert and he'll tell you why that is.
Re: (Score:2)
It's trivial to tell the difference between 24fps and 30fps because a quick sideways pan on TV (even interlaced) is just some stuff moving but a quick sideways pan in the theater, even from the back row, makes me wanna puke. And frankly, you can trivially tell the difference between 30fps and 60fps in a big pan on a large HDTV.
Re: (Score:2)
People may have a hard time seeing something occuring i
Re: (Score:2)
"But but but but but..."
That's motion blur. Maybe the TV creates extra frames but that's still motion blur in your head. No matter what way you look at it. Your brain just puts all these frames into one frame. That's how motion blur works in the first place.
And the reason games do not work like that is because that makes your view lag 1-2 frames behind, so all you're seeing is smoother frames, but not more than about 24 frames per second.
Now you can get a monitor that has the perfect response time that is a
Comment removed (Score:4, Funny)
Re: (Score:2)
"there's no practical difference between 24 fps and 60 fps"
Except you are incorrect, in computer terms _there is_ a practical difference _the amount of detail that can be rendered will be faster on a 60fps card. This is what FPS really is measuring. It's an abstraction to be able to render more detail at a decent clip.
And it's not framerate so much that matters as it is the _minimum_ fps at a level or games most taxing points for a video card.
Re: (Score:2)
Look, I trust my eyes, and they can definitely see a CLEAR distinction between 24 fps and 30 fps, and 30 fps and 60 fps, (and when I had a CRT and played Q3 in 640x480 so I could use 120 hz, 60 fps and 120 fps) in video games on my computer monitor.
I can also say that a lot of times at the theater action movies look choppy as FUCK to me (to the of distraction in some cases), even with the built-in motion blur (a choppy transition between two blurry pictures still looks choppy to me). And yes, I realize that
Re: (Score:2)
not to mention that lots of people these days play on 1920*1080 or above screen (the ones spending 500 bucks on a vga card anyway)
and with eyefinity out on the loose, some people play these games spanning three monitors
Re: (Score:2)
OK even if you'd buy two 200 USD graphics cards every two years than you'll always be able to play the latest games and tech compared to one 1200 USD card every six years. 200*2 = 400 ... 400*3 years = 1200 USD.
Not to mention that you stay up to date with the latest OpenGL (currently version 4 if you missed it) and GLSL and DirectX
Re:Am I a cheap bastard? (Score:5, Informative)
1) 30fps is a joke and not anywhere near a playable framerate
FPS is one of those subjective issues where there seems to be a lot more "I don't like X so you are daft for suggesting someone might" then hard facts.
For lot of people 30fps is perfectly fine if it is a minimum rate rather than an average. A lot of people talk at cross purposes on this one, the "30 is fine" crowd assuming that the people looking for 100fps+ when there monitor probably refreshes at 60Hz are daft and want 100+fps everywhere and the "30 is no were near enough" crowd thinking that the 30fpss would be happy with 30 on average. For games that require decent graphics hardware the demand on that hardware can vary a lot, so a card that gets 30pfs in some areas will drop below 15fps in others, likewise that card that pushes 100Hz in the lighter scenes may drop below 50 on the really heavy ones.
So any quote of an fps requirement or recommendation is completely useless unless you qualify the figure in more detail.
Another factor that needs to be considered is screen size. An object moving from one side of the screen to the other at the same framerate is going to look smoother on an smaller monitor than it'll look on a full-wall projector (unless of course you are far away from said wall, to the point where it is effectively the same size as the small monitor in terms of how it appear on the back of your eye). How far objects on the display travel between frames is what needs to be measured, not just how many frames there are in a given time. This brings up another point as to why this sort of thing is subjective and difficult to sound reasonable discussing (without so much supporting detail that you bore people to death) - it very much depends on what games you play and how you play them.
Re: (Score:2)
But 15fps is still very playable, especially if it is constant 15fps. What makes games unplayable (or atleast annoying) is having 30fps, then 15fps, then 60fps and then 1fps. I usually can enjoy games as long as they get to the 15-20fps range, anything slower than that and aiming becomes difficult.
Re: (Score:2)
15fps is often fine if it is constant, though I would usually demand more (and be willing to drop things like texture quality, AA, and so on to compensate) from a fast-paced game.
A drop from ~30 to ~15 in a busy scene, if you are watching intently, can be a bit jarring though (I assume the change in timing is enough to kick off a "something is different, best be alert" reflex in the brain's optical processing). Much more so then the same relative change in a drop from ~60 to ~30 (which I'm not 100% sure I'd
Re: (Score:2)
The problem with "playable vs non playable" FPS is personal, because the part of the eye that can see the difference is personal and changes with the person.
Essentially our focus cannot see the difference. It's physically impossible for pretty much any person to see over 30 pictures per second in eye's focus due to the way focus handles image capture and transfer to the brain.
However peripheral vision is far greater in this aspect due to its original design purpose - to spot movement. The image seen by peri
Re: (Score:2)
Re: (Score:2)
Re:Am I a cheap bastard? (Score:4, Informative)
First 30 FPS is probably too low for a FPS because you are likely stating what the FPS is when you are motionless and in one particular spot. Now, replay a few matches and tell us what your minimum frame rate was, and I best it's in the very low teens or worse. That isn't acceptable, and you are more likely to lag when the action gets thick and you need your FPS the most.
Secondly, Crysis/Crysis Warhead is a 3 year old engine that's a generation behind. Of course playing games from 3 years ago play fine on $100 video cards, but those cards would have been the $600+ cards 3 years ago too. Try picking at least a current gen game.
And lastly, 1680x1050? My LCD's native resolution has been 1920x1200 since I got it 5 years ago. Try getting a decent monitor.
Playing old games on low resolution monitors and cherry picking frame rates only proves how wrong you are.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
It's all relative.
Would you pay $150 for a single dinner for one? I do it on a regular basis (i.e., at least bi-weekly). I'm not, by any stretch of the imagination, wealthy but I enjoy the experience. In fact, I don't eat out during the week, drive an older car, live in a small house, buy clothes maybe once a year, and forego many of the other things that co-workers with similar incomes see as normal (video games, movies, smart phone) just so that I can eat out.
You can get a great meal for $10. Hell, you
Re: (Score:3, Insightful)
What price range were you expecting for 'fastest video card'?
Re: (Score:2)
Think yourselves lucky it's equivalent to $1800 here in the UK.
http://www.overclockers.co.uk/showproduct.php?prodid=GX-230-AS [overclockers.co.uk]
Re: (Score:2)
Exactly. This thing is going to be desperately looking for a market. Fact is that PC games can be played on quite average hardware these days because they are either ported from, or intended to be ported to consoles.
Couple that with the fact that by the time games can really take advantage of that horsepower in about 12-18 months time, you'll be able to get the same throughput on a $400 card that takes up less space, runs cooler and uses less power.
Re: (Score:2)
I don't get why people are saying that; among the most popular PC games you'll find WoW, soon Starcraft 2 (and, unfortunately not soon, Diablo 3 - but how late it will be only strengthens what I'm trying to show), numerous games based on Source, Galciv2, Sins of Solar Empire, soon Elemental (really can't wait for this one); also such nice gems as World of Goo, Aquaria, et al. Many of those supposedly showing "what PC gaming is all about", most of those not touching consoles in any way, also most with quite
Re: (Score:2)
The Asus ARES commands a hefty $1200 MSRP.
What the fuck
And the name is still (barely) an anagram of "arse". :-) [urbandictionary.com]
And the games? (Score:2)
When will we have some new eye candy to make the new cards melt?
Re: (Score:2)
When the next generation of consoles arrive.
Re: (Score:2)
limited edition (Score:3, Insightful)
Have other cards been offered as 'limited editions'? I was reading the review and thinking "cool, I'll have that in a year..." but then noticed they're only shipping 1000. Then I thought, no way, it might be _that_ card that's just 1000 units, but I'm pretty sure one almost like it will follow.
Re: (Score:3, Interesting)
Have other cards been offered as 'limited editions'?
From what I've seen there is often at least one for each generation of each major manufacturer's chip. Sometimes there is more than one, as two or more board builders compete with each other to see who can earn most nerd points by pushing a given generation of chip the furthest (by over-clocking everything, over-speccing other parts, including the require cooling system to keep the out-of-spec setup inside an acceptable thermal profile, and turning marketing up to 11).
I tend to ignore such limited editions
Why this doesn't matter (Score:4, Funny)
This [techbuket.net] was over the top, totally bonkers, hilariously exaggerated just 10 years ago. With not two but 5 of the hottest graphics processors of the time on one board, it would smoke the competition in any benchmark (particularly Bungholiomark). Now tell me, what good would five times the performance of a ten year old card do in one of today's games? The ASUS ARES is just as ridiculous, but it's real and they expect you to pay real money for it. If you do that, the joke is on you.
Re: (Score:2)
Re: (Score:2)
"Just"? Just ten years ago? The only thing I've got left from 10 years ago is the mouse. If you're still using it ten years later, I rather think you've had your money's worth.
Cheers,
Ian
Re: (Score:2)
No - clearly not. But computers are of their time - a ten year-old machine that's still doing good work today represents excellent, and I do mean excellent value. Same for a peripheral. I'm typing now on an...erm...I think 4 year-old MacBook Pro - it's the newest machine I've got and
Re: (Score:2)
Never said you should. Said quite the opposite in fact - said that if you're still using a ten year-old card today, you've had excellent value. Thing is though, I doubt you are and I'll bet there are good, sane reasons for this.
Joining the discussion - that's actually quite possible and not much of a stretch. In fact, one of my most often used PCs has...Matrox G400 16MB; one that will be 10 years old in a month IIRC (and the type is available for 11 years, I think). Most of the components in that machine ar
Re: (Score:2)
dunno...the Chrome and the Volari would've slowed things down...greatly.
Re: (Score:2)
The ASUS ARES is just as ridiculous, but it's real and they expect you to pay real money for it. If you do that, the joke is on you.
No, they don't expect people to pay for it. They just create these cards so they can say that they have the fastest consumer card on the planet. It is hoped that this will raise the prestige of the brand and people will buy the affordable versions.
Re: (Score:2)
It's even older than that. This is just an update of the original photoshop job done on the 3Dfx Voodoo2. Remember that card? ;)
Go Back in time with it (Score:2)
"Look, I have brought a graphics card from the future!"
*Crack open the case (with added dry ice for the appropriate smoke effects)
"Compared to your puny Voodoo2 with 8mb of ram, this has 4GB! Weighs over 2 kg and requires over 200 watts of power and other fancy numbers. Tremble at it's heatsink!"
Anyway, I digress, I wonder if this card is faster than all the Voodoo2s sold put together?
Re: (Score:3, Interesting)
Who knows, but that's not of the essence. Unfortunately, computer games have gone the way of Hollywood movies, all glitter and no substance.
My favorite game genres are adventure games and car simulations. Ten years ago i used to play the Need for Spped - Porshce [wikipedia.org] game and I still have to see a similar game that's as fun for the casual gamer.
Racing games today have much better graphics, the cars look almost like photographs, but they aren
Re: (Score:3, Informative)
It's not that sad. There's still gems here [telltalegames.com] and there [machinarium.net].
Re: (Score:2)
Though "Porsche Unleashed" was an arcade game already...as pretty much whole NFS series from its inception. Which isn't strictly a bad thing, really - after all it essentially means that a game knows it's a game.
(generally, consider you might be also falling a bit into simple nostalgia; and overlooking lots of indy stuff, which gives often quite close experience to "golden days", whichever time period we mean by that in given discussion (~= from your youth); with "mainstream" expanded, and popular stuff bei
Re: (Score:2)
The first FlatOut still has soul. It's quite a blast to play with a steering wheel and a bunch of friends.
Re: (Score:2)
Bringing back two Voodoo 2 cards, found for pennies now, (or some nice Voodoo 3) would be probably a bit more practical.
Re: (Score:2)
You do realize that was only 12 years ago? (98?) We've come a long way. Then compare it to smartphone development, which has grown light years faster than the PC side of things. It goes to show what competition can bring to innovation. The PC has become a stale fixed platform. Sure we get a faster processor every couple of years and the amount of ram in PCs now is astounding, but at its core, its still basically the same machine that existed 15-20 years ago. 64-bit is the only thing relatively new, and to b
Re: (Score:3, Insightful)
I find it tedious this need to equate happeningness with innovation.
Cell phones would look pretty pathetic if not embedded in an ecosystem which made it possible to efficiently produce the software and content and phone designs you implicitly rave about. Just about every great innovation that makes the modern cell phone possible was developed primarily on giant PC workstations.
Just like it's easier to have a lot of spare cash in early adulthood (and the coolness associated with that) if you still live in y
Wow (Score:2)
This just in : spend $1200 on a graphics card and you'll get a card that destroys everything else on benchmarks! (til next month) And you'll have to wait 3 solid years before games are out that really need the card!
Film at 11.
And don't forget (Score:3, Insightful)
Re: (Score:2)
Trust me. If they think they could sell a $2400 card, they certainly would try. When the front of the pack started costing about $500 for the price of admission, it was clear to me that they would just keep increasing the limit higher and higher. $1200 is a lot, but at only 1000 cards, I think they realize this is a very limited market.
what's the point of the briefcase? (Score:4, Interesting)
sure...it's cool..but at the same time...gimmicky..
once I install the card...it stays in the there and not in the briefcase.
And the "gaming mouse"....I'm sorry, I like my G5 (rev 2).
Plus the price makes it un-attractive.
Re: (Score:3, Funny)
Just so if you go on vacation, and decide to take your gaming graphics card, instead of your girlfriend . . . it should be great for getting flagged, when going through airport security.
TSA agent: "Sir, what exactly is this . . . ?"
Gamer: "It's the fastest graphics card in the world, as we know it! And it costs $1200 . . . and came with this great briefcase!"
TSA to colleague: "I don't see any Apple logo on it. Cuff him, and put him on the next flight to Guantanamo. Send the briefcase to the lab in
Re: (Score:2)
that's what my LAN party rig is for...
Re: (Score:3)
I was going to make a snarky remark about how it would be unlikely that you would have a gf if you bought this card, but come to think of it, if you have $1200 to blow on a video card, there is a very good chance that you have a girlfriend.
Re: (Score:2)
getting on /.
Re: (Score:2)
The card costs $1200. Do you think that they would just put it in a static bag and drop it in a cardboard box? The packaging of a high-end retail item is typically one of the most important aspects. Companies spend a lot of time and money to have really interesting packaging.
One page (Score:2, Informative)
Gaming driving video card development (Score:2)
Now all we need is the next iteration of Crysis to suck the life out of the latest and greatest video card so people will be pining away for something offering the performance of 3 of these puppies in crossfire mode @ 1/3 of the price.
Geeks all over the world are going to have to live with mom and dad an extra few months to pay for that indulgence. :)
IO limited? (Score:2)
Assuming the card provides 4.64 TFLOPS and PCIe offers 8GB/s I would assume one can perform 2320 single floating point operations per single float send. Is this what GPU programmers want, or do you feel that the card is twiddling its thumbs?
I could imagine that the io-flops rate is just like it should be but I'm curious what people think about it.
Re: (Score:3, Informative)
Typically for graphics cards, the only data sent over PCIe is texture data, vertex lists, and commands. The bulk of the operations done by the card are running the commands over the vertex lists while bringing in texture data. The commands are almost always a multi-pass or pipeline so each vertex will be used in computations more than once. The result is the pushed to the monitor, not the PCIe. So, yes, in general, a graphics card will have more FLOPs than I/O bandwidth.
Re:Linux? (Score:5, Funny)
Re: (Score:2)
Re:Linux? (Score:4, Informative)
Oh yeah sorry... I forgot to mention that that engine is actually used to make this commercial game that is comming to Linux: http://www.primalcarnage.com/website/ [primalcarnage.com]
Re: (Score:2)
Re: (Score:3, Informative)