×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD's Dual GPU Monster, The Radeon HD 3870 X2

CmdrTaco posted more than 6 years ago | from the heckuva-lot-of-video-cards dept.

AMD 146

MojoKid writes "AMD officially launched their new high-end flagship graphics card today and this one has a pair of graphics processors on a single PCB. The Radeon HD 3870 X2 was codenamed R680 throughout its development. Although that codename implies the card is powered by a new GPU, it is not. The Radeon HD 3870 X2 is instead powered by a pair of RV670 GPUs linked together on a single PCB by a PCI Express fan-out switch. In essence, the Radeon HD 3870 X2 is "CrossFire on a card" but with a small boost in clock speed for each GPU as well. As the benchmarks and testing show, the Radeon HD 3870 X2 is one of the fastest single cards around right now. NVIDIA is rumored to be readying a dual GPU single card beast as well."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

146 comments

But does it run Linux? (3, Interesting)

Ed Avis (5917) | more than 6 years ago | (#22207812)

No mention from the article summary of whether this is supported by ATI's recent decision to release driver source code. If you buy this card can you use it with free software?

(Extra points if anyone pedantically takes the subject line and suggests targetting gcc to run the Linux kernel on your GPU... but you know what I mean...)

Re:But does it run Linux? (1)

DoctorDyna (828525) | more than 6 years ago | (#22207910)

Was a Linux version of Crysis released that I didn't hear about?

Re:But does it run Linux? (1)

Spokehedz (599285) | more than 6 years ago | (#22208668)

Oh how I wait for this to be reality. My dual 8800GTX cards want to be in Linux all the time, but sadly there is no way to run it without Windows.

Re:But does it run Linux? (4, Informative)

habig (12787) | more than 6 years ago | (#22207986)

No mention from the article summary of whether this is supported by ATI's recent decision to release driver source code. If you buy this card can you use it with free software?

While AMD has done a good thing and released a lot of documentation for their cards, it has not been source code, and has not yet included the necessary bits for acceleration (either 2D or 3D). That said, I'm watching what I'm typing right now courtesy of the surprisingly functional radeonhd driver [x.org] being developed by the SUSE folks for Xorg from this documentation release. While lacking acceleration, it's already more stable and lacks the numerous show-stopper bugs present in ATI's fglrx binary blob.

Dunno yet if this latest greatest chunk of silicon is supported, but being open source and actively developed, I'm sure that support will arrive sooner rather than later.

Re:But does it run Linux? (2, Informative)

GuidoW (844172) | more than 6 years ago | (#22209208)

Actually, what did they really release? I remember some time ago, there was a lot of excitement right here on /. about ati releasing the first part of the documentation, which was basically a list with names and addresses of registers but little or no actual explanations. (Although I guess if you have programmed graphics drivers before, you'd be able to guess a lot from the names...)

The point is, it was said that that these particular docs were only barely sufficient to implement basic things like mode-setting and 2D-support and maybe TV-Out, but certainly not 3D-acceleration. There was a promise by ati to release even more documentation in the future to allow these things, but so far, I haven't seen anything. I did some googling to find out if maybe I've missed something, but that turned up very little. Even the X.Org wiki didn't help much.

So, does anyone here know a bit more? What's the real status of the released docs? Is there enough to do a real implementation with all the little things like RandR, dual head support, TV-Out and 3D-support, or is ati just stringing us along, pretending to be one of the good guys?

Re:But does it run Linux? (1)

habig (12787) | more than 6 years ago | (#22209928)

What's the real status of the released docs? Is there enough to do a real implementation with all the little things like RandR, dual head support, TV-Out and 3D-support, or is ati just stringing us along, pretending to be one of the good guys?

RandR and dual head work, based on what's running on my desk right now. Better than fglrx.

No idea about TV-out. Some 2D acceleration is in the works, but the 3D bits were not in the released docs (although rumors of people taking advantage of standardized calls abound, see the sibling of this post).

Well, barely (1)

G3ckoG33k (647276) | more than 6 years ago | (#22208238)

AMD/ATI still has issues delivering drivers on par with nVidia, depending on the application.

But, yes it does run Linux.

Re:But does it run Linux? (0)

Anonymous Coward | more than 6 years ago | (#22208406)

Who cares?

It's not like linux has any decent games.

Re:But does it run Linux? (1)

ronadams (987516) | more than 6 years ago | (#22209066)

What's that? Sorry, I couldn't hear you over the sound of my Unreal Tournament 2004/Nexuiz/Tremulous/Quake4/et. al.

Multiprocessing everywhere! (5, Funny)

AceJohnny (253840) | more than 6 years ago | (#22207852)

Can't make it faster? Make more. Another multiprocessing application. Can I haz multiprocessor network card plz?

When can I have a quantum graphics card that displays all possible pictures at the same time ?

Re:Multiprocessing everywhere! (5, Funny)

mwvdlee (775178) | more than 6 years ago | (#22207968)

Here's your "all possible pictures at the same time" (using additive mixing), and it doesn't even require you to buy a new graphics card:














Cool éh?

Re:Multiprocessing everywhere! (2, Funny)

kvezach (1199717) | more than 6 years ago | (#22208102)

Bah, that probability distribution is just wrong! Or you overflowed all your pixels.

Re:Multiprocessing everywhere! (1)

gardyloo (512791) | more than 6 years ago | (#22208186)

It's all those screenshots of the 2010 Windows White Screen of Death which does that. Once you've added that in, nothing survives.

Re:Multiprocessing everywhere! (1)

PopeRatzo (965947) | more than 6 years ago | (#22208234)

Here's your "all possible pictures at the same time"
I just saw my own death! In an infinite number of ways.

Re:Multiprocessing everywhere! (1)

Spokehedz (599285) | more than 6 years ago | (#22210444)

This raises a good question: If you saw your death, in every way possible, woulden't that make it impossible to die?

Observing changes the outcome. By observing all outcomes, there is nothing left to change into. Ergo, no way to die?

Re:Multiprocessing everywhere! (1)

n3tcat (664243) | more than 6 years ago | (#22208002)

I'm pretty sure that "white" was available on even the first video accelerator cards...

Re:Multiprocessing everywhere! (0)

Anonymous Coward | more than 6 years ago | (#22209860)

Mine could only display red, green or blue.

Re:Multiprocessing everywhere! (1)

afidel (530433) | more than 6 years ago | (#22208026)

Um? Actually video cards are an inherently parallizable problem set. You see this in every modern video card where the difference between the top and bottom of a product line is often simply the amount of parallel execution units that passed QC. All they are doing here is combining two of the largest economically producible dies together into one superchip. Oh and I already have multiprocessor network cards, their called multiport TCP Offload cards =)

Re:Multiprocessing everywhere! (1)

kvezach (1199717) | more than 6 years ago | (#22208068)

When can I have a quantum graphics card that displays all possible pictures at the same time ?

Quantum algorithm for finding properly rendered pictures:
1. Randomly construct a picture, splitting the universe into as many possibilities as exist.
2. Look at the picture.
3. If it's incorrectly rendered, destroy the universe.

But now, with Quantum Graphics, you don't have to destroy the unfit universes - the card will take care of it for you! Buy now!

You have been deselected... (1)

kholburn (625432) | more than 6 years ago | (#22208524)

The reason for the existence of your universe is to not be rendered on someone's computer game. Thanks for playing but your universe will not be needed.

Re:Multiprocessing everywhere! (1)

Yvanhoe (564877) | more than 6 years ago | (#22208582)

I have a quantum graphics card somewhere, standing still, but I can't locate it ! Damn you, Heisenberg !

Re:Multiprocessing everywhere! (1)

TheLink (130905) | more than 6 years ago | (#22208734)

"displays all possible pictures at the same time?"

Goatse, hot tub girl and "Can I haz cheeseburger" at the same time? No thanks.

Re:Multiprocessing everywhere! (0)

Anonymous Coward | more than 6 years ago | (#22208920)

Actually network bonding is common, and is a very sensible way of increasing throughput. Networking, Storage, and Graphics all scale very well in parallel.

crossfire capable? (-1, Redundant)

gEvil (beta) (945888) | more than 6 years ago | (#22207884)

Does anyone know if this card is Crossfire capable? I guess I'll head over to TFA to take a look.

Re:crossfire capable? (1)

Chainsaw Karate (869210) | more than 6 years ago | (#22207954)

You can't put two of these in Crossfire yet. ATI is working on Crossfire X drivers that will allow you to put two 3870x2s in crossfire.

Re:crossfire capable? (0)

Anonymous Coward | more than 6 years ago | (#22208530)

How fast will these be if you run them in Crossfire mode? If you can use one card on 16x with two chips on it, each one gets 8x PCIe (or eight lanes). Put two cards in the same system and each card gets 8x - the equivalent of 4 lanes per core on most motherboards.

Sounds like you have to replace your new motherboard if you want to get the potential out of the cards.

Re:crossfire capable? (0)

Anonymous Coward | more than 6 years ago | (#22208706)

There are plenty of motherboards out there that can drive two PCIe x16 slots. Only mediocre boards will downgrade the second slot to x8 when a second card goes in.

Re:crossfire capable? (0, Offtopic)

gEvil (beta) (945888) | more than 6 years ago | (#22208370)

Nice. I get modded down to -1 for legitimately asking if you'll be able to run a 4 core setup using two of these cards. Way to go guys!

Re:crossfire capable? (1)

toleraen (831634) | more than 6 years ago | (#22208720)

You admitted that you didn't even RTFA before asking, your question is covered in TFA, and you said you were about to read it. Kinda like asking a mechanic how much oil your car takes while you start to open the car's manual.

R680 (-1, Redundant)

omeomi (675045) | more than 6 years ago | (#22207918)

kind of a boring codename...

Re:R680 (2, Funny)

MightyYar (622222) | more than 6 years ago | (#22208126)

You must be mis-pronouncing it - it's R-Upside-Down-Nine-Vertical-Infinity-Circle. That's how the engineers all refer to it internally.

Pretty cool if you ask me.

Re:R680 (1)

somersault (912633) | more than 6 years ago | (#22208172)

Not really. This codename was created in remembrance of those that gave their lives in the 'Crossfire' Revolution of 680AD, where the French (or the Gauls as they were known back then) ambushed the Germans with their Black Widow catapults, from opposite sides of a treacherous ravene, and accidentally killed each other in the process. WTF are you expecting from a codename? o_0

Re:R680 (1, Offtopic)

bob.appleyard (1030756) | more than 6 years ago | (#22208804)

In the 7th Century what we know as France today, along with the low countries and some of western Germany, was known as Francia [wikipedia.org] and was ruled, at least in theory, by the Merovingian [wikipedia.org] line of Frankish kings. This century saw the rise of the Carolingian [wikipedia.org] dynasty within Francia, which reached their height in the late 8th and early 9th Centuries with the reign of Charlemagne [wikipedia.org] .

Germany wasn't a single political entity until the 19th Century, and the Franks were Germanic [wikipedia.org] , which is more of a group of identities, but as close as your going to get at this point in time.

Francia would form the basis of the Carolingian Empire, which would itself lead on to the establishment of the Holy Roman Empire [wikipedia.org] , one of the most important political entities in Western Christendom throughout the High Middle Ages [wikipedia.org] , when it was more usually thought of as a continuation of the Roman Empire in the West, even though it was nothing of the kind.

I did appreciate the joke, and I'm not being a pedant or anything. I just thought I'd share with you some of the history of the time. After the Volkerwanderungzeit, but before the second wave of barbarian invasions, this is a crucial period in the early formation of Europe.

Re:R680 (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#22209194)

You actually are being a giant pedantic fuckface.

You'd be better served by keeping your mouth shut, therefore maintaining the illusion that you're not some kind of bookworm-ish faggot who can't understand that sometimes, you're supposed to only laugh at a joke, not dissect it.

Cockgobbler.

This just in: New technology faster than old. (-1, Redundant)

Fross (83754) | more than 6 years ago | (#22207934)

This isn't really news, is it? It's not even a new graphics card, just a new way of packaging existing ones. The X2 card costs about the same as two 3870s anyway, so you may as well SLi them for the same performance...

The benchmarks really are a bit weird though, comparing the card *only* against single-card Geforce systems, and old ones at that (such as the 8800GTX, which is the same generation as the ATI 2900 rather than 3800 series), and not even the top end, ie the Ultra. For comparison, the costs in the UK for the cards are about:

3850: £110
3870: £135
3870 X2: £255
8800GTS: £170 (the new model, out last month, which they didn't even test, and performs between a GT and GTX)

The 3800 series is obviously aimed at the low end 3d gaming market (and I don't mean tux racer :p ), while the 8800GTS gives the best "bang for your buck" at the moment. The X2 is currently the fastest card in a single socket, but why not benchmark it against an 8800 SLi system, if you are going to bother testing against 3870s in crossfire?

Apart of course from the fact it wouldn't be the fastest card out, and having a new flagship card which is second-best isn't a great thing in that industry.

Re:This just in: New technology faster than old. (2, Interesting)

Kamots (321174) | more than 6 years ago | (#22208312)

Interesting thing is what happens when you stop looking at synthetic benchmarks... and start looking at real gameplay.

Take a read through hardocp's review [hardocp.com] for an example.

As to why AMD released? Well, my understanding is that NVidia is looking to release thier own 2-GPU card (9800 GX2) in Feb/March. Given the benchmarks of the current cards, I can't see the 3870 X2 holding up well... so... beat 'em to market. Although when you factor price in, I'd imagine it'll still be competitive; just not anywhere near the fastest.

What I'm waiting to see come out from AMD is the R700 cards... especially if it convinces nvidia to finally release thier true next-gen cards as well (not merely the continued tweaking/shrinking of the G80 architecture). Then we can all have something to look forward to :)

Re:This just in: New technology faster than old. (1)

jrwr00 (1035020) | more than 6 years ago | (#22208616)

na the trick with these cards are that you can crossfire 2 of those (4 GPUs anyone)

Sounds wasteful, but isn't (1, Redundant)

mcvos (645701) | more than 6 years ago | (#22207976)

Two GPUs on a single card? Who the hell needs that kind of power? Besides, don't modern graphics cards waste ridiculous amounts of energy even when they're simply drawing your desktop?

For those who haven't been following the recent releases of ATI graphics cards, it's probably interesting to note that the AI HD2850 and HD2870 use only 20 Watt when idling (most low-end cards use at least 30W nowadays, and high-end cards are often closer to 100W).

So that should mean that this new card should eat about 40W when idling, making this card not just the most powerful graphics card today, but also less wasteful than nVidia's 8800GT. Not a bad choice if you're in dire need of more graphics power. Although personally I'm planning to buy a simple 3850.

Re:Sounds wasteful, but isn't (1)

mwvdlee (775178) | more than 6 years ago | (#22208028)

Isn't AMD working on a system which switches back to a low-power on-board graphics chip when drawing the OS?

Re:Sounds wasteful, but isn't (1)

nonsequitor (893813) | more than 6 years ago | (#22209774)

Isn't AMD working on a system which switches back to a low-power on-board graphics chip when drawing the OS?
I don't know what AMD/ATI is currently working on, but you can not draw an Operating System. You can however draw a windowing system, for instance XOrg rendering KDE or Gnome. This is Slashdot, us nerds are pedantic.

Perhaps you meant having a low power chip which can take over for simple 2D graphics. I believe Aero (hopefully I got the name correct) uses 3D graphics now, and its all the rage in the Linux world to use XGL, a 3D renderer for a windowing system. So it is unlikely that it would be utilized by whatever modern OS you install unless you explicitly select 2D rendering for your Desktop.

RTFA (1)

Fross (83754) | more than 6 years ago | (#22208044)

This card is actually the most power-hungry of the lot.

They only give power consumption for the whole system, 214W when idle, 374W when under load (!)

SOme basic math on their results gives you the 3870 consumes 50W when idle, and the X2 consuming 100W when idle and up to a massive 260W when under full load.

(3870 at idle = 164W, 3870 X2 at idle = 214W, hence 3870 = 50W)

Re:Sounds wasteful, but isn't (1)

IBBoard (1128019) | more than 6 years ago | (#22208058)

Who the hell needs that kind of power?


Who needs it? Probably graphics artists who are rendering amazingly complex scenes. I can imagine it would help some game designers and potentially even CAD architecture-types. Probably not so much with films because I think they're rendered on some uber-servers.

Who wants it? Gamers with more money than sense and a desire to always be as close to the cutting edge as possible, even if it only gains them a couple of frames and costs another £100 or more.

Re:Sounds wasteful, but isn't (1)

Penguinisto (415985) | more than 6 years ago | (#22208250)

Who needs it? Probably graphics artists who are rendering amazingly complex scenes. I can imagine it would help some game designers and potentially even CAD architecture-types. Probably not so much with films because I think they're rendered on some uber-servers.

Not necessarily. Most standard rendering engines eat system CPU a lot more than it ever would GPU - especially when it comes to things like ray tracing, texture optimization, and the like.

Most (even low-end) rendering packages do have "OpenGL Mode", which uses only the GPU, but the quality is usually nowhere near as good as you get with full-on CPU-based rendering. Things may catch up as graphics cards improve, but for the most part, render engines are hungry for time on that chip on your motherboard, not necessarily the one on your graphics card. Where the graphics card shines in is preview rendering - that is, showing you in the workspace what you'll get while you're still assembling the mesh, texture, composition, etc.

...now HD graphics, video editing, and pro-level photography OTOH? Those could certainly use the boost in some cases...

/P

Don't bother (2, Insightful)

BirdDoggy (886894) | more than 6 years ago | (#22207982)

Wait for the nVidia version. Based on their latest offerings, it'll probably be faster and have more stable drivers.

Re:Don't bother (1)

Wicko (977078) | more than 6 years ago | (#22210038)

Or, pick up a pair of 8800GT's for roughly the same price as AMD's X2, and more performance (most likely). This is assuming you have an SLI capable board. An X2 from nvidia is gonna cost an arm and a leg most likely..

Seriously? Yawn. (3, Insightful)

esconsult1 (203878) | more than 6 years ago | (#22208020)

Am I the only one underwhelmed by almost every new graphics card announcement these days?

Graphic cards have long since been really fast for 99.9999% of cases. Even gaming. These companies must be doing this for pissing contests, the few people who do super high end graphics work, or a few crazy pimply faced gamers with monitor tans

Re:Seriously? Yawn. (4, Informative)

Cerberus7 (66071) | more than 6 years ago | (#22208168)

Actually, graphics power isn't fast enough yet, and it will likely never be fast enough. With high-resolution monitors (1920x1200, and such), graphics cards don't yet have the ability to push that kind of resolution at good framerates (~60fps) on modern games. 20-ish FPS on Crysis at 1920x1200 is barely adequate. This tug-of-war that goes on between the software and hardware is going to continue nearly forever.

Me, I'll be waiting for the card that can do Crysis set to 1920x1200, all the goodies on, and 50-60fps. Until then, my 7900GT SLI setup is going to have to be enough.

Re:Seriously? Yawn. (1)

nschubach (922175) | more than 6 years ago | (#22209154)

Me, I'll be waiting for the card that can do Crysis set to 1920x1200, all the goodies on, and 50-60fps. Until then, my 7900GT SLI setup is going to have to be enough.
Which should be just in time for the "next big game" to come out. ;)

Re:Seriously? Yawn. (1)

jcnnghm (538570) | more than 6 years ago | (#22209180)

That's the biggest problem that I see with PC gaming. Last week, I went out and bought a Nvidia 8800 GTS for $300, so that I could play some of the more recent PC games at an acceptable frame rate at my primary monitor's native resolutions (1680x1050). My computer is fairly modern, with a 2.66 GHZ dual core processor and 2 GB of DDR2 800. The problem is, even with this upgrade, I could only play Crysis at medium settings.

While it was definitely a performance improvement over my 6800 sli setup, the quality just wasn't there for the price. For another $100, I can get a PS3 that includes a Bluray player and I won't need to worry about tweaking settings and overclocking to get acceptable framerates. Granted, I could probably match the performance of my 360 or the PS3 if I upgraded my processor along with the graphics card, but if I were to do that, I'd be looking at a considerably more expensive upgrade. It just doesn't seem worth it any more.

Re:Seriously? Yawn. (2, Insightful)

stewbacca (1033764) | more than 6 years ago | (#22210334)

Play the game and enjoy it for the best settings you can get. I downloaded the Crysis demo last night for my 20" iMac booted into WinXP (2.33ghz c2d, 2gb ram, 256mb X1600 video card, hardly an ideal gaming platform, eh?). I read that I wouldn't be able to play it on very good settings, so I took the default settings for my native resolution and played through the entire demo level with no slowdowns. It looked great.


The real problem here is people feeling like they are missing out because of the higher settings they can't play. Just play the game! Quit fueling the ridiculous ten-year-old trend of spending more on graphic cards than the computers themselves! If the game were unplayable at the medium setting, then yeah, I'd say the complaint is valid.

Re:Seriously? Yawn. (1)

zrq (794138) | more than 6 years ago | (#22209400)

I'll be waiting for the card that can do Crysis set to 1920x1200, all the goodies on, and 50-60fps

I recently bought a new 24" monitor (PLE2403WS [iiyama.com] ) from Iiyama. Very nice monitor, but a few problems integrating it with my current video card.

The monitor is 1920x1200 at ~60Hz. The manual for my graphics card (GeForce PCX 5300) claims it can handle 1920x1080 and 1920x1440, but not 1920x1200 :-(

Ok, I kind of expected I would need to get a new graphics card, but I am finding it difficult to find out what screen resolutions the available cards will actually handle. Most of the online shops don't really supply any details, and the manufacturers websites seem to deliberately make it difficult to find out.

The information is almost always buried at the back of the user manual (which you normally don't get until after you have bought the card). You have to go to the manufacturers website, select each type of card, transfer to their 'download site', select the card again, download the PDF manual wrapped as a zip file, unpack the pdf .... only to find it is the 'lite' version of the manual which doesn't actually give you details of the screen resolutions.

Am I missing something ? Does anyone know of a resource on the net where I can find out what screen resolutions graphics cards are capable of handling.

Re:Seriously? Yawn. (1, Insightful)

TemporalBeing (803363) | more than 6 years ago | (#22209450)

Actually, graphics power isn't fast enough yet, and it will likely never be fast enough. With high-resolution monitors (1920x1200, and such), graphics cards don't yet have the ability to push that kind of resolution at good framerates (~60fps) on modern games. 20-ish FPS on Crysis at 1920x1200 is barely adequate. This tug-of-war that goes on between the software and hardware is going to continue nearly forever.

Me, I'll be waiting for the card that can do Crysis set to 1920x1200, all the goodies on, and 50-60fps. Until then, my 7900GT SLI setup is going to have to be enough.
But then you'd just be complaining that resolution Xres+1 x Yres+1 can't be pushed as FPS N+1. Honestly, you only need 24 to 32 FPS as that is pretty much where your eyes are at (unless you have managed to time travel and get ultra-cool ocular implants that can decode things faster). It's the never ending b(#%*-fest of gamers - it's never fast enough - doesn't matter that you're using all the resources of the NCC-1701-J Enterprise to play your game.

Re:Seriously? Yawn. (1)

RzUpAnmsCwrds (262647) | more than 6 years ago | (#22209796)

Honestly, you only need 24 to 32 FPS as that is pretty much where your eyes are at


Honestly, you don't play FPS games if you say that.

Film has such a crappy frame rate (24fps) that most movies avoid fast camera pans.
TV runs at 60 fields (480i60, 1080i60) or 60 frames (480p60, 720p60) per second, not 30 frames per second.

30fps is acceptable for a game like WoW where you have hardware cursor and you aren't using a cursor-controlled viewpoint. It's not as smooth, but it's playable.

30fps isn't acceptable for a FPS, RTS without hardware cursor, or really any game where your mouse rate depends on your framerate. Precise mouse movement (which is essential in many games) depends on having a consistently high number of updates per second.

Not at all (4, Insightful)

Sycraft-fu (314770) | more than 6 years ago | (#22209908)

Many things you are wrong with there. The first is framerate. If you can't tell the difference between 24 and 60 FPS, well you probably have something wrong. It is pretty obvious on computer graphics due to the lack of motion blur present in film, and even on a film/video source you can see it. 24 FPS is not the maximum amount of frames a person can perceive, rather it is just an acceptable amount when used with film.

So one goal in graphics is to be able to push a consistently high frame rate, probably somewhere in the 75fps range as that is the area when people stop being able to perceive flicker. However, while the final output frequency will be fixed to something like that due to how display devices work, it would be useful to have a card that could render much faster. What you'd do is have the card render multiple sub frames and combine them in an accumulation buffer before outputting them to screen. That would give nice, accurate, motion blur and thus improve the fluidity of the image. So in reality we might want a card that can consistently render a few hundred frames per second, even though it doesn't display that many.

There's also latency to consider. If you are rendering at 24fps that means you have a little over 40 milliseconds between frames. So if you see something happen on the screen and react, the computer won't get around to displaying the results of your reaction for 40 msec. Maybe that doesn't sound like a long time, but that has gone past the threshold where delays are perceptible. You notice when something is delayed that long.

In terms of resolution, it is a similar thing. 1920x1200 is nice and all, and is about as high as monitors go these days, but let's not pretend it is all that high rez. For a 24" monitor (which is what you generally get it on) that works out to about 100PPI. Well print media is generally 300DPI or more, so we are still a long way off there. I don't know how high rez monitors need to be numbers wise, but they need to be a lot higher to reach the point of a person not being able to perceive the individual pixels which is the useful limit.

Also pixel oversampling is useful just like frame oversampling. You render multiple subpixels and combine them in to a single final display pixel. It is called anti-aliasing and it is very desirable. Unfortunately, it does take more power to do since you do have to do more rendering work, even when you use tricks to do it (and it really looks the best when does as straight super-sampling, no tricks).

So it isn't just gamers playing the ePenis game, there's real reasons to want a whole lot more graphics power. Until we have displays that are so high rez you can't see individual pixels, and we have cards that can produce high frame rates at full resolution with motion blur and FSAA, well then we haven't gotten to where we need to be. Until you can't tell it apart form reality, there's still room for improvement.

Re:Seriously? Yawn. (2, Insightful)

Man in Spandex (775950) | more than 6 years ago | (#22210298)

Honestly, I doubt you play FPS games because the difference between the 24-32fps range and the 50-60's is way noticeable. Forget the theoretical technicalities of human eyes capabilities for one second because I'm sure when the FPS of a game reaches the 30's, there are other factors that make it sluggish and all that together give us the perception that the difference between 30's and 60's is an important difference.

Re:Seriously? Yawn. (1, Informative)

Anonymous Coward | more than 6 years ago | (#22209674)

Judging from the comments here it seems that the market for this card is for Crysis players who want to play at max settings. That is a pretty narrow market.

Re:Seriously? Yawn. (0)

Anonymous Coward | more than 6 years ago | (#22208200)

These cards are morphing into the modern version of supercomputer array processors. SSE is nice and all, but for large vector and matrix operations, you can't beat 128 floating point units connected to a 384 bit memory bus. (8800 GTX there, not sure what ATI has)

Re:Seriously? Yawn. (0)

Anonymous Coward | more than 6 years ago | (#22208248)

Have you paid attention to the new games? Most of the graphics cards are crying unless you turn settings down.

It can never be too fast. If you believe the nonsense of not being able to notice fps differences above 30 fps, then you're right.

Go try a game at 30 fps, then try it at 130.

Re:Seriously? Yawn. (1)

berwiki (989827) | more than 6 years ago | (#22208600)

Am I the only one underwhelmed by almost every new graphics card announcement these days?
Absolutely not, and the reason these announcements are so 'boring' is the fact the cards are never That Much better than the previous generation.

I expected to see Double the scores and Double the frame-rates from a Dual GPU card! But alas, steady incremental improvements that don't warrant the extreme cost of the device.

Maybe now that I've made that realization, I won't overhype myself from now on.

Re:Seriously? Yawn. (1)

jez9999 (618189) | more than 6 years ago | (#22208770)

Graphic cards have long since been really fast for 99.9999% of cases.

I think they're releasing a new Elder Scrolls soon.

Re:Seriously? Yawn. (0)

Anonymous Coward | more than 6 years ago | (#22208794)

I thought that within the last year or so we've seen some pretty interesting developments in graphics. Shader Model 3.0, High Dynamic Range Imaging, DirectX 10 with OpenGL coming out 'soon.' Plus if you look at the benchmarks for Crysis in the article you'll see that the best any card can do in the market is 40fps. This is still pretty low for a first person shooter, so increased speeds on gpu's are always welcome.

And not forgetting the advent of GPGPU, the number of scientists and engineers buying these top end graphics card is likely to rise. Just have a look at the CUDA documentation if you don't believe this AC. You get all the math and Fourier functions a budding Imaging analyst would need.

Re:Seriously? Yawn. (1)

MacarooMac (1222684) | more than 6 years ago | (#22208988)

I hear what you're saying, esconsult1, in that the top-of-the-range GPUs are capable of hoovering up the most demanding apps and games at ridiculous resolutions and so product announcements such as this are neither groundbreaking nor exciting.

In terms of the progression of GPU technology as a whole, however, I for one shall be acquiring a new 'multimedia' laptop in about six months and I need a fairly high spec graphics card that will, for example, support game play of the latest titles but (1) will not drain my bat within 30 seconds and (2) won't blow my budget. I'll probably get a 17" 'desktop replacement' style unit since it won't actually be 'mobile' that often. But the majority of mobile users will be after more convenient notebooks and slick looking ultra-portables. It's the emergence of high-spec, low-cost and low-power consumption GPU's capable of running demanding apps, but which also fit into increasingly confined mobile units, that is beginning to allow CAD designers and other graphics power users to use a laptop as a complete substitute for their desktop *beast*, w/out suffering any significant compromise in GPU performance.

The GPU power consumption and power management issues mentioned above may not be very important for the desktop and permanently plugged in user, but for frequently mobile users this is a pretty big issue. What remains is for the designers to improve the internal cooling for CPUs and GPUs, as chip power increases. Finally, with the advent of SSD drive technology (when they begin offering a half decent capacity SSD for under $300), with the reductions in HD power consumption and noise and increase in performance that SSDs promise, *power* mobile computing with 4-5 hour bat life will truly have arrived and I for one intend to be out and about considerably more than I'm able to at present!

price/performance (1)

pablo_max (626328) | more than 6 years ago | (#22208030)

I am really not that impressed. It's not much faster than the 8800 GT which is MUCH MUCH MUCH less expensive. I am sure you can pick up two 8800 GT's for the price of this card. Of course then you have to deal with the noise, but overall it looks to me that the price/performance ratio of this card is not that great.

Still a good product (1)

Mishra100 (841814) | more than 6 years ago | (#22208072)

No matter how well they designed the card, at the end of the day price/performance is what you are looking for in a graphics card. This card delivers performance that teeters around the same performance that the 8800 Ultra gives at a much lower cost and produces about the same noise and power ratios.

ATI announced that they won't sell cards for over 500 dollars and I think that gives them a good standing in the market place. If you are willing to spend 450 dollars http://www.newegg.com/Product/Product.aspx?Item=N82E16814103052 [newegg.com] and want to upgrade in the future, then this is probably going to be best buy. I think that speaks well for ATI who hasn't even been near the market for a while.

That being said, I think if you are going to buy a video card and can wait for Nvidia's product (which is supposed to be in Feb) then I would definitely do that to see what competition they will bring.

Driver dependent performance (2, Insightful)

Overzeetop (214511) | more than 6 years ago | (#22208154)

Ultimately though, the real long-term value of the Radeon HD 3870 X2 will be determined by AMD's driver team.
That doesn't really bode well, given the clusterfuck that the CCC drivers tend to be.

Re:Driver dependent performance (4, Informative)

chromozone (847904) | more than 6 years ago | (#22208262)

ATI/AMD's drivers can make you cry. But their Crossfire already scales much better than Nvidia's SLI which is a comparative disaster. Most games use Nvidia's cards/drivers for development so Nvidia cards hit the ground running more often. As manky as ATI drivers can be, when they say they will be getting better they tell the truth. ATI drivers tend to show substantial improvements after a cards release.

Re:Driver dependent performance (0)

Anonymous Coward | more than 6 years ago | (#22208540)

Yes, but with ATI isn't every direction up with regards to their driver quality?

I'm still waiting for a working driver for my 2600 (1)

Joce640k (829181) | more than 6 years ago | (#22210284)

Six month old card, no working driver. ...and by that I mean a driver which doesn't say "Card not supported" when you try to install it.

This month they released an unsupported "hotfix driver" which installs but puts garbage on screen when you try anything - even with obvious things like 3DMark.

Does it come with... (3, Interesting)

Walles (99143) | more than 6 years ago | (#22208170)

... specs or open source drivers?

I haven't heard anything about any specs for 3d operations being released from AMD. I know they were talking about it, but what happened then? Did they release anything while I wasn't looking?

Re:Does it come with... (2, Insightful)

Andy Dodd (701) | more than 6 years ago | (#22208560)

They probably are pulling a Matrox. Release partial specs, promise to release more, rake in $$$$$$$$$$ from gullible members of the Open Source community, fail to deliver on promises. Great short-term strategy but only works once before said community stops trusting you, especially those who were dumb enough to go for your promises like I was back in 1999.

Ever since I made the mistake of buying a Matrox G200 (Partial specs - more complete than what ATI has released so far as I understand it, and a promise for full specs which were never delivered) I never make buying decisions based on promised specification/driver releases - only what works NOW, whether binary or not. (Hence I've been happily buying NVidia for 6-7 years now.)

Re:Does it come with... (1)

EricR86 (1144023) | more than 6 years ago | (#22208646)

Are you talking about the radeonhd [radeonhd.org] driver?

Currently there's only 2d support, but a handful of developers from Novell seem to be consistently working on it.

As for specs, they just released another batch back in early january [phoronix.com] .

Buy Intel (1)

Kludge (13653) | more than 6 years ago | (#22209338)

I keep repeating this: Buy vendors that do offer open source drivers.

Typical Reply: Boo hoo, Intel is too slow, boo hoo.

My reply: Intel's graphic cards won't get faster if no one buys them. Other companies won't open source their drivers if you keep buying them with closed source drivers. Other companies will only open their drivers if they see it works for Intel.

Re:Does it come with... (2, Interesting)

Kjella (173770) | more than 6 years ago | (#22209486)

I haven't heard anything about any specs for 3d operations being released from AMD. I know they were talking about it, but what happened then? Did they release anything while I wasn't looking?
They released another 900 pages of 2D docs around Christmas, 2D/3D acceleration is still coming "soon" but given their current pace it'll take a while to get full 3D acceleration. So far my experience with the nVidia closed source drivers have been rock stable, I have some funny issues getting the second screen of my dual screen setup working but it never crashed on me.

Drivers are something for the here and now, they don't have any sort of long term implications like say what document format you use. The day ATI delivers, I can buy a new card and switch instantly. They can promise to release specs all they want, I'll promise to buy one when they do. ATI may find that promises are cheap both in the giving and the getting. I'm still afraid that all the good stuff will be stuck in the legal department forever.

Price point (0)

eebra82 (907996) | more than 6 years ago | (#22208226)

The summary failed to mention the most important factor: the new AMD card is actually much cheaper than the 8800 Ultra and at the same time a lot faster in many tests. In addition, it seems that the X2 equivalent of nVidia is delayed by one month or more, so AMD does have the lead for at least another month.

Re:Price point (1)

beavis88 (25983) | more than 6 years ago | (#22208296)

AMD would have the lead for another month if they would ship actual product. But they haven't yet, in usual ATI form, and I wouldn't recommend holding your breath...I would not be at all surprised to see Nvidia's competitor, while delayed, in the hands of actual consumers around the same time as the 3870X2.

Re:Price point (1)

beavis88 (25983) | more than 6 years ago | (#22208324)

Well shit, serves me right for not checking again - looks like Newegg has some of these in stock and available for purchase, right now. Go ATI!

And the winner is... (1)

steveaustin1971 (1094329) | more than 6 years ago | (#22208328)

My 8800 Ultra still beats this thing hands down, and I could put another one in SLI for way less than buying one of these ATI cards... And for all those that say why do you need this? You don't NEED it any more than you need a 4X4 SUV when you only drive in the city... or $200 basketball sneakers to walk down the street. I buy it because I want it.

Re:And the winner is... (1)

i.of.the.storm (907783) | more than 6 years ago | (#22210554)

Really? The benchmarks I've seen put it at a fair bit faster than the 8800 Ultra, which makes sense considering it's got two GPUs. And uh, the 8800 Ultra costs $700, this costs $450, so I don't know what crazy inside deals you've got but there is no way you could get another of those for way less than one of these.

Anyone remember . . . (2, Interesting)

TheScorpion420 (760125) | more than 6 years ago | (#22208410)

Anyone remember the ATI Rage Fury MAXX [amd.com] ? I've still got one in use. It was a monster in its day. Dual Rage 128 Pro GPU's and 64MB of RAM. But for some reason the way that they jury rigged them on one board didn't work properly in XP, so it only uses one. Oh well, still s nifty conversation piece.

Re:Anyone remember . . . (2, Informative)

TheRaven64 (641858) | more than 6 years ago | (#22208662)

The Rage 128 Pro was never close to the top of the line for a graphics accelerator (and doesn't really qualify as a GPU since it doesn't do transform or lighting calculations in hardware). It was around 50% faster than the Rage 128, which was about as fast as a VooDoo 2 (although it also did 2D). You had been able to buy Obsidian cards with two VooDoo 2 chips for years before the Maxx was released, and the later VooDoo chips were all designed around putting several on a single board.

it's missing stuff (1)

Joe The Dragon (967727) | more than 6 years ago | (#22208414)

Why only pci-e 1.1 a 2.0 switch would better split the bus to the 2 gpus.

also there should be 2 cross fire slots as each gps has 2 links and 2 out of 4 are used to link each other.

Re:it's missing stuff (1)

Hanners1979 (959741) | more than 6 years ago | (#22208746)

Why only pci-e 1.1 a 2.0 switch would better split the bus to the 2 gpus.

Because there simply aren't any 3-way PCI Express 2.0 switches available on the market yet - Waiting for that would have delayed the product substantially for very little in the way of real-world gains.

No 8800 Ultra? (0)

Anonymous Coward | more than 6 years ago | (#22208548)

They only compared the card to the 8800 GT and GTX from NVIDIA's selection of cards.

What i really want to know is how much power does it have against NVIDIA's top card the 8800 Ultra. I read the article and i think i didn't even spot a single mention of the Ultra being tested along side all the other cards. So, did they run out of budget or something?

If you can't beat 'em... (1)

JCSoRocks (1142053) | more than 6 years ago | (#22208624)

Just take two of your cards that are getting beaten by NVIDIA and then combine them in the hopes that they'll beat NVIDIA! aaaaand go!

Oblig (0)

Anonymous Coward | more than 6 years ago | (#22208656)

666 million transistors
The card is evil! Radeon is a servant of the devil and creates wares in his name!

Next up... (2, Funny)

jez9999 (618189) | more than 6 years ago | (#22208742)

Work is in the pipeline for a board which can house all your computer's necessary components, including a multiple core CPU that can handle graphics AND processing all-in-one! It will be the mother of all boards.

Offtopic, but... (1)

gaelfx (1111115) | more than 6 years ago | (#22208802)

whatever happened to the physics card that some company released a while back? It seemed like a pretty good idea, and I wonder if it could be modified to fit onto a graphics card as well. I just think that would be a nice coupling because I like the small towers rather than the huge behemoth that I have in my Mom's basement (no, I don't live at home any more, wanna take my geek card back?). It's nice that they are putting an extra chip into their cards, I can definitely imagineer that as being pretty helpful, especially with buffery stuff, but it seems that the physics processing unit (ppu) would be an even more handy addition since I can't think of many things you would do that require a powerful gpu that couldn't also make use of some nice ppu functions. Maybe if it were designed really well, the ppu could even workhorse as a secondary gpu for those applications that I can't think of. Although, I certainly have no concrete evidence for this, nor reason to believe that what I'm saying makes any sense to anyone who does know about these things, I still think it seems like a more "revolutionary" step than this.

Re:Offtopic, but... (0)

Anonymous Coward | more than 6 years ago | (#22209638)

Just like when 3D cards were introduced, it'll be awhile before these catch on. Making a game that requires a PPU isn't going to happen for some time, so games that use it will have to use it in a way that is optional. Which is even harder than with a GPU, because while that only affects the visuals, a PPU is designed to affect things that could have a serious effect on gameplay.

I'm pretty sure UT3 supports a PhysX PPU, even though there hasn't been a new card released in some time.

All the glorious color the author imagined.. (0)

Anonymous Coward | more than 6 years ago | (#22210336)

So when my level 58 Troll gets gang-ganked outside the gates of the Dark Portal by a pack of teamspeak-using 12 year olds, I'll be able to see him melt in the full techno-color the author envisioned. Sweet.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...