'SLI On A Stick' Reviewed 188
Bender writes "What would happen if you took NVIDIA's multi-GPU teaming capability, SLI, and stuck it onto a single graphics card? Probably something like the GeForce 7950 GX2, a 'single' video card with dual printed circuit boards, dual graphics processors, dual 512MB memory banks, and nearly twice the performance of any other 'single' video card. Add two of these to a system, and you've got the truly extreme possibility of Quad SLI. We've seen early versions of these things benchmarked before, but the latest revision of this card is smaller, draws less power than a single-GPU Radeon X1900 XTX, and is now selling to the public."
Wow (Score:5, Insightful)
Re:Wow (Score:5, Funny)
Re:Wow (Score:2)
Re:Wow (Score:3, Informative)
While they may be overkill for your average user, for (game) developers these things will be goldmines..
-K
Re:Wow (Score:4, Funny)
Re:Wow (Score:1, Funny)
Re:Wow (Score:1)
Re:Wow (Score:3, Informative)
Re:Wow (Score:5, Insightful)
Tomorrow, who knows? I remember a time when a TNT2 Ultra was considered overkill, now you can get more powerful GPUs in mobile phones.
Re:Wow (Score:2)
Sure, 3 years down the road there'll be games that look noticeably better with such a setup, but heres the thing; 3 years down the road you can have this graphics-performance for 1/8th the price and power-consumption.
It's fine though, those "early adopters" (aka idiots) pay a large fraction of the development-cost for the rest of us.
Re:Wow (Score:2)
Re:Wow (Score:1, Redundant)
Re:Wow (Score:2)
http://hardware.slashdot.org/article.pl?sid=06/04
Re:Wow (Score:2, Informative)
Re:Wow (Score:4, Insightful)
As an aside: it doesn't matter how long you've been playing a certain fps, your eyes have not mutated to give you the ability to discern a difference between 400 and 405 frames per second.
Re:Wow (Score:2)
Re:Wow (Score:2)
Which would only matter if FPS are tied to refresh rate, which it doesn't need to be, just turn Vsync off.
Re:Wow (Score:2)
Re:Wow (Score:2)
So you're saying your monitors refresh rate is the limit of your FPS but if you turn off vsync you will get more FPS. Yah that was what I was saying.
I don't care about tearing, I care about 100+FPS, I can see flickering and stutter with 75FPS or
Re:Wow (Score:2)
Now, IIRC Doom 3 is locked at 60fps, even if your monitor has
Re:Wow (Score:2)
So, if you're making a flight simulator to study the way bees fly, you might need 1000 FPS.
Re:Wow (Score:4, Informative)
Oblivion at 1920x1200? Good thing I don't have an Apple Cinema Display. Personally I think Oblivion's game engine is a bit overrated. Ok it's pretty but not *that* much prettier than the other freeform 3D games that don't kill my GFX card. Right now I'm working on a HOMM5 addiction instead...
Re:Wow (Score:2)
what would be the tempareture like? (Score:3, Interesting)
Less than one Radeon X1900XT (Score:1, Informative)
A lot... (Score:2)
What about... (Score:2, Insightful)
Re:What about... (Score:1)
OMG (Score:4, Funny)
you know, this place called "Work" (Score:2)
Where they have to put up with "recently corporate-purchased Dell, we won't feel necessary to change them before 2 years" crappy machine, that are slugs compared to what geek assembled in his garage 4 years before, out of spares.
On the other side, this thing called "work" comes with a nice stuff called "pay-check" that enables you to buy even more ultimate-leet-gear (and also buys baits for girlfriends such as "dining in a nice restaurant")
Weight (Score:5, Interesting)
Or maybe, just maybe, old-school lay-down cases will come back in style.
Re:Weight (Score:3, Funny)
Bleugh (Score:2, Interesting)
Who truly honestly needs this much horsepower for personal use? Seems li
Re:Bleugh (Score:5, Insightful)
Re:Bleugh (Score:1)
Re:Bleugh (Score:3, Insightful)
What's wrong with staying way behind the curve? It's the same tech, the same games, the same everything over time, except that you get those who think there is some important value to being at the leading edge of the curve to finance your gaming for you.
Your problem isn't tech, or money . .
Remember, the best ride is on the face of the wave.
KFG
Re:Bleugh (Score:5, Funny)
I'm sorry, you'll have to come up with a car analogy.
Re:Bleugh (Score:2, Insightful)
The best value in a car is a two year old used, third year of the model, but avoid the models favored by teenage street racers. They're innately overpriced for what you get and no matter how shiney the paint the internals have had the shit beat out of them.
KFG
Re:Bleugh (Score:3, Funny)
I'm sorry, you'll have to come up with a car analogy.
The best ride is on the roofrack of the car?
Re:Bleugh (Score:1)
I bought a shiny 6800GT over a year ago for $400 bucks. I'll never spend close to that for a video card again. If I can play the same games on a next gen console I'll pass on any PC upgrades in the future. It's a shame... the PC industry is only hurting self for a long term user base. Hell, you can't even get a decent baseball sim on the PC anymore... it's all going to pot.
Re:Bleugh (Score:2)
As for a baseball sim... you've gotta be kidding. I mean, I can understand going out to a game, the atmosphere, the pop-corn and hot dogs, the c
Re:Bleugh (Score:1)
Re:Bleugh (Score:2)
As great as consoles are, they are still specialized machines which limits their adoption. My PC can do everything consoles can do and much more that consoles can not. And as long as PCs have that advantage and a wide spread adoption rate, there will continue to be a market for PC based video games.
-Rick
Re:Bleugh (Score:1)
This is actually pretty cool... I'm starting to feel like the computer industry is warming up to the prospect of modular parallelization "at home".
We are reaching a point where quantum tunneling could become a real problem and frankly, I was hoping this would happen sooner... The industry always focused on things getting smaller, but we're running into a barrier in that direction.
Now we're starting to see the opposite: instead of buying a brand new system a
Re:Bleugh (Score:2)
Thats often how progress happens. Products are developed where the demand that already exists is a very limited niche, then, once the technology exists, more uses for it are developed, and demand increases.
But then, I don't think that's really the case here; seems to me that polygon-pushing horsepower on GPUs is something that developers have plenty of uses for as much as anyone can make available, and that plenty of
Re:Bleugh (Score:2)
Exclusivity- what's the deal? (Score:5, Insightful)
Before you get all excited about the prospect of dedicating 96 pixel shaders, 2GB of memory, and 1.1 billion transistors to pumping out boatloads of eye candy inside your PC, however, there's some bad news. NVIDIA says Quad SLI will, at least for now, remain the exclusive realm of PC system builders like Alienware, Dell, and Falcon Northwest because of the "complexity" involved.
So they are going to alienate the majority of the market that would spend the money on a Quad SLI setup to keep it exclusive to system builders for whatever period of time.
Seems like a bad business decision to me, at least until (and if) Nvidia comes to their senses.
Re:Exclusivity- what's the deal? (Score:2, Funny)
FINALLY (Score:5, Funny)
If I upgrade, I might be able to go from 200 frames per second in Doom III to.... 205 frames per second!
I can't wait to get rid of my old setup! It was a piece of shit!
Re:FINALLY (Score:2)
HDCP (Score:1)
Re:HDCP (Score:1)
Dual madness (Score:2, Funny)
Re:Dual madness (Score:2, Funny)
Great! (Score:1)
Re:Great! (Score:2)
Attack of the GFX E-penis argument? (Score:5, Insightful)
Graphics cards innovations for the past several months/year with SLI seem to be me mostly "i have a dual SLI system!", "yeah? well i have a QUAD SLI system!" - soo much performance that is unused it's pointless. Furthermore for the price of one of these brand new cards in the article I can build a decent gaming computer or a HDTV mythTV box.
I would rather spend $600 on much more useful things that would see use right now on pricewatch the video cards at $100 are: radeon x1300 256mb agp, radeon x1600 pro 256mb pci express, radeon x800 pci express 256mb, geforce 6600 gt pci-e 256mb
Re:Attack of the GFX E-penis argument? (Score:2)
So spend your $600 on more useful things with the rest of us, and let the fanatics keep driving the very high end video card market so that we can all benefit from it when it's in the $100 bin in what, 2 or 3 years.
Re:Attack of the GFX E-penis argument? (Score:2)
Re:Attack of the GFX E-penis argument? (Score:2)
Oblivion.
Three more words:
Unreal Tournament 2007.
I have an athlon 64 3800+ with SLI 7800GT's, XFi etc etc and oblivion still grinds to a halt if I push the settings up much beyond their medium levels. Even FEAR only just runs at a decent rate at full whack on my rig. I don't even want to think about the horsepower UT2007 will need.
You want a game that looks like crap and runs like crap, fine. Buy an X1300 or 6600GT. Those of us who want a better looking, faster responding high-end game can use all
Re:Attack of the GFX E-penis argument? (Score:2)
And just doing a quick back-of-the-envelope, that rig probably cost you well over $3000. That's a heck of a lot of money to spend and still have Oblivion 'grind to a halt' at max settings. Today's gaming market has just gotten ridiculous. What ever happened to the days when you could get good performance from the latest games on only $1000-$2000 worth of hardware?
Re:Attack of the GFX E-penis argument? (Score:2)
Re:Attack of the GFX E-penis argument? (Score:2)
Re:Attack of the GFX E-penis argument? (Score:2)
XSI on a pair of Quadros is worth the cost to me.
Re:Attack of the GFX E-penis argument? (Score:2)
They sell their cards used, last generations high end card is better performance than this generations mid range...
If they sell them consistently they are paying approximately the same amount because you can't sell last generations mid range card.
Sad but true... Of course I like to have dozens of systems doing nothing so I need the old hardware but if you don't run servers it doesn't matter
I predicted dual video cards was a fad (Score:5, Insightful)
The cost to implement and manufacture multiple video cards is ridiculous. Who honestly would spend $1400 just to have two video cards, and then only get at most 20% performance improvement.
With the current trend of multiple cores, I figured it would be just a matter of time for the SLI and Crossfire solutions to switch back to a single video card. Either they would dual core the GPU, or simply put two GPU on the same card.
I just makes sense to keep a video card as a single card. You dont have to duplicate the production costs and all the other components that are wasted in a dual card configuration, you also dont have to duplicate the bus technology on the motherboard in order to implement dual video cards. Overall, this will be a much cheaper configuration that will actually bring high performance video technology into the realm of being practical.
Eventually, 4 way GPU cards will be released, and eventually nVidia and/or ATI will start to dual core their GPUs, those spending money on their expensive dual or even quad based SLI configurations just wasted a bunch of money.
Re:I predicted dual video cards was a fad (Score:2)
Actually, the G71 processor used in that beast has 32 pixel pipelines already, which in their context are similar to cores on a CPU. (Sure, they form a SIMD architecture unlike CPU cores, but so does SLIed GPUs sort of as I have understood it.) When CPUs get more cores, G
Re:I predicted dual video cards was a fad (Score:2)
Re:I predicted dual video cards was a fad (Score:2)
Re:I predicted dual video cards was a fad (Score:2)
Re:I predicted dual video cards was a fad (Score:2)
Re:I predicted dual video cards was a fad (Score:2)
5950 to 7675 (3dmark scores) is over 28%. There were better and worse scores than that, but since that was the overall 3dmark core, I figured it would be good to go with.
Yes, there are individual tests that are lower than 20%, but to say 'at most 20%' when there are no games designed to USE that kind of hardware and the current benchmarks ALREADY show higher results... That's just wrong.
If you'd said 'better than 30%' I'd still have checked my
The problem with dual core... (Score:2)
Instruction Parallisation was never a problem there, so the cores are inheritly as parallel as the die-size allows. If you could squeeze twice as much transistors on a chip, your GPU would have 64 instead of 32 pixel piplelines, for example.
Plus dual core does nothing for the bandwith problem... (and no, going to 1024 bit memory or something isnt an option
Re:I predicted dual video cards was a fad (Score:2)
You are missing the point that GPU are highly parallel operation processors. What you call "dual core their GPU" has been done for the past 5+ years in the graphics industry. They call it a new product.
Every new generation had more pixel pipelines. What do you think those are? You can
Re:I predicted dual video cards was a fad (Score:2)
1. I have an SLI motherboard and GPU. I'm only running one card so I can upgrade it down the line (3+ years) when the card hits 50 bucks.
2. Who honestly would spend $1400 just to have two video cards, and then only get at most 20% performance improvement?
Actually, SLI can be had for as little as 300 (mobo not included). Also, you will see a much more than 20% performance boost. Check your numbers next time.
3. Yeah, i saw it merging onto one ca
Re:I predicted dual video cards was a fad (Score:2)
4x4 pawa (Score:2)
Someone shoot me.
In case you're like me (Score:5, Informative)
Re:In case you're like me (Score:2)
Re:In case you're like me (Score:1)
Re:In case you're like me (Score:2)
Re:In case you're like me (Score:2)
NVIDIA Corporation reintroduced the name SLI in 2004 (renamed as Scalable Link Interface) and intends for it to be used in modern computer systems based on the PCI Express bus.
Nice... (Score:2)
New game in mind? (Score:3, Funny)
Imagine... (Score:2, Funny)
to quote a friend (Score:2)
Comment removed (Score:3, Insightful)
And I just bought 'Deus Ex' the other day ... :-) (Score:3, Insightful)
That's allways the more fun way to go IMHO.
Re:And I just bought 'Deus Ex' the other day ... : (Score:2)
I was hoping SLI on a Stick meant USB or Firewire (Score:3, Insightful)
Sigh.
See, if I'd bought the "latest" computer, I'd already be out of date - by choosing to just buy a cheap $500 laptop, I'm just as out of date as I was a month ago.
But
Re:I was hoping SLI on a Stick meant USB or Firewi (Score:2)
Hurry Up and Wait (Score:2)
Given the qualifications and interests of that joint community, I'd expect to see a "PCI network" that parallelizes MP3 encoding on much cheaper MFLOPS GPU HW by now.
Maybe actually playing the games is eating up too much time
Lets move to a better format. (Score:3, Insightful)
Get the Gfx 'card' out of the computer. Add a GPU socket to the motherboard and expandable video-ram slots.
I could spend an hour on why I think this solution would be better but here are a few of my reasons:
1) As fast as PCI-E is, a direct motherboard interface would be faster
2) Directly upgradeable memory allows you to afford the better chips and expand the ram as you have the money instead of 'settling' for a lower card because the higher memory version doubles the price.
3) The ability to use the same memory and JUST upgrade your GPU since many revisions happen to cards while the memory stays the same.
4) You could use standard CPU cooling on the GPU to have a much more efficiently cooled GPU instead of adding more weight to a relatively flimsy PCI-E connector saving the occasional card/mb damage.
5) A forced standard all chipmakers would have to produce chips under the same interface standard for new boards and motherboard mfr's as well as CPU mfr's would have to be on the ball too. A GFX chip that you could buy for one year would still plug into new boards 5 years later as would the vid-ram, CPU and the system ram. Also, once any of them are upgraded the bios would need to auto-set to handle the faster speeds...so I want them to predict the speed of the GPU/CPU/RAM 10 years from now and at least try to make motherboards that can support the changing times for a realistic amount of time.
Sure, have boards with dual GPU's or more but it's time to get off the slot and move into a better format.
I know, the motherboards would cost more because the expectation would be that you could use the same motherboard for 10 years and frequent upgrades to the CPU/GPU/Ram/Gfx-Ram but I'd pay more for a board I didn't have to keep freaking changing while still being able to keep my game on and upgrade only the pieces that need upgraded, as I can AFFORD them.
But that's just my 10 cents.
Pro and cons (Score:3, Informative)
Also recently announced on slashdot, the developpement of a standart hypertransport connector [slashdot.org] (as part of the HT 3.0 revision).
So maybe in a near future you'll see motherboards featuring HyperTransport connectors, in which you could directly plug CPU/DDR board, GPU/GDDR
HDCP - DRM Crippled (Score:2)
Can it run AA + HDR at same time? (Score:2)
Re:Completely off topic (Score:1, Informative)
The more you know.
Re:AMENDMENT (Score:2)
Re:Are they crazy? (Score:2)
Re:Congratulations! (Score:2)
Tell him what he's nearly won!