×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

'SLI On A Stick' Reviewed

Hemos posted more than 7 years ago | from the that's-one-hot-stick dept.

188

Bender writes "What would happen if you took NVIDIA's multi-GPU teaming capability, SLI, and stuck it onto a single graphics card? Probably something like the GeForce 7950 GX2, a 'single' video card with dual printed circuit boards, dual graphics processors, dual 512MB memory banks, and nearly twice the performance of any other 'single' video card. Add two of these to a system, and you've got the truly extreme possibility of Quad SLI. We've seen early versions of these things benchmarked before, but the latest revision of this card is smaller, draws less power than a single-GPU Radeon X1900 XTX, and is now selling to the public."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

188 comments

Wow (4, Insightful)

McGiraf (196030) | more than 7 years ago | (#15472016)

Which game you need to run to take advantage of the equvalent of 4 graphic cards?

Re:Wow (3, Informative)

Kutsal (514445) | more than 7 years ago | (#15472063)

Nothing yet, probably. But that doesn't mean there won't ever be any.. Also, in most cases, these boards are used by people like John Carmack to come up with proof-of-concept of new ideas/technologies, or whatever cool thing he's cooking up...

While they may be overkill for your average user, for (game) developers these things will be goldmines..

-K

Re:Wow (0)

Anonymous Coward | more than 7 years ago | (#15472066)

If you want hyperrealistic Bouncing Boobies in Dead or Alive: Beach Volleyball, this is the stick for you.

Personally, I'm still waiting for somebody to code matrix-like graphics for compiling Linux kernels before I plunk any more money on high end whiz-bang grafix HW.

Re:Wow (4, Funny)

ceeam (39911) | more than 7 years ago | (#15472112)

Windows Vista Aero Glass.

Re:Wow (1, Funny)

Anonymous Coward | more than 7 years ago | (#15472150)

Seriously, it's gonna be great to know that a game of solitaire can bring my computer to it's knees...

Re:Wow (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#15472118)

Wow is right! I can't escape this css /. 4th dimension I've stumbled into. Looks good, but IE7b is getting jacked now, because even this form box I'm typing into right now is spilling over past the white "table" into some "black" bar nether-region over on the right. And don't give me some IE7 www non compliance crap as an excuse. IE7 is here. It's queer. And it's here to stay.

Also, please indent the nested replies at least twice the pixel width you have now. You can barely make out the nested structure and who replied to what parent. Overall, the new fonts are welcome and it looks great.

As far as this topic goes, Nivida rulez! That is all. Please return to your regularly scheduled programming. END TRANSMISSION.

Re:Wow (0, Offtopic)

SmashPot (721474) | more than 7 years ago | (#15472431)

IE7 blows. As a staunch IE supporter prior to v7 I am totally giving up any support for the new browser and will recommend the same to my clients. MS's biggest mistake in the browser domaination game was developing IE7. Download Firefox my man.

Re:Wow (-1, Offtopic)

MooUK (905450) | more than 7 years ago | (#15472516)

That's an unusual view. Every other person I know either:
  - didn't like IE6 or earlier and still doesn't like IE7;
  - didn't like IE6 or earlier, but does like IE7;
  - loved IE6 and loves IE7 even more.

Out of interest, why do you feel that way?

(Me, I think the first is probably sane, the second is perhaps misguided, and the third is either warped and masochistic, or plain stupid...)

Re:Wow (3, Informative)

Anonymous Coward | more than 7 years ago | (#15472195)

Oblivion. At 1600x1200. Turn up the antialiasing to 4x and turn on all the effects.

Re:Wow (4, Insightful)

Tim C (15259) | more than 7 years ago | (#15472210)

Today, none (unless you want to run something like Doom 3 or Half Life 2 with all the options turned up to max and at an insane resolution).

Tomorrow, who knows? I remember a time when a TNT2 Ultra was considered overkill, now you can get more powerful GPUs in mobile phones.

Re:Wow (1)

Eivind (15695) | more than 7 years ago | (#15472814)

Yes. But the market will consist of idiots. Because, as you say, today you can literally use such a setup for nothing.

Sure, 3 years down the road there'll be games that look noticeably better with such a setup, but heres the thing; 3 years down the road you can have this graphics-performance for 1/8th the price and power-consumption.

It's fine though, those "early adopters" (aka idiots) pay a large fraction of the development-cost for the rest of us.

Re:Wow (2, Informative)

greatguns_17 (955947) | more than 7 years ago | (#15472529)

i thought the same about my 7800 GTX till i tried playing oblivion with all setting....and the fps i get mostly below 30. you get the hardware and getting the software to saturate that piece of hardware is not so hard....

Re:Wow (4, Insightful)

matt328 (916281) | more than 7 years ago | (#15472586)

The one where you win by claiming higher frame rate than your peers.

As an aside: it doesn't matter how long you've been playing a certain fps, your eyes have not mutated to give you the ability to discern a difference between 400 and 405 frames per second.

Re:Wow (1)

KingMotley (944240) | more than 7 years ago | (#15472757)

It'd help games like:
Sony Everquest 2
Tom Clancy's Ghost Recon Advanced Warfighter
Microsoft Rise of Nations 2: Rise of Legends
Oblivion with all the settings on high

Any of those games trying to run with the eye candy on, at my monitors native resolution (1920x1200) will cause the framerates to drop into the sub-teen FPS on even high end graphics cards. Now if they can just make it so that you don't need a SLI chipset, that'd rock. I'll never get quad-sli, but it sure would be nice to be able to drop one of these in a non-sli machine. So far, I've heard that they only run in SLI-capable motherboards, which is a shame.

what would be the tempareture like? (2, Interesting)

ketamine-bp (586203) | more than 7 years ago | (#15472028)

i wonder

Less than one Radeon X1900XT (1, Informative)

Anonymous Coward | more than 7 years ago | (#15472197)

Less power consumed than the high-end Radeon, and take into consideration the heat is going to be coming from two GPU cores instead of one. If you're already on an ATI setup this will surely take your temp down a couple of degrees.

/nVidiot fanboy

What about... (2, Insightful)

exit3219 (946049) | more than 7 years ago | (#15472032)

4 cards with 2 dual-core, double-the-cache, twice-the-speed GPUs each? Is that what the future keeps for us?

Re:What about... (1)

Xymor (943922) | more than 7 years ago | (#15472074)

I think in the future we'll see GPUs with multi-core CPUs, with PhysX and Media(maybe IBM's CELL) co-processors and lots and lots of memory for HD-textures and HD-content. Or maybe I just need some coffe.

Weight (4, Interesting)

Bios_Hakr (68586) | more than 7 years ago | (#15472057)

Don't snap off your PCI slot. Soon, we'll see modder cases with rails for support the front of the cards.

Or maybe, just maybe, old-school lay-down cases will come back in style.

Re:Weight (0)

Anonymous Coward | more than 7 years ago | (#15472180)

Vertical cases are already back in style at least amongst the Home Theatre PC crowd. Veritcal cases allow better fits into entertainment centers and shelves for people who like to hide their components in closets.

Re:Weight (2, Funny)

dkf (304284) | more than 7 years ago | (#15472422)

Or maybe, just maybe, old-school lay-down cases will come back in style.
Bah! It's 19" racks for me! All I need now is a big reel-to-reel tape deck to use as a false front, and everyone will know I've got a proper computer!

Bleugh (2, Interesting)

Anonymous Coward | more than 7 years ago | (#15472062)

I'm not the only one that thinks 'great, just what we need' am I? I only just upgraded my graphics recently from a 5900-series to a reasonably priced 7600-series, and since doing so reviews of CrossFire[sic?] and SLI keep popping up, and now quad- is appearing. This time next year can I expect my graphics card to not even be considered minimum-spec to run new games on the PC, yet are going to be on the Xbox360 and PS3 running just fine?

Who truly honestly needs this much horsepower for personal use? Seems like a case of making the product long before any real demand for it actually exists.

Re:Bleugh (1)

ClamIAm (926466) | more than 7 years ago | (#15472121)

This is the nature of the "hardcore" (or "enthusiast", or whatever they call it these days) PC game market. Unless you spend several hundred dollars every few years, you get way behind the curve. It's really unfortunate, as I'd love to play more PC games, but the total cost of upgrades (versus what you get out of it) is way too much.

Re:Bleugh (3, Insightful)

kfg (145172) | more than 7 years ago | (#15472309)

Unless you spend several hundred dollars every few years, you get way behind the curve.

What's wrong with staying way behind the curve? It's the same tech, the same games, the same everything over time, except that you get those who think there is some important value to being at the leading edge of the curve to finance your gaming for you.

Your problem isn't tech, or money . . .it's envy.

Remember, the best ride is on the face of the wave.

KFG

Re:Bleugh (5, Funny)

moonbender (547943) | more than 7 years ago | (#15472511)

Remember, the best ride is on the face of the wave.

I'm sorry, you'll have to come up with a car analogy.

Re:Bleugh (2, Insightful)

kfg (145172) | more than 7 years ago | (#15472637)

I'm sorry, you'll have to come up with a car analogy.

The best value in a car is a two year old used, third year of the model, but avoid the models favored by teenage street racers. They're innately overpriced for what you get and no matter how shiney the paint the internals have had the shit beat out of them.

KFG

Re:Bleugh (1)

Grizpin (899482) | more than 7 years ago | (#15472142)

/agree

I bought a shiny 6800GT over a year ago for $400 bucks. I'll never spend close to that for a video card again. If I can play the same games on a next gen console I'll pass on any PC upgrades in the future. It's a shame... the PC industry is only hurting self for a long term user base. Hell, you can't even get a decent baseball sim on the PC anymore... it's all going to pot.

Re:Bleugh (1)

RingDev (879105) | more than 7 years ago | (#15472315)

I built my newest PC about a year and a half ago for under $800. It replaced my previous PC which I had used for about 3-4 years. My year and a half old PC is still doing fine with most newer games, I've play HL2 based games with most options turned on with no problem. I've been playing a lot of NFS:MW lately, with the graphics cranked up and it runs smooth as silk.

As for a baseball sim... you've gotta be kidding. I mean, I can understand going out to a game, the atmosphere, the pop-corn and hot dogs, the crowd... But of all the boring games to turn into a video game... I put virtual-baseball right up there with virtual fishing. What a complete waste of time, unless you are looking for a 'hip' way of having a talk about sex, drugs, or alcohol with your child. Even then, it doesn't have the no escape mentality of being stuck on a boat miles from shore or in a stadium with no ride home.

-Rick

Re:Bleugh (1)

Grizpin (899482) | more than 7 years ago | (#15472379)

Well Rick, some people still enjoy a good simulation of baseball. The PC used to be the king of baseball sims but since developers are making more money developing for the console they abandoned the PC (thus my reason for mentioning it). My point is that if they can make games like HL2 and Doom3 better on consoles, which they have come pretty damn close (go read the xbox reviews @ gamespot), then why put any more money into a PC?

Re:Bleugh (1)

RingDev (879105) | more than 7 years ago | (#15472612)

HL2/Doom3 are better on newer consoles than their pretecesors were on earlier consoles. They are still weak compared to their PC based rivals. ;)

As great as consoles are, they are still specialized machines which limits their adoption. My PC can do everything consoles can do and much more that consoles can not. And as long as PCs have that advantage and a wide spread adoption rate, there will continue to be a market for PC based video games.

-Rick

Mod Parent -1, 640K is enough memory (0)

Anonymous Coward | more than 7 years ago | (#15472227)

Who truly honestly needs this much horsepower for personal use?

This sounds too much like, "640K should be enought memory for anyone".

Re:Bleugh (1)

spankey51 (804888) | more than 7 years ago | (#15472239)

"640KB ought to be enough for anybody." -Bill Gates

This is actually pretty cool... I'm starting to feel like the computer industry is warming up to the prospect of modular parallelization "at home".
We are reaching a point where quantum tunneling could become a real problem and frankly, I was hoping this would happen sooner... The industry always focused on things getting smaller, but we're running into a barrier in that direction.
Now we're starting to see the opposite: instead of buying a brand new system as an upgrade, all you need to do is add to the existing one. It makes your computer more like an investment and less like a fruit that will inexorably go bad or turn into poo... like a console.
So now we wait... I really want to see processors with sockets built into their tops so you can stack them as a modular upgrade. Likewise with GPUs, RAM, etc... SOrry, that's the little kid in me. We can dream.

Re:Bleugh (0)

Anonymous Coward | more than 7 years ago | (#15472346)

This time next year can I expect my graphics card to not even be considered minimum-spec to run new games on the PC, yet are going to be on the Xbox360 and PS3 running just fine?

Because my native LCD resolution is 1920x1200, and I'd prefer to play games (on the rare occasions that I do) in that resolution. The Xbox and PS3 are relatively low-res even with their new fancy "HiDef" modes.

But really, most games have "detail levels" you can turn off, you just have to live without the fancy water reflectivity effects, mega scale textures rendered in high detail at 10,000 feet, etc. Let the Hilton kids spend the money on this ultra tech and don't worry about it, it just drives the price/performance on pedestrian tech higher.

Exclusivity- what's the deal? (4, Insightful)

Coopjust (872796) | more than 7 years ago | (#15472092)

The review states:

Before you get all excited about the prospect of dedicating 96 pixel shaders, 2GB of memory, and 1.1 billion transistors to pumping out boatloads of eye candy inside your PC, however, there's some bad news. NVIDIA says Quad SLI will, at least for now, remain the exclusive realm of PC system builders like Alienware, Dell, and Falcon Northwest because of the "complexity" involved.

So they are going to alienate the majority of the market that would spend the money on a Quad SLI setup to keep it exclusive to system builders for whatever period of time.

Seems like a bad business decision to me, at least until (and if) Nvidia comes to their senses.

Re:Exclusivity- what's the deal? (2, Funny)

alpinerod (970358) | more than 7 years ago | (#15472321)

The review states: Before you get all excited about the prospect of dedicating 96 pixel shaders, 2GB of memory, and 1.1 billion transistors to pumping out boatloads of eye candy inside your PC, however, there's some bad news. NVIDIA says Quad SLI will, at least for now, remain the exclusive realm of PC system builders like Alienware, Dell, and Falcon Northwest because of the "complexity" involved. Did it mention anything about having to have a direct supply of electricity from your local Three Gorges Dam as well?

FINALLY (5, Funny)

Quick Sick Nick (822060) | more than 7 years ago | (#15472096)

*Throws away 4 7900 GTXs running in SLI*

If I upgrade, I might be able to go from 200 frames per second in Doom III to.... 205 frames per second!

I can't wait to get rid of my old setup! It was a piece of shit!

Re:FINALLY (0)

Anonymous Coward | more than 7 years ago | (#15472200)

If you think you will only be getting five extra frames per second, then you are clearly mistaken. Of course, the only way to really find out is to purchase it. I'm not affiliated with Nvidia.

Dual madness (2, Funny)

xming (133344) | more than 7 years ago | (#15472134)

I have a dual core duo 2 with dual channel GB DDR2 and dual GPU dual card (SLI) setup with dual monitor, cooled with dual case fan, powered by dual (redundant) PSU on 220V. Oh forgot about my dual layer DVD burner and dual button mouse.

Re:Dual madness (0)

Anonymous Coward | more than 7 years ago | (#15472393)

...But still only one penis.

Attack of the GFX E-penis argument? (5, Insightful)

LordKazan (558383) | more than 7 years ago | (#15472199)

I'm probably going to loose even more karma for posting with that title and subject - but i'm on a karma--; roll lately.

Graphics cards innovations for the past several months/year with SLI seem to be me mostly "i have a dual SLI system!", "yeah? well i have a QUAD SLI system!" - soo much performance that is unused it's pointless. Furthermore for the price of one of these brand new cards in the article I can build a decent gaming computer or a HDTV mythTV box.

I would rather spend $600 on much more useful things that would see use right now on pricewatch the video cards at $100 are: radeon x1300 256mb agp, radeon x1600 pro 256mb pci express, radeon x800 pci express 256mb, geforce 6600 gt pci-e 256mb

Re:Attack of the GFX E-penis argument? (1)

skiflyer (716312) | more than 7 years ago | (#15472415)

I would rather spend $600 on much more useful things that would see use right now on pricewatch the video cards at $100 are: radeon x1300 256mb agp, radeon x1600 pro 256mb pci express, radeon x800 pci express 256mb, geforce 6600 gt pci-e 256mb

So spend your $600 on more useful things with the rest of us, and let the fanatics keep driving the very high end video card market so that we can all benefit from it when it's in the $100 bin in what, 2 or 3 years.

Re:Attack of the GFX E-penis argument? (1)

arkhan_jg (618674) | more than 7 years ago | (#15472592)

One word:
Oblivion.
Three more words:
Unreal Tournament 2007.

I have an athlon 64 3800+ with SLI 7800GT's, XFi etc etc and oblivion still grinds to a halt if I push the settings up much beyond their medium levels. Even FEAR only just runs at a decent rate at full whack on my rig. I don't even want to think about the horsepower UT2007 will need.

You want a game that looks like crap and runs like crap, fine. Buy an X1300 or 6600GT. Those of us who want a better looking, faster responding high-end game can use all the graphic card horsepower we can buy.

Re:Attack of the GFX E-penis argument? (1, Insightful)

Anonymous Coward | more than 7 years ago | (#15472762)

This card is not about you. It's about a) the rich, for whom $10000 for a system is as much pocket change as a $50 card for us, b) the computation-on-GPU people and most importantly c) developers whose job depends on having 2008's hardware in their development systems right now.

You've also got to admit that if you had the money you'd buy one just to tell the crazed Sony fanboys "my PC kicks the ass of a PS3 before it's even out" :)

I predicted dual video cards was a fad (5, Insightful)

TheSkepticalOptimist (898384) | more than 7 years ago | (#15472235)

Like the original dual Voodoo cards, multiple video cards is just one of those things that keeps going out of style (but like old fads, makes its appearance every decade or two).

The cost to implement and manufacture multiple video cards is ridiculous. Who honestly would spend $1400 just to have two video cards, and then only get at most 20% performance improvement.

With the current trend of multiple cores, I figured it would be just a matter of time for the SLI and Crossfire solutions to switch back to a single video card. Either they would dual core the GPU, or simply put two GPU on the same card.

I just makes sense to keep a video card as a single card. You dont have to duplicate the production costs and all the other components that are wasted in a dual card configuration, you also dont have to duplicate the bus technology on the motherboard in order to implement dual video cards. Overall, this will be a much cheaper configuration that will actually bring high performance video technology into the realm of being practical.

Eventually, 4 way GPU cards will be released, and eventually nVidia and/or ATI will start to dual core their GPUs, those spending money on their expensive dual or even quad based SLI configurations just wasted a bunch of money.

Re:I predicted dual video cards was a fad (1)

Alef (605149) | more than 7 years ago | (#15472497)

With the current trend of multiple cores, I figured it would be just a matter of time for the SLI and Crossfire solutions to switch back to a single video card. Either they would dual core the GPU, or simply put two GPU on the same card.

Actually, the G71 processor used in that beast has 32 pixel pipelines already, which in their context are similar to cores on a CPU. (Sure, they form a SIMD architecture unlike CPU cores, but so does SLIed GPUs sort of as I have understood it.) When CPUs get more cores, GPUs get more pipelines. I suppose you could put 64 in there instead, but then you run into manufacturing problems because of the chip size.

They could probably put two GPUs on the same board though.

Re:I predicted dual video cards was a fad (0)

Anonymous Coward | more than 7 years ago | (#15472518)

I guess you don't see a real need for more than two monitors then? I rely on at least 2 for development and gaming w/ 3 is getting better all the time. While I agree that going to dual-core was innevitable, I think the need for multiple cards has been established and will continue to exist for the simple principle that you just can't drive more than 2 monitors w/ one card for space reaasons. What would really be interesting is if dual CPU-GPU systems could dedicate memory paths/bandwidth between specific processors; then multiple apps could run like they had their own dedicated systems!

Re:I predicted dual video cards was a fad (1)

Aladrin (926209) | more than 7 years ago | (#15472602)

"Did you know that 93% of statistics are made up on the spot?"

5950 to 7675 (3dmark scores) is over 28%. There were better and worse scores than that, but since that was the overall 3dmark core, I figured it would be good to go with.

Yes, there are individual tests that are lower than 20%, but to say 'at most 20%' when there are no games designed to USE that kind of hardware and the current benchmarks ALREADY show higher results... That's just wrong.

If you'd said 'better than 30%' I'd still have checked my facts before posting, but I never would have actually posted because for the most part, you'd be right. I'm surprised to see such a low increase when SLI normally gets about 60-80% more. (That's not a hard number, I made it up from my experiences and the benchmarks I've seen in the past.)

I splurged for a 5900 ultra when they first came out and I was very disappointed. I took my changes one more time and bought the 7800gtx when it came out, and I've been very happy with it. This could be just another situation where everything wasn't optimized correctly and it didn't perform like it should.

Re:I predicted dual video cards was a fad (1)

CrazyBusError (530694) | more than 7 years ago | (#15472672)

"Who honestly would spend $1400 just to have two video cards, and then only get at most 20% performance improvement."

Hi! Welcome to Slashdot - take it you've just discovered this place?

The problem with dual core... (1)

imsabbel (611519) | more than 7 years ago | (#15472802)

is that it doesnt work for GPUs.
Instruction Parallisation was never a problem there, so the cores are inheritly as parallel as the die-size allows. If you could squeeze twice as much transistors on a chip, your GPU would have 64 instead of 32 pixel piplelines, for example.
Plus dual core does nothing for the bandwith problem... (and no, going to 1024 bit memory or something isnt an option

4x4 pawa (1)

suv4x4 (956391) | more than 7 years ago | (#15472246)

Now've the full package: a 4x4 car, a 4x4 AMD chipset and a 4x4 SLI video card.
Someone shoot me.

In case you're like me (4, Informative)

szembek (948327) | more than 7 years ago | (#15472250)

SLI stands for Scalable Link Interface.

Re:In case you're like me (0)

Anonymous Coward | more than 7 years ago | (#15472362)

And for those old schoolers (like me) who wanted to correct szembek:

http://en.wikipedia.org/wiki/Scan-Line_Interleave/ [wikipedia.org]

SLI originally meant "Scan Line Interleave" (meaning one GPU would take the even scan lines and the other would take the odd scan lines, "doubling" your power because each card only rendered half of the pixels). In 2004, NVIDIA apparently renamed it to "Scalable Link Interface", so szembek is correct.

Re:In case you're like me (1)

Forseti (192792) | more than 7 years ago | (#15472423)

My mistake... Man, why couldn't you have posted this before I started composing my post? ;-)

Re:In case you're like me (0)

Anonymous Coward | more than 7 years ago | (#15472465)

I was too busy not checking my urls to make sure they work :)

Re:In case you're like me (1)

gEvil (beta) (945888) | more than 7 years ago | (#15472364)

I thought it was a Super Lickable Interface. At least that sounds like it would make sense to put it on a stick...

Re:In case you're like me (1)

Forseti (192792) | more than 7 years ago | (#15472401)

Actually, in the context of linking multiple graphics cards, I'm pretty sure SLI stands for "Scan-Line Interleave". It's a technology they inherited in the 3Dfx buyout.

http://en.wikipedia.org/wiki/Scan-Line_Interleave [wikipedia.org]

Re:In case you're like me (1)

szembek (948327) | more than 7 years ago | (#15472450)

Apparently this acronym used multiple times with different meanings in the graphics card industry. From the Wikipedia article you posted:

NVIDIA Corporation reintroduced the name SLI in 2004 (renamed as Scalable Link Interface) and intends for it to be used in modern computer systems based on the PCI Express bus.

Nice... (1)

GmAz (916505) | more than 7 years ago | (#15472261)

Nice card...err...cards. I would buy one if I had the $$$. But if you look at the price point of the 7900GTX and this new card the price difference isn't that big. Still though, thats a pretty penny just to make games look better. My 6800GT is still hanging in there.

Completely off topic (-1, Offtopic)

jmichaelg (148257) | more than 7 years ago | (#15472280)

Ow! Ow! Ow! Slashdot just went unreadable on me!

Taco - Love the new sleeker look but please, please,please,... give us control over the display font and font size in the preferences. The font you chose is presbyoptic phobic.

I'm headed off to mozilla.org to see if there's a "choose your own fontsize so you can read slashdot" extension for Firefox. Hmmm,... do I look under GeezerEyes or GeezerEyesGoodbye?

Re:Completely off topic (1, Informative)

Anonymous Coward | more than 7 years ago | (#15472324)

If your in firefox and it's just the text size that's bothering you, do CTRL-+ (Press the control and plus buttons) It will increase the font size, but not change the font. (you can use CTRL-- CTRL and minus buttons to change it back)

The more you know.

Re:Completely off topic (1, Informative)

Anonymous Coward | more than 7 years ago | (#15472424)

hold ctrl and scroll your mouse wheel, geez you'de think these video cards would be able to resize fonts on the fly.

What is the difference from Voodoo disaster days? (0)

Anonymous Coward | more than 7 years ago | (#15472299)

I seem to remember a day and age when a chip manufacturer thought the best way to outperform its competitors was to place two processors on the same graphics card. Then, they placed four. I believe they had an eight-processor version in the pipeline when everyone told them to stop selling junk. I believe the reasons were overhead caused by the load-sharing, and that resources invested in single processor paid off much more in terms of performance.

Could someone point out exactly why this is not likely to be repeating itself?

Imagine... (2, Funny)

enko (802740) | more than 7 years ago | (#15472323)

Imagine a beowulf cluster of those?

Re:Imagine... (0)

Anonymous Coward | more than 7 years ago | (#15472763)

Recently featured on slashdot... A library capable of using cards like these for FFTs. I assume this makes their benchmark more relevant.

Are they crazy? (1, Funny)

WhackingDay (967592) | more than 7 years ago | (#15472402)

$600???!! Why, that's way too expensive. I mean no one would spend $600 for something you only can play games with. The makers of these cards are stupid.

Getting by (2, Insightful)

Neo-Rio-101 (700494) | more than 7 years ago | (#15472420)

I'm still getting by with my ATI RADEON 9700 PRO. Still plays just about anything I can throw at it. Oblivion gives it a hard time, but it's still adequately playable.

I'm going to hold off as long as possible until the card can't play the latest games, at which point I may get one of these quad SLI setups. by that time, we'll have DDR3 memory and quad core CPUs too.

Practicality. (1)

pr0digy25 (915443) | more than 7 years ago | (#15472444)

While it's nice to always push the envelope, certain directions of this are only good for the hardcore bleeding-edge types. Products of these caliber definitely appeal only to a limited audience.

But... where is the stick? (1)

jftitan (736933) | more than 7 years ago | (#15472596)

Now, I know about a month ago I read about NVIDIA releasing the quad card setup to high end PC manufactures for those that wanted "Lambo" price & performance. The reason why people (and nvidia) released these quad setups to high end manufactures was the fact that these cards took too much room for DIY do make an efficent setup not to light a fire within the case.

Since this article stated this was NVIDIA's way of releasing the Quad setup to DIYs, where is the dual on a stick idea? (after reading the article & comments, I think this was submitters hype)

At first I was really actually hoping that this would be dual graphics chips on one PCB. Not two PCBs attached together. Nothing different compared to the ones released awhile back to "only manufactures". I like a clean case. Thats why I spend the extra cash on longer cables, and spend at minimum an hour or so getting the cables out of the way of airflow. Seriously, I have a great gamer rig with only half the fans that most gamer rigs will require just to keep them cooler.

Having to have a oversized video card (other than bragging rights) is just causing the proper airflow from being efficent. Again, this article just spoiled my hopes of buying a dual sli on one pcb, but just one pcb.

wake me when these get IC'ed onto one PCB and then I will be surprised. AND WILLING TO BUY.

AMENDMENT (1)

jftitan (736933) | more than 7 years ago | (#15472670)

I overlooked the logical layout. The concept is there.. two cores on one pcb. logically!

I don't doubt that gamers already have the large enough rigs to have a longer than usual video card. I mean, hell, if I can fit a old SCSI card into a standard PC case. (Remember those old ass SVGA cards, so long that it actually contacted the front end of the case.)

I am sure they could have lengthened the PCB, placed both cores on the same PCB, then actually made a air duct system using less parts. After seeing the 'logical' diagram, I know we will soon see, these duals on a stick. just gotta get the engineers involved.

And I just bought 'Deus Ex' the other day ... :-) (2, Insightful)

Qbertino (265505) | more than 7 years ago | (#15472669)

I just bought the budget edition of 'Deus Ex' the other day. What I really like about it is that I needn't think twice about wether it will run smooth or not. I have an Athlon 2100 XP + and a Geforce 4 Ti 4-something, I can crank up the grafics to full and needn't worry about lag or something.
That's allways the more fun way to go IMHO.

GeForce 7950 GX2, More details at HotHardware (0)

Anonymous Coward | more than 7 years ago | (#15472739)

From another post that never made the front page...

NVIDIA has officially taken the wraps off of their new ultra high-end Graphics card dubbed the GeForce 7950 GX2. HotHardware has a full review and showcase [hothardware.com] posted that shows performance with this new single card design that employs a pair of GeForce 7900 GPUs on a single card. One of the more interesting aspects of the card [hothardware.com] is its PCI Express switch that provisions a X8 PCI Express connection to each GPU, back down to a single X16 PCI Express Graphics slot. The new card certainly rips up the benchmarks pretty much as well.

I was hoping SLI on a Stick meant USB or Firewire (3, Insightful)

WillAffleckUW (858324) | more than 7 years ago | (#15472778)

because for the large numbers of us with laptops, it's really hard to upgrade our video cards, given space constraints, but quite easy to pop in a "stick" video card so we can run the latest graphics apps.

Sigh.

See, if I'd bought the "latest" computer, I'd already be out of date - by choosing to just buy a cheap $500 laptop, I'm just as out of date as I was a month ago.

But ... I will need to be able to play Spore ...

Hurry Up and Wait (1)

Doc Ruby (173196) | more than 7 years ago | (#15472801)

With so much of the highest-level CPU design going into GPUs, and so many of the most wily consumers of the fastest GPUs going to any lengths possible to trick them out, I'm surprised there's not a lot more development of GPGPU [gpgpu.org], harnessing these processors for general purpose computing.

Given the qualifications and interests of that joint community, I'd expect to see a "PCI network" that parallelizes MP3 encoding on much cheaper MFLOPS GPU HW by now.

Maybe actually playing the games is eating up too much time.

I won't buy this one, but... (1)

ericdfields (638772) | more than 7 years ago | (#15472829)

... i'm happy that NVIDIA & ATI keep coming out w/ these overly slick, largely under-utilized cards into the gaming market, as they continue to drive the price of pre-existing, under-utilized cards down. The mobo/v-card combo i've been eyeing was ~$400 a month ago. Now it's ~$300. I'll be drooling over HDR Oblivion by the end of the summer ;-)

No more heating bills (0)

Anonymous Coward | more than 7 years ago | (#15472840)

Next winter, I can just set up my quad core GPU in the middle of the house and it should be able to heat everything.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...