Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nvidia Launches 8800 Series, First of the DirectX 10 Cards

ScuttleMonkey posted more than 7 years ago | from the by-popular-demand dept.

149

mikemuch writes "The new top-end GeForce 8800 GTX and GTS from Nvidia launched today, and Loyd Case at ExtremeTech has done two articles: an analysis of the new GPU's architecture, and a benchmark article on PNY's 8800 GTX. The GPU uses a unified scalar-based hardware architecture rather than dedicated pixel pipelines, and the card sets the bar higher yet again for PC graphics." Relatedly an anonymous reader writes "The world and his dog has been reviewing the NVIDIA 8800 series of graphics cards. There is coverage over at bit-tech, which has some really in-depth gameplay evaluations; TrustedReviews, which has a take on the card for the slightly less technical reader; and TechReport, which is insanely detailed on the architecture. The verdict: superfast, but don't bother if you have less than a 24" display."

cancel ×

149 comments

Sorry! There are no comments related to the filter you selected.

WOW! This is FAST! (3, Insightful)

Salvance (1014001) | more than 7 years ago | (#16775453)

It's actually pretty surprising that the DX10-compatible 8800 runs $450-$600 given it's brand new and has huge performance gains over NVidia's current cards. I don't understand why someone would say only buy it if you have a 24" monitor though ... it seems like buying a single 8800 would be just as good (and cheaper) than buying a couple 7800's ...

Re:WOW! This is FAST! (2, Informative)

Ant P. (974313) | more than 7 years ago | (#16775707)

What they're saying is that if you're only ever going to go up to 1600x1200, this is just going to waste drawing more frames than your monitor can ever display. Right now it looks like the only thing that could strain this card is one of those huge Apple LCDs.

Re:WOW! This is FAST! (1)

Kenja (541830) | more than 7 years ago | (#16775847)

7800gtx and dual AMD Opterons.

Modern games still dont run at optimal frame rates at 1280x1024 with max graphics settings. Most recent of these is Neverwinter Nights 2, I get around 20fps which is enough, but I wouldn't mind it being a bit smoother.

Re:WOW! This is FAST! (2, Informative)

JohnnyBigodes (609498) | more than 7 years ago | (#16777085)

Apparently Neverwinter Nights 2 has some sort of problem in it is *very* slow for some people with reasonably fast PCs. I've tried it and it also runs almost unbearably slow with things set to medium everything and a couple of lows (1024, no AA) on a 7800.

20fps with your 7800GTX in NWN2 is certainly not acceptable :)

Re:WOW! This is FAST! (0)

Anonymous Coward | more than 7 years ago | (#16777261)

Hmmm... I get mostly 30fps (vsynced, occasionally drops to 20 if the area gets intensive) at that resolution on that game with just a plain vanilla 6800 (and single 3GHz P4 HT) and the shadow options turned down to non-insanity. Can't see it makes much difference visually.

However, there doesn't seem to be an option for AA. Also my screen is 1600x1200 native; fortunately it's 20", so running 1280x1024 centred non-scaled works OK.

I want one of these beasties... the cheapest model I could find from a UK retailer was the equivalent of €550. I am sort of in the market for a graphics card update (Oblivion for example I have to run at 1024x768, and that doesn't look so great scaled up to 1600x1200) but my budget is more like €300. I'm not happy with the increase an X1900PRO or a 7950GT would get me (doubtful that the insanity that is Bethesda's creation runs on those at 1600x1200 high settings).

Just need to keep waiting I guess. Tests the patience though; I could afford to even get an 8800 series right now if I was completely and utterly foolish with my money. I suspect I may end up holding off a whole year more. Took that long when I was price/value-watching for a laptop (heh, the thing manages to run NWN2 800x600 on an X1400 Mobility at 20fps - with high textures and the filtering which are the important things for me!).

Re:WOW! This is FAST! (4, Funny)

steveo777 (183629) | more than 7 years ago | (#16776159)

Just play Ultima IX on 1028x768 mode without any of the fixes or patches. I do believe the 8800 will have met its match. (never met a configuration that could run it over 10fps, except my friends old 650Mhz PIII with some VooDoo card or another, ran it at 19fps)

Re:WOW! This is FAST! (2, Funny)

Nimey (114278) | more than 7 years ago | (#16776453)

VooDoo


Stop that.

Re:WOW! This is FAST! (0, Flamebait)

LordMyren (15499) | more than 7 years ago | (#16778175)

|| VooDoo
|
| Stop that.
| --
| Hail Eris, full of mischief...

You just earned yourself one Greyface cursing.

FUCK-ASS. Consider your banal ass cursed.

Re:WOW! This is FAST! (1)

mikael (484) | more than 7 years ago | (#16776351)

But if you are doing general purpose computing that requires considerable floating-point performance (FFT signal processing, dynamic systems), then you won't be restricted by the refresh rate of the monitor. Both DirectX and OpenGL support floating-point framebuffers. However, for some simulations, you may have less than four floating-point variables per pixel. So just by using three out of four pixel channels, you are just wasting 25% or more of your processsing time. Having scalar processors would seem to be the way to solve this problem.

Re:WOW! This is FAST! (1)

Barny (103770) | more than 7 years ago | (#16776601)

Hrmm, well running a pair of 7900gt cards atm, and on company of heroes, all settings maxed at 1902x1200 (dell 24") things get a little chunky, so yes, other than apple there are screens that need these.... mines pre-ordered btw :)

Re:WOW! This is FAST! (2, Funny)

Sj0 (472011) | more than 7 years ago | (#16777127)

You know, I remember being impressed that Duke Nukem 3d ran at 640x480. The point where you need 1902x1200 AND anti-aliasing is the point where you're just doing it to make fun of the people without a Geforce 8800.

Re:WOW! This is FAST! (1)

Ruff_ilb (769396) | more than 7 years ago | (#16777993)

Indeed.

640K should be enough for anyone.

Re:WOW! This is FAST! (1)

XL70E3 (574496) | more than 7 years ago | (#16778035)

You know i was impressed by it too. But then, i was very much impressed when i bought my 400$ Voodoo 2 card and played Quake 2 at 800*600! You can't stop this! Somehow, graphics improvements are good too in games. My guess is that you need to upgrade, but you can't.

Re:WOW! This is FAST! (2, Insightful)

Babbster (107076) | more than 7 years ago | (#16777205)

It depends on the game. In the [H]ardOCP review [hardocp.com] , this appears to be the first card that can do Oblivion with maxed in-game settings (the grass has been the problem area in the past, even with top-of-the-line cards) at very high resolutions and high AA settings while retaining solid framerates - the settings they considered ideal in their testing were 8x AA at 1600x1200 and 4x AA at 1920x1200. That would be impressive for a SLI setup, let alone a single card.

How worthwhile that is depends, of course, on just how killer a person wants their gaming rig to be (I can't imagine ever buying a $600 graphics card myself). But, given that the performance seems to exceed that of any other graphics card (or any two, for that matter), it's pretty clearly the card to get to ensure maximum gaming PC penis size. :)

Re:WOW! This is FAST! (2, Insightful)

Handpaper (566373) | more than 7 years ago | (#16777427)

Ahem.

Section "Monitor"
Identifier "Sun GDM-5410"
HorizSync 30-122
VertRefresh 48-160
Modeline "2048x1536@72" 472.89 2048 2080 3872 3904 1536 1565 1584 1613
EndSection

'The old that is strong does not wither' :)

Re:WOW! This is FAST! (1)

Trinn (523103) | more than 7 years ago | (#16778785)

Just wait until us Beryl developers get ahold of one ^-^

Re:WOW! This is FAST! (0)

Anonymous Coward | more than 7 years ago | (#16775761)

it seems like buying a single 8800 would be just as good (and cheaper) than buying a couple 7800's ...


And the driving need for a couple of 7800's are? I have a 24" and I'm fairly happy with the 6800GTO I have driving it. The only reason I'd look to upgrade is to reduce the fan noise...

Re:WOW! This is FAST! (1)

Heembo (916647) | more than 7 years ago | (#16777025)

The only reason I'd look to upgrade is to reduce the fan noise...

Just remove the fan and smear hear-absorbing paste all over your video card! Works wonders!

You're forgetting the hidden costs... (0)

Anonymous Coward | more than 7 years ago | (#16776147)

DirectX 10 will require Windows Vista: $150 (basic version)
New processor/motherboard or RAM to deal with Vista requirements: $800
And lets not forge the usual HDCP/Windows EULA: Your soul

Re:You're forgetting the hidden costs... (2, Insightful)

Tyger (126248) | more than 7 years ago | (#16776385)

Just because you have a DirectX 10 capable card doesn't mean you need DirectX 10. Most of those games/benchmarks are against DirectX 9, and the rest are against OpenGL. It will be a few years before most games require DirectX 10.

Re:You're forgetting the hidden costs... (1)

Supergibbs (786716) | more than 7 years ago | (#16777793)

Yes, but Windows Vista will take advantage of it...ducks

Re:You're forgetting the hidden costs... (1)

gripen40k (957933) | more than 7 years ago | (#16778761)

OK, If you are buying a 8800 then you KNOW you have an adequate rig to run Vista... Besides, who pays for software these days?

Re:WOW! This is FAST! (1)

mgemmons (972332) | more than 7 years ago | (#16777665)

Actually, I think the point they were making is that until your max out your resolution to above 1920x1200 both the nVidia and the the ATI it was being compared to are so fast that the bottleneck is always the CPU and not the GPU.

Re:WOW! This is FAST! (1)

saltlick35 (1019660) | more than 7 years ago | (#16778641)

I wonder what the ATi's will be like...

DNF! (4, Funny)

spacemky (236551) | more than 7 years ago | (#16775533)

I heard somewhere that this will be one of the only supported video cards in Duke Nukem Forever.

*ducks*

another review (4, Informative)

brunascle (994197) | more than 7 years ago | (#16775547)

Hot Hardware has another review [hothardware.com]

More In-depth Analysis Here At HotHardware.com (3, Interesting)

MojoKid (1002251) | more than 7 years ago | (#16775551)

NVIDIA has officially launched their new high-end 3D Graphics card that has full support for DX10 and Shader Model 4.0. The GeForce 8800 series is fully tested and showcased [hothardware.com] at HotHardware and its performance is nothing short of impressive. With up to 128 Stream Processors under its hood, up to 86GB/s of memory bandwidth at its disposal and comprised of a whopping 681 million transistors, it's no wonder the new GPU rips up the benchmarks [hothardware.com] like no tomorrow. NVIDIA is also launching a new enthusiast line of motherboard chipsets in support of Intel's Core 2 Duo/Quad and AMD Athlon 64 processors. NVIDIA's nForce 680i SLI and nForce 680a SLI motherboard chipsets [hothardware.com] will also allow a pair of these monster graphics cards to run in tandem for nearly double the performance and the new chipset offers a ton of integrated features, like Gigabit Ethernet Link Teaming etc.

More In-depth Analysis of the ethernet. (0)

Anonymous Coward | more than 7 years ago | (#16775907)

"NVIDIA is also launching a new enthusiast line of motherboard chipsets in support of Intel's Core 2 Duo/Quad and AMD Athlon 64 processors. NVIDIA's nForce 680i SLI and nForce 680a SLI motherboard chipsets will also allow a pair of these monster graphics cards to run in tandem for nearly double the performance and the new chipset offers a ton of integrated features, like Gigabit Ethernet Link Teaming etc."

Binary drivers all the way. Or do we get documentation for the ethernet this time?

Re:More In-depth Analysis Here At HotHardware.com (0)

Anonymous Coward | more than 7 years ago | (#16777797)

681 million transistors is quite impressive I think the xbox 360 gpu has something like 330 million for comparison

Does this mean.. (1)

bigattichouse (527527) | more than 7 years ago | (#16775571)

... you can get reasonable framerates with NeverWinter Nights? :)

Yes... (1)

topace3 (962476) | more than 7 years ago | (#16776373)

...the original, that is.

Re:Does this mean.. (1)

Mongoose (8480) | more than 7 years ago | (#16777839)

It'll run fine on a 7800, since that's what was used to make it. ;)

Yay (0)

Anonymous Coward | more than 7 years ago | (#16775611)

enough power ot run Windows Vista and DNF at the same time

Yeah, but... (2, Informative)

cp.tar (871488) | more than 7 years ago | (#16775639)

... does it run Linux?

Seriously... when are the Linux drivers expected?

Re:Yeah, but... (-1, Offtopic)

Clever7Devil (985356) | more than 7 years ago | (#16775671)

I wish I had mod points so I could give you +1 funny.

Re:Yeah, but... (1)

rg3 (858575) | more than 7 years ago | (#16775957)

In theory, the so-called NVIDIA binary blob used by the Linux driver is not platform specific and is used across several operating systems. This means that when that blob supports the new cards, any future driver releases should support them. AFAIK NVIDIA has been pretty fast at introducing drivers for new cards, so I would expect the next driver release to support these cards. Whenever that is, I'm not sure. Maybe 15 days?

Re:Yeah, but... (0, Troll)

Constantine XVI (880691) | more than 7 years ago | (#16776017)

Better yet, when will we have a video card that actually runs Linux? I'm sure it's possible, even if we couldn't do much with it.

Re:Yeah, but... (1)

mashade (912744) | more than 7 years ago | (#16776021)

Well, since Nvidia uses a unified single driver for all their cards, the Linux drivers are already out! Unless, of course, this is a special case.

Re:Yeah, but... (2, Insightful)

BRTB (30272) | more than 7 years ago | (#16776031)

Is now [nzone.com] soon enough for you? =]

Sure, they're beta, if you want to be picky about it. Probably works just fine - their last beta drivers did.

Re:Yeah, but... (1, Interesting)

Anonymous Coward | more than 7 years ago | (#16777299)

Sure, they're beta, if you want to be picky about it. Probably works just fine - their last beta drivers did.

Maybe they did for the majority, but they sucked ass for some of us with more esoteric systems.

For example, all the 7xxx and up cards have dual-link dvi transmitters built into the chipset - it is not an option. Yet, if the driver had problems parsing the EDID information from the monitor, the drivers assumed the transmitters were single-link and misprogrammed them as such, assuring that they would not work at all. No amount of configuration options could force the dual-link behaviour if EDID information was unreadable (like, for example you have a uni-directional DVI over fibre extender, or your monitor's EDID voltage level is just a tad below spec).

Their only linux support - the informal, unofficial kind provided via a forum at nvnews.com - was seriously lacking in ass, it wasn't even half-assed. All they could do was follow a script and when you got to the end of the script without a solution, they just stopped responding.

ATI drivers are just as closed, I won't be buying either in the future unless they open up. It is too bad that Intel's fully-open graphics are motherboard only.

At least I won't lose out on all the fancy-dancy MPEG4 decode acceleration that the Nvidia card can do - their official linux drivers don't support it either. For me, that makes even the 8800 cards just about on par with the Intel offerings.

Re:Yeah, but... (1, Interesting)

allometry (840925) | more than 7 years ago | (#16776861)

Not to troll, but in using Linux, I've never seen the need for such a card.

Anyone actually using this card under Linux and can give me a reason? I'm simply curious, that's all...

Re:Yeah, but... (1, Informative)

Anonymous Coward | more than 7 years ago | (#16777155)

People actually use high end graphics cards on Linux for things other then games. There are Linux systems running scientific viusalization software, virtual reality systems (flight simulators, driving simulations) and animation software for the non-game entertainment industry.

Re:Yeah, but... (2, Funny)

Sj0 (472011) | more than 7 years ago | (#16777177)

It makes the other nerds think you've got a HUGE wang.

Re:Yeah, but... (1)

dingDaShan (818817) | more than 7 years ago | (#16777761)

...I just got a DX9 card...

24" monitor? (3, Insightful)

BenFenner (981342) | more than 7 years ago | (#16775669)

So this will benefit my 13' projected monitor running at 1024 x 768 resolution (60 Hz refresh), and not my 20" CRT running at 1600 x 1200 resolution (100 Hz refresh)?

You don't say...

Now we need (1)

wumpus188 (657540) | more than 7 years ago | (#16775685)

Now we need Duke Nukem Forever to really put this baby to work.

Re:Now we need (1)

ningeo (1022283) | more than 7 years ago | (#16777115)

I don't know about the 24" monitor comment, but directX 10 capability puts this high up on my wish list... mostly because of Crysis. http://www.crysis-game.com/ [crysis-game.com] You'll probably have more luck with Google though, not much of a page. Take a look at the movies though, they can be found, just make sure you see it at high res.

What about the DRM (DX 10 certs require it) (3, Interesting)

plasmacutter (901737) | more than 7 years ago | (#16775697)

Seriously.. last i checked certification for logo testing and DX 10 required DRM... not just DRM but enough lockdown to get hollywood to sign off on it.

They kept changing the standards over and over.. so the question is exactly what is required in terms of integrated DRM.

Re:What about the DRM (DX 10 certs require it) (1)

Firehed (942385) | more than 7 years ago | (#16777973)

That's the FUDdiest post I've read in a while. Back that up with at least a vague reference at anything.

HDCP support being required wouldn't surprise me, but that's not so much DRM as a stupid thing to try and make you buy a new monitor. But in either case, won't affect gaming whatsoever, or legal content for quite some time (the ICT isn't likely to be enabled before 2010-2012). I doubt it's required anyways, just highly recommended.

Finally... (2, Funny)

Anonymous Coward | more than 7 years ago | (#16775725)

...something that can run Vista Aero with 5 stars!!!

MSI's 8800GTX @ Bootdaily (1)

theonecp (1021699) | more than 7 years ago | (#16775883)

The folks at Boot Daily take a peek at MSI's GeForce 8800GTX and run it through quite a few benchmarks and discuss its visual qualities.

Virtualisation Support? (2, Interesting)

TheRaven64 (641858) | more than 7 years ago | (#16775915)

I was under the impression that one of the major advantages of DirectX 10 was it supported virtualisation. This means that the device needs to either be able to save its entire state to main memory (for switching) or, ideally, the ability to produce virtual devices that render to a texture rather than the main screen easily (for picture-in-picture).

TFA didn't seem to mention anything about this though. Can anyone comment?

Re:Virtualisation Support? (1)

rzei (622725) | more than 7 years ago | (#16776767)

Not that I was a developer or really knew anything about the implementations of todays graphics cards I think that off screen rendering has been supported for some time.

For example game F.E.A.R. [whatisfear.com] did take use of, among other things the off screen rendering, or straight to textrure when multiple surveillance cameras were rendered on a monitor in the game world.

The way I see it, the chip itself doesn't have to know so much about how many tasks are using it, it's the drivers or perhaps even higher level software that does the scheduling of "graphic-requests". But then again this is all speculation :)

Re:Virtualisation Support? (1)

pilkul (667659) | more than 7 years ago | (#16776889)

Indeed. Less obviously, many games render to a texture in order to apply full-screen effects (e.g. your entire vision getting blurry when you are damaged) on them before sending to the screen.

More at ocp and toms (1)

llZENll (545605) | more than 7 years ago | (#16775977)

http://enthusiast.hardocp.com/article.html?art=MTI xOCwxLCxoZW50aHVzaWFzdA== [hardocp.com]

http://www.tomshardware.com/2006/11/08/geforce_880 0/ [tomshardware.com]
Although the toms article is pretty worthless as most benches are cpu bound with a fx64 cpu.

my favorite has to be this page, 8800 GTX SLI/3.80GHz Core 2 Duo SLI
http://www.extremetech.com/article2/0,1697,2053791 ,00.asp [extremetech.com]

Coincidence? (1)

minvaren (854254) | more than 7 years ago | (#16776039)

Any coincidence that they launch the first DX10 card the same day that Vista goes gold [neowin.net] ?

Re:Coincidence? (3, Funny)

slughead (592713) | more than 7 years ago | (#16776105)

Any coincidence that they launch the first DX10 card the same day that Vista goes gold?

No. M$ doesn't release its products until they go bismuth (to treat typical symptoms of M$' early adopters), which is still 4 release candidates away.

[H]ard Review (1)

mr_stinky_britches (926212) | more than 7 years ago | (#16776049)

For those of you who are interested in what the [H] has to say about this card..here is the direct link:

BFGTech GeForce 8800 GTX and 8800 GTS [hardocp.com]
Today marks the announcement of NVIDIA's next generation GeForce 8800 series GPU technology code named "G80." We have two new hard-launched video cards from BFG Tech representing the 8800 products. Gameplay experience TWIMTBP?

I found their review to be of typical [H] quality, which I think is pretty decent (when compared to other H/W review sites, that is ;)

-
Wi-Fizzle Research [wi-fizzle.com]

More goodness (-1, Troll)

Anonymous Coward | more than 7 years ago | (#16776069)

Ah goody, slashbots rejoice, with a new top of line the line graphics card out it gives the slashbots new oppurtunity. They can give more blowjobs and sell more crack to soccer moms to raise the money to buy it. Then they can give even more blowjobs to buy a bigger monitor to really push their gfx card to the limit, big monitors are good because they block out the sunlight even more, and making for an even better beastiality viewing experience. What more could a slashbot want for Christmas.

SLI? (1)

scombs (1012701) | more than 7 years ago | (#16776319)

Is this SLI capable? Not that I would be able to pay for even one of the cards...

Re:SLI? (2, Informative)

noSignal (997337) | more than 7 years ago | (#16776827)

From nvidia.com:

Q: Do the new GeForce 8800 GTX and GeForce 8800 GTS GPUs support SLI technology?

A: Yes. All GeForce 8800 GPUs support NVIDIA SLI technology.

Re:SLI? (1, Informative)

Anonymous Coward | more than 7 years ago | (#16777505)

From the looks of it, with the card having two SLI connectors on it, a 3-card SLI solution will be introduced soon enough. Scary to think that there are people out there willing to spend that kind of cash to get three top of the line 8800s - not to mention paying top dollar for a board that has three PCI-E 16x slots.

Makes PS3 obsolete before launch (2, Interesting)

ConfusedSelfHating (1000521) | more than 7 years ago | (#16776393)

At least the Xbox 360 was released before it was obsolete. The PS3 graphics processor is similar to the 7800 GTX if I remember correctly. When the PS3 releases people won't be saying "Buy the PS3 for the greatest graphical experience", instead they'll say "Buy the PS3 for the greatest graphical experience, expect for the PC you could have bought last week". The PS3 will probably be about as powerful as the 8600 when it's released.

I know I sound very offtopic bringing this up, but many PC gamers also play console games. They will want to compare console graphics to PC graphics.

Re:Makes PS3 obsolete before launch (1)

Wesley Felter (138342) | more than 7 years ago | (#16776791)

It's really not fair to expect a $500 console to have the same graphics as a $2,000 PC. For mainstream gamers, PS3 will probably compare favorably to a PC when it comes out.

Re:Makes PS3 obsolete before launch (1)

Chandon Seldon (43083) | more than 7 years ago | (#16777369)

Isn't the good PS3 package selling for more like $600?

The 8800 GTS is around $450. You can easily get the rest of a solid gaming PC for $450. So you're talking more about a $900 PC than a $2,000 PC. And that's before pricing a decent HDTV vs. a decent gaming monitor.

Re:Makes PS3 obsolete before launch (1)

Elladan (17598) | more than 7 years ago | (#16776841)

Graphics processors aren't really as important for a console system as a PC, though, since consoles target an output device with a typical resolution of 640x480 at 60 frames per second at most (and more like 30 in most cases). Sure, a few people might have HDTV, but not many.

Plus, the PS3 has a herd of vector coprocessors to assist the video engine. I don't think anyone is going not buy a PS3 because it fails to meet some artificial benchmark in the lab. They're going to complain that it costs a hell of a lot for the small number of games available.

Re:Makes PS3 obsolete before launch (1)

PitaBred (632671) | more than 7 years ago | (#16776991)

Consoles can tune games a LOT more than PC's, because the hardware is completely standard. They can do tricks and optimizations with rendering and such that you couldn't reliably expect to work on Joe Blow's random PC. The console still isn't out of the game.

Besides, the video card you can buy costs as much as a whole PS3. The PS3 is still better bang for your gaming buck. Either way, I'm ok with my Go7600 in my laptop and I'm gonna get a Wii, so y'all can go do your own thing when posturing about games.

Re:Makes PS3 obsolete before launch (2, Insightful)

Sj0 (472011) | more than 7 years ago | (#16777307)

You CAN, but I've found almost universally that they don't. The game development cycle is too tight, multi-platform compatibility is too important, and codebases are simply too large to justify optimizing the living hell out of the code you've got.

And the new gaming PC I'm building costs less than the PS3, and other than perhaps 100 bucks for the chibi version of this monster when it comes out, I don't expect to have to do much to keep the system I'm building competitive with the PS3 in terms of playing a broad spectrum of recent games for the livespan of the machine.

Not to be a PS3 fanboy (1)

Inoshiro (71693) | more than 7 years ago | (#16777273)

(Especially as I find Sony a bunch of asshats), but...

An 8800 GTX is how much, exactly? A PS3 is 550$ CDN. How many PC games will use DX10? 10 games? AFAIK, Halo 1, Half-Life 2, etc, aren't magically DX 10 games since they were written for previous versions of the DX API.

Will SquareEnix be writing PC versions of Final Fantasy XII? X-2?

Cost wise, these cards and the PS3 are close. Game wise, I suspect the PS3 will have more games out than there will be DX10 games. The DX10-Vista lock in is another dis-incentive to go and get a raging boner over an 8800.

I find my Nintendo DS to be a very enjoyable game platform, despite the fact that it doesn't require a 450W power supply, or other conmensurate upgrades, to get the same picture as my PC.

Re:Makes PS3 obsolete before launch (1)

asuffield (111848) | more than 7 years ago | (#16777775)

I know I sound very offtopic bringing this up, but many PC gamers also play console games. They will want to compare console graphics to PC graphics.


Have you ever played PC games and console games? Console graphics have always SUCKED DONKEY compared to PC graphics. The PS2 had the most advanced graphics processor around when it was released... but the output resolution was 320x200 (because that's about what a TV uses), so it really didn't matter a damn.

Nobody sane has ever expected decent graphics from a console. Consoles are not intended to give decent graphics, except when compared to other consoles. Consoles are intended to be used with TV sets, which is an incredibly limiting constraint.

The current round of consoles has made a lot of noise about HDTV but none of them currently have any intention of implementing it (they'll give HDMI output but the quality of the output is not greatly improved over a regular TV). Console games continue to run at painfully low resolutions, compared to equivalent PC games. Gamers who really care about graphics quality will continue to use PCs, like they have done for the past several years.

oh goodie (1)

zulater (635326) | more than 7 years ago | (#16776447)

now we can finally watch pr0n at over 1000fps!

Re:oh goodie (0)

Anonymous Coward | more than 7 years ago | (#16778859)

As fun as it sounds at first, I think I'll stick with being with my girl friend at 60 fps.

NVIDIA CUDA, GPGPU initiative (5, Interesting)

Vigile (99919) | more than 7 years ago | (#16776555)

http://www.pcper.com/article.php?type=expert&aid=3 19 [pcper.com]

This review looks at gaming and such too, but also touches on the NVIDIA CUDA (Compute Unified Device Architecture), that NVIDIA is hoping will get super computing into mainstream pricing. What thermal dynamics programmer would love to access 128 1.35 GHz processors for $600?

http://www.pcper.com/article.php?aid=319&type=expe rt&pid=5 [pcper.com]

"DirectX 10 Cards"? (1, Troll)

Deagol (323173) | more than 7 years ago | (#16776781)

Isn't this the tail wagging the dog? Shouldn't the video card industry have hardware API standards and shouldn't the software vendors be releasing stuff compatible with the hardware?

"DirectX 10 Cards" sounds as silly as saying "Vista compatible PC BIOS". WTF?

Re:"DirectX 10 Cards"? (2, Insightful)

TheRaven64 (641858) | more than 7 years ago | (#16777149)

The biggest difference between DirectX and OpenGL is the extension mechanism. OpenGL specifies a set of features which must be implemented (in hardware or sofware), and then allows vendors to add extensions. These can be tested for at runtime and used (and the most popular ones then make it into the next version of the spec). DirectX doesn't have a comparable mechanism; the only features it exposes are those that the current version of the API dictates.

In their rush to get a chunk of the big Windows market share, vendors put their weight behind DirectX, without noticing that it was a typical Microsoft attempt to commoditise the market by preventing vendors from differentiating themselves easily. Now GPU vendors just have to try to release the fastest card they can that conforms to the Microsoft API, rather than adding new, innovative, features. I doubt something like S3 texture compression would survive if it were added now; only OpenGL programmers would be able to use it, and they don't seem to make up much of the games market.

"You must be new here." (1)

uhlume (597871) | more than 7 years ago | (#16777161)

Seriously, where have you been for the last 10-15 years, and were you somehow under the impression all this time that OpenGL, DirectX 3-9 and their predecessors were "hardware API standards"? The only difference in this respect between DirectX10 and earlier versions is that DX10 doesn't attempt to provide backward compatability for older hardware, so you'll need an explictly DX10-compatible card in order to take advantage of DX10 rendering paths.

Re:"DirectX 10 Cards"? (1)

drsmithy (35869) | more than 7 years ago | (#16778295)

Isn't this the tail wagging the dog? Shouldn't the video card industry have hardware API standards and shouldn't the software vendors be releasing stuff compatible with the hardware?

Sure, if you want to go back to the bad old days of games only supporting a small number of very specific video cards.

mod j0p (-1, Flamebait)

Anonymous Coward | more than 7 years ago | (#16776945)

Re: don't bother if you have less than a 24" (1)

Kris_J (10111) | more than 7 years ago | (#16777131)

Or a Matrox Triplehead2Go [matrox.com] . A 24" panel is only a little over 2 million pixels. Three 1280x1024 panels are almost 4 million pixels. And you can get a TH2G plus three 17" or 19" panels for significantly less than a 24" panel.

Is anyone testing these video cards in 3840x1024 yet?

DirectX cards (1)

Sloppy (14984) | more than 7 years ago | (#16777197)

Forget the review; what catches my eye here is the term "DirectX 10 Card." The very idea that it's categorized by limited software compatibility, rather than categorized by the type of hardware slot that it uses, is a new idea to me.

I can see a huge upside to it, though. As a time-saver, I would love for the amount of "closedness" to be how hardware gets categorized, so that I could just shop from the "open and compatible with everything" category instead of having to do research along the lines of "is this usable? Is that?"

Re:DirectX cards (1)

Nicolay77 (258497) | more than 7 years ago | (#16777425)

It runs all the DirectX 7, 8 and 9 games with amazing framerates.
It just happens to be the first to be able to run DirectX 10 games too.

... bindone? (1)

eneville (745111) | more than 7 years ago | (#16777627)

is this what i need to be able to run vista?

I want to see the rest of the family (1)

Nicolay77 (258497) | more than 7 years ago | (#16777669)

I am planning to buy a GeForce 7600GT, a card that gives me the framerates and resolutions I want with a very small price compared with the high end cards. Also because a more expensive card would be bottlenecked by my CPU so it would be a waste.

However I now want to get a card of the same price and watts requirements of the 7600GT but with the G80 chipset (the one inside this beasts), just because of the better anisotropic filtering and antialiasing.

So... how long until we have the mid end versions of this card ???

Backwards Compatible (1)

gravy.jones (969410) | more than 7 years ago | (#16777795)

The article left an impression that backwards compatibility with DX9 should theoretically be possible. Back in reality I still use a nice high-end AGP card which lets me play my flight simulator online with others. My gaming experience would not yet benefit from this VGA card. According to the article, and my assumptions, I would get more bang for my buck by investing in a multi-core CPU since my current VGA card demands more from the CPU.

8800GTS? (0)

Anonymous Coward | more than 7 years ago | (#16777885)

Where the heck is the reviews for the 8800GTS, its like that card doesn't exist for reviewers but yet.... ugh

Re:WOW! This is FAST! (1)

Sj0 (472011) | more than 7 years ago | (#16778041)

Anyone using qbasic, yes.

These are just video games. At some point, you're seriously facing diminishing returns for your $1200USD SLI 8800 rig.

Re:... bindone? (1)

SEMW (967629) | more than 7 years ago | (#16778097)

No, you don't need a DX10 card to run Vista. You need a a DirectX 9 card with 128 MB of Video RAM for Aero Glass, or any old 2D chip for Aero Standard, Aero Basic, Classic, etc.

Re:WOW! This is FAST! (1)

Sj0 (472011) | more than 7 years ago | (#16778201)

Actually, I reserved a new video card this week. I decided on something with a bit less oomph than this though. Between my Geforce MXes and my super high end video cards, I've found that it's better to buy the cheap video card that'll last you maybe a year or two than to go all-out and get maybe a year or two out of it. (Hey, it's not raw power that neccessitates upgrades, it's Pixel Shaders 4.1, right around the corner!)

Re:What about the DRM (DX 10 certs require it) (0)

Anonymous Coward | more than 7 years ago | (#16778249)

I don't know about DX10 as a whole, but MS has already said that they will not support hd-dvd/blu-ray playback on anything other than 64-bit vista WITH a hdcp-adled video card. Presumably that also means the various intra-chip encryptions to prevent someone from tapping into a bus that the DVD-Audio guys required for full-rez DVD-A playback.

ICT has nothing to do with it, it is all about preventing people from figuring out a way to 'rip' the raw digital stream, either post-decode or preferrably pre-decode, but post-decrypt.

Power consumption (3, Insightful)

MobyDisk (75490) | more than 7 years ago | (#16778411)

Dual power connectors, yeeeha! Video card manufacturers really aren't doing much about idle power consumption. 66 watts at idle just to display a static frame buffer. I can't imagine what will happen running Vista w/ Aero glass. I bet people's power consumption numbers will double.

Power consumption - correction (1)

MobyDisk (75490) | more than 7 years ago | (#16778451)

Holy crud! I misread that: It is 220 WATTS AT IDLE! [bit-tech.net] The idle TEMPERATURE in deg C is the 66.

Re:What about the DRM (DX 10 certs require it) (1)

plasmacutter (901737) | more than 7 years ago | (#16778605)

It's been some months since I last saw the relevant articles (they were on the EFF's Trusted Computing repository and in places like freedom to tinker), but I'll try to bring what stuck in my mind here:

AACS copy protection on the new generation HD video media has invasively strict requirements, such as encryption of the video path within the system itself to prevent "sniffing" attacks, which means either the hardware itself or the drivers constitute a form of DRM. Any way I look at that encrypted media path requirement I wonder exactly how a set of linux drivers would not be challenged as a "circumvention device", and at the very least the authentication process within the system for this encryption will impose a toll on video performance.

Display Lists (1)

PaloDeQueso (769669) | more than 7 years ago | (#16778715)

Is this the first DirectX that can use display lists? Because I was reading the DX10 explanation on Tom's Hardware and it seems that they are either referring to that or the new Geometry shaders? If they are talking about display lists, OpenGL has had those for quite a while.

In depth G80's architecture analysis (0)

Anonymous Coward | more than 7 years ago | (#16778861)

for anyone that wants to know more about G80's architecture (not just the same pr material every website publish again and again) have a look at this in depth analysis from Beyond3d.com:
http://www.beyond3d.com/reviews/nvidia/g80-arch/ [beyond3d.com]

Re:WOW! This is FAST! (1)

JackieBrown (987087) | more than 7 years ago | (#16779117)

Wow. I haven't seen Modelines since Xfree86 x-servers. Not that there is a rule against it. Does that tweak it? Do you notice a difference or is it out of habit?
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>