Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

GeForce 8800GTX Benchmarked

Zonk posted more than 7 years ago | from the new-shininess dept.

214

An anonymous reader writes "The card does not launch for another week, but DailyTech already has benchmarks of the new GeForce 8800GTX on its website. The new card is the flagship GPU to replace GeForce 7900, and according to the benchmarks has no problem embarrassing the Radeon X1950 XTX either. According to the article, 'The GeForce 8800GTX used for testing is equipped with 768MB of GDDR3 video memory on a 384-bit memory bus as previously reported. Core and memory clocks are set at 575 MHz and 900 MHz respectively.'"

cancel ×

214 comments

Sorry! There are no comments related to the filter you selected.

CRAZY (0)

Anonymous Coward | more than 7 years ago | (#16712543)

Looks like the card is the bomb.
It also looks like the card sucks more juice than my complete computer I have right now.

Re:CRAZY (2, Insightful)

scott_evil (266713) | more than 7 years ago | (#16712919)

What impressed me most is the fact that a review site didn't feel the need to lay out 20 pages of crap to give an overall idea of how the card ran. Two thumbs up for that alone...

Re:CRAZY (1)

JonTurner (178845) | more than 7 years ago | (#16712943)

Yes, it is crazy. Pacman, Donkey Kong, Space Invaders, galaxian, Tempest, Defender, and PC favorites like MULE, Tetris, etc. all ran on machines with less than 1/10 of 1% the processing capability of this card.

It feels like it's all been done, because for the most part, that's true. Adding 37% more shiny crap to the same old game doesn't make it better, it just means it has more shiny crap and you are going to spend a fortune on new hardware just to play the same old game concept.

I wish the industry would let go of this obsession for photorealism (at the expense of gameplay dynamics) for a bit and realize that fun isn't something that can be tacked on to a game by adding more processing power. Some of the great games were great because of the focus on challenging gameplay, not visuals. And even on visually splendid games (HL2 comes to mind), the dynamics of the game and the story are foremost... graphics are secondary.

Re:CRAZY (1)

cheater512 (783349) | more than 7 years ago | (#16713505)

Which is *exactly* why I had a nice multiplayer game of Total Annihilation just 2 days ago.
UT GOTYE also still gets a workout.

Re:CRAZY (1)

rgaginol (950787) | more than 7 years ago | (#16713629)

Yep, I agree with this comment completely. Great games come from great ideas implemented well - not the graphics obsession (up to a base level of course). I've played what feels like so many single player orientated games lately where the story line and plot falls on it's bum due to lack of detail and forethought. Whereas games like Galatic Civilizations 2 just keep me coming back despite the slightly rough edges to the game - it's got a killer AI and the games usually develop nicely. Half Life 1 had okay graphics for it's time, but it was the attention to detail which really impressed me (though face huggers jumping at me in dark vents did get a bit old after the thousandth time).

Re:CRAZY (0)

Anonymous Coward | more than 7 years ago | (#16713793)

I just started playing Resident Evil on the Playstation again yesterday (because I was bored and haven't played the _original_ RE for about 4 or 5 years..).
I have to say that the Gamecube remake is actually the better game and mainly (there are a few gameplay enhancements) because of the updated graphics which add a lot to the atmosphere. Nevertheless I'm enjoying the original again though.

Re:CRAZY (1)

Kyokugenryu (817869) | more than 7 years ago | (#16714129)

On the contrary, there are games that take new technology like this and do things that simply were never possible before with old games. You make a blanket statement that new games are all gloss and no substance, which is simply not true. Splinter Cell is one of the most amazing games I have ever played, gameplay wise, and if it weren't for the tech in the generation it came out in, it wouldn't have been possible. Gears of War is another game I'm looking forward to, as it will probably be another one of those "Wow, the $400 I spent on this 360 are worth it, right now" games like Splinter Cell was when I bought it on my Xbox, and like Soul Calibur was when I bought my Dreamcast, and like Mario 64 was on the N64, and so on. Sure there's a lot of games that are all gloss and no substance, but every generation has that "Wow" game that makes you amazed at how far gaming has come, in terms of both gameplay and glitz.

Re:CRAZY (1)

Andrew Kismet (955764) | more than 7 years ago | (#16714315)

Amen to that. At this point in time, you're right - adding 37% more graphical glitz is just that, glitz. I think right now, if this tech is gonna go anywhere, it's offloading physics processing so the CPU can spend more time doing decent AI. HL2 isn't photo realistic, but it's a damn better game than it's nearest neighbour, Doom 3, which used all it's graphical prowess to render several thousand miles of poorly-lit corridor.
Right now I'm playing Cave Story and Bontago, both freeware games, while waiting on the Wii. My Intel Core Duo with 2 Gigs of RAM and 512MB NVidia graphics card has been bored while I've been entertained. You claim that graphics are secondary - I say that gameplay is primary, story is secondary, graphics are tertiary.

Re:CRAZY (1)

cskrat (921721) | more than 7 years ago | (#16713485)

Actually the power consumption numbers they gave were for complete systems. They were just there to give an idea of how it stacked up, power wise, to the competitor's current flagship.

Re:CRAZY (1)

aztracker1 (702135) | more than 7 years ago | (#16713795)

Honestly, I wonder how well they underclock.. would *love* to have something that does as well as a formerly midrange 7?00 card, but without the heat, and power requirements... I like my SFF (now going on 3 years old, and getting ready for a current generation setup), but I can't stand the noise, I want decent gaming, and a *QUIET* computer

FP (-1, Troll)

Anonymous Coward | more than 7 years ago | (#16712553)

Penis!

Re:FP (-1, Troll)

Anonymous Coward | more than 7 years ago | (#16712595)

OMG!! PENIS!!!!!

Holy Cow (2, Funny)

0racle (667029) | more than 7 years ago | (#16712581)

equipped with 768MB of GDDR3 video memory on a 384-bit memory bus as previously reported. Core and memory clocks are set at 575 MHz and 900 MHz respectively
Would you like some computer with your video card? Why even have a system at all, can I just get a backplane to attach a NIC and power to this card and just run everything from it?

Re:Holy Cow (2, Interesting)

msobkow (48369) | more than 7 years ago | (#16712665)

That video card has 50% more memory than my development database server.

Kinda scary, eh?

Re:Holy Cow (1)

ElephanTS (624421) | more than 7 years ago | (#16712861)

. . . but only 3/4 of what I've got in my cellphone . . .

double scary

Re:Holy Cow (0)

Anonymous Coward | more than 7 years ago | (#16712967)

flash != ddr ram

lol ive gots 2tb off rammm in my commpppp wat u mean its my hard disk

Re:Holy Cow (1)

kettch (40676) | more than 7 years ago | (#16712959)

Imagine what you could do with a beowulf cluster of these.

Re:Holy Cow (1)

NickCatal (865805) | more than 7 years ago | (#16713255)

Play Quake3 at 12093109283091823091820938109283091823 fps?

Re:Holy Cow (2, Funny)

Firehed (942385) | more than 7 years ago | (#16713807)

I know one thing... prove that we have an energy crisis. By singlehandedly causing it.

Re:Holy Cow (2, Funny)

Anonymous Coward | more than 7 years ago | (#16712673)

equipped with 768MB
nVidia is preparing for Windows Vista

Re:Holy Cow (1)

BeeBeard (999187) | more than 7 years ago | (#16712811)

I got a chuckle out of this. Incidentally, I am someone who is "lucky" enough to own a last- generation card new enough to run the latest games at acceptable framerates, and yet somehow too dumpy to run Vista in its fully tricked out form.

Re:Holy Cow (1)

darkheart22 (909279) | more than 7 years ago | (#16712981)

And now when you buy a computer is like you're buying pizza. You buy one computer and an other one comes with it for free...

Re:Holy Cow (1)

mennucc1 (568756) | more than 7 years ago | (#16714023)

why dont they make it with a CPU kinda socket , so we can unplug the CPU and plug this GPU in its place instead? it would be more fair..

wow (3, Funny)

pppppppman (986720) | more than 7 years ago | (#16712587)

Wow... this thing could run like... two Vistas... maybe

Re:wow (1)

Wavicle (181176) | more than 7 years ago | (#16712601)

Now that's just Crazy Talk.

Re:wow (1)

desenz (687520) | more than 7 years ago | (#16712699)

No way. Some guy on a forum told me it was possible. In fact, the guy after him claimed to have done it on a PentiumII.

Re:wow (1)

tubapro12 (896596) | more than 7 years ago | (#16712997)

The way it is
Wow... like two of these things could run vista... maybe.

Re:wow (1)

dch24 (904899) | more than 7 years ago | (#16713265)

And if you are missing the Trademark ATI Red, don't worry. It's a feature. Once the thing is powered up, that black? It turns RED.

Re:wow (1)

r_jensen11 (598210) | more than 7 years ago | (#16713655)

Nahh, in order to run two Vistas, you'd need to use all three 8800GTX's

Oh your god! (2, Insightful)

Daath (225404) | more than 7 years ago | (#16712593)

Oh your god! 92% more FPS than ATI's current flagship! Both in HL2 and in Quake 4! "Only" 54% better 3Dmark06 score though. This card is crazy ;P I wish I could afford a truck full of these. Or maybe just one. Hmm and a new CPU... And more RAM... And some huge disks in RAID-5... Damn.

Re:Oh your god! (1)

kkwst2 (992504) | more than 7 years ago | (#16712901)

You're daydreaming....about RAID-5??? What excites you more, it's partial redundancy or it's marginal performance improvement?

Re:Oh your god! (1)

Belial6 (794905) | more than 7 years ago | (#16712999)

For me it's the partial redundancy. Although the marginal performance improvement is nice, the fact that one of my drives can fail without me loosing data is great. Particularly since if you loose 2 drives at the same time in a totally redundant array, you are just as screwed as if you loose two drives at the raid-5. Being able to loose one drive without data loss and only loosing 20% in a 5 disk array to redundancy is pretty damn cool.

Re:Oh your god! (0)

Anonymous Coward | more than 7 years ago | (#16713579)

You loose an arrow. Your girlfriend's ass is loose. You do not loose data from a drive, you lose it. If you reply to this, at least have the decency to call me a loser, not a looser.

But.... (2)

thealsir (927362) | more than 7 years ago | (#16712599)

But will the 8600 GT be in a good price range? The 8200? This will matter to a lot more people.

More power is never, worse, though...unless you are trying to reduce power consumption...

Re:But.... (0)

Anonymous Coward | more than 7 years ago | (#16712643)

Exactly. I'll probably upgrade when Supreme Commander comes out, and if the 8600 is out, affordable, and performs well then they have themselves a sale. This 8800 GTX is purely for me to envy.

Re:But.... (0)

Anonymous Coward | more than 7 years ago | (#16712961)

"This 8800 GTX is purely for me to envy."

What for? 3 years from now people will be saying what a POS. This new [insert latest and greatest] rocks! It's a moving target. You will never catch up. Buy 2nd gen and save. Games aren't that important but if they are, it's time to rethink your priorities. But to each his own.

Re:But.... (1)

Karloskar (980435) | more than 7 years ago | (#16713237)

I'm really looking forward to buying myself a new computer and having previous generation graphics hardware in there. It's in the right dollar bracket for me and will be enough for me to play the games I want to play.

It's nearly all about game-play for me, so I'm happy to turn the graphics down until the framerate is high enough to make the game playable.

I have to turn the graphics down a lot to play the new games on my GeForce 2 MX-200...

My guess (1)

Sycraft-fu (314770) | more than 7 years ago | (#16712667)

Is that both the launch cards will be expensive. nVidia's usual form is to launch high end only with a new major numerical generation (this being the GeForce 8 series). The high end one will be $600 or more, the next one down probably $300-400. You'll have to wait a few months on a more midrange card to come out.

Makes sense too, new chip and such yields are likely to be a bit low at first so you need to drop it in the expensive stuff. After you've done some work, you release some lower cards.

If you want a midrange card, we'll I'd probably be eying the 7600GT. They are good performers and not bad as it is, once this launch is underway I'd expect their price to drop even further.

of course not (1)

Nasarius (593729) | more than 7 years ago | (#16713077)

But it should push down the prices for the 7000 line, which is nice since I don't pay more than $100 for a video card (7600GS is the best at the moment).

At last (0)

Dr. Eggman (932300) | more than 7 years ago | (#16712613)

The final piece of my plan for world domi...

I mean...

At last, the long awaited G80 series! Only two things prevent my upgrade: Vista's final release reqs and the G80 series. Is it Direct x10 ready as expected? I can't tell from the article. Bah! Even if it isn't I'm holding off Vista until well past its release; I could wait for the GeForce 8850GTX, or 8900GTX or whatever their naming convention is, as well. Impressive stats to say the least.

Re:At last (0)

Anonymous Coward | more than 7 years ago | (#16712721)

Anyone know if it is direct x10

I hope so :)

Re:At last (1)

nbowman (799612) | more than 7 years ago | (#16713337)

Yeah, its DX10 compatible. Daily Tech overview of G80 [dailytech.com]

Well? (1)

JimXugle (921609) | more than 7 years ago | (#16712621)

When will we see the 8950GTX?

Re:Well? (2, Funny)

crossmr (957846) | more than 7 years ago | (#16713545)

Monday, they're planning on rolling out the 9800 next Thursday.

Re:Well? (2, Funny)

Chandon Seldon (43083) | more than 7 years ago | (#16713879)

Just as long as it's not the 9800 Pro, that's fine.

Effing Cool, however (1)

Private.Tucker (843252) | more than 7 years ago | (#16712663)

I just bought a new system with current gen components, and its already outdated now.

Re:Effing Cool, however (1)

DigiShaman (671371) | more than 7 years ago | (#16712733)

You must be new to the world of PCs :p

Kidding aside, your system was outdated even before you placed an order on those parts. It's one of those bitter pills to swallow when building or purchasing a computer. That is to say...you will NEVER have the fastest machine.

Re:Effing Cool, however (1)

idonthack (883680) | more than 7 years ago | (#16712851)

I'll take it off your hands, for free! :)

don't feel so bad (1)

snuf23 (182335) | more than 7 years ago | (#16714031)

My computer is 2 months old with a 7900GT. I was going to get a second one for SLI, now I wonder if I should put that money towards a 8800GTS when they come out.
Or be sane and wait until there is a game I want to play that actually stresses out my computer.... but that... would be... exercising... so much restraint...

that thing looks like it could give blowjobs (0)

Anonymous Coward | more than 7 years ago | (#16712671)

hope it's worth the cash...

Wow... (1)

Mikachu (972457) | more than 7 years ago | (#16712679)

That looks... expensive.

Re:Wow... (1)

DJ Rubbie (621940) | more than 7 years ago | (#16712887)

... in terms of power consumption.

Of course the real question is (4, Insightful)

Sycraft-fu (314770) | more than 7 years ago | (#16712681)

Does it do DirectX 10? If so, how well? I mean the target market here is the high end gamer thus the interest is going to be on having something that supports the latest, greatest. The game development community seems to be going bonkers over DX10 so it's something to consider before you get a card.

I'm planning on getting a high-end graphics card here soon but I'm going to hold off until Vista is out and running for a bit to evaluate and make sure I get one with good DX10 support. No sense in spending money on a new generation of hardware if it doesn't support the new generation of software fully.

Re:Of course the real question is (1)

Rufus211 (221883) | more than 7 years ago | (#16713821)

Does it do DirectX 10? If so, how well?

Umm, of course. The point of G80 and R600 (ATI's next) are that they're the DX10 generation chips. However how well it does DX10 is somewhat of a pointless question. As you point out Vista won't be out for "a few months", and no games using DX10 will be out untill a bit after that. By the time that DX10 performance actually matters an incremental spin of the 8800 (psychic, I'm guessing it'll be called the 8850) will be out.

Re:Of course the real question is (3, Informative)

AbRASiON (589899) | more than 7 years ago | (#16713955)

I think you're missing his point.

Damn good point it is too, I forgot that entirely.

Sure the card might be good at DX9, this is obvious but how good is it at DX10?
The ATI offering may be substantially faster, or this thing may only do the basics of DX10 but be unable to do certain DX10 functions in a single pass, where the competition can.

Who knows? I can say that in the past, sometimes the 2 companies offering, 1 of them has been designed slightly differently which has led to performance hits in certain modes (iirc ATI's competitor to the GF3 was fairly ho-hum, but don't quote me on that)

So to summarise, it might be a nice DX9 card but until we see what DX10 demands and both DX10 cards can do - we can only be sure of it's current gen performance, not next gen.

DirectX 10 and Vista (1)

GoMMiX (748510) | more than 7 years ago | (#16712685)

I'm no hardware techie, but I do so enjoy playing a good game --- "when I have time" (yeah...).

Everytime Microsoft releases a new version of DirectX it has some new sweet feature that everyone wants but none of the current cards on the market support it.

Microsoft has also said DirectX 10 and Vista will not be backward compatible with previous versions of DirectX. (Or has this changed, as I recall Vista wouldn't support applications built for previous OS's too - seems they changed their tune on that one. Then again they've really yanked everything from the OS that was originally going to set it aside as a truly new OS, but I digress...)

So, basically, what I'm getting at is why? Why would I want this (obviously hawt) card, when chances are in 4-6 months (If they don't kick back the release data again, har) DirectX 10 will be out and have some new fancy feature this card won't support?

Of course, I could be missing something and maybe the card does support DX10 - feel free to tell me I'm a toad for even asking.

Re:DirectX 10 and Vista (1)

beavis88 (25983) | more than 7 years ago | (#16712715)

I could be missing something and maybe the card does support DX10

It does indeed support DX10. As the first ever DX10 card, however, it probably will be put to shame by something else in 4-6 months regardless ;)

Re:DirectX 10 and Vista (1)

Shados (741919) | more than 7 years ago | (#16712763)

No way! Remember the FX serie of card? They had -amazingly- Direct X9 and Pixel Shader 2.0 support even though it was new!

::grumbles at his FX 5900 Ultra that can't play most DX9 at an acceptable frame rate...::

Re:DirectX 10 and Vista (2, Interesting)

westlake (615356) | more than 7 years ago | (#16712871)

Microsoft has also said DirectX 10 and Vista will not be backward compatible with previous versions of DirectX.

"Windows Vista continues to support the same Direct3D and DirectDraw interfaces as Windows XP, back to version 3 of DirectX (with the exception of Direct3D's Retained Mode, which has been removed). Just as with Windows XP Professional x64 Edition, 64-bit native applications on Windows Vista are limited to Direct3D9, DirectDraw7, or newer interfaces. High-performance applications should make use of Direct3D 9 or later to ensure that they have the closest match to the hardware capabilities." Graphics APIs in Windows Vista [microsoft.com]

Re:DirectX 10 and Vista (1)

GoMMiX (748510) | more than 7 years ago | (#16712971)

Yes, I misspoke, or miss-wrote... Whatever-have-you.

What I meant was that DX10 wouldn't be backward compatible. I have read Vista will be backwards compatible but read it was some sort of software emulation.

What I was getting at was that according to the articles I have read DX10 will simply not work on a card not designed for it - and DX10 itself was not going to be backward compatible. Basically, if you don't have a card built for it you simply can't use it at all.

When DX9 came out - my 6800GT didn't support all of it's features but I could still use DX9 - when DX10 comes out, as I understand it, my (now very old, I know) 6800GT simply will not be compatible at all with DX10. And thus not really be able to take advantage of the pretty things in Vista.

To me this is all great, as a gamer. Because I really don't use Windows for anything but playing games anyway. It just seems like Vista and DX10 are really more catering to the enthusiast/hobby crowd than to their core customer base. And my reference noted/quoted (which I spoke incorrectly and apologize) was to focus on whether or not the card would even be able to be used 'at all' with DX10, not just missing a few features, but rather not at all.

I guess that question is answered above as this is evidently the first DX10 compatible card.

Cheers!

WHEN Will DailyTech and Anandtech Get BUSTED???? (-1, Troll)

Anonymous Coward | more than 7 years ago | (#16712687)

Working within this industry, I'm keenly aware of the ins and outs of NDA embargos and the legal ramifications of leaking stories like this into the wild before it is officially sanctioned. DailyTech continues to hide behind the guise that they are autonomous from Anandtech but that is complete and utter bullshit. They got this card in from a Taiwanese manufacturer that also works with Anandtech and that company (AT) is legally bound to an embargo. That embargo lifts a few days from now but yet DailyTech is free to leak because they are not specifically bound to an NDA.

The real story is that SOMEDAY some major player is going to get pissed enough on a leak that they're going to sue DailyTech and Anandtech both for all they're worth and then maybe, if we're lucky, they'll be shut down for the lying cheats that they are... Yes many of us have an axe to grind on this topic but plain and simple, it's unethical and unprofessional what they are doing.

Moderate this up if you have the moral fiber and intestinal fortitude to show something worth-while about this pathetic breach of confidentiality.

Thanks

Re:WHEN Will DailyTech and Anandtech Get BUSTED??? (1)

Sterling Christensen (694675) | more than 7 years ago | (#16712767)

I agree that people should respect laws and contracts better than that, and I see how not doing so hurts all the honest ones, but what are those embargoes good for, anyway?

Is it just nVidia's PR department trying to coordinate press coverage for maximum effect? What's in it for the reporters? What does it accomplish?

Re:WHEN Will DailyTech and Anandtech Get BUSTED??? (1)

BoberFett (127537) | more than 7 years ago | (#16712793)

Want a tissue?

Re:WHEN Will DailyTech and Anandtech Get BUSTED??? (0)

Anonymous Coward | more than 7 years ago | (#16712893)

I'll pass, thanks but hey, I was just posing a question. I couldn't give a rats ass if they leak this stuff early. If they don't some puke from across the world will. I'm just pointing out the fact that seemingly professional sites like AT and DT are walking the line and risking getting their asses handed to them by a blood thirsty lawyer.

Pass the popcorn, I'd love to watch that in action...

Nvidia invades (0, Redundant)

Boorad420 (975814) | more than 7 years ago | (#16712697)

I, for one, welcome our new Nvidia overlords.

Direct X 10? (1)

cojsl (694820) | more than 7 years ago | (#16712705)

The article doesn't say it, but it appears from a quick web search that this is the 1st of the cards that will support DirectX 10.

Re:Direct X 10? (2, Interesting)

William_Lee (834197) | more than 7 years ago | (#16712727)

Here are the full specs on the card...As mentioned it offers DirectX10 support and is also HDCP complaint for those who care.

http://www.dailytech.com/article.aspx?newsid=444 1

AMD ATI vs Nvidia (2, Insightful)

Black-Six (989784) | more than 7 years ago | (#16712719)

Now to get things straight, I'm not bashing Nvidia here or criticizing AMD ATI as I own products from both and am very impressed.

Ok, on to the meat of the topic. I read about this card on Tom's Hardware about a month ago and was very impressed. The specs Nvidia gave Tom's for the 8800GTX was 768mb of GDDR4 memory, 128 pixle pipelines, dual 384 bit memory busses (768 bit total), 4 RAMDAC cores at 450mhz and 2 G80 cores at 550 mhz with the memory at 1000mhz (2000mhz for DDR). The card probably won't have a aftermarket cooling solution for sometime as the user can only apply one HSF to one G80 core. Also I understand the G80 is a 75nm chip instead of a 90nm chip. This provides reduced power consumption.

Now what I'd like to see happen is AMD get on-board with ATI and do there magic on the operations per clock view of a VGA and help ATI churn out some killer VGA's that are smaller and cheaper yet rival monsters like the 8800 GTX.

Overall, both companies are kings in there own rights, for now anyway. AMD holds ground in the CPU market like none other and Nvidia churns out next gen products at better prices and performances. Who knows what these guys have in store for us, but one thing is certain only user demands and time will tell us what the next gen VGA will be.

Re:AMD ATI vs Nvidia (2, Interesting)

tonyray (215820) | more than 7 years ago | (#16713029)

Who knows what these guys have in store for us

From what I've been reading, come late 2008, AMD will have one or more GPU's built into their multi-core processors using a new modular technology which allows them to quickly create application targetted processors. One processor for games, another for database servers, still another for scientific applications requiring parallel processing, and so on. This is AMD's much reported "Fusion" technology.

Re:AMD ATI vs Nvidia (1)

Black-Six (989784) | more than 7 years ago | (#16713195)

Yep, that's what I've read also. I was refering to the possiblity of AMD working there magic on the operations per clock cycle completed on a stand-alone VGA such as the upcoming new ATI card that's to kill the 8800gtx. Imagine AMD using a Hammer style chip as a VGA core, whooo thats fast.

Re:AMD ATI vs Nvidia (1)

m-wielgo (858054) | more than 7 years ago | (#16713637)

Overall, both companies are kings in there own rights, for now anyway. AMD holds ground in the CPU market like none other and Nvidia churns out next gen products at better prices and performances.

Maybe I'm just old school, but last I heard, Intel was king of the hill with their latest Core 2 Duo and Xeon processors...

Re:AMD ATI vs Nvidia (0)

Anonymous Coward | more than 7 years ago | (#16713673)

Insightful? On what planet?

Re:AMD ATI vs Nvidia (1)

Rufus211 (221883) | more than 7 years ago | (#16713803)

I'm sorry, but does your post have a point? You ramble between Nvidia, ATI, and AMD randomly.

Also check your basic facts. It's not dual core. What on earth is a dual 384-bit bus? 75nm production doesn't exist except for one DRAM (90, 80, 60, and 45 are the current and future logic steps).

Only a few more release cycles... (1)

FSWKU (551325) | more than 7 years ago | (#16712765)

...until they reach the 5 digit numbers. My guess is BFG is already drooling over what's just over the horizon.

BFG 10K, anyone?

Re:Only a few more release cycles... (0)

Anonymous Coward | more than 7 years ago | (#16713715)

Unfortunately, the BFG 9000 won't quite be as powerful as the BFG 6800.

Idiotic comments about power consumption (0)

Anonymous Coward | more than 7 years ago | (#16712791)

They conclude that the nVidia card draws 4% more power than the ATI card because the difference between the power consumption under load of an entire system differs by 4%. When this issue is raised in the comments, they defend their math:

How do you figure that the difference btw the two cards themselves is more than 4%? We have two identical systems here. The only thing that changes is the video card. The only way what you say could be valid is if the 8800 isn't pumping out as much juice as possible. Other than that, because the difference btw these two systems is 4%, we know that the difference btw the only variable in those systems (the two cards) is also 4%.

Words fail me.

By the way, there's a 45 watt difference between the idle power of the ATI system and the nVidia system. That should be a huge deal, but these gamers don't seem to care.

Oblig. Star Wars comment (1)

i.of.the.storm (907783) | more than 7 years ago | (#16712823)

That's no moon... Seriously, it looks like a (bad) replica of a star destroyer or something.

[rant] One key rating was not evaluated.... (1)

Admin_Jason (1004461) | more than 7 years ago | (#16712835)

and that would be (drumroll).........cost! Okay, so this is a what percent improvement over the previous generation? Who (besides high end gamers and developers) will be able to even notice a difference? Finally, what possible motivation would they have for purchasing a card that is likely going to be more expensive than their entire current computer (monitor inclusive if you have a CRT)...?

[/rant]

Re:[rant] One key rating was not evaluated.... (1)

Shados (741919) | more than 7 years ago | (#16712847)

Who (besides high end gamers and developers)
If they even WANTED non-high end gamers and developers to buy this, the lower ends models wouldn't exist. Not only is this thing pointless, its probably even sold at a loss (or no profit), and is just there to keep a brand name going :)

The results are a bit skewed for nVidia (1)

iOsiris (944032) | more than 7 years ago | (#16712837)

If you compare these results with the 8800GTX/1900XTX benchmarks from other reviewers. Seems the 8800GTX is a bit higher than average as well as 1900XTX lower, but still this card is a beaset

So what about the r600? (1)

Sir_Sri (199544) | more than 7 years ago | (#16712933)

So then the question is: How does this compare to AMD/ATI's R600 which is due out in some sort of final form somewhere between later this month and early january.

Comparing the 8800 to a x1950 is like comparing a 7800 to an x850 (granted this demonstrates it will at least for a brief period be the fastest card on the market, both in DX9.0c and being the only DX10 card out there that as well). But ATI have had their next gen card in the pipe for a while so presumably we'll see it fairly soon, and it's likely significantly faster than the x1950 series (I've heard estimates from 2-4 times as fast, including an estimate in that range from a former ATI employee, but I have no idea how likely that is to be accurate). How that would compare to the 8800 I'm not sure, but I bet they'll be fairly stiff competition.

Fortunately this can kickstart some life back into the high end computer business, which at the moment has been from what I've seen largely dead waiting on the release of SM4.0 hardware. Sure CPU's are nice and all, but why would anyone have gone out and bought the fastest card on the market for the last 3 or 4 months knowing full well that there is a whole new *featureset* in the immediate future (as opposed to the constant faster version of the same thing, which is unavoidable, radically new features, such as the geometry shader, and the whole new driver model with vista etc... and that sort of thing only comes around every couple of years).

It could almost run my 2D game! (1)

MaXiMiUS (923393) | more than 7 years ago | (#16712941)

Now isn't THAT overkill?

Hint: Look at the markup, and no, this isn't a nVidia bash ;)

Well then. (1)

Overfiend1976 (979710) | more than 7 years ago | (#16712957)

Looks like a gorgeous card with blazing speeds for all you gamers out there. But for those like me who run Maya, I'll be content sticking with my ATI FireGL v7350. Now yes, I like you all have seen those reviews out there that say the GeForces can run Maya faster than ATI, but that's bullshit. I've done rendering with both setups on identical systems and pound for pound the ATI annhilates the Nvidia.

Re:Well then. (0)

Anonymous Coward | more than 7 years ago | (#16713915)

Hint: Rendering is done by the CPU.

Somewhat confused (3, Funny)

ET_Fleshy (829048) | more than 7 years ago | (#16712979)

Where are the remaining 27 pages of the article?

And where are the adds?

Did I time travel 4 years in the past? What year is it!

It will be clear soon (1)

zurmikopa (460568) | more than 7 years ago | (#16713873)

This is just the trailer for the actual article. Coming soon to a website near you!

ATI No Hope to Survive? (0)

Anonymous Coward | more than 7 years ago | (#16713021)

I guess this means ATI is sunk. Time to open source those video drivers! I'd go ATI/AMD all the way if they open sourced the fglrx drivers even though they kind of suck. Yep, that's the kind of desperate counter measure that's called for here.

At the very least start supplying specifications again so we can write a nice, clean open source driver ourselves. Seriously, I'd like my next computer to use something better than a Radeon 9250.

Any chance of getting NVIDIA to opensource theirs? (1)

Deimos Foxtrot (1004229) | more than 7 years ago | (#16713575)

Man, if I can just find some talking points, I'm about ready to start writing my own letters. I really do hate to think that the abilities one of the most decent cards in recent history might be imprisoned within closed code. That would suck so hard it'd catch tachyons.

Oh my... (1)

kevlarcowboy (996973) | more than 7 years ago | (#16713217)

Does that thing have TWO PCI-E power jacks?

Re:Oh my... (0)

Anonymous Coward | more than 7 years ago | (#16713393)

No. However it does have two 12V molex power connectors ;)

Re:Oh my... (1)

FuturePastNow (836765) | more than 7 years ago | (#16713413)

If you're not going to overclock the card, it should run fine with both power inputs connected to a splitter. For extra stability while overclocking, the thing needs to be connected to two of the power supply's 12V rails. The 6800 Ultra had two power connectors for the same reason.

Is That A Skateboard? (1)

chromozone (847904) | more than 7 years ago | (#16713423)

It's a power hungry for my psu - but I think with some jumper cables I can hook it up to the service panel in the basement. It will add some clutter down there, but at least when winter comes I can put my wet shoes next to the computer instead of the furnace.

Will there be a VESA version? (1)

glrotate (300695) | more than 7 years ago | (#16713447)

Thinking about upgrading my DX2-100.

The geforce 8800 GTX rocks (0)

Anonymous Coward | more than 7 years ago | (#16713647)

I wish the review site had some charts to help show the difference in FPS for those games. I made a quick chart [instacalc.com] in case you want to see.

If one is good... (1)

(Robo_Bro) (1009507) | more than 7 years ago | (#16713857)

The test system configuration is as follows: ...NVIDIA nForce 650i SLI based motherboard
...you know what that means: dual GeForce 8800GTX over 16x SLI. }:D

Support? (1)

n1hilist (997601) | more than 7 years ago | (#16714043)

Why do I get the feeling we're going to one day need to use those plastic support 'hooks' (I can't remember what they're called) to secure our cards again?

Reminds me of my first vesa local bus card.

style (1)

mennucc1 (568756) | more than 7 years ago | (#16714065)

too bad the cooler looks like may aunt's hair dryer.

Caveat emptor: THIS CARD IS EVIL (1)

GapingHeadwound (985265) | more than 7 years ago | (#16714239)

This card is a poster child of Gluttony, Greed, Envy & Lust. That pretty much constitutes a majority of deadly sins, there. I'd think twice about what exactly it is that you're doing in trying to find out more information about this card.

You definitely don't want to be anywhere near one of these things if the apoclypse starts.

On a personal note, this card outperforms my development server in clock speed, memory and power consumption.

Now I'm not saying I'm humble or meek or anything but seriously... G80 IS THE NUMBER OF THE BEAST!!!!11!!1

Underclocking (1)

Kawahee (901497) | more than 7 years ago | (#16714343)

Even if you underclocked this thing you'd have enough performance to load two copies of Half Life 2 at once and still have enough memory left over to play Solitaire!
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>