Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Positive Reviews For Nvidia' GeForce 6800 Ultra

timothy posted more than 10 years ago | from the my-hardware-cringes-in-shame dept.

Graphics 564

Sander Sassen writes "Following months of heated discussion and rumors about the performance of Nvidia' new NV4x architecture, today their new graphics cards based on this architecture got an official introduction. Hardware Analysis posted their first looks at the new GeForce 6800 Ultra and takes it for a spin with all of the latest DirectX 9.0 game titles. The results speak for themselves, the GeForce 6800 Ultra is the new king of the hill, beating ATI's fastest by over 100% in almost every benchmark." Reader egarland adds "Revews are up on Firing Squad, Toms Hardware, Anandtech and Hot Hardware." Update: 04/14 16:54 GMT by T : Neophytus writes "HardOCP have their real life gameplay review available."

cancel ×

564 comments

Sorry! There are no comments related to the filter you selected.

Wonders Never Cease... (5, Funny)

ackthpt (218170) | more than 10 years ago | (#8860647)

It's good to know I can look forward to reading text in a more scintillating black and white, while Flash ads and pop-ups will be more vibrant than ever.

In a word, "Wow."

I mean, who'd have thunk it that the 6800 would still have life? Maybe ATI can counter with a Radeon All-In-Wonder Xtravaganza 6502!

Beats ATI by 100%... (1)

bonch (38532) | more than 10 years ago | (#8860719)

...that is, until ATI releases their next card too.

I wouldn't expect a new card NOT to beat out the current cards. ATI and Nvidia have played this catchup game with each other for years.

Re:Wonders Never Cease... (0)

Anonymous Coward | more than 10 years ago | (#8860805)

6502, ahh what a number :) now if that doesn't bring back fond memories ... i don't know what does :)

fp (-1)

SlashdotMakesMeKool (610077) | more than 10 years ago | (#8860649)

YAWEJAEWRFOEWJAAAAAAAAAAH Can you hear the children cheering?

posiotive reivews for a forsty piss (-1)

Anonymous Coward | more than 10 years ago | (#8860656)

assclowns.

Nvidia Bird (-1, Troll)

Anonymous Coward | more than 10 years ago | (#8860664)


(o>
( )
8=="==D

Re:Nvidia Bird (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#8860770)

You didn't have the anti-aliasing turned on, did you?

nvidia's back (3, Insightful)

rwiedower (572254) | more than 10 years ago | (#8860671)

These are the guys that managed to crush every single other player into the ground...the fact that nVidia was knocked backwards by ATI was a huge deal, but they weren't the champ for being slow on your feet. At the end of the next few months, the continuing battle should be good news for all of us consumers.

Did anyone else notice the size of the die rivals even that of the Pentium 4 EE? This thing is frickin' huge!

Re:nvidia's back (4, Interesting)

ackthpt (218170) | more than 10 years ago | (#8860711)

These are the guys that managed to crush every single other player into the ground..

Is it considered "safe" to buy any of the Nvidia chipset motherboards, or are they still pretty sketchy?

Re:nvidia's back (3, Informative)

scumbucket (680352) | more than 10 years ago | (#8860797)

I've had an MSI K7N2-L motherboard which has the Nforce2 chipset for over a year now. It's rock solid with no problems.

Re:nvidia's back (3, Informative)

Jeff DeMaagd (2015) | more than 10 years ago | (#8860836)

Two people have had some issues with the nVidia IDE drivers, at least one person fixed it by using a generic IDE driver.

Re:nvidia's back (1)

Lord Kano (13027) | more than 10 years ago | (#8860867)

Is it considered "safe" to buy any of the Nvidia chipset motherboards, or are they still pretty sketchy?

I have an ECS [ecs.com.tw] N2U400-A motherboard with an NVidia N Force 2 Ultra chipset. It's fantastic. Rock solid stable and fast.

Don't take my word for it, google up some reviews of motherboards with the chipset. It's good stuff.

LK

Re:nvidia's back (1)

benzapp (464105) | more than 10 years ago | (#8860912)

I have the original nForce motherboard with an Athlon XP 1800+, and it has been rock solid since the day I got it.

Oh, I also have an audigy platinum and an all in wonder 9700 Pro. no problems what so ever.

Re:nvidia's back (2, Insightful)

YanceyAI (192279) | more than 10 years ago | (#8860720)

And this is why healthy competition is GOOD for consumers (*nudges Bill Gates*).

Re:nvidia's back (4, Insightful)

BiggerIsBetter (682164) | more than 10 years ago | (#8860724)

To hell with the die size, check out the power requirements. There's two, TWO! power connectors for that thing. Damn, they've created a monster. I wonder how fast it can run GPGPU [gpgpu.org] apps...

Re:nvidia's back (5, Funny)

ackthpt (218170) | more than 10 years ago | (#8860902)

To hell with the die size, check out the power requirements. There's two, TWO! power connectors for that thing. Damn, they've created a monster.

Nvidia GeForce 6800 Ultra: $600

800 Watt Powersupply: $250

MMORPG: $10/mo.

The look on your face when you get your next powerbill: Priceless

There are some things in life your measley paycheck can cover, for everything else there's Massivecharge.

Re:nvidia's back (1)

Egekrusher2K (610429) | more than 10 years ago | (#8860731)

Actually, it's over twice as big.

Re:nvidia's back (0)

Anonymous Coward | more than 10 years ago | (#8860822)

Yeah, maybe this time they won't get caught cheating.

Re:nvidia's back (3, Interesting)

LqqkOut (767022) | more than 10 years ago | (#8860824)

The damn thing still won't fit into a Shuttle case... It'd be nice it they said something about noise. [H] [hardocp.com] is /.'ed too, I wonder what they have to say.

I've been a hardcord nVidia follower for years, but after last year I was left with a bad taste in my mouth. I'm glad to see another generation of video cards and I can't wait to see what ATI's got to offer - it's been a while since nVidia has had to play catch-up.

Yea! More horsepower for Doom ]|[ (only 2 more months!)

ATI may be right there with them (3, Interesting)

egarland (120202) | more than 10 years ago | (#8860890)

ATI is supposed to announce the 420 soon. They've had some time to redesign too. I switched to ATI in the last round of upgrades and was very happy. I'll need a good reason to switch back. So far I have good reason but ATI could take it away with a decent new product.

latest vs last-year (5, Informative)

bwindle2 (519558) | more than 10 years ago | (#8860677)

They are comparing the latest nVidia GPU to the 9800XT, which is several months old. When ATI's next-gen chip comes out (two weeks?), only then will we be able to see who holds the GPU Speed crown.

Re:latest vs last-year (3, Interesting)

Seoulstriker (748895) | more than 10 years ago | (#8860732)

They are comparing the latest nVidia GPU to the 9800XT, which is several months old. When ATI's next-gen chip comes out (two weeks?), only then will we be able to see who holds the GPU Speed crown.

I don't think so. The first ATi card to be released will be a 12x1 pipe version while the first nVidia card will be a 16x1 pipe version. ATi seriously underestimated what nVidia was planning as they moved the production schedule of their 16x1 pipe version 5 months ahead of schedule. ATi was scared s***less and for good reason as we found out today.

Fanboyism (4, Insightful)

bonch (38532) | more than 10 years ago | (#8860787)

I think the submitter must be something of an Nvidia fan. :) Most people wouldn't ridiculously compare a new next-gen card to today's months-old cards, not even mentioning that ATI has a new one due out in weeks. But he sure did mention an over 100% speed increase over those old cards, didn't he?

Personally I don't get the fanboy rivalries--I have a Radeon in my laptop and a Geforce in my desktop, and that's just what I happened to buy at the time, no fanboy adherism going on.

Re:Fanboyism (3, Interesting)

Nogami_Saeko (466595) | more than 10 years ago | (#8860866)

Well said! The amount of "epenis" bickering that surrounds videocards is legendary, but the fact of the matter for me is that I buy what's fastest with best quality at any given time (assuming relatively stable drivers of course). Of course, price does figure into it as well. I'm not going to pay a huge premium for a card unless it's significantly better than the competition. A few extra percent on a benchmark simply won't open my wallet more.

Had a NVidia GEForce2 when it was at the top of the pile a few years ago, picked up an ATI 9700Pro when it was released. May go back to Nvidia, may stay with ATI (shrug).

In the longrun, all of us consumers benefit from some healthy competition. Granted, as a Canuck, I'm happy to see ATI do well - but they also earned it. At the time when the 9700Pro was released, ATI blew Nvidia out of the water. Nvidia had grown a tad complacent, and they paid for it.

Now we'll see what happens with Nvidia having a fast new card and ATI about to release their new offering in a few more weeks.

N.

Re:Fanboyism (1, Troll)

Kenja (541830) | more than 10 years ago | (#8860930)

Your right, you should always compare the current nVidia chip to the theoretical non existent ATI chip that your brothers friends cousin heard about. Only then can you have an unbiased comparison.

As a matter of fact, here are some specs on X800 (2, Informative)

bonch (38532) | more than 10 years ago | (#8860843)

...so it's even sillier that the submitter would say that. But, hey, it's healthy fanboyism I guess.

Here's what the Register says [theregister.co.uk] :

ATI will ship its much-anticipated R420 chip later this month as the Radeon X800 Pro. The part's 26 April debut will be followed a month later by the Radeon X800 XT on 31 May.

So claims Anandtech, citing unnamed vendor sources and a glance at ATI's roadmap.

If the date is accurate, it puts ATI just 13 days behind Nvidia's NV40 launch on 13 April. NV40 will surface as the GeForce 6800 and is likely to form the basis for other series 6000 GeForce parts. Note the lack of the 'FX' branding - Nvidia has dropped it, Anandtech claims.

The X800 Pro will ship with 256MB of GDDR 3 graphics RAM across a 256-bit memory bus, but a revised version with 512MB of memory is expected later this year. The report also forecasts the arrival of an X800 SE, which supports 128MB of vanilla DDR SDRAM.

The R420 is an AGP 8x part - the native PCI Express version, the R423, will launch on 14 June, the report claims. It too will be offered as the Radeon X800. Both versions are expected to clock at around 500MHz with 1GHz memory clock frequencies. They feature eight-stage pipelines with six vertex shaders.

Expect to see Radeon X600 and X300 products in due course, we're told, as the RV380 and RV370 parts come on stream. These represent ATI's first 110nm parts.

Meanwhile, ATI's Radeon 9100 IGP is due for an update, apparently, in a few months' time. The revision, codenamed 'RS350', will support Intel's LGA775 CPU interface.

Further down the line, late in Q3, ATI will offer three new Pentium 4 chipsets, currently dubbed the RS400, RC400 and RU400. The first provides PCI Express graphics and non-graphics add-in card buses, along with a dual-channel memory controller. The other two will offer single-channel memory support, while the latter will not support external graphics cards.

AMD isn't being left out, courtesy of RS480 and RX480 chipsets, the first with integrated graphics the second without it. ®


Here's a little more info from Rage3d [rage3d.com] :

Only weeks before the release, ATI Technologies decided to boost performance of its next-generation code-named R420 processor by increasing the number of pixel pipelines inside the chip. Industry source told X-bit labs that the story is not about redesign, but about enabling "big guns" that were "hidden" inside the chip from the very beginning.

ATI Technologies' chip known as R420 will be called RADEON X800 PRO and is likely to be launched on the 26th of April, 2004. Higher-speed flavour of the R420 - the RADEON X800 XT - is expected to debut on the 31st of May, 2004, if the assumptions posted earlier this week are correct. PCI Express x16 solution powered by the R423 architecture will see the light of the day on the 14th of June. ATI on Tuesday began marketing campaign on its web-site to support the launch of the new graphics architecture.

BitBoys (0)

Anonymous Coward | more than 10 years ago | (#8860921)

Yeah, but when the BitBoys finish their Glaze3D vidcard, it'll blow away both nVidia and ATI in DNF.

N0w I w1ll R00l j00 @t C0un73Rs7r1k3! (-1, Offtopic)

Anonymous Coward | more than 10 years ago | (#8860684)

My 1337 w@llh@x0rz w1ll Ru|\| @7 60 m1ll10n FPS!

Wait.. (5, Funny)

bdigit (132070) | more than 10 years ago | (#8860685)

It seems they forgot to take the card out of its case. Wait no thats just the huge fan/heatsink combo

Insensitive... (2, Funny)

guile*fr (515485) | more than 10 years ago | (#8860687)

I have a mini-pc you insensitive clod!

Re:Insensitive... (1)

ackthpt (218170) | more than 10 years ago | (#8860803)

I have a mini-pc you insensitive clod!

Maybe there's a daughterboard connector you can hang your mini-pc off of.

Power Requirements (5, Insightful)

Lord_Pall (136066) | more than 10 years ago | (#8860688)

Okay so it's fast.. no question.. Amazing feature set as well..

but it requires a 480 watt power supply

and 2 power connections... And it also has what looks to be a vacuum cleaner tied to it..

I currently use a shuttle skn41g2 for my main box.. I love the sff pc's. This won't work in that.. It would make the includied power supply very sad.

My HTPC box uses an antec sonata with a fanless radeon 9000, and ultra quiet everything else.. Forget using this in a quiet pc as well

I don't care for nvidia's trend towards hideously loud, bulky, power hungry video cards.. They might perform well, but for normal use, i'd prefer something smaller and quieter.. and for god's sake, give me an external power supply.. heh

Re:Power Requirements (3, Insightful)

happyfrogcow (708359) | more than 10 years ago | (#8860760)

The sound of the fans should be drowned out by booming speakers you should have to go with your gaming system. games and gamers aren't quite, who cares about fan noise when your kicking someones ass?

Now power consumption... that can be an issue.

Re:Power Requirements (0)

Anonymous Coward | more than 10 years ago | (#8860795)

that's 'quiet' not 'quite'

pehraps you're spleeling has room four improovement?

Re:Power Requirements (2, Funny)

ackthpt (218170) | more than 10 years ago | (#8860766)

but it requires a 480 watt power supply

and 2 power connections... And it also has what looks to be a vacuum cleaner tied to it..

The approaching question is, which is the principal in your box, the Motherboard or the Video card? My present video card has more memory and sucks more power than my laptop, and like yours has a fan, though it's quiet.

FNNNZZOWWWNT! "Wayl, shewt! Thar goes the arc welder! Gessen we cain't play no Medal o' Honor till we gets a new one."

Re:Power Requirements (1)

deathazre (761949) | more than 10 years ago | (#8860772)

seeing as the stock shuttle PSU will barely support my system with its 2600+ and a Ti 4200.

That thing has to be drawing over 15 amps.

Re:Power Requirements (2, Funny)

Anonymous Coward | more than 10 years ago | (#8860820)

but it requires a 480 watt power supply

bah, I won't be impressed until a video card requires 1.21 gigawatts.

Re:Power Requirements (1)

On Lawn (1073) | more than 10 years ago | (#8860840)


You bring up an interesting point. I wonder what it would take to create a whole house AC/DC converter. Once in DC its an easy step up or down to the proper voltage for a PC, or any other number of little gadgets that incorporate transformers on them.

Hmm, I only now electronics from one class in Physics so I coudn't comment on it much now. I should look into it though.

I can imagine a 45V supply running through to outlets that support the circle jacks of DC/DC converters. Maybe 12V? Most devices that use bulky transformer plugs probably standardize on 9V or less because they are meant to use batteries anyway. PC's run on 5V, no?

In my limited knowlege it seems that down in the 9V range should be pretty safe.

Re:Power Requirements (1)

dawurz (326644) | more than 10 years ago | (#8860844)

actually no. a 430W with a strong 12V rail is ok. one reviewer ran such a setup with no issues, and firingsquad stated their confidence in using 430W supplies with a strong 12V rail (http://www.firingsquad.com/hardware/nvidia_geforc e_6800_ultra/page6.asp). a 480W power supply is recommended by nvidia by way of their guidelines; it's not a requirement.

Re:Power Requirements (0)

Anonymous Coward | more than 10 years ago | (#8860853)

Did you read the articles?

Only the Ultra needs two power connectors.

Only the Ultra needs the bigger heatsink that intrudes on the adjacent PCI slot. And even this is only true if IHV's dont change it from NVIDIA's reference design.

Incredible day for PC gaming! (4, Interesting)

Seoulstriker (748895) | more than 10 years ago | (#8860693)

I am really quite impressed with the performance of the 6800. Across the board, the 6800 is nearly twice the performance of the current top of the line cards. Going from 4x2 pipes to 16x1 was definitely worth it for nVidia, as their shading performance is simply astounding! Halo actually runs incredibly well on the 6800, getting 2x-3x current performance.

Now, as DooM 3 is supposedly being released with the 6800, can we expect DooM in mid-may? This is truly an incredible day for PC gaming as we will have cinematic computing in the near future.

I'm giddy. :-)

So... (-1, Troll)

Viceice (462967) | more than 10 years ago | (#8860694)

Will all the tin foil hat wearers care to explain how they faked it this time?

Re:So... (1)

bonch (38532) | more than 10 years ago | (#8860931)

When you're running 480 FUCKING WATTS through your circuitry, you can do anything, boy.

Its HUGE (4, Interesting)

silas_moeckel (234313) | more than 10 years ago | (#8860703)

Ok this card has great specs etc etc etc. Did you look at the thing it's taking up at least 1 PCI slot for the fan and another for it's intake to the fan. This thing should have just come with water cooling out the back. Granted it's specs look great I do have to ask will it drive that IBM T221 LCD display that hits 204DPI at 22" thats about the only thing I can think of that realy would do the card justice.

Re:Its HUGE (0)

Anonymous Coward | more than 10 years ago | (#8860743)

IBM T221 LCD display that hits 204DPI at 22"

Why on earth would you want to run this FPS monster with an LCD display?

LCD displays don't refresh fast enough for a good game experience. You can't beat a CRT in gaming.

Re:Its HUGE (0)

Anonymous Coward | more than 10 years ago | (#8860815)

Not true. Depends on the monitor. Nobody ever complained about a higher-end DVI-equipped Samsung LCD in gaming, I can tell you that.

If you can afford this... (1)

cnelzie (451984) | more than 10 years ago | (#8860793)

...card when it is first released to the consumer market, I am fairly certain that you can afford the IBM T221 LCD Screen that you mention...

I can imagine this card coming out with a nearly 500 to 600 dollar price tag to start...

You know you are cool when your video card costs more then most entry level PCs, right?

Re:If you can afford this... (0)

Anonymous Coward | more than 10 years ago | (#8860849)

The monitor he mentions, if I am remembering correctly, costs over $8,500 and supports resolutions of 3800x2400 pixels. Perhaps if you could afford the monitor you could afford the graphics card, but if you were going to get this monitor you should get the professional grade Quadro version of the 6800 instead of the GeForce. Sure it'll probably be around $1,200 instead of $500, but you NEED to spend that much money, right?

We have some of these IBM monitors in the lab here at school and even with the ~4600 Quadro equivalent cards they aren't driven sufficiently well.

Re:If you can afford this... (1)

silas_moeckel (234313) | more than 10 years ago | (#8860938)

The Ultua lists for 500 and is avalible as low as 399 per AnandTech. It's not if you can afford a 7k LCD it's can it drive it only a few NVIDA cards are supported as the resolution is to high to fit it int a DVI connected there just isn't enought bandwith for the pixels. This uses up to 4 DVI connectors to drive the screen.

Re:Its HUGE (2, Interesting)

afidel (530433) | more than 10 years ago | (#8860905)

I take it you haven't seen some of the games out now with LOTS of eye candy? Silent Storm is absolutly amazing looking even with crappy settings, I turned on all the eye candy for a while just to look at it but my lowly GF3 can barely do 1FPS. People with the newest Radeon's and GForce's can't run it at high resolution with everything cranked. This is an engine that Nival obviously designed for a seriously long lifespan. Oh yeah and AI processing eats my 1.2GHz Athlon for breakfast. I think this game is going to make me finally upgrade my PC =)

Re:Its HUGE (1)

Segfault 11 (201269) | more than 10 years ago | (#8860928)

How many PCI slots do you need, though? Twin DVI ports take care of most people's video needs, and most motherboards ship with excellent networking, audio, Serial ATA, and even FireWire onboard these days. My oldest system, a dual P3-800, uses two PCI slots for an additional ATA/100 controller and a NIC, and if it had sound, that would take an additional slot. My P4 2.4B has only a PCI TV tuner card. The nForce2 based Athlon machine doesn't use *ANY* PCI slots, just AGP for graphics.

And the word of the day @ ATI.... (4, Funny)

Julius X (14690) | more than 10 years ago | (#8860704)

0wn3d!

And the word of the day @ Nvidia in a few months (0)

Anonymous Coward | more than 10 years ago | (#8860869)

We got caught cheating again, and yet again we are very far behind ATI!

Re:And the word of the day @ ATI.... (0)

Anonymous Coward | more than 10 years ago | (#8860933)

The ATI "engineers" must be hanging their heads low about now.

So shameful, and so embarrassing.

No hope is possible at this point.

The future is bleak.

the cards are still all very expensive (2, Insightful)

junkymailbox (731309) | more than 10 years ago | (#8860705)

Man .. There has been many generations of video cards now .. but the prices doesnt seem to come down that much ..

Re:the cards are still all very expensive (2, Insightful)

Moofie (22272) | more than 10 years ago | (#8860748)

I wonder if it has anything to do with the price the market will bear. Hmmm...

Impressive! (3, Interesting)

cK-Gunslinger (443452) | more than 10 years ago | (#8860709)


I must admit, after looking at the benchmarks from Tom's and Anand's earlier this morning, I am *very* impressed by the results of this chipset. I still have concerns about the cooling and power requirements, as well as the image quality, but that may be partly related to my newfound ATI fanboy-dom. ;-)

Speaking of which, I can't wait to see what the boys from Canada have coming next week. 16 pipelines? Mmmm....

This is an excellent development (1, Insightful)

Anonymous Coward | more than 10 years ago | (#8860718)

It's great to see competition in this space -- to see a market with solid competitors duking it out. Now, if standards were a little more solid and stable, we'd get to see even more action and get even more benefit as consumers.

Money (2, Insightful)

tai_Dasher (319541) | more than 10 years ago | (#8860723)

People who can afford to buy these kind of things should give money to charity.

No seriously, this thing costs more than a new full fledged computer.

Power consumption (-1, Redundant)

Mindjiver (71) | more than 10 years ago | (#8860727)

"NVIDIA indicated (in the reviewers guide with which we were supplied) that we should use a 480W power supply in conjunction with the 6800 Ultra. "

This is crazy! I thought it was bad enough when I had to get a 380W power supply for my P4 and Radeon 9800. It's insane to require people to first pay a lot of money for a new videocard AND then you have to buy a new power supply also.

Nvidia won't be getting me as a customer that's for sure.

Re:Power consumption (1)

Egekrusher2K (610429) | more than 10 years ago | (#8860788)

I'm all set. I have an Antec TrueControl 550 Watt PSU.

I like the reviews, but.... (2, Informative)

hawkbug (94280) | more than 10 years ago | (#8860733)

This thing requires a 480 watt power supply, minimum. That's too much. I am currently responsible for a large number of servers that don't have larger than 400 watt power supplies each.

It's not hard to see why the U.S. has to violently defend our oil interests when we have video cards wastefully burning through electricity like there's no tomorrow.

I'm all for advances in processor technology, just not when it comes with a high energy consumption price.

I once heard that by leaving a computer with a measely 150 watt power supply (minute by today's standards) on 24 hours a day like most people do, it consumes more energy than the common refrigerator.

Re:I like the reviews, but.... (0)

Anonymous Coward | more than 10 years ago | (#8860801)

I agree that 480 watts is huge and moronic, but comparing a desktop to a workstation is stupid.

Re:I like the reviews, but.... (1)

mr.capaneus (582891) | more than 10 years ago | (#8860816)

Unless you are doing something with the computer, it will not be drawing very much current at all.

I'm astounded (0)

Anonymous Coward | more than 10 years ago | (#8860736)

August will be two years since I got my 9700 and it took this long for Nvidia to kick ass again. But the damn thing is still a two-slot cooling pig. 2 molex connectors and huge wattage draw. Its screams overclocked-to-the-max just to compete like the NV30. I'm waiting for the r420( which should have low-k, lower wattage, but no ps3?) to make a comparison, but we finally have some cards a 9700 owner could consider as a generational upgrade (the 9800 was barely a refresh if you ask me)

"... by over 100% in almost every benchmark"?? (5, Insightful)

sczimme (603413) | more than 10 years ago | (#8860738)

From the article:

To measure how well both cards perform with actual gameplay we used Unreal Tournament 2003 and 2004 and Halo and Far Cry. For both versions of Unreal Tournament we've used the built-in benchmark, which consists of a flyby and a botmatch. We've omitted the flyby scores as they doesn't tell us much about performance during actual gameplay, just how fast the graphics card is able to render the flyby. With UT2003 the lead the GeForce 6800 Ultra takes over the Radeon 9800 XT is less impressive, at a 1024x768 and 1280x1024 resolutions it is only 6% faster. At 1600x1200 however the GeForce 6800 Ultra pulls away and clocks in 21% faster. With UT2004 the difference is much bigger, starting off at 10% at 1024x768 up to 65% faster at 1600x1200. What is also noteworthy is the fact that the performance of the Radeon 9800 XT drops at higher resolutions whereas that of the GeForce 6800 Ultra stays at about the same level.

I know this is /., but how does this become "beating ATI's fastest by over 100% in almost every benchmark"??

Re:"... by over 100% in almost every benchmark"?? (1)

l33t-gu3lph1t3 (567059) | more than 10 years ago | (#8860870)

Because that's only one benchmark where it doesn't beat by 100%? The fact that it has a 3x lead in "call of duty" gives them a little license to make generalized claims ;)

Re:"... by over 100% in almost every benchmark"?? (1)

Lehk228 (705449) | more than 10 years ago | (#8860886)

you add up all of the %difference and report that number

Re:"... by over 100% in almost every benchmark"?? (1)

Gruuk (18480) | more than 10 years ago | (#8860900)

>> I know this is /., but how does this become "beating ATI's fastest by over 100% in almost every benchmark"??

It's called "lying".

What is the Price on this? (1)

RicJohnson (649243) | more than 10 years ago | (#8860747)

This sounds great - what is it worth the price?
At least the other cards will be cheaper now
Also - doesn't this chip set have new DRM tech?
I am not going to buy it if it is reporting eveything I do

Numbers are incredible (1)

UTAssassin (769210) | more than 10 years ago | (#8860750)

A quick look at the benchmarks will reveal that this card is about twice as fast as the current 5950. Priced at $400, the 6800 should drive down the price of the 5950 cards to a sub-$200 level. This is excellent news for gamers!

5950 is a suckers bet (0)

Anonymous Coward | more than 10 years ago | (#8860774)

Especially if you want to play Far Cry with pixel shader 2.0.

nVidia vs ati (0)

allen17 (735040) | more than 10 years ago | (#8860751)

The results speak for themselves, the GeForce 6800 Ultra is the new king of the hill, beating ATI's fastest by over 100% in almost every benchmark.

i wonder how long it will take Ati to release a press release about an upcoming card. They have already talked about enabling pipelines in previous video cards.

Looks great but... (3, Funny)

Stevyn (691306) | more than 10 years ago | (#8860752)

But where do I put this thing? That's not a heatsink, that's the kitchen sink!

Re:Looks great but... (0)

Anonymous Coward | more than 10 years ago | (#8860923)

Right next to your local power station. 480W!? Come on, this is plain daft!

How is it the "King of the hill"? (4, Interesting)

Guspaz (556486) | more than 10 years ago | (#8860755)

ATI's next-gen offering is to be launched about the same time as nVidia's GeForce 6800, and we haven't seen reviews from it yet.

I'd wait until the Radeon X800 benchmarks are out before crowning a new king. For all we know ATI's new offering will beat the new GeForce.

Nvidia said it... (0)

Anonymous Coward | more than 10 years ago | (#8860756)

When ATI was better, nvidia said "Don't look at benchmark scores", oh, accusing benchmark developers of being bias torwardxs ati and so on.. how the tone has changed now.. ;)

that just because ATI caught them with their pants down, that kind of attitude makes me not want an nvidia chip even if it's much faster!

crazy titles (2, Funny)

silverhalide (584408) | more than 10 years ago | (#8860758)

I heard from a confidental source that the next NVidia card was going to be called the Super GEForce 95050++ Hyper-titanium Happy Extreme-platinum Ultra MK-II Enhanced V2.2 Omicron version. Keep your eyes open.

omg lol lol lol (1)

legomad (596194) | more than 10 years ago | (#8860908)

um, yeah.

But... (1, Funny)

Anonymous Coward | more than 10 years ago | (#8860768)

Will Doom 3 run faster than 10 fps on it?

Addtional Revenue (3, Funny)

netfool (623800) | more than 10 years ago | (#8860773)

nVidia mine as well get into the case and CPU fan/heatsink business! Look at that thing!
Hell, with something that big they should just build freezer around the card.

short review (5, Funny)

Gingko (195226) | more than 10 years ago | (#8860783)

all of the latest DirectX 9.0 game titles

what, both of them? ;)

Thank you ladies and gentlemen, I'm here all week. Available for weddings, bahmitzvahs and light-hearted funerals.

Talk about cornering the market ... (2, Insightful)

hlygrail (700685) | more than 10 years ago | (#8860785)


... but what am I going to have to PAY for this beautiful monster?

It's big (2 slots), it probably runs VERY VERY hot, takes two power connectors... but it seems to trump EVERYTHING else so far, and not by small amounts!

FIX THE TYPO (1)

egarland (120202) | more than 10 years ago | (#8860786)

Reviews! not Revews. (My typo. Sorry)

More info, pcis, and a different view (3, Informative)

Recoil_42 (665710) | more than 10 years ago | (#8860789)


here. [mbnet.fi]

those benchmarks don't look too impressive to me, and the hugeass heatsink/fan combo is still there! not to mention that it requires *two* molexes?

Nvidia is really starting to fall behind...

Re:More info, pcis, and a different view (3, Informative)

Seoulstriker (748895) | more than 10 years ago | (#8860884)

Hmmmmm. Let's see. We have about 10 reviews saying that the nVidia is 2x faster than current top of the line cards, and we have one review by [H]ardOCP which uses different measures in its benchmarks (different resolutions, AA, AF settings in the same graph) and is profoundly anti-nVidia and we are supposed to take it seriously? Come on...

Re:More info, pcis, and a different view (1)

peeon (743159) | more than 10 years ago | (#8860918)

umm...check the settings for the game benchmarks, the 6800 card is on 4AA/8AF, while the other cards have no AA/AF. learn to read before you make accusations.

FX 6200? (3, Insightful)

Spleener12 (587422) | more than 10 years ago | (#8860790)

I'm curious as to whether or not this means there will be a new low-end NVIDIA card. Yeah, the 6800 is nice, but I'm more interested in the cards that I can actually afford.

ATI (1)

TheAxeMaster (762000) | more than 10 years ago | (#8860791)

The hottest debate in the gaming world...

Honestly I have owned both nVidia cards and ATi cards, I am currently using an Ati 9600XT, upgraded from a GeForce 2 Ti, and I'm not that impressed with the ATi cards.

I'm glad nVidia came out with something newer and better and hopefully this new card won't have all that confusion over flip chip versions and whatnot like the 5700 did.

nVidia's driver support is better too. This new card should also be PCX or soon will be.

No more Quake bencmarks?! (2, Insightful)

Cthefuture (665326) | more than 10 years ago | (#8860814)

Ahh! When did Tom's do away with the Q3 benchmarks?

It's still the only game that can push the hardware to its limits reliably. All those other games tend to have bottlenecks that are algorithm/code related rather than hardware related (like the scripting engine in UT).

Too bad, I found Quake3 to be one of the most accurate because it ran at such a low level and could pretty push the hardware. It's not like those other games are using the hardware shaders yet anyway (or are they?).

Re:No more Quake bencmarks?! (1)

superpulpsicle (533373) | more than 10 years ago | (#8860834)

I don't think it's a question of benchmarks. I have owned both nvidia and ATI cards. Every ATI card has a life span of about 1 year, that's including the new ATI 9800.

I am going back to Nvidia permanently I don't care what the specs are. ATI hardware has a real quality problem out of the box with cheap default fans. ATI software drivers are so bad, they need a new version every month. Nuff said.

Re:No more Quake bencmarks?! (1)

Cthefuture (665326) | more than 10 years ago | (#8860920)

True, but I'm still interested in seeing where technology is at.

I never got on the ATI boat. I've always maintained that (IMO) their drivers suck too much to be useful. In my experience nVidia works better and is better supported on Linux also, which is where I spend most of my time.

another review @ hexus.net (1)

godofmischief (771271) | more than 10 years ago | (#8860823)

There is another review up at hexus.net. It looks good, but takes a pretty good performance nose dive with certain features enabled, and the 9800XT beat it in some cases. And like some others have said, 2 molex connectors for power... don't they think its getting a bit stupid now!

DoomIII now ready to ship? (5, Funny)

guile*fr (515485) | more than 10 years ago | (#8860826)

in other news ID Software announce that DoomIII will
run at 30@fps on the new Nvidia 6800

Holy mother of crap (5, Informative)

l33t-gu3lph1t3 (567059) | more than 10 years ago | (#8860832)

Strong points of new Nvidia card:

-Obscene performance boosts, on a scale I've never seen before
-fancy new effects
-massively improved image quality
-heatsink fan still pretty quiet
-basically free 4xFSAA and 8x ANISO

Weaker points of new Nvidia card:

-Expensive
-it seems that shader precision is still not as pretty as ATI's, though that may be fixed by game patches
-takes up 2 slots with the tall heatsink
-480W recommended PSU
-video processing engine isn't implemented in software yet

I don't really object to the power requirements. This thing is more complicated, bigger, and has more transistors than a P4 Extreme Edition. It consumes about 110W, of which 2/3 is the GPU die's power draw. It is certainly NOT unreasonable to require a big power supply with this thing. It seems as though ATI's solution will have a power supply recommendation as well. Simply put, if you're gonna improve performance by such a margin by means other than smaller manufacturing, you're going to increase power consumption. Get over it.

This thing isn't meant for SFF PCs or laptops, though I'm sure the architecture will be ported to a laptop chip eventually. As for the 2-slot size, well...It consumes 110W! To put this in perspective, it consumes more than any non-overclocked desktop CPU today! Think of how big your Athlon64/P4EE heatsink/fan is, then you'll realise that 2 slots aren't really that big of a problem.

My own personal reason for wanting this thing: It can play any current game at 1600x1200 with 4XFSAA and 8x anistropic filtering at a good framerate, and is the only card that can claim to do this right now :)

Any word on X support? (1)

i.r.id10t (595143) | more than 10 years ago | (#8860858)

What with the license changes for XFree86, the various new X implementations, changing distros, etc. has NVidia come out and said which one their drivers will work with?

Biased Article ? (1)

Tensor (102132) | more than 10 years ago | (#8860861)

From the article:
"What is also noteworthy is the fact that the performance of the Radeon 9800 XT drops at higher resolutions whereas that of the GeForce 6800 Ultra stays at about the same level."

Wouldn't that mean that the limiting factor for fps is NOT the card but some other thing (processor, memory bandwidth) ?

I mean i know they used this hardaware for the test:
"The system we used consists of a Pentium 4 3.2GHz EE processor, EpoX? 4PCA3+ i875P chipset motherboard, 1GB of Crucial DDR400 memory and two Western Digital WD740GD Raptors in raid0"

Which is no POS sistem, but still, they should have done some test with the processor and memory overclocked to check if there were really other HW limitations in the Scores.

Not too fond of early comparisons... (1, Redundant)

CaptIronfist (457257) | more than 10 years ago | (#8860865)

The card performances sound promising, however comparing a next gen card with it's competitors out-dated models isn't too significative. I've always prefered Nvidia cards for their respectable performances and their annoyances-free install under linux, but they announced not so long ago that their first PCI Express cards would use a PCI Express to AGP bridge and this is where i think Nvidia is going to shoot themselves in the foot and come tumbling down that hill.

ATI OTHO will be using a native PCI Express solution for their RV423 chipset, the pci express version of the RV420. Until we see ATI's X800 card and also both companies' PCI Express solutions perform, no one owns that hill. An early fanatical reaction towards any graphic card at this point is a foolish reaction and an uneducated one IMHO.

I wish .... (5, Insightful)

El Cubano (631386) | more than 10 years ago | (#8860880)

I wish that people that pretend to be computer experts would do the teeniest bit of research.

How about this gem: First introduced in 1995, Microsoft's DirectX application programming interface (API) was designed to make life easier for developers by providing a standard platform for Windows-based PCs. Before the arrival of DirectX, developers had to program their software titles to take advantage of features found in individual hardware components. With the wealth of devices on the market, this could become a tedious, time-consuming process.

I'm glad he cleared that up for us. Because this little known company called SGI [sgi.com] didn't develop OpenGL [opengl.org] back in 1992 [sgi.com] . In fact, were it not for MS, we would still be in the computer graphics dark ages.

I'm not trying to troll here. I am just pissed that people pretend to be experts when they don't have a clues what they are talking about.

Re:I wish .... (1)

l33t-gu3lph1t3 (567059) | more than 10 years ago | (#8860934)

There are different OpenGL paths for every graphics card or architecture. You can use Architecture Review Board (ARB) standards as well. Those barely existed when DirectX and Direct3D came into being.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>