Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

GeForce 7800 GTX 512 Reviewed

ScuttleMonkey posted more than 8 years ago | from the truly-ridiculous-toys dept.

Graphics 37

ThinSkin writes "Today Nvidia released its latest combatant in the desktop graphics wars in the wake of ATI's new X1800 line, the GeForce 7800 GTX 512. The clock rate has been upped as well as the memory, partly thanks to a truly massive cooling solution. ExtremeTech's Jason Cross does all the benchmarking on a board from XFX, which is slightly overclocked and includes VIVO capabilities. At $650 list, it also sets a new price record for a new generation desktop graphics card."

cancel ×

37 comments

Sorry! There are no comments related to the filter you selected.

Other Reviews (4, Informative)

Vigile (99919) | more than 8 years ago | (#14025949)

Re:Other Reviews (2, Informative)

plover (150551) | more than 8 years ago | (#14026272)

Bigger question: have any of the reviews discovered if nVidia's cheated [extremetech.com] on this benchmark yet?

It seems that any time ATI or nVidia releases a new card, they've also got some drivers that "optimize" for the 3DMark benchmarking software. So I figure it must just mean that nobody's found out how they're doing it yet.

It's kind of sad to think that when they announce some obviously kick-ass hardware that all I can think of is "how did they cheat this time?"

Re:Other Reviews (2, Informative)

RzUpAnmsCwrds (262647) | more than 8 years ago | (#14026660)

Bigger question: have any of the reviews discovered if nVidia's cheated on this benchmark yet?
It seems that any time ATI or nVidia releases a new card, they've also got some drivers that "optimize" for the 3DMark benchmarking software. So I figure it must just mean that nobody's found out how they're doing it yet.

It's kind of sad to think that when they announce some obviously kick-ass hardware that all I can think of is "how did they cheat this time?"


The GeForce FX scandal was a few years ago, and no major GPU manufacturer is stupid enough to try to pull something like that again.

Image quality has been excellent on the GeForce 6 and 7 series parts so far. Some have reported a "shimmering" issue with AF turned on, but it's so difficult to spot that most people will never notice it. I certainly never have.

Re:Other Reviews (2, Interesting)

plover (150551) | more than 8 years ago | (#14027079)

Because nVidia wasn't the only company cheating. ATI was also found [extremetech.com] to be "optimizing" for benchmarks, too. Yes, it was a couple of years ago. But you give too much credit to think that people have stopped being stupid. It won't surprise me at all if it happens again.

Re:Other Reviews (2, Informative)

dimfeld (247690) | more than 8 years ago | (#14027024)

That's just another reason that reviewers are using real games as benchmarks more and more. It also helps that they give a much better impression of the card's capabilities in real-life situations.

Re:Other Reviews (2, Interesting)

plover (150551) | more than 8 years ago | (#14027223)

Don't be fooled just because it's not an official benchmark program. Remember Quack vs Quake? A benchmarker renamed Quake3 to Quack3, and found the ATI card's performance dropped because the drivers were tuned to the specific application named Quake3.exe.

What they never identified was an answer to this question: "Was this an optimization for Quake, or was this a deliberate attempt to improve ATI's standings in the benchmark wars?" If you download the ATI drivers, you'll find that every new driver patch contains a list of games that have had driver bugs fixed that address title-specific problems. So we have plenty of evidence that card makers look at specific games, but without mind-reading abilities it's impossible to know their motives.

Re:Other Reviews (1)

HD Webdev (247266) | more than 8 years ago | (#14028332)

What they never identified was an answer to this question: "Was this an optimization for Quake, or was this a deliberate attempt to improve ATI's standings in the benchmark wars?"

It lowered the quality settings a bit only when the executable was named specifically "quake3.exe". So, no matter what the "real" end result someone wanted, doing so improved ATI's standings for Quake3.

One of my favorite not-so-malicious tactics was Nvidia holding out extremely improved drives until ATI launched a new line. Suddenly, Nvidia released drivers that improved performance ~20%. ATI had made a mistake by trying to edge out what they thought was the highest performance Nvidia had to offer.

Of course, this isn't limited to Video Cards. FORD did this a while back by advertising lower horsepower for a new truck and when competitors based their production on slightly beating that number, they got suckered.

Probably both (1)

dimfeld (247690) | more than 8 years ago | (#14039525)

I'd imagine that the answer to the question is "both." Personally, I don't have a problem with this sort of optimization, since the user will probably get the same benefit when actually playing the game, and that's what really matters.

I'd be unhappy if there were optimizations that only turned on when running a game in demo mode or when FRAPS was loaded (or whatever is being used for FPS measurements nowadays). I haven't heard about anything like that yet though.

Re:Other Reviews (1)

HD Webdev (247266) | more than 8 years ago | (#14027856)

It seems that any time ATI or nVidia releases a new card, they've also got some drivers that "optimize" for the 3DMark benchmarking software. So I figure it must just mean that nobody's found out how they're doing it yet.

Actually, it's not every time. It's much more rare. They've learned that not all good publicity is good publicity. On top of that, they will always get caught because there's masses of fanboys of the opposing line of cards watching for that type of mucking with the drivers.

1 Gb in SLI? (1)

DavidV (167283) | more than 8 years ago | (#14025966)

I only partially RTFA but I can't see any mention of SLI, 1 Gb of video memory is a dream.

Re:1 Gb in SLI? (0)

Anonymous Coward | more than 8 years ago | (#14030169)

They don't share the video memory in SLi as far as I know.

Re:1 Gb in SLI? (1)

JorDan Clock (664877) | more than 8 years ago | (#14030686)

It's not like it'll have 1GB, per se, but each card will have half the work the to do.

Here come the complaints (2, Insightful)

elasticwings (758452) | more than 8 years ago | (#14026105)

And now for the slew of, "WTF? Who pays that much for a video card!" "If game developers knew what they were doing, we could all play Half Life 2 on my Geforce 2." "This is why consoles are cheaper/more appealing."

Re:Here come the complaints (1)

plover (150551) | more than 8 years ago | (#14026359)

How about "WTF? Who wants to look at crappy graphics on a console when you can have creamy goodness from high-end cards like these?" Once you get used to nice graphics, consoles just look like Fisher Price toys - big, safe, chunky, blocky, slow, and mostly made of primary colors. Put a real rendering engine to work for you some time and there is just no way you want to play on crap like an XBox or PS2 ever again. And unless you've got a spectactular television, there's simply no way an NTSC-based display can stack up against a computer monitor, either.

Re:Here come the complaints (1)

elasticwings (758452) | more than 8 years ago | (#14026411)

Dude, you don't have to tell me. I wasn't bashing it, just saying that it will probably happen. The only console I own is a Gamecube. I play FPS games, of course I don't use a console. :P

Re:Here come the complaints (0)

tomstdenis (446163) | more than 8 years ago | (#14026438)

Sure why not. Someone has to keep these mindless idiots in check.

The problem is thinking like yours means you'll never have "the right card". Once you save up your salary from Walmart to buy this 7800 the "7800 X-treme e-gamer E-edition" will be out and you'll be like "shit, damn and I just bought this 350W graphics card for nothing!"

And frankly I don't know why you guys are impressed. My graphics card spends 99.9999% of the time in 2D mode [rendering my glorious X11/Gnome desktop]. I do play games but really not a big part of my daily life.

Why would I spend the money on the card and power just so once in a blue moon when I play games the crapply modelled M16 my character is holding will look "ultra detailed and super rendered". All I care about FPS type games [where most of this bullshit technology goes] is a decent framerate and the ability to totally own a bot or two.

If I have to use "medium" detail to get that then so be it. If the game visually is crap I'll not buy the sequel, etc...

Tom

Re:Here come the complaints (2, Insightful)

plover (150551) | more than 8 years ago | (#14027017)

And frankly I don't know why you guys are impressed ... I do play games but really not a big part of my daily life.

Because games ARE a big part of the daily life of many people. (Not ALL people, but many.) Leaving computer graphics aside for a moment, how many sports fans do you know? Football, baseball, hockey, basketball? How many of them covet HDTV sets so they can watch the big game in more detail, or see larger-than-life closeups? And when one of these guys gets a plasma TV, is it coincidence that his friends come over to watch the game at his house?

So then, why is it OK for a football fan to spend $5,400 for an HDTV to watch his games, but not $650 for a World Of Warcraft gamer to want the best graphics for his game? Many of the gamers I know are addicts, spending far more time with their "entertainment gear of choice" than even the football fans with big screen TVs.

Please, feel free to save your money. If you're satisfied with your setup, then hooray for you! You've saved money that other people haven't. But you might want to consider that not everyone has the same spending priorities as you.

Re:Here come the complaints (1)

LoRdTAW (99712) | more than 8 years ago | (#14027311)

Ah don't worry about him. Sounds more like he is the jealous one working at Walmart on his bargain bin Lindows PC. Why else woule he cry that he doesent need that kind of expensive video card. It would be like posting on a car website and ranting about why I would spend $250,000 on a Ferrari when all I need is a Honda. Morons. If you have nothing to contribute then shut the hell up.

Re:Here come the complaints (1)

tomstdenis (446163) | more than 8 years ago | (#14027553)

Actually I have my own "nifty" computing devices [among other things AMDx2 and Pentium D desktops]. But I buy them because I actually do productive work on them [ok the pentium was solely for benchmarking ...]. When building a project that has 100K lines in it [from scratch when doing customer drop tests] it's nice to know I can do 30 builds an hour instead of 5. [to fine tune scripts and such].

Things like these cards make no sense. My FX5200 was capable of playing games like UT2k4 just fine. My 6600 is more than enough for things like Far Cry. I'm not saying people shouldn't be allowed to get faster cards. I'm asking why are you impressed? Engineering is about making the most from the least.

Getting 2x the FPS at a cost of 3x the power isn't really progress now is it?

People clamour over the Intel EE processors [apparently given the reviews] because of the super sized cache and "biggie sized" power requirements all while a lower powered AMD64 [or heck even a P4] is often just as good or if not sufficiently good.

What does it matter once your card can get over 60fps at 1024x768? You're not going to see any more detail just because your HUD is smaller and above 60fps and you're a laughing joke anyways [hint: not a lot of monitors can do 1600x1200 at 100Hz...] so really a lot of the frames go to waste and the detail is irrelevent.

It'd be like having a dual-core processor where the 2nd core mirrors the first core only [e.g. it doesn't do anything other useful]. Yeah it's totally cool you have two cores on die but you're not extracting any useful benefit from it.

As for your car reference, the same line applies there. How many people do drive SUVs and minivans when a 5L/100Km car would serve 99% of their tasks properly? It's cheaper for most people to just rent a van when you need it and save the rest of the year on gas.

Sure you have a right to be wasteful but is that really something to be proud of?

Tom

Re:Here come the complaints (1)

LoRdTAW (99712) | more than 8 years ago | (#14029167)

Your still not getting the point. It might not make sence to you but to others it does. These over the top cards are for people who want bragging rights as well as top preformance.
And did I say I was impressed? I have a 6800GT and it runs most of todays cutting edge games just fine.

I dont mean to be an ass hole but who cares what you do with your computer? If you dont need that card then don't but it. End of story. Reguardless of what you say people will buy it. Just like the people who buy ferraries to drive around in Manhattan and suv owners who put rims and stereo systems in to drive themselves around town.

Re:Here come the complaints (1)

tomstdenis (446163) | more than 8 years ago | (#14029536)

It's called a discussion. I'm raising the question of whether we should encourage this trend. Suggesting perhaps that a better course of action would be looking at new sources of performance.

I'm not saying I'm right or everyone should bow to my wisdom. I'm just trying to act anti-sheep and suggest we as customers ought to demand more than power-hungry graphics cards [or processors for that matter].

Tom

Re:Here come the complaints (1)

plover (150551) | more than 8 years ago | (#14029219)

I'm asking why are you impressed? Engineering is about making the most from the least.

And this card is about making more from less. They're clocking their stock chipset far above their previous 7800 card and massively piling on heat-reduction gear like a rabid overclocker in a copper tube factory. And they're making more money :-)

Keep in mind that it's not about the frame rate. It's about keeping the frame rate high in increasingly complex situations. Obviously, a frame rate higher than your monitor's refresh rate is "excess capacity." (Hell, my card gets 300 FPS when I'm looking at a black screen!) But having that capacity means you can have more details while still maintaining visual smoothness. Most games deal with complexity by fogging stuff in the distance, or by not painting stuff until it's within some range. With a juiced-up graphics card like this, you can have less fog or more distant objects rendered. Better, you can have crowds of people (such as in a raid party) all dynamically rendered, nice and smooth. You don't get the screaming laggies just because a bunch of monsters and fireballs hit your screen at the same time. You get to see it in all its anti-aliased 24-bit color glory. That's why gamers want more power.

My FX5200 was capable of playing games like UT2k4 just fine. My 6600 is more than enough for things like Far Cry.

And now I don't understand you, because it doesn't seem like you're being self consistent. You had an FX5200 but you upgraded to a 6600? When the 6600 came out it would have been a $150 investment, which is an awful lot of extra expense for a graphics card, especially when your FX5200 was 'just fine'. Or did you get the 6600 with a new computer, even though most motherboards come with on-board 3D graphics chipsets that are 'just fine' for consumer PCs?

Right now it sounds like you have a pretty arbitrary "line in the sand": it's somehow acceptable to spend $150 for a graphics card, but anyone who spends $650 is an 'idiot'. What's your cutoff line for idiocy? $200? $300? $649.99? $150.01? Or is it more of a sliding scale, ranging from $150==genius to $300==dummy to $650==idiot?

Oh, wait. I get it now. IHBT. Thank you very much.

Re:Here come the complaints (1)

tomstdenis (446163) | more than 8 years ago | (#14029509)

I upgraded to the 6600 because my AMDx2 motherboard is PCI-E. So is my 915G [and 945] I use for my Intel gear.

Trust me, I wasn't happy to have to buy the new card. My choice was an 6200, 6600 or not buy the dual-cores. The 6200 is just a piece of shit. I mean I'm not into excess but I DO like some ability to render a 3d scene ;-)

Tom

Re:Here come the complaints (1)

GameMaster (148118) | more than 8 years ago | (#14030404)

"Things like these cards make no sense. My FX5200 was capable of playing games like UT2k4 just fine. My 6600 is more than enough for things like Far Cry. I'm not saying people shouldn't be allowed to get faster cards. I'm asking why are you impressed? Engineering is about making the most from the least."

Yes, making the most from the least is one thing engineering is about but it isn't the only thing. I know engineering types often like to focus on that aspect of engineering. I tend to find it impressive as well. A good example would be the Apollo 13 incident.

However, another perfectly valid aspect of engineering is taking the sum total of human knowledge and experience and using it to advance the cutting edge of technology. Good examples of this would be the original moon landing, the SR-71 spy plane, and any number of the other, cutting edge, technological toys that are designed for the U.S. military. In fact, many people would consider these just as impressive, if not more so, than someone making the most out of limited supplies. To each their own but please don't try to push your personal preference as the "true ideal" of engineering. It tends to come off as arrogance and/or, as has been said, "sour grapes".

Also, speaking as someone who used to play a great deal of FPS games, the biggest and badest hardware can make a great deal of difference in game performance. Being able to play the newest FPS games like F.E.A.R. at the highest resolutions/quality while maintaining a FPS higher than the refresh rate of the monitor (so that any dips in FPS still max out the monitor) provides a definite competitive edge. You would have to be pretty good to notice it (better than I ever was) but the difference is there and people that good do exist. As we all know, these games keep coming out with higher and higher hardware requirements. The general trend has been that the most cutting edge ones are designed to run, significantly, slower than 75 - 80 FPS average on anything but the most cutting edge video cards (Doom 3, F.E.A.R., etc.). If you absolutely needed some kind of "practical" excuse for people to be wishing for this hardware then there it is.

-GameMaster

Re:Here come the complaints (1)

tomstdenis (446163) | more than 8 years ago | (#14030592)

Pumping a lot of electricity into a circuit is ***NOT*** "cutting edge".

I'm sure my 3.2Ghz P4 could run at 4.2Ghz if I nitrogen cooled it and pumped 16 times the current into it.

Big deal. Show me where they get this performance at an EQUAL or LOWER amount of waste?

The SR-71 is not exactly a good example. They sacrifice payload and a lot of fuel to fly that high and fast. Compare that to a 757. Last time I did the rough calculation per person transported a 757 wasn't that much less efficient than your average SUV [all while travelling much faster and further]. That's progress. That's also 20 years ago.

Show me the SR-71 version 2.0 which uses less fuel or has a higher payload ... etc...

A lot of "really cool things" are possible if we just decrease the efficiency of a device or process. You can make cars quicker if you accept a higher defect rate. You can get places quicker if you speed [and lower your mpg]. You can read books faster if you drop every other sentence, etc.

On a similar but slightly different note games like F.E.A.R. aren't really that much better because of the graphics. Just because it *can* use those cards doesn't mean you need to. And this gets back to the whole point, why are people impressed with tech demos, er, um, ah, games that require such heavy lifting just to look half decent?

Would you be equally impressed with my sloppy game engine? Oh but it's good I mean it requires a 2.8Ghz AMD64 to run!!! That must mean it's good!!!

I just don't find it very compelling that just because something requires a lot of power to run it must be somehow better than before. I've played far cry in both high and medium settings [my PCI-E 6600 can handle both just fine at 1024x768] and frankly I really didn't notice an improvement that really stood out. I mean the draw distance seemed maybe further but that's about it.

You're ability to base camp, spawn camp, stalk and other such griefing, er, winning game plans aren't affected by rendering detail. That's a load of shit. You'd do just as good with solid non-textured objects. All you need to do is see the shape and point. The fact that the texture is 512x512 instead of 64x64 won't help you do that.

I mean how much detail do you take away anyways at 80fps in a fast paced battle? When I play UT2k4 [with a shitload of bots] all I see are red and blue and I shoot. They could have "fuck you trebek" written on their fronts and I probably wouldn't notice.

People want the cards for the single fact that they have to have the latest. It's bragging rights. And it's something to brag about because people are ignorant of engineering principles and think all the wrong things are impressive. Also a lot of these kids don't pay the hydro bill [or for their computers] and it's nothing to waste 100s of Watts on a overblown GPU to "frag some chump".

Tom

Re:Here come the complaints (1)

GameMaster (148118) | more than 8 years ago | (#14031052)

"Pumping a lot of electricity into a circuit is ***NOT*** "cutting edge".

I'm sure my 3.2Ghz P4 could run at 4.2Ghz if I nitrogen cooled it and pumped 16 times the current into it.

Big deal. Show me where they get this performance at an EQUAL or LOWER amount of waste?

The SR-71 is not exactly a good example. They sacrifice payload and a lot of fuel to fly that high and fast. Compare that to a 757. Last time I did the rough calculation per person transported a 757 wasn't that much less efficient than your average SUV [all while travelling much faster and further]. That's progress. That's also 20 years ago.

Show me the SR-71 version 2.0 which uses less fuel or has a higher payload ... etc..."


I'm sorry if you don't like it, but efficiency isn't the end all and be all of "cutting edge". In fact, it's almost the opposite. Cutting edge things very often sacrifice all thought about efficiency in the drive to be the absolute best at whatever they were designed to do. Being the absolute best at its primary function is what defines something as cutting edge.

Efficiency is only considered "cutting edge" for devices designed specifically to be efficient (Volkswagen Lupo, Hybrid cars, etc.) If the 7800 line from nvidia happens to be the most refined GPU designed to date and the only way to make it faster is to design a better cooling system, optimize the PCB layout, and pump more power into it then yes that does make it cutting edge. You can't assign your own, personal, definition of "cutting edge" that ties it directly to the power efficiency of a device and then expect everyone else to agree with you (well, I suppose you can, but they won't).

I think the SR-71 was a perfect example. Who cares that it sacrificed payload and fuel? Was it designed to be a cargo craft? No! It was designed to be the, absolutely, fastest aircraft ever created. And, at the time it was created, it was (and still is to the best of public knowledge). Furthermore, could you make a 757 go just as fast as an SR-71 by sacrificing all the payload and fuel? No! The 757, as efficient as it may be, lacks the basic materials science technology and mechanical engine design need to attain those speeds no matter how hard you push it.

Yes, the 757 is cutting edge (or was when it was designed) but the point is that they are both cutting edge for the primary function each was designed for (efficiency for the 757 and pure speed for the SR-71). You may value efficiency higher than any other factor of a device, but that would be your opinion and would run cournter to the primary design goal of a great many devices designed today. I think, perhaps, you are confusing your opinion with the actual meaning of the term "cutting edge".

On a similar but slightly different note games like F.E.A.R. aren't really that much better because of the graphics. Just because it *can* use those cards doesn't mean you need to. And this gets back to the whole point, why are people impressed with tech demos, er, um, ah, games that require such heavy lifting just to look half decent?

Yet again, this is your opinion. You may have a threshold at which you can't see, or don't care about, a noticeable difference between say Quake 3 and F.E.A.R. but, most people can (even if they aren't willing to shell out the massive amount of money needed to get one of the high end video cards needed to play at that quality level). My point wasn't that it "can use those cards" but that it also gets a tangible benefit from those cards in highly competitive play. Your argument smacks of Ludditeism and, if take to its logical extreme, would question why we ever stopped using slide rules in favor of calculators (surely, the most energy efficient way to do math problems).

Would you be equally impressed with my sloppy game engine? Oh but it's good I mean it requires a 2.8Ghz AMD64 to run!!! That must mean it's good!!!

If you can't see the difference in visual quality between a poorly designed Quake 3 clone that requires a 2.8 GHZ AMD 64 to run and a copy of something like F.E.A.R. then perhaps you should consider updating the prescription on your glasses.

I just don't find it very compelling that just because something requires a lot of power to run it must be somehow better than before. I've played far cry in both high and medium settings [my PCI-E 6600 can handle both just fine at 1024x768] and frankly I really didn't notice an improvement that really stood out. I mean the draw distance seemed maybe further but that's about it.

Draw distance alone can be a competitive advantage in FPS games. Other advantages are found with ultra high resolution and FPS when using rail gun/sniper style weapons. Yet again, you obviously don't play these games enough to care but the point is that others do. I won't claim that I'm good enough to notice these things in the heat of combat but I've definitely played against people that do.

Beyond that, there is an unmeasurable factor to a video game that can be referred to as the "prettiness factor". Even for someone like me that can't, necessarily, see a competitive advantage in the game, the increase in visual quality is noticeable and enjoyable (when not immediately confronted with a hot-and-heavy deathmatch duel). As you have made clear, you are either physically incapable of seeing the difference or else don't care/play enough to notice but it is there, and many people can see it.

I'll ignore your last paragraph because it is nothing but rampant stereotyping (unless, of course, you are claiming to have read the minds of every game player on the planet and to know their true intentions behind buying their hardware) and your personal opinion that nothing cutting edge matters (or is "true engineering") unless it conforms to your, personal, efficiency ethic. All and all, you are coming off like some crotchety old man complaining why those "damn whippersnapper" need "those new fangled color televisions and them fancy schmancy cars with power steering and anti-lock breaks" when "a tube radio and my pappy's Model T suited me just fine". If the gaming technology that you have now suits you then that's fine, but most people still agree that graphics technology hasn't reach the pinnacle of development (which most people would agree is real-time photo-realism) and would actually like to see that happen.

-GameMaster

Re:Here come the complaints (1)

tomstdenis (446163) | more than 8 years ago | (#14031117)

Improvements are not always obvious.

Are the new GPUs faster? Yes.

Do they get more detail at same/higher FPS? Probably yes.

Is this the result of some new groundbreaking design? No.

Is this EVEN LESS power efficient then the last series of non-power efficient cores? Yes.

You're trying to go along the lines of "this is the way it is and that's all there is to it." and I'm trying to say it isn't impressive, worthy enough to spend money on. What is their incentive to come up with a new GPU design if people keep buying the power hungry current designs?

Hint: Intel anyone?

Tom

Re:Here come the complaints (1)

GameMaster (148118) | more than 8 years ago | (#14031333)

The incentive is that at some point silicon turns into a pile of goop when you pump too much power through it. If, and when, we are able to move onto something stronger, like diamond semiconductors, then we will. But, even diamond breaks down at some point. We are hitting the practical limits of how much we can shrink the die size. This leaves us with only three real options for improvement. Those are to change semiconductor material (allowing us to pump more power in), to improve the chip designs, and to make use of multiple processors working in parallel to accomplish the task. Nvidia, ATI, Intel, and AMD are each making use of all of these options (to various extents) to help them compete.

You may not believe this, but nvidia and ATI actually do a great deal of design improvement already. The tight competition between the two companies stops them from resting on their laurels in the way 3DFX did before nvidia hit the scene. They both tend to put out a new product every six months or so. Each new release alternates between an overclock of the previous release and a complete chip redesign. The newest redesign for nvidia was the 7800GTX.

Both of these companies are in a position to avoid some of the legacy garbage that limits Intel's ability to re-design their chips because they don't have to support a legacy instruction set like x86. They have much more freedom to innovate and have done so in the last few generations of chip redesigns. Over the last few generations they have added much greater flexibility to the shader engine (the 7800's major contribution was full Shader Standard 3.0 support).

And, once again, I'll restate that I don't care how much power the new designs use. Virtually no one that is interested in the cutting edge of video game graphics hardware cares about power consumption. Since pumping more power into the chips is one of the primary ways we have left to improve performance then efficiency runs counter to the true goal of video game graphics hardware which is graphics quality and/or realism. All that matters in the field of real-time graphics hardware is sheer performance numbers and the quality of the graphics the hardware is capable of producing. Is this solution revolutionary? I would have to say not really, though they have, supposedly, done a radical redesign of the PCB layout as well as the cooling system so that should count for something.

At some point there is going to be a theoretically most efficient chip design and a theoretically most efficient die size and the only method we will be left with to improve CPU speed will be material science and thermodynamics to pump as much power into each chip as possible without melting it.

My point is that these cards are, both, impressive and worth spending money on for the people that are into video games (and have the excess disposable income to spend on them) because people that are into that sort of thing value image quality (for a number of reasons as I've mentioned before) and don't care at all about energy efficiency. Perhaps you aren't even close to being the target market for this product but that doesn't mean that that target market doesn't exist or that those people are total fools for buying this stuff. It also doesn't mean that this technology isn't "cutting edge" even though it wantonly throws away the isolated aspect of energy efficiency in order to excel at the opposing aspect of raw performance.

-GameMaster

Re:Here come the complaints (1)

HD Webdev (247266) | more than 8 years ago | (#14028369)

It would be like posting on a car website and ranting about why I would spend $250,000 on a Ferrari when all I need is a Honda. Morons. If you have nothing to contribute then shut the hell up.

It's 99% sour grapes that creates those kinds of comments about Ferrari's and such.

Those who have thousands of dollars of extra money each month for entertainment aren't the ones complaining. It's the ones up to their ears in debt or are barely living from paycheck to paycheck what whinge about people (who they'll never meet) enjoying a better product.

Re:Here come the complaints (1)

Jeff DeMaagd (2015) | more than 8 years ago | (#14027595)

Silly argument, IMO. There are $500 HDTVs now. I think spending $650 on a video card is worse because the retail value often drops in half in six months and would likely need to be replaced every year, while current HDTVs could be in service for ten years.

Re:Here come the complaints (1)

Scudsucker (17617) | more than 8 years ago | (#14031379)

Silly argument, IMO. There are $500 HDTVs now.

Sure, tiny ones. 60 inch HDTV's are a little more than that. A quick look at Best Buy's site reveales a price tag of $3,500 for a 50" plasma or a 62" rear-projection tv.

Re:Here come the complaints (1)

HD Webdev (247266) | more than 8 years ago | (#14028204)

And now for the slew of, "WTF? Who pays that much for a video card!"

Yes, they often miss the fact that if it weren't for people who have enough spare cash to pay for those high priced cards, development would stagnate and the masses wouldn't reap the benefits from buying those same cards 1 year later at a much lower price. Those who pay high prices subsidise the rapid development of them and the resulting lower prices for the previous generation of video cards.

Vivo??? (1)

skyshock21 (764958) | more than 8 years ago | (#14026311)

Does anyone else remember when Vivo pr0n movies were the big thing? What are they used for these days?

Re:Vivo??? (2, Informative)

damsa (840364) | more than 8 years ago | (#14030231)

Vivo = Video in video out.

Vivo you are thinking of was a codec or something like that.

GeForce 6600 DDR2 (4, Interesting)

RzUpAnmsCwrds (262647) | more than 8 years ago | (#14026683)

Slashdot users may be far more intereseted in the GeForce 6600 DDR2:

http://www.neoseeker.com/Articles/Hardware/Reviews /geforce6600ddr2/ [neoseeker.com]

At $99, it's a lot easier to swallow than the $600 GPUs we're now seeing, and it still offers excellent performance and decent Linux support.

Fanboi to save the day (1)

NVP_Radical_Dreamer (925080) | more than 8 years ago | (#14032380)

Sorry for the fanboi rant, but ATI is going to have to do some SERIOUS work to catch up to nVidia at this point. With Crossfire being pretty much a joke and their top of the line card still not besting nvidias now second string card I see a good deal of the gaming community losing faith in them quick. Even being a huge nvidia fan I see this going one of two ways, either Nvidia will crush ATI and they will be the lone wolf in the graphics industry thus bringing innovation to a halt and bringing prices even higher OR ATI will come back with a vengence and even the score. I am hoping for the latter because even though I want Nvidia to succeed, its always bad to have only 1 real player in a particular market.

Re:Fanboi to save the day (1)

Attaturk (695988) | more than 8 years ago | (#14034787)

My that is a fanboi rant. The ATi card that this 7800 is being compared with is priced at around $400 vs nVidia's $650 for a card that marginally beats it in performance. Your words sound a lot like those of the folk that used to proclaim that one day 3dfx would be a "lone wolf in the graphics industry" and we all know how that turned out. Just like AMD and Intel or any other big component manufacture you will always see a performance see-saw between iterations of their products. But when one of them has to charge >1.5 times the price of the other, it's simply not a valid comparison.
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>