Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nvidia's DX11 GF100 Graphics Processor Detailed

kdawson posted more than 4 years ago | from the will-do-physics-for-food dept.

Graphics 220

J. Dzhugashvili writes "While it's played up the general-purpose computing prowess of its next-gen GPU architecture, Nvidia has talked little about Fermi's graphics capabilities — to the extent that some accuse Nvidia of turning its back on PC gaming. Not so, says The Tech Report in a detailed architectural overview of the GF100, the first Fermi-based consumer graphics processor. Alongside a wealth of technical information, the article includes enlightening estimates and direct comparisons with AMD's Radeon HD 5870. The GF100 will be up to twice as fast as the GeForce GTX 285, the author reckons, but the gap with the Radeon HD 5870 should be 'a bit more slender.' Still, Nvidia may have the fastest consumer GPU ever on its hands — and far from forsaking games, Fermi has been built as a graphics processor first and foremost."

cancel ×

220 comments

When's it coming out? (4, Insightful)

Ant P. (974313) | more than 4 years ago | (#30217890)

There's no point bragging about being faster than last month's graphics card if your own is still a quarter of a year from being an actual product.

Re:When's it coming out? (1, Interesting)

roguetrick (1147853) | more than 4 years ago | (#30217906)

I'm more worried about the state of PC gaming. We're taking a long slide recently and I'm starting to worry if this high end hardware is worth it.

Re:When's it coming out? (2, Informative)

TheRealMindChild (743925) | more than 4 years ago | (#30217940)

No, it isn't. It almost never has been. If you needed a 'high end' graphics card to play a majority of PC games reasonably, they wouldn't be 'high end' anymore... they would be standard.

Re:When's it coming out? (1, Flamebait)

not already in use (972294) | more than 4 years ago | (#30218746)

The real thing that threatens PC gaming is PC gamers themselves. When PC gamers don't get exactly what they want, they "stick it to 'em" by openly admitting they'll pirate the game. Case in point: Modern Warfare 2 and dedicated servers. There is an amazing sense of entitlement for a platform that continues to be less relevant every day, and far and away the biggest pain in the ass.

Re:When's it coming out? (1)

jandrese (485) | more than 4 years ago | (#30219168)

Why would you pirate a game that doesn't have proper network support? Simply not buying it is a lot easier.

Re:When's it coming out? (2, Insightful)

bloodhawk (813939) | more than 4 years ago | (#30219312)

While it definitely isn't worth it now, it was only 4 or 5 years ago that you had to stay close to the cutting edge if you wanted to play games as they were released in full resolution. Now though even a middle of the range card is adequate for even the most system taxing games. Graphics cards have outpaced gaming. I just bought a new 5870 but I had been sitting on a card that was 2 generations old before that and was still able to play most games at full res, the only real reason for the 5870 was it is a new machine and should hold me in good stead for a few years.

Re:When's it coming out? (2, Insightful)

timeOday (582209) | more than 4 years ago | (#30217980)

Which would be a good reason for NVidia to focus on science and media applications rather than games after all.

Re:When's it coming out? (3, Insightful)

Pojut (1027544) | more than 4 years ago | (#30218106)

I'm gonna have to disagree with you there.

Look at the 4850. When it was brand new, it cost $199, and it could run ANY game on the market at full resolution and detail with a smooth, sustained framerate. Flash back to the year 2000. Try to find me a $200 card back then that could do the same. Hell, I challange you to do the same thing just 5 years ago, back in 2004.

Good luck.

Re:When's it coming out? (2, Insightful)

quercus.aeternam (1174283) | more than 4 years ago | (#30218394)

I'm gonna have to disagree with you there.

Look at the 4850. When it was brand new, it cost $199, and it could run ANY game on the market at full resolution and detail with a smooth, sustained framerate. Flash back to the year 2000. Try to find me a $200 card back then that could do the same. Hell, I challange you to do the same thing just 5 years ago, back in 2004.

Does this mean that we're hitting a software complexity wall?

It's now the devs turn to play catch up... I hope nobody cheats by adding idle loops (looks around menacingly).

Re:When's it coming out? (4, Insightful)

Pojut (1027544) | more than 4 years ago | (#30218462)

As a poster previously in the thread stated, a big part of it are games that need to work on consoles and PC. As an example, considering the 360 has a video card roughly equivalent to a 6600GT, there is only so far they can push ports. Hell, even now, 3-4 years into the current gen, there are STILL framerate problems with a lot of games...games that can now run at an absurdly high FPS on a decent gaming PC.

Re:When's it coming out? (1, Flamebait)

dave562 (969951) | more than 4 years ago | (#30218932)

The problem is the 360. Games look ridiculously good on the PS3. If you compare the new COD on the 360 and the PS3 there is a noticeable difference in graphics quality.

Re:When's it coming out? (2, Insightful)

Pojut (1027544) | more than 4 years ago | (#30219146)

I don't think the problem is the 360, I think the problem is your fanboyism. Multi-platform games look more or less the same between the 360 and the PS3.

Trust me, I know. I have both systems.

Re:When's it coming out? (0, Troll)

dave562 (969951) | more than 4 years ago | (#30219224)

More or less the same is not the same. I have no stake in which system is the better system so your accusation of fanboyism is way off base. I spend less than ten hours a week playing video games between the PS3 and the computer.

Re:When's it coming out? (4, Informative)

Pojut (1027544) | more than 4 years ago | (#30219324)

When the differences are minute to the point where you have to pause a Gametrailers video and lean in close to your monitor, they may as well be the same...you aren't going to see that during actual gameplay, ESPECIALLY not in a frantic shooter like MW2.

That being said, there is one consistant difference between the 360 and the PS3 in terms of image quality: the 360 tends to be a little washed out, and the PS3 tends to be a little dark. Thank goodness for auto-switching color profiles based on the input selected.

Re:When's it coming out? (-1, Troll)

Anonymous Coward | more than 4 years ago | (#30219460)

The problem is the 360. Games look ridiculously good on the PS3. If you compare the new COD on the 360 and the PS3 there is a noticeable difference in graphics quality.

There's a game released for PS3? Holy shit I thought the only thing the PS3 was useful for was completely pointless clusters, running a crapass Linux distro, and pretending to be as elite as an Apple fanboi.

Posting anon because my karma is BAD

Re:When's it coming out? (-1, Offtopic)

quercus.aeternam (1174283) | more than 4 years ago | (#30219676)

Posting anon because my karma is BAD

Though karma does perfectly correlate to post quality, there is some relationship.

If AC's karma is better than yours, that might tell you something. Perhaps you should consider taking a persuasive writing course? Or thinking a little bit before you post - sometimes, at least? Obviously, I can't say for sure as I don't know what your account is.

That said, having some idea of how your idea will be received is an /extremely/ useful skill, as is being able to decide when to post anyway.

Good luck!

Re:When's it coming out? (1)

Rewind (138843) | more than 4 years ago | (#30219710)

I have to agree with Pojut. I have seen both the PS3 and 360 versions on the same 42" 1080p TV. There is very very little difference visually, hardly "noticeable".

Re:When's it coming out? (2, Interesting)

poetmatt (793785) | more than 4 years ago | (#30219140)

companies don't want games on console and PC. The reason is there is a lot less control on PC. So they want to shove console requirements onto a PC and you end up with horrible ports like Borderlands and MW2. Thus, nobody wants the PC version and they go "oh, nobody bought the PC version" even though the reason is they fucked their own community, so that they don't have to keep making games for PC.

It's a really shortsighted strategy, but it's basically an attempt at creating a walled garden all over again. Apparently the companies don't realize or have enough forsight re: what's going to happen in the next 1-2 generations of gaming consoles when they're easily powerful enough to be used as home computers (hint: it doesn't mean more consoles are going to sell).

Re:When's it coming out? (0)

Anonymous Coward | more than 4 years ago | (#30218612)

Its not a software wall. Its a console wall. Most games these days are developed for multiple platforms.
Most game engines run on Xbox2 & PS3 as well as the PC. Its a matter of time, resources and money.
Most developers will design games to look the best they can on the lowest spec (but best for revenue)
Xbox2. They then have to work out if its worth the extra effort to improve the game for the PS3 for a
lower percentage of your sales instead of improving the existing game or using your available resources
on other projects. Its even worse for the PC which vastly exceeds the capabilities of both the consoles.
Most console games still run in upscaled 720p. Its just not worth the effort to create special or improved
content just for the PC.

Re:When's it coming out? (2, Insightful)

Talderas (1212466) | more than 4 years ago | (#30218716)

Which is why Metal Gear Solid 4 still looks better than games that are coming out now that are both PS3 and XBox 360.....

Re:When's it coming out? (1)

Dutch Gun (899105) | more than 4 years ago | (#30219426)

Does this mean that we're hitting a software complexity wall?

From the perspective of a game programmer, I'd posit that it's not as much a software complexity wall as it is a content creation wall. Creating a fully-realized world in a modern videogame is amazingly time consuming. It's now all about how to reduce the dozens of developer-years required to build the environment (world geometry, props, models, effects, etc) and gameplay content (events, missions, etc). One of the problems has been that with each new generation, we not only have to re-build all our content from scratch, we have to do so with much higher fidelity than ever before.

Yeah, the programming challenges are harder, because we're being asked to do more, but in my experience, on average, the ratio of non-programmers to programmers on a typical development team is growing year by year. In my first commercial game, we had three programmers and one artist on the project. On my current team, the artists, writers, designers, audio guys and producers probably outnumber the programmers by around 3 to 1.

Re:When's it coming out? (1)

afidel (530433) | more than 4 years ago | (#30219564)

More like an art wall, the percentage of total cost of development for art resources in modern games is incredibly higher than it was back in say the 90's. Budgets for AAA titles have ballooned to nearly Hollywood levels.

Re:When's it coming out? (0, Flamebait)

sexconker (1179573) | more than 4 years ago | (#30218498)

Look at the 4850. When it was brand new, it cost $199, and it could run ANY game on the market at full resolution and detail with a smooth, sustained framerate.

Lies. I have a 3840x2880 CRT.

Re:When's it coming out? (1)

Neil Hodges (960909) | more than 4 years ago | (#30218584)

What's the model? It sounds interesting.

Re:When's it coming out? (1)

jo42 (227475) | more than 4 years ago | (#30219166)

He's counting the red, green and blue dots as pixels.

Re:When's it coming out? (3, Insightful)

MartinSchou (1360093) | more than 4 years ago | (#30218634)

Look at the 4850. When it was brand new, it cost $199, and it could run ANY game on the market at full resolution and detail with a smooth, sustained framerate

Pull the other one. It has got bells on it.

Define "full resolution".

If I have a very old 1280x1024 monitor, sure.
If I have a new 1920x1200 monitor, not so much.
If I have a dual 2560x1600 monitor setup, not in this life time.

Also, define "full detail". Is that at medium? High? Maximum? What level of anisotropic filtering? Anti aliasing?

But let's have a look at something a bit realistic and look at "any game", in this case Crysis.

From [H]ard|OCP's review of the 4850 from June 25th, 2008 [hardocp.com] :

Highest Playable Resolution:
1600x1200
No AA | 16x AF

Minimum FPS: 16
Maximum FPS: 42
Average FPS: 28.5

Considering that the Radeon 4870 and Geforce GTX 260 have their highest playable at 1920x1200, I'd say you're flat out wrong in your claim.

Now, you may claim that Crysis doesn't count as it's not "ANY game on the market", so let's use Age of Conan instead [hardocp.com] :
Woops, that one seems to hit its limit at 1600x1200.

That was my rather convoluted way of saying "you're an idiot".

Re:When's it coming out? (3, Informative)

Anonymous Coward | more than 4 years ago | (#30219076)

They have admitted those 2 games were programmed by monkeys.

If you compare a 4850 from then to a 4850 today with the game fully patched and monkey shit removed you'd see an increase in frame rates. Or compared it to the squeal which had even more monkey shit removed there would be a further increase in frame rates.

Besides the fact that 2 games, that received crap reviews except from the "Oh so pretty" crowd do not represent the market.

You Need A Better Example (1)

Kneo24 (688412) | more than 4 years ago | (#30219212)

If you're going to use any "random" game is an example, why would you purposely choose a game that was not very well optimized? Crysis ran pretty poorly on top of the line hardware at the time too (in comparison to other games that looked just as good and ran better). Age of Conan unfortunately resulted from a lot of the same issues. It wasn't as bad as Crysis, but still pretty bad in that regard.

Re:When's it coming out? (1)

Pojut (1027544) | more than 4 years ago | (#30219290)

I define full resolution as the max resolution of the average big monitor these days...which, unless you have some 27-30 inch monstrosity, cap out at 1920 x 1200. In your example, they have 16XAA enabled, which makes a MASSIVE difference...which is something I have adressed in my other posts made in this thread. That being said, congrats...you're right. I am totally wrong. There actually is a game out there that a then-$200 card couldn't play full bore. Sue me.

By the way, I appreciate you insulting me, considering you don't know me. Nothing makes me feel better about being a part of a community than knowing people like you are a part of it as well.

Re:When's it coming out? (1, Insightful)

Spatial (1235392) | more than 4 years ago | (#30219476)

First you cherry-pick two very rare resolutions, and then you choose two games that are renowned for their exceptionally high system requirements. Pretty intellectually dishonest of you.

Edge cases don't make good refutations of general statements. Besides, he's not totally correct but he isn't far from the truth either. The HD4850 can run most games at fairly high settings, at the highest resolutions most people have available.

(According to the Steam stats, 1920x1200 comprises less than 6% of users' displays, 2560x1600 is in an "other" category of less than 4%. 1280x1024 is the most common, and that or lower comprises 65%)

Re:When's it coming out? (1)

roguetrick (1147853) | more than 4 years ago | (#30218656)

Actually I don't even mean it from a technical standpoint. I just feel like the influx of console tailored games, designed to run on local hosts for multiplayer, and designed to prevent modification are really screwing with things. Of course, I have to say that my view is strong in that I'm mainly looking at blockbuster games and not some of the real gems that are PC centric.

Re:When's it coming out? (0)

Anonymous Coward | more than 4 years ago | (#30219130)

Your claim is pretty weak, the 4850 when it was new was not that good. It couldn't run any game at maximum settings at my native resolution 2560x1600. It couldn't even run several current games at 1920x1200. Hell, dual GTX 285s still cant run Crysis at 1920x1200 max settings smoothly.

To name a few that the 4850 could not run at 1920x1200 or 2560x1600 max at release:
Crysis
UT3
GTA4
Age of Conan
Farcry2

I'm sure there are a few more but those are examples that I ran into.

The 4850 was definitely a great card for the price, but it was not nearly enough to run at max settings especially when throwing in FSAA and high levels of AF, both which make a large improvement in the visuals.

I still feel like GPUs are behind the software in performance. I have seen Tri SLI GTX 285 rigs and Dual GTX 295 rigs and they still lag on new games. Some of it though is that CPUs are a huge bottleneck. You need at least a 4Ghz i7 to reduce the bottleneck to a point where those monster GPU setups are actually worth it.

Check out these charts:
http://www.tomshardware.com/reviews/radeon-hd-5970,2474-8.html

Pay close attention to the 1920x1200 and 2560x1600 results. also, keep in mind PC gamers are looking for an average of 60FPS for us to call it a "smooth" experience. PC game frames are sharp and you need about twice as many of them compared to video which uses motion frames to achieve its smoothness at low framerates.

Re:When's it coming out? (0)

Anonymous Coward | more than 4 years ago | (#30219386)

Nor in 1680x1050. I have a GTX 275 at the moment and Crysis isn't perfectly smooth in that resolution, although it would be in 1280x1024. Far Cry 2 is, where it wasn't on a 4850. Age of Conan is still not running that great although it's actually playable with everything maximized in DX10

Re:When's it coming out? (1)

illaqueate (416118) | more than 4 years ago | (#30219400)

the 5800 series of cards are actually the first I've seen that outpace every game. I bought an 8800 GTX in 2006 when it first came out and there were still games that would chug a bit with AA on like Neverwinter Nights 2, Rise of Legends, Supreme Commander

Re:When's it coming out? (1)

jandrese (485) | more than 4 years ago | (#30219234)

The problem is that most of the new AAA games are console ports now, because consoles are where they money is. But none of the current generation consoles can hold a candle to even a 2 year old PC, so pretty much any halfway competent PC can play all modern games that aren't Crysis. You have to work to get a machine that can't support games (like getting one with Intel graphics).

Worse, the games that aren't just console ports are small indy developer efforts with simple graphics that rarely need more than a budget box to play.

About the only way to even being to stress a modern card is to push your monitor up to resolutions well beyond what most people's monitors can support. If you're not running WQXGA or WQUXGA there is really not much point in getting a 5970.

Re:When's it coming out? (0)

Anonymous Coward | more than 4 years ago | (#30219294)

I agree with you, I think this changed with the 8800GT. The chip that it used was the G92 and they are still selling it today. At the time I remember reading articles that pinned the decline of PC gaming on the hardware being to expensive, I thought that the change in how much cash is needed to play might have been down to that. However it also has much to do with the relative age of the consoles. Most of the games we are playing are made to look good on 4+ year old hardware.

Re:When's it coming out? (2, Interesting)

alen (225700) | more than 4 years ago | (#30218142)

since WoW controls 50% of all pc game revenues, the market as it was a few years ago is over. it's not even fun building a PC anymore since everything is integrated on the motherboard except for a decent graphics card.

i'm personally tired of chasing the latest graphics card every year to play a game. i'll probably buy a PS3 soon and a Mac next year just because it's lack of wires makes the wife happy

Re:When's it coming out? (1)

CannonballHead (842625) | more than 4 years ago | (#30218280)

it's not even fun building a PC anymore since everything is integrated on the motherboard except for a decent graphics card.

And the RAM. And sound card if you want to get it off the mobo. And the hdd/optical drive(s)...

Building a PC can be really fun, still. Getting a decent graphics card for cheap is still possible, too, and you don't have to chase the latest graphics card. You don't have to play games on the Ultra High setting, either...

Re:When's it coming out? (1)

Moridineas (213502) | more than 4 years ago | (#30218436)

I think the point is not that building a PC *can* be fun, but rather, it's usually not anymore. ie, the time+cost to reward ratio is off!

Building a computer even 10 years ago was a lot different than it is today. Even minute amounts of overclocking could make a huge difference, small differences in ram timings were noticeable. Getting a Cyrix, an Intel or AMD cpu gave very different performance profiles. Individual steppings were sought out by overclockers. Certain soundcards could greatly lighten CPU load, etc.

Now, as the GP said, most motherboards have decent sound and network integrated in (not fantastic usually, but more than good enough), the main RAM decisions is between 2gb or 4gb (etc). I'm well aware that there are still hardcore overclockers, who know which processor stepping is the best, etc, but IMHO (and many others!), it's just not worth it anymore. The difference between a moderately priced OTS Dell and a highend souped up homebuilt computer just really isn't that great anymore. Unless you care about things like stupid glowing tubes in your case, fire decals, etc.

Re:When's it coming out? (0)

Anonymous Coward | more than 4 years ago | (#30219204)

you had me right up until you're conclusion that the margin between a custom and Dell isn't worth it. there's still a wide gap on the price-performance ratio between commodity units like Dell and a custom. maybe you (and many others!) should just settle down with your netbooks and flash games, no point in pushing the edge.

true, the overclocking part isn't quite as meaningful as it was 10 years ago, what with the law of diminishing returns taking it's toll recently. however, the building part is still fun, and there's something still to seeking out the very best parts for the money. don't kid yourself with believes that we've reached any end to the advancement of the hardware or software. the developers are just catering to the consoles for the moment. why put in the extra time when the payoff in $$$ isn't there, right?

Re:When's it coming out? (0)

Anonymous Coward | more than 4 years ago | (#30219572)

as someone who just built a computer, i can say that this is not true at all. the current intel i7s on the market can be overclocked by 50% without doing anything special (messing with timings, ratios, or doing anything besides adding an aftermarket fan). it's like those crazy celeron overclocking days. i've never OCed before--and i've been building my own computers for 12 years now--but it was so easy with these chips that i did this time. also, i got the performance of a $6000 alienware system for $2000 including 32" monitor (i use an HDTV) and HTPC capabilities!

the quality of components these days is so far superior to what it was a few years ago, i think this is the best time ever to build your own system. also, it's the only way to get exactly what you want if you're really picky like me.

Re:When's it coming out? (1)

mxh83 (1607017) | more than 4 years ago | (#30218496)

Who said we get wiser as we get older? When you spend thousands of dollars buying a computer which will clearly just browse the net and check mail (you don't appear to be a potential user of any mac exclusive features), based on your wife's preferences for lesser wires, you're just getting stupider.

Re:When's it coming out? (0, Offtopic)

alen (225700) | more than 4 years ago | (#30218602)

if you spec out a comparable Dell to one of the new iMac's it's the same price or more considering you can't get the same quality screen on a Dell desktop. macbook pro's are a rip off now, but the price was the same as a dell when they were last refreshed. next refresh is coming early next year.

and for whatever reason, Mac's hold their value very well. you can buy a new one every year for $300 - $400 out of pocket per year

Re:When's it coming out? (1)

Moridineas (213502) | more than 4 years ago | (#30219088)

Thousands of dollars? The most expensive imac -- with a 27" screen is $2000. The average model (with 4gb memory btw) is $1200.

The nice thing about apple "exclusives" (iphoto, iweb, etc) is that they make stuff very accessible to non-power users.

Perhaps there will even be a time in your life when you will be more interested in less wires than in a few mhz, or perhaps than in a case with a window on the side and glowing wires?

Re:When's it coming out? (1)

dave562 (969951) | more than 4 years ago | (#30218840)

I finally made the switch to the console with COD/MW2. I have a PS3 hooked up to a 37inch Samsung LCD. My desktop PC is a simple Core2Duo (2.6ghz) with an old GeForce 6800 256MB. I couldn't stomach the cost of upgrading the hardware on the desktop and having to deal with hackers. In all honesty, it's the hackers that really drove me away. It was probably 2/3rds hackers, 1/3rd knowing that I'd get flawless framerate and top notch graphics on the console. I've been playing LAN/online FPS games since Quake and I've never hacked. I hate people who use aimbots and wallhacks and all that other crap. The only other thing that was keeping me on the console for FPS games was the superiority of the keyboard/mouse interface. Yet with a little bit of Google fu, I figured out that the Chinese solved that problem. (http://www.xcm.cc/xfps_rateup_adapter_for_ps_3.htm)

Re:When's it coming out? (1)

Dan667 (564390) | more than 4 years ago | (#30219226)

I disagree, it looks like console gaming has run its course like it did with the Atari 2600. After microsoft banned all those people and all the xbox and ps3 quality problems people are beginning to realize consoles are not as "just works" and they thought. I have upgraded my rig once in 4 years and it still runs every game I have bought in that time at a reasonable resolution. That is a lot cheaper than buying every gen console and all the extra crap with it.

Re:When's it coming out? (2, Insightful)

Anonymous Coward | more than 4 years ago | (#30217920)

Sure there is, because then some people will wait for this new card rather then buying AMD's card, thus providing Nvidia with revenue and profit.

Re:When's it coming out? (3, Interesting)

rrhal (88665) | more than 4 years ago | (#30218056)

By that logic wouldn't those same people then wait for AMD's next offering which will be yet faster? Waiting for the latest and greatest means there will always be something greater in the pipeline to wait for. How long before we saturate the PCI-E bus and need something faster? The current bus structure is about as old as AGP was when it lost favor.

Re:When's it coming out? (0)

Anonymous Coward | more than 4 years ago | (#30218408)

Fanboys [wikipedia.org] will be fanboi's [urbandictionary.com]

Re:When's it coming out? (1)

FunkyRider (1128099) | more than 4 years ago | (#30218442)

16x PCI-E even in it's 1.0 flavor is not too slow. See majority of motherboards offers SLI/CF uses dual 8x PCI-E (Forget about the 'high-end'), which is still more than adequate to handle all graphics tasks. PCI-E 2.0 released 1 or 2 years ago doubles the bandwidth yet again. So I would say PCI-E is not as old as AGP was.

Re:When's it coming out? (1)

Spatial (1235392) | more than 4 years ago | (#30218572)

How long before we saturate the PCI-E bus and need something faster?

In a way, it has already been replaced. PCIE V2 is the current standard. It's backwards and forwards compatible, and has twice the bandwidth of V1. V3 will double that bandwidth again.

It'll be quite a long time before it becomes obsolete.

Re:When's it coming out? (1)

RAMMS+EIN (578166) | more than 4 years ago | (#30218672)

``By that logic wouldn't those same people then wait for AMD's next offering which will be yet faster?''

Well, some people actually do that. I'm waiting for the budget card that comes out with a fully functional open-source driver available. Until then, my fanless GeForce 6600 is chugging along just fine. I don't even remember what I had before that, but it was something fanless with the open-source r300 driver ... a Radeon 9200 or similar.

But then, I don't buy games unless they run on Linux, either. Which usually means they have to wait until Wine supports them, and, by that time, the hardware I have is usually good enough for them, too.

Re:When's it coming out? (2, Insightful)

DoofusOfDeath (636671) | more than 4 years ago | (#30218906)

There's no point bragging about being faster than last month's graphics card if your own is still a quarter of a year from being an actual product.

You haven't spent much time with Marketing people, have you?

Re:When's it coming out? (1)

poetmatt (793785) | more than 4 years ago | (#30219102)

There's a phrase for it, paper launch [wikipedia.org] or paper tiger [wikipedia.org] . If this actually gets released is one thing. I'd like to see benchmarks, not theoreticals.

Re:When's it coming out? (0)

Anonymous Coward | more than 4 years ago | (#30219170)

May be I'm wrong, though is pretty sure that the high end PC gamers are shrinking and the video game console getting bigger faster, because with the price of one those graphics card you can buy one PS3 that offer better in most of cases graphics and you can keep playing it for several years, you don't need to keep expend money for each new game that simply doesn't run smooth in your 'old' graphics card, and that often need change all os most of hardware. This sort of industry can't survive for long time, this model will change, because today still have people playing with computers, but quite less than 10 years ago and pretty sure that in 10 years advance will have quite less.

Feh. (4, Informative)

Pojut (1027544) | more than 4 years ago | (#30218032)

The days of needing the biggest, fastest, most expensive card are pretty much over. You can run just about any game out there at max settings at 1920 X 1080 silky smooth with a 5870, which goes for less than $300. Hell, even the 4870 is still almost overkill.

Unless you plan on maxing out AA and AF while playing on a 30 inch screen, there is no reason to drop $500-$600 on a video card anymore...

Re:Feh. (4, Interesting)

Knara (9377) | more than 4 years ago | (#30218084)

Almost as if Nvidia were looking at some other market than gamers....

Re:Feh. (1)

Zen-Mind (699854) | more than 4 years ago | (#30218254)

Mmm, you mean like all those people that will want 3D LCD and therefore will require double the current FPS to look as smooth as it does today?

Re:Feh. (5, Insightful)

Kratisto (1080113) | more than 4 years ago | (#30218096)

I think this is largely because consoles set the pace for hardware upgrades. If you want to develop a multi-platform game, then it's going to need to run on XBox 360 hardware from four years ago. I don't even check recommended requirements anymore: I know that if it has a 360 or PS3 port (or the other way around), I can run it.

Re:Feh. (2, Interesting)

Pojut (1027544) | more than 4 years ago | (#30218144)

This is pretty much the case with me. I plan on doing a full system upgrade this Cyber Monday, but I haven't bought any new hardware for my computer other than a new DVD drive in about 2 years...and I STILL haven't needed to turn down visual details in any games that are released.

Re:Feh. (1)

DavidTC (10147) | more than 4 years ago | (#30218374)

Yeah. I paid about 150 for my graphics card, a 9600 GT, I have a nice 1680x1050 monitor I'm not going to upgrade any time soon, and at this point I can't imagine what games would require me to buy a new CPU.

I can run any game whatsoever at full resolution and visual details.

That's always been the joke...if you buy a middling video card, you're buying the same thing that's in a PS3 or whatever the newest console is, because those were created a year ago.

Re:Feh. (1)

Neil Hodges (960909) | more than 4 years ago | (#30218636)

Seriously? I paid $100 for a 9800 GT a while back, and have two 1400x1050 monitors. Your card sounds expensive.

I agree with you though, aside any hardware failures, I won't be upgrading it for a long time either. Heck, I wouldn't have moved up from the old 8800 GS if it weren't for VDPAU requiring a newer card.

Re:Feh. (1)

DavidTC (10147) | more than 4 years ago | (#30219606)

I probably got it before you, I think I've had it a year at this point.

$100 is normally the spot I aim at, but I had some extra cash last time, because cost of the memory and motherboard suddenly dropped before I bought, so went about $50 higher than normal.

Re:Feh. (1)

sznupi (719324) | more than 4 years ago | (#30218992)

Not always. Not when both kinds of platforms weren't homogenized to such a degree...

Re:Feh. (1)

afidel (530433) | more than 4 years ago | (#30219738)

That's why I'm looking at a HD5750 with passive cooling when they come down to ~$100, lower power bills and no noise but it will still play just about anything at 1920*1080.

Re:Feh. (2, Insightful)

sznupi (719324) | more than 4 years ago | (#30218210)

Well, the "problem" is those are not really ports anymore; often practically the same engine.

Which kinda sucks, coming from both worlds, enjoying both kinds of games - now that Microsoft made targeting both platforms from the start of development "sensible", most games are hybrids; not exploiting the strengths of either platform.

Re:Feh. (0)

Anonymous Coward | more than 4 years ago | (#30218216)

I think this is largely because consoles set the pace for hardware upgrades. If you want to develop a multi-platform game, then it's going to need to run on XBox 360 hardware from four years ago. I don't even check recommended requirements anymore: I know that if it has a 360 or PS3 port (or the other way around), I can run it.

Try GTA4 some time ;) It will not run smooth unless you have quad core. Seriously.

Re:Feh. (0, Troll)

Spatial (1235392) | more than 4 years ago | (#30218724)

It won't run smooth, period.

The framerate was awful even on the consoles it was designed for. To add insult to injury, the PC port was rushed out in time for Christmas.

Re:Feh. (0)

Anonymous Coward | more than 4 years ago | (#30218578)

And this gentlemen, is what will kill PC gaming, the fact that every game will have only the maximum potential of what a PS3, 360, -insert console system here- can do hardware wise. PC gaming at that point will be meaningless since there will be no incentive to make more hardware intensive games because you will push past what the consoles can do and then you can't touch that demographic anymore. After that, hardware advancement will start to plateau, and we will be back to the point where only corporations/governments/universities that need the fastest hardware will order it. This is how PC gaming dies, and this is why

Re:Feh. (1)

Talderas (1212466) | more than 4 years ago | (#30218796)

Except that PC gaming still has the hold on MMO and RTS games.

Though once again Square-Enix is going to be trying to market a MMO to platforms with FF14.

Re:Feh. (1)

not already in use (972294) | more than 4 years ago | (#30218832)

Yes, for those game developers targeting people willing to spend $1000+ (and that's a conservative number) on a gaming rig, as opposed to $300 on a console. There is a huuuuge market for that.

Re:Feh. (1)

KCWaldo (1555553) | more than 4 years ago | (#30219374)

$1000 is a overblown number for a new from scratch upper level gaming rig. Unless you are looking for a i7 965. Which again is overkill. $700 will get a you a excellent gaming rig. $300 is BS on a new console. The new consoles started about $599, which puts the 2 IMO even, considering the PC can do many other things.

Re:Feh. (1)

Spatial (1235392) | more than 4 years ago | (#30218886)

This is what will kill PC gaming, the fact that every game will have only the maximum potential of what a console can do hardware wise.

In case you hadn't noticed, the consoles are already years old. If that were true, wouldn't you expect it to have happened already?

Yet PC hardware continues to advance, PC games still scale up far beyond the capabilities of their console brethren, and the market hasn't died. [steampowered.com]

Re:Feh. Only GFX matters to you? (1)

sznupi (719324) | more than 4 years ago | (#30219142)

Sure, it might "kill" PC gaming if all that matters for "true PC gamers" is bling...

Though I wonder how that correlates with the fact that most PCs sold have integrated GFX. And that most popular PC games are Solitaire, Minesweeper, Peggle, flash games, etc.

Re:Feh. (1)

werfu (1487909) | more than 4 years ago | (#30218804)

That's why I can still play with my S939 Opteron 180 machine pretty well. I simply upgraded to a 9800GTX+ in 2008 summers and my machine is still doing pretty well for its age. But it's maxed out with 3Gigs of ram and the video card is bottlenecked by the CPU. After 5 years (with upgrade all along) I think that this machine served me very well. I'm looking toward a new build by the end of this year. I'm doing more and more intensive multitasking and going for a quadcore will be huge boost in productivity.

Re:anymore? There never was a reason for it! (0)

Anonymous Coward | more than 4 years ago | (#30218130)

there is no reason to drop $500-$600 on a video card anymore...

Re:Feh. (1)

DomNF15 (1529309) | more than 4 years ago | (#30218154)

Less than $300 is still a lot for a graphics card. Some higher end CPUs (Intel Core i7 920) go for around that price, and CPUs are much more important than a graphics card in terms of functionality (although GPUs have become more important recently). If you don't have a CPU, your computer doesn't work at all. If you don't have a discrete graphics card, you can still do a great many things aside from playing games/rendering graphics. I want to be able to run just about any game out there at max settings for about $150, not $300.

As is usually the case, if you want to buy bleeding edge, you're going to pay bleeding edge prices, even if there is only a 10-15% improvement over the older stuff.

Re:Feh. (3, Insightful)

Pojut (1027544) | more than 4 years ago | (#30218264)

Mostly agreed, however I will take a low-to-mid range CPU if it means I can afford a top of the line GPU...when it comes to gaming, anyway.

The GPU is a much larger bottleneck in terms of gaming, although the line of importance between the GPU and CPU has been blurring a bit lately.

Re:Feh. (0)

Anonymous Coward | more than 4 years ago | (#30218700)

Then again, you can also just go cheap all-over. I've got a Phenom II 955 (less than $200) and a pair of 4670's (less than $100 combined, including the crossfire connect. Newegg recertified FTW), and I can play pretty much any game I've wanted to at 1680x1050 with pretty much full effects. It'll bench over 15000 in 3DMark06, and P6600 or something in Vantage. Near 4850 scores, for a hell of a lot less. If you poke around for deals, you can still build a screaming machine for not a lot of cash.

Re:Feh. (-1, Redundant)

Anonymous Coward | more than 4 years ago | (#30218338)

I disagree

Well, wait some time. (1)

DrYak (748999) | more than 4 years ago | (#30218248)

You can run just about any game out there at max settings at 1920 X 1080 silky smooth with a 5870, which goes for less than $300.

Well, that's until Crysis 2 with Stereo 3D + Multi-Head and Windows 8's compositing + DirectX 13 come out.
Then it'll be again waiting 1 year until the hardware catch up.

Remember the mantra :
What hardware giveth, software taketh...

Also, you assume discreet GPU.
nVidia and ATI have still to do some improvement until the performance you quote happen on a low-power miniature *embed* GPU in a laptop (that doesn't drain the battery flat after 10 minutes).
Thus expect future generation with better performance per watt and performance per dollar.
And while they manage such ratios, why stop at embed ? You can pretty much bet that the same technology will once again be available in a range between embed GPUs (with the performance you quote above despite punny foot-size) and as much raw power as you can cram into a 200$ GPU - for ludicrious graphics speed.

Therefore it seems pretty much logical that nVidia is courting the science market in order to find more buyer interested in those monsters... ...Well, that's until Crysis 3 starts using ray-tracing and crawls at 3fps on those cards.

Re:Well, wait some time. (1)

pwfffff (1517213) | more than 4 years ago | (#30218642)

Honestly stereo 3d is enough to make me want one of these. I can play l4d2 at max settings with 60 fps, but I turn on the 120hz stereo and it's instantly halved. I'm coming from a gefore 9600 and I don't really want a mediocre upgrade, so I'm going to go straight for a DX11 card. Since I use nvidia's shutterglasses and software, I need one of their cards, so the current ATI cards won't do it for me.

Re:Well, wait some time. (1)

sznupi (719324) | more than 4 years ago | (#30219040)

Uhmm...aren't current LCD monitors pretty much locked to 60 fps anyway?

Re:Feh. (1)

MeatBag PussRocket (1475317) | more than 4 years ago | (#30218312)

i agree completely but i think that this situation will be a catalyst for the next big step. i think back to when unreal was released. there was almost no hardware that could run the game smoothly, in a way it was a proof of concept of what gaming could become, but as hardware caught up we saw it give rise to a whole new way to make games, FPS, RTS, RPG, all genres really, have adopted the 3d model, even board games. now the market is saturated and the pressure is off the hardware vendors to make components that perform.

now its time for the software guys to push again. with modern hardware it may soon be possible to produce games that are ray traced. beyond that there are other technologies that will require greater processing power, and the GPU is well adapted to handle the loads of these needs. things like detailed realtime physics modeling, collision and deformation modeling and so forth.

i dont think this is the end of the golden age of the GPU, but just a small barren land that must be bridged to access the next level.

Eyefinity (1)

grimJester (890090) | more than 4 years ago | (#30218316)

You can run just about any game out there at max settings at 1920 X 1080 silky smooth with a 5870, which goes for less than $300.

The 5870 still seems to cost more than $400, but your point is of course valid. What might become an issue is multi-monitor gaming like ATI's Eyefinity. Running a triple-screen setup demands a bit more. I don't know if multi-monitor will become mainstream, but it's roughly in the same ballpark price-wise as high-end GPUs.

Re:Eyefinity (1)

Pojut (1027544) | more than 4 years ago | (#30218410)

augh! yes indeed, I meant $400. You can also get the 4870 for cheap even when it was new, and for super cheap now that it has some age on it...an extremely capable card that will likely last at least another generation or two of video cards.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814129113 [newegg.com]

$170, awesome stuff

Re:Feh. (1)

jasonwc (939262) | more than 4 years ago | (#30218644)

You probably meant the 5850, which had an initial MSRP of $260 but is now selling at $300-310 due to supply issues. The 5870 is ATI's flagship card and had a MSRP of $380. It's currently on sale for $400-420.

Re:Feh. (1)

MobyDisk (75490) | more than 4 years ago | (#30219384)

Wow, I wouldn't spend $300 on a video card. Many people buy whole computers for around that price noawadays.

Fermi-based? (2, Funny)

SnarfQuest (469614) | more than 4 years ago | (#30218244)

I assume they mean the scientist Enrico Fermi. So, did they dig him up, or is this one of those Jesus fingerbone type of thing, where there are more fingerbones than there are chickens? Did they use the whole Fermi, or are there only specific pieces of him that work? Whatever the case, there must be a limited number of cards that can be built, since there is a finite amount of Fermi.

fail (1)

binarylarry (1338699) | more than 4 years ago | (#30218352)

None of the above, you simpleton!

The GF100 is clearly nuclear powered, which would explain the massive heatsink and PCB, which takes up a majority of the standard size PC case and conveniently covers all available SATA and all other types of peripheral interface connectors.

Re:Fermi-based? (1)

megamerican (1073936) | more than 4 years ago | (#30218380)

I assume they mean the scientist Enrico Fermi. So, did they dig him up, or is this one of those Jesus fingerbone type of thing, where there are more fingerbones than there are chickens? Did they use the whole Fermi, or are there only specific pieces of him that work? Whatever the case, there must be a limited number of cards that can be built, since there is a finite amount of Fermi.

They used his skull with the jawbone of an orangutan. [wikipedia.org]

40nm process... (3, Insightful)

Sollord (888521) | more than 4 years ago | (#30218302)

Isn't this going to be built on the same TSMC process as the 5870? The same one that's having yield problems and supply shortages for AMD and yet the nvidia chip is even bigger and more complex chip? I for see delays.

Alongside a wealth of technical information... (1)

John Hasler (414242) | more than 4 years ago | (#30218326)

Enough to write a Free driver?

What does GF stand for? (0, Troll)

Icegryphon (715550) | more than 4 years ago | (#30218552)

Girl Free maybe?
because you probably don't have one if you are wasting your money on it?
Or Maybe it is suppose to be a Girl Friend replacement?

Only Question I have (1)

rehtonAesoohC (954490) | more than 4 years ago | (#30218674)

So was it created in the Fermilab?!

Still no "real" benchmarks? (2, Insightful)

ZirbMonkey (999495) | more than 4 years ago | (#30218736)

While the articles is very interesting on explaining the chip archetecture and technical specifications, I can't believe there sin't a single actual gaming benchmark on these chips yet.

The best they can do is give an estimated calculation on how the chips may or may not actually live up to. They estimate that it will be faster at gaming than ATI's already released 5870.

By the time Nvidia actually releases their Fermi GPU's, ATI's Cypres will have been actively selling for over 3 months. And there's a funny thing about advancements over time: things keep getting faster (aka Moore's Law). Supposing that chips are supposed to double in transistor count every year, the new Fermi chips need to have 20% more transistors than ATI's RV5870 if they release 3 months later... just to keep on the same curve.

And there's still no mention of pricing... but that's expected on a product that doesn't actually run games yet. I don't see a lot of optimism on the gaming front, so I hope for Nvidia's sake that the investment into GPGPU is the branch out they need to trump ATI's sales.

With x86 on the Die? (1)

Doc Ruby (173196) | more than 4 years ago | (#30219504)

Does nVidia sell any of these top-end GPU chips with a full x86 core on the die with it? A CPU that's maybe as powerful as a single core P3/2.4GHz, tightly coupled for throughput to GPU and VRAM, going out to the PC bus (PCI-e) just for final interface to a video display (or saving to a file, or streaming to a network).

Re:With x86 on the Die? (1)

Devil's BSD (562630) | more than 4 years ago | (#30219696)

x86 is horrible for graphics processing compared to a stream processor. when it comes to performing repetitive calculations on a constant stream of data a gpu will beat a cpu any day. yes, that is even with hyperthreading and streaming SIMD.

Re:With x86 on the Die? (1)

Doc Ruby (173196) | more than 4 years ago | (#30219758)

Yes, that is why the GPU is on there. The x86 is there for everything else: OS and application processing, managing devices, etc. A single chip with both CPU and GPU for maximum total performance. An x86 because that's where the most SW runs.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...