Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA On Their Role in PC Games Development

Zonk posted more than 7 years ago | from the out-on-the-bleeding-edge dept.

PC Games (Games) 92

GamingHobo writes "Bit-Tech has posted an interview with NVIDIA's Roy Taylor, Senior Vice President of Content/Developer Relations, which discusses his team's role in the development of next-gen PC games. He also talks about DirectX 10 performance, Vista drivers and some of the upcoming games he is anticipating the most. From the article: 'Developers wishing to use DX10 have a number of choices to make ... But the biggest is whether to layer over a DX9 title some additional DX10 effects or to decide to design for DX10 from the ground up. Both take work but one is faster to get to market than the other. It's less a question of whether DX10 is working optimally on GeForce 8-series GPUs and more a case of how is DX10 being used. To use it well — and efficiently — requires development time.'"

cancel ×

92 comments

Sorry! There are no comments related to the filter you selected.

Just one question (1)

drinkypoo (153816) | more than 7 years ago | (#19653311)

FTFA:

[...]WQUXGA, 3840x2400, or nine million pixels. [...] We asked Roy what size monitors we'd see with this kind of resolution, but he didn't really give any specifics: "I think you can already buy 22" monitors with this resolution, but they're not designed for gaming because the refresh rates are too high. They also cost too much, too." I guess from that, we might see 30" monitors at 3840x2400, or we may see even bigger monitors...

Conjecture aside, what refresh rates are they using now?

I would have assumed that the highest resolutions would be at pretty rational refresh rates...

Re:Just one question (3, Informative)

merreborn (853723) | more than 7 years ago | (#19653433)

but they're not designed for gaming because the refresh rates are too high


http://en.wikipedia.org/wiki/QXGA#WQUXGA [wikipedia.org]

Apparently, the existing monitors at WQUXGA (worst. acronym. ever.) resolution run at 41hz, max. These days, top of the line game systems will pump out upwards of 100 frames/sec in some cases. A 41hz refresh rate is essentially caps you at 41 FPS, which is enough to turn off any gamer looking at blowing that much on a gaming rig.

Re:Just one question (1, Informative)

Actually, I do RTFA (1058596) | more than 7 years ago | (#19654103)

41 FPS for display purposes. However, many time physics/AI/etc is done "per-frame." A higher FPS will still affect those (moreso since any decently threaded game will use less resources rendering.) Most display systems cannot handle 100Hz, and most humans cannot tell the difference above 25-30 Hz. It's only games where slow displays lead to slow calculated frames that this will cause a problem. That and arrogant SOB's who claim they can tell the difference without FRAPS.

Plus, at that resolution you are fill-bound anyway.

Re:Just one question (2, Informative)

White Flame (1074973) | more than 7 years ago | (#19655551)

Most display systems cannot handle 100Hz, and most humans cannot tell the difference above 25-30 Hz. It's only games where slow displays lead to slow calculated frames that this will cause a problem. That and arrogant SOB's who claim they can tell the difference without FRAPS.

*sigh* Where do people come up with this garbage? Look at some evidence already instead of making stuff up:

http://mckack.diinoweb.com/files/kimpix-video/ [diinoweb.com]

Re:Just one question (1)

bdjacobson (1094909) | more than 7 years ago | (#19658205)

*sigh* Where do people come up with this garbage?
Digg. No really. You'd be modded down for going against group think and providing actual links.

Re:Just one question (1)

Actually, I do RTFA (1058596) | more than 7 years ago | (#19658395)

I've checked various framerates. I cannot tell the difference above around 18-20 Hz (when paying attention. I've played games at 12-15 FPS without noticing anything wrong), but I recognize others can. In movies/animation/etc. 24/25(PAL)/29.97(NTSC) are standard.

Re:Just one question (1)

Endo13 (1000782) | more than 7 years ago | (#19658583)

I don't know, maybe it depends on the type of monitor. But I can easily tell a significant difference between 60hz, 70hz, 75hz, 85hz, and 100hz on any CRT monitor. The lowest refresh rate I can comfortably use on a CRT for extended periods of time is 85hz.

Re:Just one question (1)

AngelofDeath-02 (550129) | more than 7 years ago | (#19660633)

It depends on the game, and the amt of movement.

Starwars galaxies? Yea - that's fine at 15-30 ... But try playing burnout at 15 fps and you will notice a difference...

Above 30 I will admit, I don't really notice too much. It's more of a smoothness thing though. If 30 is as low as it goes, I won't complain too much, but if that's as high as it goes, or even the average - it probably dips much lower... and that IS noticable.

Re:Just one question (3, Funny)

Fiver- (169605) | more than 7 years ago | (#19655145)

WQUXGA (worst. acronym. ever.)

Yeah, but think of the points you could rack up in Scrabble.

Re:Just one question (0)

Anonymous Coward | more than 7 years ago | (#19655701)

Re:Just one question (1)

merreborn (853723) | more than 7 years ago | (#19656635)

The linked monitor is 1920x1200@75hz

That's 1/4 of WQUXGA's 3840x2400

Black bars (1)

dj245 (732906) | more than 7 years ago | (#19662171)

Apparently, the existing monitors at WQUXGA (worst. acronym. ever.) resolution run at 41hz, max. These days, top of the line game systems will pump out upwards of 100 frames/sec in some cases. A 41hz refresh rate is essentially caps you at 41 FPS, which is enough to turn off any gamer looking at blowing that much on a gaming rig.

What bothers me more is that the screen uses 16:10 aspect ratio. Seems Apple is quite fond of 16:10 for some reason (according to that link). I hate 16:10.

In principle, 16:10 might seem like compromise between 4:3 and 16:9. However, it means that you always have black bars (or you can have software crop your movie if you really want to) because nobody makes movies in 16:10. You can argue that if you watch video in 4:3 or 2.35:1 that you would have black bars anyway on a 16:9 screen, but more and more video I have seen seems to be going 16:9. Dr Who, Torchwood, even the latest pr0n, all 16:9.

As someone who had a 16:10 monitor for a while, let me also say that it is less supported natively in video games, leading you to use non-native resolutions or resolution hacks.

Re:Just one question (0)

Anonymous Coward | more than 7 years ago | (#19664371)

These days, top of the line game systems will pump out upwards of 100 frames/sec in some cases

Not at that resolution. You don't buy a 9MP monitor and play in 1024x768.

Re:Just one question (0)

Anonymous Coward | more than 7 years ago | (#19653579)

"As the only manufacturer with DirectX 10 hardware, we had more work to do than any other hardware manufacturer because there were two drivers to develop (one for DX9 and one for DX10). In addition to that, we couldn't just stop developing XP drivers too, meaning that there were three development cycles in flight at the same time."

Didn't ATI kick out some DX10 hardware the other day? I'm sure the ATI x29xxx is DX10.

"Our research shows that PC gamers buy five or more games per year, and they're always looking for good games with great content."

Interesting, but makes me wonder what they lay in the definition PC gamer.

Tony and David are right, there are API reductions, massive AA is 'almost free' with DX10. This is why we are able to offer CSAA [up to 16xAA] with new DX10 titles - the same thing with DX9 just isn't practical. Also interesting, but I'm skeptical. Turning on AA is just one API call, how does AA affect overhead?

So yes we will see big performance jumps in DX10 and Vista as we improve drivers but to keep looking at that area is to really miss the point about DX10. It's not about - and it was never about - running older games at faster frame rates. Wait, rewind. Are he saying my DX7/8/9 games will run faster once Nivida gets their DX10 drivers together? Or is he saying games with DX9 level of graphics will run faster if ported to DX10?

"Five years from now, we want to be able to walk into a forest, set it on fire and for it to then rain (using a decent depth of field effect) and to then show the steam coming off the ashes when the fire is being put out."

No, I can do that in real life. A Pyromaniacs VS firefighters burn fest OTOH....

Re:Just one question (1)

Machtyn (759119) | more than 7 years ago | (#19654647)

Interesting, but makes me wonder what they lay in the definition PC gamer.

That would be people who have purchased an Nvidia card, who happen to be gamers, who happen to have registered their hardware, who happen to have responded to an e-mail from Nvidia requesting they complete a questionnaire.

That's, what... all of 100 people?

(Ok, so I fall into that category... but I was the one who responded as a pays for one or two games a year... but I play them every day for a long time.)

Re:Just one question (1)

Puff of Logic (895805) | more than 7 years ago | (#19654789)

(Ok, so I fall into that category... but I was the one who responded as a pays for one or two games a year... but I play them every day for a long time.)
I envy you, as I'm the polar opposite. I rarely finish a game (I'm just now getting closer to finishing the original HL2) but can't stop myself from getting excited about--and subsequently buying--the latest "ooh shiny!"

I suspect it probably points to a fairly fundamental personality trait: I enjoy novelty and learning new systems, but get bored very easily with working my way through levels. I also thoroughly enjoy reading game manuals, again because it's new information. Still, at least my foible represents good financial support for the game industry and I have a library of fantastic games just waiting to be finished!

Re:Just one question (1)

dintech (998802) | more than 7 years ago | (#19660545)

This is a simple function of the personal cost of time available vs perceived reward. I'm betting you no longer have as much spare time as you used to so games have to be very rewarding to compete with the other things in your life. That's a rare thing after you seen a lifetime of computer games already. Like a toddler with on Christmas Day with new toy, sometimes the box is more fun. :)

And own up, how many emulated games have you loaded up only to go 'nice, it works' then move on the next one...

OpenGL Please (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#19653315)

FRIST PROST!!!!

Re:OpenGL Please (0, Offtopic)

kerohazel (913211) | more than 7 years ago | (#19653609)

I see someone's still using an old graphics card. How can you expect to post first when your competition is using the latest and greatest pixel-shading, cross-hyper-threading, voxel-throbbing thingamabobs?

Brought to you by KVIDIA GeWhiz graphics.
"Because FIRST POST is the only *real* benchmark."

Heh. (2, Funny)

michaelhood (667393) | more than 7 years ago | (#19653351)

As an early 8800GTX adopter, I'd like to tell NVIDIA where they can shove this $700 paperweight..

Re:Heh. (0, Offtopic)

ardor (673957) | more than 7 years ago | (#19653409)

Then why did you buy it in the first place?
Besides, a 8800 GTX is a very good card. I chose a GTS because of power usage and price. But if you bought a $700 highend card without actually wanting it, then you are to blame.

Re:Heh. (4, Interesting)

michaelhood (667393) | more than 7 years ago | (#19653583)

You obviously didn't get the idea.. My problem is that the DX10 angle was played up so severely, and that the card's potential would only truly be unlocked in a DX10 environment.

Now NVIDIA is basically advising developers to proceed with caution in DX10 implementations.

Nice.

Re:Heh. (1)

kaleco (801384) | more than 7 years ago | (#19653659)

The DX10 situation was obvious from the beginning. DX10 support would be Windows Vista only, and it would be a while before Vista had the marketshare to justify widespread DX10 game development. Most early adopters found this out in their research, and accepted the 8800 for the benefits it brings to DX9 games, or didn't buy the card at all.

Re:Heh. (1)

LWATCDR (28044) | more than 7 years ago | (#19654045)

Don't blame Nvidia blame Microsoft.
DX10 is Vista only. You have to look at the market share. There are a lot more XP machines than Vista. If you write to DX9 your potential market is about I would guess 100 times the size of a Vista only game.
Notice that Microsoft's Flight Simulator 10 was written for DX9.
But thanks buying a bleeding edge card. In three years when I pay $200 for my DX10 card it will probably be faster than your $800 card. With out people like you the rest of us wouldn't get to buy good cards for $200.
There is an old saying.
Pioneers get slaughtered, settlers get rich.

Re:Heh. (1)

GrayCalx (597428) | more than 7 years ago | (#19655133)

Pioneers get slaughtered, settlers get rich.

Ehhh... thats possibly the most idiotic saying I've ever read on /.

Stupid pioneers like Thomas Edison (idiot! I pay $.53 for lightbulbs today), Neil Armstrong (duh, like the moon taught us anything!) or even the British/Spanish explorers (Retards! I was born in America, hahaha, idiots risked their lives to sail boats over here).

But, no no, I'm sure you're little saying is applicable somewhere... yeah.

Re:Heh. (1)

LWATCDR (28044) | more than 7 years ago | (#19655753)

The Britsh where settlers not explorers.

Do you use an Altair PC? Fly on planes made by Curtis Wright Aircraft? Use Visicalc for your Spreadsheets?
Armstrong wasn't the pioneer that would have been Robert Goddard.

Going first always has costs and risks and more often then not it pays off for the people that go in to a land or market second or third.
  IBM wasn't the first to produce a computer they followed Sperry. Apple and IBM where not first with home computers or PCs they where following Altar and IMSA.
Buying an $700 DX10 Video card before there are any DX10 games means that you will pay a lot for a card that will cost a lot with very little return.

Re:Heh. (0)

Anonymous Coward | more than 7 years ago | (#19656427)

"The Brit[i]sh w[-h]ere settlers not explorers."

Care to back that up?

Re:Heh. (1)

LWATCDR (28044) | more than 7 years ago | (#19662603)

The English where settlers.
Who where the first to cross the Atlantic? The Vikings and the Spanish.
Who was the first to cross the Pacific?

The English did do some exploring but they where not pioneers for the most part. They came and settled the lands that other had "found".
That was their great achievement. Being the first to walk on some hunk of land is nice for getting your name in the history books but living on it and living well is the real achievement.
The English know about pioneers getting slaughtered all to well. One that case that jumps right out at me is the Comet. Look at the history of the pioneering Comet and the the "settlers" the 707 and DC-8 that came behind it.
How about Capitan Cook vs the Hudson Bay Company?
I should have said that British where very successful settlers. And I should have used pioneers instead of explorers. since they are two different things.

Heh-Telsa. (0)

Anonymous Coward | more than 7 years ago | (#19657941)

"Buying an $700 DX10 Video card before there are any DX10 games means that you will pay a lot for a card that will cost a lot with very little return."

CUDA [nvidia.com]

Re:Heh. (1)

abdulla (523920) | more than 7 years ago | (#19656953)

Don't forget that those features are accessible under OpenGL, which means that they will be available across platforms. DX10 is not your only option.

Re:Heh. (0)

Anonymous Coward | more than 7 years ago | (#19653469)

Have you learned the lesson about early adoption? ;)

Re:Heh. (0)

Anonymous Coward | more than 7 years ago | (#19653533)

Some poor schmuck eventually has to do it...

Re:Heh. (1)

Murrdox (601048) | more than 7 years ago | (#19653575)

Uhhh depends.

Are you running that 8800GTX on Vista or XP?

Because I have to tell you, I'm ALSO an early adopter of the 8800GTX. I run XP (screw Vista) and I couldn't be happier. It was worth every single penny. I haven't had a single problem with it whatsoever.

I run all my games at 1900x1220 Resolution at maximum detail levels, and they are all gorgeous. I don't have any performance issues at all.

If you have yourself a $700 paperweight, you've got something else going wrong besides the card itself. Under Vista, I could believe that it might qualify as a paperweight. I have yet to hear a fellow PC gamer enthusiast say something positive about Vista.

Re:Heh. (0)

Anonymous Coward | more than 7 years ago | (#19653625)

I have yet to hear a fellow PC gamer enthusiast say something positive about Vista.

I can say something positive about gaming in Vista... I'll be much better able to deal with problems on relatives machines after shoe-horning years old software onto Vista (Punkbuster comes immediately to mind).

Re:Heh. (1)

SparkyFlooner (1090661) | more than 7 years ago | (#19655817)

I'm a fellow PC gamer enthusiast. Me and my 7900GTX are doing just fine with Vista. 1900x1200 C&C3 with max details...runs smooth.

Yay Vista!

(duck)

Re:Heh. (1)

Spikeles (972972) | more than 7 years ago | (#19656303)

I have yet to hear a fellow PC gamer enthusiast say something positive about Vista.
Leaving aside cost of the O/S, i have nothing against Vista(I use Home Premimum) at the moment. I'm using the NVIDIA 158 BETA drivers for my 8800GTS which are extremely stable with exceptional(compared to the early release drivers) performance. If i had to say anything positive about Vista and gaming, it's that loading times for games and game levels were almost halved after i moved from XP to Vista. Your mileage may vary though.

Re:Heh. (1)

sumdumass (711423) | more than 7 years ago | (#19659589)

Are the load times increased because of the increase in memory or have you ran both XP and Vista on the same hardware. I have a brother who swears up and down that XP is ten times faster then windows 98 when I installed it on his families computer. I keep telling him it is a combination of going from an older 10 gig hard drive with 128 megs of ram to a 7200 rpm western digital (8 meg cache) 120 gig drive and 1 gig of ram. He won't consider the hardware upgrade as a possibility for the increase. Because the case is the same, the computer is the same, except it has XP now. I'm willing to bet I could change the case out and quite a few people would think they have a new computers.

Re:Heh. (1)

Spikeles (972972) | more than 7 years ago | (#19659793)

Certainly wasn't an increase in memory, i have used 2gig for a long time now. I did install Vista itself to a new SATA drive, but the games still ran from the same IDE drive. Certainly the swap file itself would have been faster, and I won't discount the hardware as having an effect, it could possibly be also because a new install of an Operating System(Windows/Linux/OSX) is usually faster due to less clutter. Unfortunately i didn't do extensive benchmarks to determine the root cause of the increase. I think it's probably a combination of things, including the new drive, and Vista's more (well i'm assuming)efficient caching drive/memory practices.

Actually now that i think about i did have a very negative experience with the ready-boost functionality. It's supposed to use your usb drive as extra memory. Whenever it's turned on iTunes will play all videos with a huge amount of stutter until you turn it off again. It's been widely reported http://vista.blorge.com/2007/04/16/vista-itunes-an d-readyboost-do-not-play-well/ [blorge.com] but i have 2gig so i don't really need it anyway.

Re:Heh. (5, Funny)

kaleco (801384) | more than 7 years ago | (#19653607)

Oh, it's not a paperweight, you're using it wrong. If you install it in your PC, it will improve your graphics.

Re:Heh. (1)

Ruprecht the Monkeyb (680597) | more than 7 years ago | (#19654087)

As an early 8800 adopter (January 15th), I've been pretty happy. To be honest, though, I got the board as a Christmas gift. The Vista drivers were a little rough, but I've been around enough to know that brand new hardware + brand new OS is going to cause trouble. Suck it up and deal.

For the first month or so, I dual-booted XP, but since the middle of March I've been running Vista only, and played Vanguard, Company of Heroes, LotRO, Civ4 and a bunch of other stuff with almost no issues. Except for Vanguard, and, well, big shock there. Even then after the March driver rev, I could play that just fine, too. The other day I had to reboot for the latest round of patches, and it had been almost two months since the PC had been shut down.

Re:Heh. (2, Funny)

feepness (543479) | more than 7 years ago | (#19654413)

As an early 8800 adopter (January 15th), I've been pretty happy. To be honest, though, I got the board as a Christmas gift.

You're doing it wrong.

Re:Heh. (1)

Creepy (93888) | more than 7 years ago | (#19661773)

as humorous as that is, there is such a thing as pre-order, so it is possible.

    The day I get a $700 Christmas gift (much less a pre-order) is the day my wife wins the lottery. If I get a pre-order graphics card from her, I know the aliens have truly infiltrated earth and replaced my wife with a brain eating monster. She thinks I play games too much as it is - about 10 hours a week - certainly not the 10 hours in a day I did sometimes in college (I was a binge gamer ;)

Re:Heh. (0)

Anonymous Coward | more than 7 years ago | (#19654259)

That's the price you pay for being an early adopter. You paid $700 for bragging rights and the right to whine on Slashdot. If you're clever, you might even be able to score some advertising money by posting benchmarks on your site or blog.

Sorry, but I don't think the $700 included the right to sodomize a company with a printed circuit board. I'm sure that would be illegal in many states.

Re:Heh. (2, Interesting)

illumin8 (148082) | more than 7 years ago | (#19654933)

As an early 8800GTX adopter, I'd like to tell NVIDIA where they can shove this $700 paperweight..
I too have an 8800GTX and it's been nothing but a great card for me. All of my games play very fast in it, and it's much quieter than my previous 7800GTX. I'm not using Vista yet (sticking with XP SP2) so maybe that's why you don't like it. I have to say it is the best graphics card I've ever had.

Re:Heh. (1)

fbjon (692006) | more than 7 years ago | (#19661359)

How much heat are these things putting out these days? I'm considering something between 7600 to a 7900, probably with only a passive heatpipe. Are you saying the 8800 actually has less heat dissipation?

Re:Heh. (1)

illumin8 (148082) | more than 7 years ago | (#19663799)

How much heat are these things putting out these days? I'm considering something between 7600 to a 7900, probably with only a passive heatpipe. Are you saying the 8800 actually has less heat dissipation?
No, I'm not saying these things are putting out any less heat than the previous model. In fact, based on the power draw requirements (it takes 2 PCI-Express power connectors instead of 1 like most cards), I would guess this thing generates a lot more heat.

What is better about the 8800GTX compared to my previous 7800GTX is the cooling solution. The fan is much improved and is so quiet now that it's not as loud as my case fans. With the 7800GTX whenever I was playing a game the fan on the graphics card would start whining like a banshee. I'm pretty sensitive to fan noise (drives me nuts) so this was a nice improvement.

If you're looking for something with a truly passive cooling solution, I don't think any of the "GTX" cards are what you're looking for. These are high performance and require a lot of cooling. I heard the 7300GT is a completely passively cooled solution which is suitable to home media center type applications (not gaming).

Re:Heh. (1)

default luser (529332) | more than 7 years ago | (#19666993)

No, it uses more power (about 2x more), but the 8800 series has a wonderfully engineered heatsink that is better than anything previously offered as standard.

Of course, you'll only find it on the high-end cards, because those are the only cards where they can actually afford a quality cooler. Stock midrange cards use the cheapest coolers manufacturers can find, and you have to pay extra for a good cooler (or passive cooling solution).

Snippets from the article (2, Insightful)

anss123 (985305) | more than 7 years ago | (#19653513)

"As the only manufacturer with DirectX 10 hardware, we had more work to do than any other hardware manufacturer because there were two drivers to develop (one for DX9 and one for DX10). In addition to that, we couldn't just stop developing XP drivers too, meaning that there were three development cycles in flight at the same time."

Didn't ATI kick out some DX10 hardware the other day? I'm sure the ATI x29xxx is DX10.

"Our research shows that PC gamers buy five or more games per year, and they're always looking for good games with great content.

Interesting, but makes me wonder what they lay in the definition PC gamer.

"Tony and David are right, there are API reductions, massive AA is 'almost free' with DX10. This is why we are able to offer CSAA [up to 16xAA] with new DX10 titles - the same thing with DX9 just isn't practical. Also interesting, but I'm skeptical. Turning on AA is just one API call, how does AA affect overhead?

"So yes we will see big performance jumps in DX10 and Vista as we improve drivers but to keep looking at that area is to really miss the point about DX10. It's not about - and it was never about - running older games at faster frame rates. Wait, rewind. Are he saying my DX7/8/9 games will run faster once Nivida gets their DX10 drivers together? Or is he saying games with DX9 level of graphics will run faster if ported to DX10?

"Five years from now, we want to be able to walk into a forest, set it on fire and for it to then rain (using a decent depth of field effect) and to then show the steam coming off the ashes when the fire is being put out."

No, I can do that in real life. A Pyromaniacs VS firefighters burn fest OTOH....

Re:Snippets from the article (1)

TheRaven64 (641858) | more than 7 years ago | (#19654699)

Interesting, but makes me wonder what they lay in the definition PC gamer.
Someone who plays games on PCs, and buys at least five new titles a year, obviously...

Re:Snippets from the article (1)

sssssss27 (1117705) | more than 7 years ago | (#19656981)

Also interesting, but I'm skeptical. Turning on AA is just one API call, how does AA affect overhead?

I'm wondering if this has more to do with an architectural change than just a software modification. Maybe DirectX 10 specifications just require the board to have a daughter die similar to what the graphics processor in the 360 has.

Re:Snippets from the article (1)

fractoid (1076465) | more than 7 years ago | (#19658385)

I'm wondering if this has more to do with an architectural change than just a software modification. Maybe DirectX 10 specifications just require the board to have a daughter die similar to what the graphics processor in the 360 has.
Well, according to nVidia [nvidia.com] :

The method of implementing CSAA in DX9 differs from DX10. This is due to a limitation in the DX9 runtime, which prohibits the driver from exposing multisample quality values greater than 7. For this reason, instead of specifying the number of coverage samples with the quality value, we simply set quality to a predetermined value which will be interpreted as a specific CSAA mode by the driver.
So there. It looks like it's just as possible under DX9 but you can't give your devs the warm fuzzy glow of going "set supersampling to 11!"

Re:Snippets from the article (0)

Anonymous Coward | more than 7 years ago | (#19660639)

"As the only manufacturer with DirectX 10 hardware, we had more work to do than any other hardware manufacturer because there were two drivers to develop (one for DX9 and one for DX10). In addition to that, we couldn't just stop developing XP drivers too, meaning that there were three development cycles in flight at the same time."

In other words, he's saying that their management can't handle anything more than one driver development track without everything falling to pieces. That's pretty piss-weak for a company with a market cap of nearly $15B. Sell your stock now!

Customers are funny... (2, Funny)

Alzheimers (467217) | more than 7 years ago | (#19653545)

"Given how many copies of Vista are in use, a surprisingly small number of people came back to say they were not happy with our Vista drivers when we launched Vista Quality Assurance. Within a month the number of reported problems had been halved."

Customers are funny, if you ignore them long enough eventually they go away.

Resolution (3, Insightful)

SpeedyGonz (771424) | more than 7 years ago | (#19653639)

I don't want this to sound like the famous "640k should be enough for everyone", but...

WQUXGA, 3840x2400, or nine million pixels.

Sounds like overkill to me. I mean, I'm used to play my games @ 1280x1024 and i feel this resolution, maybe combined with a wee bit of AA, does the trick.

I'd rather see all that horsepower invested in more frames/sec or cool effects. I know, it's cool to have the capability, but it makes me wonder about what another user posted here regarding the 8800 being a 700$ paperweight 'cause of early adoption. You'll have a card capable of a gazillion pixels on a single frame, yet no monitor capable of showing it fully, and when finally the monitor comes out or achieves a good price/value relationship, your card is already obsolete. Null selling point there for moi.

Just my "par de" cents.

Re:Resolution (4, Insightful)

CastrTroy (595695) | more than 7 years ago | (#19653807)

3DFX thought the same of 32 bit graphics. They were still making 16bit cards when everyone else was doing 32 bit. In reality they got killer performance from doing 16 bit, blowing every other card out of the water in 16 bit performace. Most of the cards that had 32 bit couldn't even run most of the stuff in 32 bit because it ran too slow. 3DFX didn't care that it didn't do 32 bit, because 32 bit was too slow, and didn't actually improve the game that much. Now 3DFX is gone. The problem is, is that a lot of gamers don't want to get the card that only supports 16bit graphics, or in this case only supports 1900x1280 resolution. Because they feel that they aren't getting as good of a product, even if they can't tell the difference.

Re:Resolution (1, Interesting)

Anonymous Coward | more than 7 years ago | (#19653955)

There is a huge difference between 16bit and 32bit graphics. 16bit graphics using textures meant for 32bit rendering makes the results appear like a badly encoded DivX/Xvid video. I'm glad 3DFX died because if they were still around we wouldn't have made such great progress like we have been doing. Could you imagine still using their half-ass'd OpenGL like graphics API GLIDE today? I sure as hell couldn't.

Re:Resolution (2, Insightful)

TheRaven64 (641858) | more than 7 years ago | (#19654969)

I think you're overplaying the importance of 32-bit colour. I didn't start turning it on until well after 3dfx was dead. The thing that really killed them was the GeForce. They used to own the top end of the gamer market, and kept pushing in this direction. The only difference between their cheap and expensive lines was the number of graphics processors on them, and none of them did transform and lighting. At the time, this meant that a lot of their power (and they used a lot, and generated a lot of noise and heat) was wasted because games were CPU-bound, with the slow CPU (I had a 350MHz K6-2 at the time) handling the geometry set-up. You could, I think, get better performance with a high-end VooDoo card and a beefy CPU, but it cost a huge amount more than a GeForce and a slow CPU, without much benefit.

Missing the boat on transform and lighting was a major problem, but they also made some serious tactical mistakes, like starting manufacturing boards, and alienating their OEM partners.

Re:Resolution (1)

CastrTroy (595695) | more than 7 years ago | (#19655189)

I'm saying that although 32 bit colour wasn't all that important, I know a lot of people who thought that 3DFX had terrible cards simply because they didn't support 32 bit. Nevermind that it was too slow to even use the feature most of the time, people liked knowing that their card supported 32 bit color, even if they could never use it. It seems to be the same thing hear. They're supporting high resolutions, just to say they support them, when in reality nobody is using these high resolutions because they actually don't look as good as the lower resolutions, because of reduced refresh rates.

Re:Resolution (3, Insightful)

drinkypoo (153816) | more than 7 years ago | (#19655521)

I'm saying that although 32 bit colour wasn't all that important, I know a lot of people who thought that 3DFX had terrible cards simply because they didn't support 32 bit.

Well, speaking as someone who was living in Austin amongst a bunch of gaming technerds, no one I knew gave one tenth of one shit about 32 bit graphics. In fact, while 3dfx was on top, you could instead get a Permedia-based card which would do 32 bit, and which had far better OpenGL support (as in, it supported more than you needed for Quake) and which was just a hair slower :) I was the only one who had one amongst my friends, and I only got it because I was tired of the problems inherent to the stupid passthrough design.

No, what made the difference was the Hardware T&L of the geforce line. That was THE reason that I and all my friends went with one, and THE reason that nVidia is here today, and 3dfx isn't.

No one has yet adequately explained what the hell ATI is still doing here, but it must have something to do with having been the de facto standard for mobile and onboard video since time immemorial (until Intel decided to get a piece of these markets.) Practically every laptop I've owned with 3D acceleration has, sadly, had an ATI chip inside. And usually they do not behave well, to say the least...

Actually, it's simpler (3, Informative)

Moraelin (679338) | more than 7 years ago | (#19660629)

Actually, you know, it's sorta funny to hear people ranting and raving about how 32 bit killed 3dfx or lack of T&L killed 3dfx, without having even the faintest clue what actually happened to 3dfx.

In a nutshell:

1. 3dfx at one point decided to buy a graphics card manufacturer, just so, you know, they'd make more money by also manufacturing their own cards.

2. They missed a cycle, because whatever software they were using to design their chips had a brain-fart and produced a non-functional chip design. So they spent 6 months rearranging the Voodoo 5 by hand.

The Voodoo 5 wasn't supposed to go head to head with the GeForce 2. It was supposed to, at most, go head to head with the GF256 SDR, not even the DDR flavour. And it would have done well enough there, especially since at the time there was pretty much no software that did T&L anyway.

But a 6 month delay was fatal. For all that time they had nothing better than a Voodoo 3 to compete with the GF256, and, frankly, it was outdated at that time. With or without 32 bit, it was a card that was the same generation as the TNT, so it just couldn't keep up. Worse yet, by the time the Voodoo 5 finally came out, it had to go head to head with the GF2, and it sucked there. It wasn't just the lack of T&L, it could barely keep up in terms of fill rate and lacked some features too. E.g., it couldn't even do trilinear and FSAA at the same time.

Worse yet, see problem #1 I mentioned. The dip in sales meant they suddenly had a shitload of factory space that just sat idle and cost them money. And they just had no plan what to do with that capacity. They had no other cards they could manufacture there. (The tv tuner they tried to make, came too late and sold too little to save them.) Basically while poor sales alone would have just meant less money, this one actually bled them money hand over fist. And that was maybe the most important factor that sunk them.

Add to that such mis-haps like,

3. The Voodoo 5 screenshot fuck-up. While the final image did look nice and did have 22 bit precision at 16 bit speeds, each of the 4 samples that went into it was a dithered 16 bit mess. There was no final combined image as such, there were 4 component images and the screen refresh circuitry combined them on the fly. And taking a screenshot in any game would get you the first of the 4 component images, so it looked a lot worse than what you'd see on the screen.

Now it probably was a lot less important than #1 and #2 for sinking 3dfx, but it was a piece of bad press they could have done without. While the big review sites did soon figure out "wtf, there's something wrong with these screenshots", the fucked up images were already in the wild. And people who had never seen the original image were using them all over the place as final "proof" that 3dfx sucks and that 22 bit accuracy is a myth.

Re:Actually, it's simpler (1)

drinkypoo (153816) | more than 7 years ago | (#19664153)

Actually, you know, it's sorta funny to hear people ranting and raving about how 32 bit killed 3dfx or lack of T&L killed 3dfx, without having even the faintest clue what actually happened to 3dfx.

I remember all that you speak of.

1. 3dfx at one point decided to buy a graphics card manufacturer, just so, you know, they'd make more money by also manufacturing their own cards.

It wasn't a horrible idea, but going exclusive was.

2. They missed a cycle, because whatever software they were using to design their chips had a brain-fart and produced a non-functional chip design. So they spent 6 months rearranging the Voodoo 5 by hand.

They should have been doing hardware T&L before then.

at the time there was pretty much no software that did T&L anyway.

Except, you know, anything using Direct3D.

Everything after the voodoo 2 was a sad joke. That's the real reason 3dfx is dead. If they had gotten the HW T&L clue earlier, they might have been able to compete. But they didn't get spun up in time and, in conjunction with their other mistakes, they died.

I honestly think that if they had got on the HW T&L bandwagon in a timely fashion that they would still be here.

Before any of that shit went down I worked for a company that was designing two out of four chips in Microsoft's Talisman graphics project which ultimately went nowhere, predominantly because it was a combination of like umpteen-jillion companies. That company, Silicon Engineering (nee Sequoia Semiconductor) later was purchased by Creative Labs and became "Creative Silicon". This was the company I worked for before I even moved to Texas, which happened before the geforce came out (which is the only reason I mention it.)

3dfx was just behind the curve. They deserve some credit for doing some good things. They deserve our endless ire for ever creating GLIDE, and not just using OpenGL from the beginning, which would probably have avoided the whole Direct3D debacle - it would likely never have existed at all.

Re:Resolution (1)

blahplusplus (757119) | more than 7 years ago | (#19657079)

"The problem is, is that a lot of gamers don't want to get the card that only supports 16bit graphics, or in this case only supports 1900x1280 resolution. Because they feel that they aren't getting as good of a product, even if they can't tell the difference."

Woah woah woah... you should not be comparing 16-bit vs 32bit colour to 'high resolutions'. You could easily see the quantization errors with transparency effects like smoke or skies on 3Dfx cards, you could easily tell the difference between 16-bit and 32-bit color, the banding was very apparent in multitextured games. I know I was the one who stuck with the voodoo3 while everyone moved over to nvidia TNT and geforce series cards. My first card was a Geforce 2.

Many (if not most) people are satisfied with 1024x768 or 1280x1024 /w AA today. Even Doom 3 looked fantastic at 800x600 when it first came out.

Even with a fast card I rarely run higher then 1280x1024, at some point resolution really ceased to matter to me with anti-aliasing, decent speed, and getting rid of blending artifacts that occur with low precision are more important. Since the actual art hasn't caught up yet.

Go play Xenosaga Episode 3 on the PS2 (a game released last fall) for the PS2 and compare it to any modern PC game, if that game proves anything, it proves that artists and art direction is much more important then simply having high resolution. High resolution doesn't matter much if you're art is not that great or you game isn't either.

Re:Resolution (1)

ultranova (717540) | more than 7 years ago | (#19661157)

Go play Xenosaga Episode 3 on the PS2 (a game released last fall) for the PS2 and compare it to any modern PC game, if that game proves anything, it proves that artists and art direction is much more important then simply having high resolution. High resolution doesn't matter much if you're art is not that great or you game isn't either.

"Our cards are designed for playing bad and ugly games!" is not a good sales pitch...

Re:Resolution (1)

A_Non_Moose (413034) | more than 7 years ago | (#19659211)

3DFX thought the same of 32 bit graphics.


Along the same lines as what got Ati in the running in the grfx market, pre 9500.
(they were "in" the market, yeah, but they did not matter, IMO until the 9500)

All of the grfx being put on screen were being drawn, even if you could not see it, say
like a house with a fence in the back yard, grass, lawn chair and other stuff that you
can't see because you are standing in front of it.

At the time that was a lot of CPU/GPU power being wasted, until the question "why" was
asked by Ati. Something as simple as occlusion in relation to POV. We take it for
granted now, but somewhere between the 95/9700 these things put their NV counterparts
to shame, an people noticed.

I liked my TNT2, GF2 even liked the PCI GF4 (dual proc intel bx2, pci only, game box...don't
laugh, it held up to Max Payne2 until multiple sounds), but eventually heat, power and
money became the issues to deal with.

Now, after owning a 95/98 and current X800pro cards, heat is meh, power a bit (98 to X800,
--along with 9HDs-- made the 350W cry) and money...well the X800 hurt a little more money
wise buying when it was shortly after release.

Not changing with the times is what made 3dfx (moment of silence) hurt, killed was the fab
change (in house/outsource, I forget).

Now the issue is size, and the X800 ain't tiny, but fit in my case, but the 1900 I tried would
cut into data cables even if I moved drives, or I'd break/degrade the raid. Not...gonna...happen.

That said, I still wonder why the grfx market feels like the Mhz race?

But, hey, nobody notices from all the shiny objects on the screen...OOooooh.

22 bits (1)

Rastignac (1014569) | more than 7 years ago | (#19661037)

A 3dfx Voodoo3 card was able to compute internally at 22 bits precision. Then the final result was downscaled to 16 bits on the fly. So the card was fast (because 16 bits is faster than 32 bits) and the picture was nice (because 22 bits precision is prettier than 16 bits, and not so ugly compared to 32 bits). That was the trick, and it was fine at that time.

Re:Resolution (1, Interesting)

Anonymous Coward | more than 7 years ago | (#19654139)

There are already games that can't be run at a good (60+) framerate with maximum settings at 1920x1200 by a single GeForce 8800GTX. The person who referred to that card (incorrectly, in my opinion) as a $700 paperweight was likely referring to the problems with its Vista drivers.

Re:Resolution (2, Funny)

feepness (543479) | more than 7 years ago | (#19654315)

I don't want this to sound like the famous "640k should be enough for everyone", but...

WQUXGA, 3840x2400, or nine million pixels.


How about five letter acronyms being enough for anyone?

Re:Resolution (2, Interesting)

llZENll (545605) | more than 7 years ago | (#19654425)

On current displays yes its overkill, but on displays in 10 years or less it will be the standard, it takes a lot of pixels to cover your entire field of view. Some may argue we dont need this much resolution, but until we are approaching real life resolution and color depth, we will need more.

Display of the future approaching the human eyes capabilities.

60"-80" diameter hemisphere, it will probably be oval shaped, since our field of vision is.
2 GIGApixels (equal to about a 45000 x 45000 pixel image, 1000x the resolution of 1080 HD).
48 bit color (16 bits per channel).
12GB framebuffer size
@60fps = 720GB/s bandwidth

its only a matter of time...

based on information at
http://www.clarkvision.com/imagedetail/eye-resolut ion.html [clarkvision.com]

Re:Resolution (3, Informative)

drinkypoo (153816) | more than 7 years ago | (#19655641)

Display of the future approaching the human eyes capabilities.

You say this like it means something. It does not. Here's why.

The real world is based on objects of infinite resolution. Our vision is limited by two things; the quality of the lens and other things in front of the retina, and our brain's ability to assemble the data that comes in, in some fashion useful to us that we perceive visually.

A lot of people make the mistake of believing that the finest detail that we can resolve is in some way limited to the sizes or quantities of features on the retina. This is a bunch of bullshit. Here's why; Saccades [wikipedia.org] . Your brain will use your eye muscles without your knowledge or consent to move your eye around very rapidly in order to make up for deficiencies in the eye surface and to otherwise gather additional visual data.

Have you ever seen a demo of the high-res cellphone scanning technique? There's software (or so I hear, I saw a video once and that's all) that will let you wave your cameraphone back and forth over a document. It takes multiple images, correlates and interpolates, and spits out a higher-resolution image. (No, I don't know why we haven't seen this technology become widespread, but I suspect it has something to do with processor time and battery life.) Your eye does precisely the same thing! This leads us to the other reason that your statement is disconnected from reality; what you think you are seeing is not, repeat not a perfect image of what is before you. Your eyes are actually not sufficiently advanced to provide you so much detail if that is what it was!

No, what you think you are seeing is actually an internal representation of what is around you, built out of visual data (from the optic nerve, which performs substantial preprocessing of the retinal information) and from memories. Your brain fills in that part of the "image" for which it does not have good information from your own mind. This is why you so commonly think that something looks like something else at first glance - your brain made an error. It does the best it can, but it only has so much time to pick something and stuff it in the hole.

Stop trying to equate vision to a certain number of pixels. It's different for everyone, and it's only partially based on your hardware. Your brain does vastly more processing than you are apparently aware. Some people get to see things that aren't there all the time! (Or maybe it's the rest of us whose visual system has problems? Take that thought to bed with you tonight.)

Re:Resolution (0)

Anonymous Coward | more than 7 years ago | (#19656407)

ah!

So what your really saying is not that we should try to get our computer graphics better, but we should ruin the eyes/brain of our children, so that for the overall "experience" the quality will be the same.

Ooh now it dawns to me, where already doing this with tv.

excelent.

Re:Resolution (1)

llZENll (545605) | more than 7 years ago | (#19659325)

"A lot of people make the mistake of believing that the finest detail that we can resolve is in some way limited to the sizes or quantities of features on the retina."

Of course it is, if its not limited by the cones or rods, its limited by the scanning rate, optical nerve rate, or the rate at which saccades happen, IT IS LIMITED, its just that we don't know the current limits technically, even if we don't know how to calcuate them through biological measurements, they are very easy to measure through subjective means.

There is a point where the resolution of a display can be increased beyond what the eye can resolve, it doesn't matter how the eye composes the image, all that matters is the end result.

"Stop trying to equate vision to a certain number of pixels. It's different for everyone, and it's only partially based on your hardware."

Its not hard to do a test to find out the acuity of your vision, make a display which exceeds this, and you're done. Of course its different for everyone, how in the hell could it not be?

Stop trying to make vision this magical sequence of events, for display purposes it doesn't matter how it comes to be, all that matters is what you see :o

Re:Resolution (1)

Puff of Logic (895805) | more than 7 years ago | (#19654669)

WQUXGA, 3840x2400, or nine million pixels.

Sounds like overkill to me. I mean, I'm used to play my games @ 1280x1024 and i feel this resolution, maybe combined with a wee bit of AA, does the trick.
I used to feel this way, running at 1280x1024 on a pretty decent 19" CRT. However, about a year ago I finally upgraded to a 22" widescreen LCD with a native resolution of 1600x1050 and the difference it made was astonishing. Games that supported high resolution (Company of Heroes, WoW, Oblivion, etc.) felt incredibly more open. For contrast, I recently reloaded Enemy Territory on my system, which I have to run in a 1280x1024 window because the full-screen standard resolutions look like crap on my wide-screen. I honestly had a difficult time playing at first, because I felt like I was suddenly playing with blinkers on, unable to see anything that wasn't directly in front of me. Now, I played ET incessantly a couple of years ago so I'm well-used to the game, but the drop down from a wide-screen resolution to a standard resolution was both dramatic and constituted a significant drop in visual enjoyment.

A caveat to this is that my poor 7600GT is now unable to keep up with some games due to the high resolution I run (that and my insistence on good quality visual settings too) so I'm probably going to have to upgrade my card in the relatively near future. Despite this, I'd heartily recommend you try gaming at a decent wide-screen resolution if you haven't done so already, as you really owe it to yourself as a gamer. While I'd agree that there's probably a point of diminishing returns in terms of monitor size and pixel count, I would also definitely argue that gaming at 1280x1024 is a poor option for the modern PC gamer.

Speaking of Nvidia Development (3, Interesting)

Anonymous Coward | more than 7 years ago | (#19653837)

Speaking of Nvidia PC game development. Why the hell are all their new versions of their useful utilities like FX Composer 2 (betas I tried to test) now requiring Windows XP (with SP2) and no more Windows 2000 support? Win2k and WinXP have virtually zero differences in hardware support and driver system architecture. I should know since I've programmed a few drivers using Microsoft's driver development kit and according to the docs nothing has changed from Win2k to WinXP for drivers and majority of the APIs, just additional features.

The thing that pisses me off is that Nvidia seems to have done this for absolutely no reason at all and Windows 2000 is still a fine operating system for me. I have no reason at all to switch to Windows XP (and hell no to Vista), I especially don't care fot the activiation headaches (I like to switch around hardware from time to time to play around with new stuff and go back once I've gotten bored with it if I don't need it, such as borring a friends Dual-P4 motherboard).

Anyway, my point/question why must Nvidia feel the need to force their customers who use their hardware for developing games into later Windows operating systems like that? Anybody got any tips on how to 'lie' or disable the windows version check to force say FX Composer 2 to install on Windows 2000? It isn't like we're talking about Windows 98 here, Win2k is a fine OS and in my opinion actually the best one Microsoft has ever done.

Re:Speaking of Nvidia Development (1)

0racle (667029) | more than 7 years ago | (#19654321)

2000 has been end of lifed. It is not supported by MS and eventually others are also going to be dropping support for it.

Re:Speaking of Nvidia Development (1)

afidel (530433) | more than 7 years ago | (#19656505)

Huh? Windows 2000 Professional and Server are in extended support until Jan 31, 2010. There won't be any non-security related hotfixes anymore but how many people still running 2000 want anything BUT security hotfixes?

Re:Speaking of Nvidia Development (1)

archen (447353) | more than 7 years ago | (#19654557)

A lot of nvidia's driver development makes no sense. Apparently according to nvidia, Windows 2003 32bit does not exist. So much for dual monitors on that machine.

Re:Speaking of Nvidia Development (0)

Anonymous Coward | more than 7 years ago | (#19654581)

dual-boot then you can use dual-monitors!

Re:Speaking of Nvidia Development (0)

Anonymous Coward | more than 7 years ago | (#19654611)

I agree that Windows 2000 is the top of the line for MS Operating Systems. I used it until March of this year when I fully switched to Debian Etch. No XP for me and definitely no Vista.

However, in relation to games. I believe there was a comment by a dev working on Galactic Civilizations II about the difference in Win2k and WinXP coding. It was from sometime last year I believe. I would go looking it up, but I am off to work. If I remember after work I will look for it, and comment with a link or an apology for mis-remembering.

Re:Speaking of Nvidia Development (0)

Anonymous Coward | more than 7 years ago | (#19659667)

I cannot remember where the comment was and my search for it ended in me re-reading the dev journals of AI game tweaking. So I failed in my objective.

Re:Speaking of Nvidia Development (1)

dsyu (203328) | more than 7 years ago | (#19654741)

Viva la Win2K!

Anybody got any tips on how to 'lie' or disable the windows version check to force say FX Composer 2 to install on Windows 2000?

I too would like to know if there's a way to do this.

Re:Speaking of Nvidia Development (1)

Bardez (915334) | more than 7 years ago | (#19655447)

Damn straight.

Windows 2000 was probably the most stable of the user OS's I've seen Microsoft roll out. XP, sure it has a firewall and all, but the only thing I like about XP over 2000 -- the ONLY THING -- is the integration of browsing into a *.zip file. That's it. The install is four times as big and just as stable. I really never saw the need to buy XP, ever. Work environments have been my main source of exposure to it.

Linux? (2, Interesting)

Anonymous Coward | more than 7 years ago | (#19654225)

My question would be how NVidia's helping the game developers write for and port to Linux. If popular cames were more compatible there, it'd be a lot easier to get converts; and I'd expect the game developers would be happy to see more of my software dollars go to their products rather to OS upgrades.

Open GL (1)

OzPhIsH (560038) | more than 7 years ago | (#19655271)

While only sort of relating to Linux, I'd be interested to hear any comments about unlocking the potential of hardware via OpenGL. OpenGL runs on multiple platforms, and a good driver should, in theory, allow developers to take advantage off all that fancy new "Designed for DX10" hardware. I was hoping that Microsoft's handling of DirectX 10 would encourage developers to take this kind of route, as it would allow them to not only to eventually exceed some of the limitations and capabilities of DX9, but do it in a way that doesn't sacrifice the largest installed platform.

The state of the Open GL (3, Informative)

S3D (745318) | more than 7 years ago | (#19656295)

While only sort of relating to Linux, I'd be interested to hear any comments about unlocking the potential of hardware via OpenGL.
You can check the OpenGL pipeline newsletters [opengl.org] . Unified shader support is part of OpenGL "Mt. Evans" ARB extensions, which is targeted for the october 2007 release. "Mt. Evans" will support geometric (unified) shaders and improvement of buffer objects. Geometric shaders supported even now as NVIDIA extension (GL_EXT_gpu_shader4, GL_EXT_geometry_shader4, GL_NV_gpu_program4, GL_NV_geometry_program4 etc) . So it seems all the functionality is available through the OpenGL.

So why... (1)

XanC (644172) | more than 7 years ago | (#19658529)

So why are games being written for Direct3D? Why would a developer voluntarily chain himself to a single vendor, any vendor, let alone Microsoft.

What would they be giving up by writing to OpenGL? It runs on Windows, right?

Re:So why... (1)

S3D (745318) | more than 7 years ago | (#19659245)

There are some reasons: Sadly d3d drivers usually more mature - more stable, less bugs. Therefore less problem with support for different graphics cards. DirectX is an integrated SDK, which include not only 3d pipeline, but also sound, video and extensive support for 3d modeling (X file format) video memory management was better in d3d up until latest OpenGL version, which was quite important for big seamless 3d worlds with run-time block load. There is a generation of coders who don't know OpenGL - it's easier to learn one API instead of two. Vista (surprise) introduce new problems: OpenGL in window with Aero enabled run 10-15% slower (turning Aero off mitigate the problem).Possible compatibility problems: OpenGL app for old windows version is less likely to run on Vista than D3D. Those are the main reason why most of windows devs prefer d3d . I will not go into discussion why OpenGL actually better, that is quite obvious.

Re:So why... (1)

Bottlemaster (449635) | more than 7 years ago | (#19660467)

So why are games being written for Direct3D? Why would a developer voluntarily chain himself to a single vendor, any vendor, let alone Microsoft.

What would they be giving up by writing to OpenGL? It runs on Windows, right?
Most so-called developers wouldn't understand what an open standard is if you slapped them in the face with one. Incompetent teachers taught them that Microsoft is the one true way, and they learned their lesson well. Welcome to our profession.

Re:So why... (0)

Anonymous Coward | more than 7 years ago | (#19677081)

"Why would a developer voluntarily chain himself to a single vendor, any vendor, let alone Microsoft."

Erm, you've heard of games consoles, right? Lots of developers will not only tie themselves to an API but a whole chunk of incompatible hardware, one completely under the vendor's control at every level, and will pay a licence fee for the privilege!

As for why they do it, I should think that's obvious: increased profits.

I'm not dissing consoles here, BTW. I own several, and I think they deserve a place in the gaming world - it's just that that place should IMO be a bit smaller. Although, handhelds really do work best under a closed model (sorry, GP2X fans).

Wow (1)

obeythefist (719316) | more than 7 years ago | (#19657829)

Some of the screenshots and videos for games like Crysis are really amazing. There's a long way to go, but we are definitely on the cusp of the next generation of games.

This is about right, when the Xbox came out, it was about on par with PCs at the time. 6 months to one year down the track, the top of the line PCs were way ahead. Now, the 360 and PS3 (which isn't living up to the hype, most of the graphics on 360 and PS3 are about the same despite the 360 being a year older) aren't competing with the top of the line PCs.

I think it would be funny to see Crysis ported to Xbox - didn't they port Far Cry to consoles? That would have been sad...

Re:Wow (1)

_2Karl (1121451) | more than 7 years ago | (#19673949)

Yes, Crysis does look amazing. But so what? It's just ANOTHER First Person Shooter. I'm sick of them. To be honest I find the more realistic games become, the less I enjoy them. I want escapism. If I want to experience realistic physics I'll go outside and stand under an apple tree. Maybe I'm just jaded from working in the games industry for so long, but I often find myself pining for the "glory days" of the 90's. Point and click adventures like Monkey Island, Fantastic strategy games like Syndicate... A lot of these games have simply gone the way of the dodo becasue nobody seems to have any originality (or their publishers do not want to "take a risk"). Back when hardware was more primitive, the games had to rely on quality gameplay rather than graphics. While I'm not saying all modern games are bereft of gameplay, I do feel that the design process has become more lazy. It's comparable in many ways to the movie industry. The vast majority of films that come out of hollywood really are terrible. Where the games industry differs is that there are no (or at best, few) "art house" software companies creating original concepts, via quality titles which lack the horrible gloss and glitz and production line mentality that most games possess. (There are notable exceptions of course, companies like Introversion and a number of small "bedroom programmed" open source projects manage to keep the flicker of hope I have alive). And then there's the whole direct-X vendor lock in, don't even get me started on that one. I'm rambling I know, but the point I'm trying to make is that a quality game is timeless in its appeal, does not necessarily rely on cutting edge hardware.

Re:Wow (1)

obeythefist (719316) | more than 7 years ago | (#19684361)

Sure, but Crysis, much like Far Cry, was never really about gameplay, it was all about the engine. This is the rare edition Eagles live at concert recording for the hi-fi fanatic. This is a game for gamers who are in it for the shiny hardware. You need something to put that gear through it's paces, right? This is a vehicle to show us what's possible. Hopefully the technologies and benchmarks set by this game will be adopted and integrated into future games with great gameplay.

The industry must move forward. If that were not true, Pac-man would remain today the pinnacle of cutting edge graphics and gameplay. That or computer chess (you can't say chess doesn't have good gameplay).

Before we had 3D graphics capability available to the masses, there were no first person shooters at all. Until we had broadband internet so widely available, MMORPGs just weren't feasible on the scale they are today. Who is to say what new game formats are discovered unless we keep pushing the envelope?

There's a lot of products that, today, we consider to be "just another this" or "dismal failure that". Like movies, right? Okay. So go back to 1950, look at the films that were released then, and you will pick up a couple of great, classic, gripping movies. And you will pick up 90% of movies released then that are truly awful. Is that so different from today? I say this isn't a new trend. I say history is quietly and happily repeating itself all around you. I dare say it's a vital process in the scheme of human literature creation.

Let's wind it back a bit more. The Mona Lisa is a portrait. There were portraits before that, why is the Mona Lisa so popular? Because DaVinci got it right. A little twist on the old format maybe. But he was making just another portrait. Would you have stopped him from making the Mona Lisa because we've already got heaps of portraits and they're all rubbish?
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>