Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

ATI & Nvidia Duke It Out In New Gaming War

Hemos posted more than 13 years ago | from the who's-bringing-the-heavy-artillery dept.

Games 208

geek_on_a_stick writes "I found this PC World article about ATI and Nvidia battling it out over paper specs on their graphics cards. Apparently ATI's next board will support pixel shader 1.4, while Nvidia's GeForce3 will only go up to ps 1.3. The bigger issue is that developers will have to choose which board they want to develop games for, or, write the code twice--one set for each board. Does this mean that future games will be hardware specific?"

cancel ×

208 comments

Sorry! There are no comments related to the filter you selected.

linux and xwindows! (-1, Offtopic)

Anonymous Coward | more than 13 years ago | (#10002)

Ok guys, maybe some of you can advise me. I'm sick of my ATI r128 card, the screen corruption and flicker in xwindows is terrible, so I'm looking for something new. I want (in X) good performance, plus tv out with full screen acceleration for playing emulators and watching dvd on my tv. X just shows a black screen when my tv lead is connected! So I have to unplug it everytime I want to run X, which is a pita. So I come to the conclusion that ati linux drivers suck.. but what else is there? Will nvidia gf3+proprietory drivers do what I want? I know ati drivers are open source but really, whats the point? No one can understand them without access to the hardware docs.

Games are already hardware-specific (1, Informative)

Anonymous Coward | more than 13 years ago | (#10648)

Max Payne [maxpayne.com] , for instance, was developed mostly with GeForce cards. This means that by choosing their standard developer hardware setup the developers are actually becoming hardware-dependant and are, if not frankly, saying that these cards are the ones that you should use to play their game.

This is really no news.

"Optimized for Pentium III" is what read on every possible piece of marketing material with the late Battlezone II.

I would make a conclusion that hardware dependancy of games goes far beyond than just the graphic cards. Use this processor to get better results, use this sound card to hear the sounds more precisely, etc.. It seems that game industry has big bucks, and every hardware vendor wants to make sure that when the next big hit comes, everyone needs to buy their product in order to get that +15% FPS out of it.

Re:Games are already hardware-specific (0)

Anonymous Coward | more than 13 years ago | (#12707)

What are you talking about? Blowing smoke as an Anonymous Coward sure is fun, isn't it?

Does there have to be a problem for developers? (1)

qwaszx (8209) | more than 13 years ago | (#12709)

If all these new wiz-bang features are implemented as extensions to the original API, then all the developer has to do (if he/she/they/it chooses to support the new features) is detect if the feature is available and add the code for it. If not, then just use the standard API.

Eventually, all the other manufacturers will catch up with the new features, and the extension will become integrated into the standard.
An analogy could be (think ye olde days) detection of a sound card, and only enabling sound if one is available.

If even that it too much work for the developer, then just dont support the new extension - the graphics will just look as pretty to the untrained (read consumers) eye.

Re:Does there have to be a problem for developers? (1)

l33t3$t_hax0r (457694) | more than 13 years ago | (#23272)

Eventually, all the other manufacturers will catch up with the new features, and the extension will become integrated into the standard.

Do you work for Microsoft?

Different battle, same war (1)

PoitNarf (160194) | more than 13 years ago | (#1429)

All hardware is never going to work with all software. It's simply too much to ask. But graphic card incompatibility is not new to the gaming world. I remember a few years ago when I had a Riva TNT board, and just about EVERYONE else had some 3dfx board. Some games only first came out with support for the 3dfx stuff, and then later released patches for riva stuff, or riva released a new driver that would work with the game. I'm fairly sure that was all basically software differences, not hardware like this is. But there can't be anything wrong with a little friendly competition. As long as the good new games are released and are able to support ALL of the major boards out there I'm happy. They don't all have to be supported as soon as they hit stores, but delayed support is better than no support.

This sucks (3, Insightful)

levik (52444) | more than 13 years ago | (#1431)

Whatever happened to standards? Remember when things were "100% compatible"? IBM-PC compatible. SoundBlaster compatible. VESA compatible. Compatibility in harware was nice, because you could take it and your software would work on any OS with a piece of compatible hardware without needing special drivers.

Now the hardware industry has moved away from that, instead giving us free drivers for windows. Which not only are crappy in their first release, but are also useless on other platforms which the vendor decides not to support.

Bring hardware standards back, and MS will lose much of the power it's able to leverage through the high degreee of hardware support their system provides. I for one would sacrifice a little technological progress for the ability to have things work together as expected out of the box.

Re:This sucks (1)

Gingko (195226) | more than 13 years ago | (#28340)

But now we have API standards. Even better, I think. Graphics card manufacturers have complete freedom of implementation. It's been a while since games actually didn't work on one of the big cards (and quite a few still work on Voodoo 3s).

I love things working right out of the box, it's one of the reasons I still use Win2k/XP for almost everything. I agree, standardisation of hardware would bring this kind of compatibility within the reach of others. But it would probably lead to a lowest common denominator approach, and then somebody will ignore the specs. It'll all happen again.

Henry

On hardware compatibility... (1)

Gingko (195226) | more than 13 years ago | (#16643)

Stanford have been doing some research on compiling RenderMan (pretty much the holy grail) shaders down to OpenGL. They can do it, in a lot of passes and with a couple of extensions.

I have heard that there is work ahead to do this with Pixel Shaders. Once Pixel Shaders become sufficiently general, all you need to do is re-target the back end of the compiler and you're set.

Henry

Not that big of a problem (2, Interesting)

mfb425 (462506) | more than 13 years ago | (#16644)

The differences in hardware are not that big of a problem for next generation graphics engines. The amount of features and flexibility available now necessitate using a higher level shader language as opposed to hardware specific api features. A well designed shader language can be compiled to take advantage of whatever driver features you have available, and emulate or ignore the rest.

We are currently able to target both pixel shader versions in DirectX, and hopefully soon in OpenGL. We are currently ignoring features not supported by the hardware that shader code tries to use. So rendering the shader surface on a GeForce1 will look much worse than on a full featured card, but we don't waste time emulating it.

For reference on similiar techniques check otu Proudfoot et al. 'A Real-Time Procedural Shading System for Programmable Graphics Hardware'. (Thought thats based on NVIDIA hardware, it extendable to new features as well)

j00 (-1, Troll)

Anonymous Coward | more than 13 years ago | (#19923)

sux0r

hardware specific games (0)

Anonymous Coward | more than 13 years ago | (#19924)

every game must have hardware specific parts at least in the underlying OS - they are called (drumroll) drivers

Do You Remember Glide? (1)

rsd (194962) | more than 13 years ago | (#20877)

Does this mean that future games will be hardware specific?

Well, Not a long time ago, we just had 3dfx (and glide) as the only option for 3D games.
Even if there were other 3d hardware and other tecnologies (OpenGL and the rising Direct3D), glide (and 3dfx) was the default choice.

Ignore the article (0)

thunker (206170) | more than 13 years ago | (#20879)

Write code twice? You obviously are new to gaming and programming. Have you heard of Direct3D, OpenGL, and Glide? These are API's that programmers use. For the most part the video card drivers will take care of the boards features. I doubt programmers will worry about pixel shader 1.3 or 1.4.

Hardware Specific? (1, Interesting)

Anonymous Coward | more than 13 years ago | (#21731)

"Does this mean that future games will be hardware specific?"

Well, yes actually. Haven't they always been? We've had 3Dfx versus PowerVR, Glide versus OpenGL, Direct3D versus OpenGL...

It goes all the way back to floppy versus CD, Win3.1 versus Win32s, 16 colours versus 256...

Every game has system requirements (even if your only talking about a scale like processing power), and always has done. I still remember the shock when I realised I'd need to get a tape drive to complement the disk drive in my CPC664, just to play some of the games!

The thing people miss here (2)

evanbd (210358) | more than 13 years ago | (#22439)

is the level of compatibility there is. PS 1.4 is really just an extension of 1.3 -- it adds more instructions to the same basic architeture. If you write for GF3, it'll run just fine on the R200 or whatever. If you write for R200, it'll run just fine on nVidia's next part, even though it supports PS1.5. It's all back-compatible, I believe. It's sorta like the deal with two texture units vs three (GF2 vs Radeon) or single-pass quad texture with two units (GF3). Write for the lowest denominator you care to, it'll work fine on all the newer stuff.

Re:The thing people miss here (2)

Fishstick (150821) | more than 13 years ago | (#696)

>Write for the lowest denominator

hmm, but it seems like game developers don't do that. There is a segment of gamers that are attracted to the newest hardware _because_ it has the latest features, and they then want to buy a game that uses that feature they just paid a $$$ premium for. Totally wrong priorities, but it seems to happen.

Sure, write your game for the best compatibility across different hardware, but then you run the risk that PC Gamer magazine won't drool all over themselves in their review because the reviewer ran your demo on his rig with a GeForce XXI, but your game didn't have the latest 'cyclops, semi-transparent, half-inverse bump/pixel grinding' feature.

A 14-year old reading pcgamer has no idea what this feature really does for him, but he knows that dad is getting him a GeForce XXI for xmas, so this game isn't going to be on his santa list.

This sucks ass like hell ! (0)

Sepultufart (468385) | more than 13 years ago | (#22573)

I want to be able to try all games regardless of what hardware I have. If such division will occur we'd all be really pissed off. The developers and the customers! Heck, I'll have to get a custom motherboard that houses two graphic cards slots! Plus I'll have to buy both of them! What a pain! What a Pain! Somebody kill the whores! Competitions good my ass! If you cant check what the other side has its like two fucking monopolies, what the hell do they think they're doing. Dont you rember when sound card where the pain in the ass issue. Because corporate whores like customer loyalty, they'll chain you up in the store. We really needed that. Like we allready didn't get enough shit with John Romero, Massively Multiplayers Offline Pains. We needed to get off with a good start with incompatible video cards. I tell you what ! If those fuckers go ahead I'll buy a Playsation !Screw them all !They can kiss their dollars good bye!

flash upgrades? (0)

Anonymous Coward | more than 13 years ago | (#23094)

My BIOS is flash upgradable. My modem is flash upgradable. Is there a reason why my video card can't be flash upgradable? I suppose 1) it would cost too much to put that much flash ROM on the board, or, more likely, 2) then you wouldn't be forced to buy a whole new card that supports some new standard. As for (2) I would say most people inclined to buy a new card now would do it anyway, for faster speed or more memory, which can't be done with such an upgrade. Even if the card has slots for extra memory, there's always a limit. And (1) a quick poll: how many would spend an extra $50 (to pull a number out of the air) on a card that won't go obsolete just because they just released DirectX 9?

first fist (-1, Offtopic)

mod you later (326902) | more than 13 years ago | (#23402)

first fist is when you fist someone fist in quake 3 arena

Re:first fist (-1, Offtopic)

gamorck (151734) | more than 13 years ago | (#23961)

Sorry Trollboy - I actually got a legitimate FP in before you could say something stupid. Thanks for playing!

Gam
"Flame at Will"

Re:first fist (0)

mod you later (326902) | more than 13 years ago | (#26024)

sure, it's legitimate but it's also redundant. take this [conhugeco.org] advice.

History Repeats Itself (2, Interesting)

BigJimSlade (139096) | more than 13 years ago | (#23548)

Looks like we're back to the days of yore, when you (the developer) got to choose to support a specific card (3dfx or the others that didn't survive) because there was no DirectX support... because there was no DirectX. Then you (the consumer) got the shaft if you didn't have the right card, unless the developer later came out with a binary that would support your card's features. But if it wasn't an uber-popular game, this usually didn't happen.

So why are Nvidia and ATI forcing developer to go back to the stone age of accelerated polygons? Oh that's right... Me likes pretty picture.

direct x is not open, OpenGL is, we should use OGL (1, Insightful)

HelloKitty (71619) | more than 13 years ago | (#21940)

why use something like direct x when opengl is an open standard with sourcecode and specification open to all?

It's scary that so many people are relying on M$'s proprietary graphicx technology. at any time they could discontinue it, or change the API in such a way to make all games broken. I wouldn't put it past them.

subatomic
http://www.mp3.com/subatomicglue [mp3.com]

Re:direct x is not open, OpenGL is, we should use (1, Informative)

bribecka (176328) | more than 13 years ago | (#21700)

opengl is an open standard with sourcecode and specification open to all

OpenGL is an open standard, but the source code isn't open--there isn't even any source code! It's just a specification, then each individual vendor must implement according to that specification. For example, Nvidia makes an OpenGL implementation that is accelerated by their graphics cards, MS makes an implementation that is software only, and 3dfx made a mini-implemenation at one point.

I think maybe Mesa is open-source? Not sure. But the actual implementation inside the vendor's API is whatever they want, and is probably closed (see Nvidia). The only requirement is to follow the specification and the rendering pipeline properly (so transforms/shading/etc will be applied the same through any OGL implementation).

Actually, it IS open... (2)

Svartalf (2997) | more than 13 years ago | (#23723)

SGI released the reference implementation under a MPL variant type license back last year. Right along with the GLX implementation.

It's Open Source.

Re:direct x is not open, OpenGL is, we should use (1)

Smedrick (466973) | more than 13 years ago | (#21982)

Have you ever used either one? IMO DirectX is a lot less of a pain in the ass to code with. It also supports more than OpenGL. Plus, most game designers aren't too concerned with other platforms seeing as the majority of their market runs Windows for gaming.

Using COM interfaces easier than using C/C++ ones? (2)

Svartalf (2997) | more than 13 years ago | (#33007)

Riiight. You're just used to coding for Windows- it's easier for you because you're inured in that style of coding.

Re:direct x is not open, OpenGL is, we should use (2, Troll)

grazzy (56382) | more than 13 years ago | (#43499)

DirectX has full documentation freely available, also DX doesnt only support accelerated graphics but the whole range in output and input devices such as joysticks, sound etc etc.

OpenGL is written for a UNIX environment, DX is for a Windows environment. And yes, opengl is opensource and very easy to learn, but still it has alot of drawbacks, one of them being those dinosaurs that runs it.

OpenGL does NOT change very much, which has both good and bad sides, for example, this threads discusses pixel shading, which is a feature OpenGL does not natively supports. I do not know how hard this is to implement in DX, but I figure that since they are even talking about it and not just dismissing it as some "toy" like the OpenGL-board seems todo..

Re:direct x is not open, OpenGL is, we should use (0)

Anonymous Coward | more than 13 years ago | (#20875)

OpenGL is written for a UNIX environment,
No it is not. You talk about opengl like it is a product, but its really a standard.
There are many implementations of opengl (some official, some not official) and to say they are written for UNIX is nonsense. For example, I use Mesa all the time. (a non-official implementation) True, this runs on unix, but actually started life on an Amiga.

Re:History Repeats Itself (1)

G-funk (22712) | more than 13 years ago | (#34536)

This is a total troll. I can't vouch for ATI, but I damn sure know there's no proprietry interface for Nivida chips. You use DirectX, or you use opengl. There are some nvidia extensions to opengl, but it's not like you'll break your game, if the functions aren't there, the game just doesn't use them. Wow! How dare they! They gave us extra features instead of asking the people who run opengl nicely for them to put things in the standard and waiting 5 years! The bastards!

what is the difference between 1.3 and 1.4? (1)

HelloKitty (71619) | more than 13 years ago | (#23829)

is 1.4 backward compatible? 1.3 and 1.4 are a direct3d thing, what about opengl?

so what if we all just use opengl instead? open standard etc... would definately worth it to pressure the ARB to extend their spec to shaders.... NVIDIA shader extensions would have to be uses cause the opengl ARB is very slow in adopting new standards (like pixel shading)

subatomic
http://www.mp3.com/subatomicglue [mp3.com]

I doubt it... (-1, Offtopic)

Anonymous Coward | more than 13 years ago | (#23960)

I doubt that Bobby is a fool who needs a life.

Oh good. A pissing contest... (5, Informative)

Gingko (195226) | more than 13 years ago | (#23995)

First of all, a direct link [ati.com] to ATI's SmartShader tech introduction.

I have a few disparate thoughts on this subject, but rather than scatter them throughout the messages I'll put 'em all in one place.

ATI are attacking what is possibly the weakest part IMHO of DirectX 8 - the pixel shaders. Pixel shaders operate on the per-fragment level, rather than on the per-vertex level vertex shaders which were actually Quite Good. The problem with Pixel Shaders 1.1 is that, to paraphrase John Carmack, "You can't just do a bunch of math and then an arbitary texture read" - the instruction set seemed to be tailored towards enabling a few (cool) effects, rather than supplying a generic framework. Again, to quote Carmack, "It's like EMBM writ large". Read a recent .plan of his if you want to read more.

If you read the ATI paper, they don't really tell you what they've done - just a lot of promises, and a couple of "more flexibles!", "more better!" kind of lip-service. I don't care about reducing the pass number. Hardware is getting faster. True per-pixel phong shading looks nice, but then all they seem to do extra is allow you to vary some constants across the object via texture addresses. Well that's great, but texture upload bandwidth is can already be significant bottleneck, so I don't know for sure that artists are gonna be able to create and leverage a separate ka, ks etc map for each material. (I did enjoy their attempts to make Phong's equation look as difficult as possible)

True bump-mapping? NVidia [nvidia.com] do a very good looking bump-map. Adding multiple bump-maps is very definitely an obvious evolutionary step, but again, producing the tools for these things is going to be key. Artists won't draw bump-maps.

Their hair model looks like crap. Sorry, but even as a simple anisotropic reflection example (which again NVidia have had papers on for ages) it looks like ass. Procedural textures, though, are cool - these will save on texture uploads if they're done right.

What does worry me is that the whole idea of getting NVidia and Microsoft together to do Pixel Shaders and Vertex Shaders is so that the instruction set would be universally adopted. Unfortunately, ATI seem to have said "Sod that, we'll wait for Pixel Shader 1.4 (or whatever) and support that." I hope that doesn't come back to bite them. DirectX 8.0 games are few and far between at the moment, so when they do come out there'll be a period when only Nvidia's cards will really cut it (I don't think ATI have a PS 1.0 implementation, someone please correct me if I'm wrong) - will skipping a generation hurt ATI, given that they're losing the OEM market share as well?

I dunno, this just seems like a lot of hype, little content.

Henry

Re:Oh good. A pissing contest... (1)

bribecka (176328) | more than 13 years ago | (#57)

I don't know for sure that artists are gonna be able to create and leverage a separate ka, ks etc map for each material

This would be done more through code than by an artist. You only need to either write a shader do do it properly, or just assign a ka/kd/ks to each material, and that isn't exactly difficult. After all, in the real world, most surfaces have a pretty much constant reflectance function for the whole dang thing. Just look around...Yes, things have different *colors* across its surface, but the actaul reflectance is usually the same.

Artists won't draw bump-maps.

Why not? They draw textures now, I'm sure they would have no problem drawing a bump map if need be. Besides, if there is support for procedural textures, you can just use those to generate a bump map.

Re:Oh good. A pissing contest... (1)

Gingko (195226) | more than 13 years ago | (#23770)

This would be done more through code than by an artist

True, but what you go on to say is that the Phong constants are indeed constant across a surface - then ATI saying 'oh look - you can programatically change ka and ks' becomes useless because you won't need to change it. This assumes that you are working on a one material : one texture map correspondance. If, like their examples, you have say metal and stone on one map then varying some constants becomes necessary. But then this requires another map (or even two) at close to the resolution of the source diffuse map. You can do per-material ka/kd/ks now with no troubles at all. Per-fragment is a bit more involved.

Why not? They draw textures now,
I'm not an artist, so I don't know. But I don't know that the tools are there for them to draw bump-maps, and you have to admit that using an RGB channel as a three component normal vector can't be the most intuitive way to draw things. Much better to procedurally generate, like you say.

Henry

Re:Oh good. A pissing contest... (1)

bribecka (176328) | more than 13 years ago | (#20578)

you have say metal and stone on one map then varying some constants becomes necessary

True enough. I'm thinking more in a high-end graphics environment rather than a gaming one, where that situation wouldn't come up very often really--it's just a way of being lazy and putting multiple objects into one--not sure if I'm being clear.

But you're right, you would have to change those constants across a surface, especially in games, where i suppose surfaces might be merged together for optimization's sake.

As far as the RGB/normal channel goes, I think most bump mapping is sufficiently done with just a grayscale type image...much like a heightmap. Since bump maps inherently give the appearance of micro geometry, some accuracy that might be acheived through an RGB bump map can be set aside for the sake of ease and speed (even if the speedup is in development!).

i wish i could get (-1, Flamebait)

Anonymous Coward | more than 13 years ago | (#24359)

fist prost. everybody wants to be fisted.

I believe (3, Insightful)

RyuuzakiTetsuya (195424) | more than 13 years ago | (#24883)

The Pixel Shader technology will be backwards compatable as far as the DirectX 8.0 API is concerned. Imagine that. Microsoft using an API to bring software developers together across various hardware choices. Now only if they could get Win32 cleaned up and a decent kernel, then I'd THINK about purchasing that OS. Although I'm not saying that there won't be card specific code, but as far as Pixel shader tech goes, as long as the drivers are DX 8 compatable, there's no problem with code for one card not working on the other. Besides, most systems sold in the last year have 810/810e/815E chipsets and stuck with those old i740 Starfighter chips.

Re:I believe (1)

vrt3 (62368) | more than 13 years ago | (#17536)

The Pixel Shader technology will be backwards compatable as far as the DirectX 8.0 API is concerned.

But that doesn't necessarily mean that it will run at maximum performance in all conditions. Perhaps it is something like this:

  • if the game uses PixelShader 1.3, the nVidia runs at its maximum speed since it natively supports it. The ATI performs suboptimally.
  • if the game uses PixelShader 1.4, ATI performs optimally. But now the game uses features that the nVidia doesn't support, so DirectX uses software emulation for those features... bye bye high performance.

Not likely ... (1)

Philipv1 (467269) | more than 13 years ago | (#7021)

..since DX is backwards compatable with such features. If DX 8 allows PS 1.4, it most certainly will allow PS 1.3 and the same functions will all be translated within DX 8.

That's pretty much why DX 8 exists. Sure, you wont have all the features of 1.4, but your 1.3 will be zipping along instead at full speed regardless. It wont simply be tossed aside to software-emulation mode.

Re:I believe (2, Interesting)

Kareena Bhagnani (462993) | more than 13 years ago | (#24554)

The Pixel Shader technology will be backwards compatable as far as the DirectX 8.0 API is concerned. Imagine that. Microsoft using an API to bring software developers together across various hardware choices.

Sadly the situation is not unified in OpenGL yet, with both Nvidia and ATI providing their own separate extensions for accessing pixel shaders. One can only hope that its not too long before we can get an ARB-approved extension that covers the capabilities of both cards.

Of course, since it will be quite a while before games publishers can rely on people having a GeForce3 or Radeon2, I expect pixel shaders will only be used for optional flash for quite some time. If people are doing bump mapping and phong shading and so on using them, they'll certainly have the option to run in a slightly less attractive mode for those with lamer hardware.

Back to the future (-1, Redundant)

Anonymous Coward | more than 13 years ago | (#24884)

Actually... It seems everything is cyclic, it's just a return to the old days when all programs had to have specific drivers for the monitor and video adapter you have... Nothing new over there.

Re:Back to the future (1)

RyuuzakiTetsuya (195424) | more than 13 years ago | (#26021)

yeah, but it's at the OS level now. Thank god. While ID's a nice company and all, i don't want them writting drivers for my video card.

Ironic (0)

Anonymous Coward | more than 13 years ago | (#30664)

John Carmack wrote the orignal GLX support for XFree 3.3 that all linux GLX support is based on. :)

General8 (1, Interesting)

General8 (470466) | more than 13 years ago | (#25889)

What's this? Obviously people don't have to hand write their code for different cards. That's why we have DirectX and OpenGL and what not. They handle that shit. No worries, people.

What comes to ATI vs. NVidia. I've always resented ATI, and NVidia has always had better cards. But my opinion has started to change to favor ATI. NVidia's refusal to release any open specs to create drivers for their cards is blasphemy. At least ATI is open about their sucky hardware (Radeon is actually quite nice).

Re:General8 (2, Insightful)

Gingko (195226) | more than 13 years ago | (#36316)

Say what? NVidia's cards have always rocked (except the ZX chipset admittedly), I agree. But NVidia provide a level of community support *far and away* better than ATI. NVidia host conferences for grad students and their professors. They have developer conferences in many different countries. Matt and Cass from NVidia hang out on opengl.org's [opengl.org] discussion forums and help everyone out (newbies, old hands, the lot). The developer documentation is sublime - and everyone can get at it. Plus their drivers *just work* 9 times out of 10.

I could care less about driver specs. The 3dfx ones are around if I want to see how modern-ish graphics cards are set up. And their drivers are such good quality, I can see why they don't want mutations springing up all over the web. I certainly don't have a problem with such a pleasant company to work with wanting to hold on to a few secrets.

Henry

Well (1)

skrowl (100307) | more than 13 years ago | (#25899)

Well, at least they aren't price gouging on Geforce 3 cards. I mean, they're ONLY 300 some dollars. So what if it's more expensive than the motherboard and the CPU? Just buy the ATI card AND the Nvidia card and then swap them out in between games. I mean, if you have to reboot to go into windows to play a game anyhow, you might as well do a hardware swap while you're at it! Now if the motherboard manufacturers would just get up to speed on that hot swap AGP slot technology that I've been looking forward to!

How the hell? (1)

MasterOfDisaster (248401) | more than 13 years ago | (#26022)

Didn't nVidia -DEVELOP- DX8 along with microsoft? why then, do they only support ps 1.3 whereas ATI supports ps 1.4?
Who in their right mind would buy an ATI board anyways? appart from keeping the Mac platform alive for several years, have they done anything worthwhile? I thought they just played catchup to nVidia and 3dfx

Re:How the hell? (1)

dinivin (444905) | more than 13 years ago | (#10502)

Who in their right mind would buy an ATI board anyways? appart from keeping the Mac platform alive for several years, have they done anything worthwhile? I thought they just played catchup to nVidia and 3dfx

Apparently you've never seen a ATI Radeon in action... A 64 Meg DDR version will easily outperform the fastest 3dfx cards and gives the GeForce2 GTS a run for it's money.

Dinivin

Re:How the hell? (0)

Anonymous Coward | more than 13 years ago | (#16180)

I've got an ATI All-In-Wonder 128. I've tried a friends PC that has a GeForce card and I liked my ATI card over it. I don't know what the FPS rate is, but it flows smooth enough that I don't give a rats @$$ what the FPS is. I guess that is doing something worthwhile (plus the fact the card is cheaper than buying an addition video I/O card for converting home movies into AVI or MPEG)

All-In-Wonder boards (1)

unitrcn (472455) | more than 13 years ago | (#18362)

rock, simply put. If you've ever wanted to fool around with video editing, they're a low-cost way to get into it. Plus you have the cable-in for the TV tuner, DVD decoding, compsite and S-video out so you can watch the DVDs on a real TV. Have you ever played Quake on 52" screen?

Their 3D capabilities have always been respectable. Not the TOP, but what's a few FPS between the Geforce and Radeon, especially when there's also a few hundred dollars difference. Heck, my Rage 128 still gets "good-enough-for-me" frame rates in Q3A (typically in the 60s).

Re:How the hell? (1)

Xugumad (39311) | more than 13 years ago | (#23757)

As someone that has Radeons at home and work, and I find that the image quality is better than Nvidia cards I've used (TNT and GeForce 2 MX, both from Hercules). Also, the All-In-Wonder line of cards is very nice if you're looking for that kind of functionality.

Re:How the hell? (1)

4n0nym0u$ C0w4rd (471100) | more than 13 years ago | (#23758)

Being that the Radeon 64MB DDR card is only $200 while the GeForce 3 is $400 could have something to do with it. ATI is a good card for people who want a great card but don't want to spend too much money on it, GeForce 3 is for performance junkies (me) or rich people who want bragging rights (not me unfortunately). For the money ATI makes a fine product.

But then again my next card will be a GeForce 3 :)

Re:How the hell? (1, Informative)

Zuchinis (301682) | more than 13 years ago | (#33441)

Here [anandtech.com] is a part of a hardware review from Anandtech [anandtech.com] that compares geforce3 cards for, among other things, 2D image quality. A Radeon DDR, a Matrox G450 eTV, and a geforce2 MX card were used for a referencein the test. Apparently, the Radeon DDR and G450 set a high standard for video card basics like high quality VGA filtering. If you read the (subjective) scores, you'll find that it is a rare geforce3 card that can live up to the Radeon in 2D image quality and no geforce3 can match the G450.

Isn't it nice to see the non-nVidia brands redeemed?

Re:How the hell? (1)

maligor (100107) | more than 13 years ago | (#6274)

Because DirectX 8.1 (that supports ps 1.4) was developed with ATI. As for why someone might just buy an ATI board is (as far as it concerns me) because they have working DRI drivers, even though they aren't feature complete. My next card probably won't be an NVIDIA one.

Another day, another marketing war (2, Insightful)

Earlybird (56426) | more than 13 years ago | (#26026)

Does this mean that future games will be hardware specific?

Well, no. Game developers do prefer the state of the art, but common sense dictates that you target something that is exists and is popular.

Comparisons to browser market shares are appropriate here: When Internet Explorer became the norm, web sites tended to take advantage of IE's superior DHTML and DOM support, but developers have mostly strived to make pages backwards-compatible with Netscape and other less capable browsers. After Mozilla caught up, most web sites still aren't targeting it specifically.

Keep in mind that, according to the article, the board does not currently exist. One's desire to write custom code for a nonexistent board is contingent on several factors, such as the manufacturer's present and potential future market share.

Case in point: Developers used to target Glide, 3Dfx' low-level rendering API. Games these days don't bother: 3Dfx has DirectX support, the effort to squeeze a few extra FPS from writing "straight to the metal" usually isn't worth the time and money, and most importantly, 3Dfx is dead. Its user base is dwindling, and there is no incentive to use the (generally) hardware-specific Glide over the generic DirectX.

As for the development effort: As a former game developer and Direct3D user, I agree with the claim that when targeting both shaders, "they'll have to write more code". A few hundred lines, perhaps, for detecting and using the two extra texture shaders per pass. It's not like it's a new, different API.

You mean they *aren't* hardware-specific? (1)

mblase (200735) | more than 13 years ago | (#27705)

Every game on the market says it right on the box: Mac or Windows PC, 300Mz or 500Mz processor, 64MB of RAM or 128MB of RAM, 3D hardware required or not... And then, of course, there's the console games which run on one, and only one, hardware configuration. Face it, games have been hardware-specific for years. This is merely the next level.

Glue (1)

PHanT0 (148738) | more than 13 years ago | (#28387)

As a senoir student in University and someone looking very serioudly at heading toward the gaming industry for a career, I'd love to work on a glue project for this and other problems in the video card industry.

Think about it... if you could make a 'glue'-like API for both these standards and many more problems like it, developers would probably be very interested, don't you think?

If you wanted, you could even go as far as supporting hardware-specific routine alternatives...

I know this has to have been attempted before, so can anyone tell me what happened? I mean, DirectX is MS controlled, OpenGL is controlled by a select group of leading tech companies, and Glide was very 3dF/x proprietary... all of these only lead consumers and game developers to one thing... a big choice when buying your VC and games/software for your VC/PC combination...

why arent.... (1)

xtermz (234073) | more than 13 years ago | (#28443)

they using a standard like direct3d or glide or (insert favorite library here). Seems that we are reverting back to the days of hardware specific code rather than library specific code.... is there not anything out there that is on par with the performance of these new cards?

Re:why arent.... (2, Informative)

_Neurotic (39687) | more than 13 years ago | (#10647)

The Pixel Shader technology discussed here is in fact a part of DirectX8 (Direct3D) The issue isn't which API (in the general sense) but rather which version of a subset of an API.

Not being a 3D programmer, I don't know whether the claim of vast differences in code are true. Can anyone shed light on this?

Neurotic

Re:why arent.... (1)

GeckoX (259575) | more than 13 years ago | (#28416)

No, not really if it is structured properly in the first place. Besides, games are never write once run anywhere, there are always code forks to get games working on as many different varying platforms as possible anyways.

Re:why arent.... (1)

geekster (87252) | more than 13 years ago | (#43386)

Glide is only for the voodoo chipsets

ATI stinks (-1, Flamebait)

gamorck (151734) | more than 13 years ago | (#29336)

ATI has always stunk compared to Nivida. Just ask any non maclot. They may have 2d performance but who here spends $300 on a 3dcard for the 3d performance aspect of things?

Nvidia is here to stay. ATI well.... as long as people keep buying their overpriced crap - they will be around as well.

Gam
"Flame at Will"

Re:ATI stinks (0)

Anonymous Coward | more than 13 years ago | (#12708)

"Nvidia is here to stay. ATI well.... as long as people keep buying their overpriced crap - they will be around as well." Don't you have that the other way around?

Re:ATI stinks (2, Informative)

4n0nym0u$ C0w4rd (471100) | more than 13 years ago | (#24885)

hmmm, first let me say I'm getting a GeForce three rather than an ATI Radeon for my new computer, so I'm not really biased.....ATI Radeon DDR 64MB, $200........Geforce 3 DDR 64MB, $400........light-speed memory architecture, priceless. ATI isn't over-priced, they are very reasonably priced, if I wasn't a total performance junky I'd be getting a Radeon instead of a GeForce 3 because the Geforce 3 is definately overpriced.

Re:ATI stinks (1)

alen (225700) | more than 13 years ago | (#12474)

Check ebay. You can get brand new Geforce3's for about $300-$325.

Re:ATI stinks (1)

4n0nym0u$ C0w4rd (471100) | more than 13 years ago | (#10646)

I was talking about straight from a major store chain, where Joe Sixpack generally gets his card. At pricewatch I could get an ATI Radeon 64MB DDR for under $150. But then I'd probably spend hundreds of dollars on cooling equipment so I could seriously overclock it without frying the damn thing. Anyway since I just realized my lawsuit money all goes into a trust which will take a while to break, I'll probably end up getting a GeForce 10 by the time I get my money :)

Re:ATI stinks (0)

gamorck (151734) | more than 13 years ago | (#29003)

I just cant believe you are arrogant enough to compare a Geforce 3 to a first gen ATI Radeon. Compare Geforce 2 prices to a Radeon. When the Radeon 2 comes out feel free to compare to the Geforce 3.

Gam
"Flame at Will"

Re:ATI stinks (1)

AA0 (458703) | more than 13 years ago | (#25474)

The Radeon is cheaper than GF2s, always has been. The value cards are similar prices, but the crappy MX cards can't compete with a Radeon LE. Your attitude stinks towards technology, you can't even accept the fact that a new (old) company is coming into 3d graphics and is doing it much better then Nvidia. Why spend all that cash on a GF3 card that has the worst looking picture of any vid card on the market? It isn't hard to get speed when the processor leaves out any and all detail the image had. If ATI wanted to compete with the GF3 they could have released another MAXX card, it would have put the GF3 to shame, and still had been cheaper. The only thing wrong with ATI right now is that their drivers are slower than they should be, they are very stable, but can't compete in win2k. Nvidia's drivers on the other hand have countless revisions and versions, some work on some computers... great way to make a product.

Re:ATI stinks (1)

4n0nym0u$ C0w4rd (471100) | more than 13 years ago | (#31809)

I'll bet the farm when the Radeon 2 comes out it will be cheaper than the GeForce 3 was when it came out. Also the 1st generation Radeon isn't that far below the GeForce 3, most gamer sites I've seen reccomen getting the Radeon over the GeForce three or waiting for the price to go down, which of course is moot since I am getting a GeForce 3, I just think saying ATI card are overpriced crap is a little extreme and a lt inaccurate.

RSN (0)

gamorck (151734) | more than 13 years ago | (#30033)

Apparently ATI's next board will support pixel shader 1.4, while Nvidia's GeForce3 will only go up to ps 1.3
Yeah ATI's board will show up real soon now right? Seriously folks - newer cards embrace newer standards. ATI's card isnt out yet though. But considering they have produced stuff for the macworld for so long Im sure it will be out RSN. ATI really needs to get a head up on Nvidia because the fact is - they are losing this war in the videocard market. Everytime I go to the computer store the Nvidia cards are always sold out - the ATI cards have plenty stocked up on the shelves. This should tell you something about who is winning what war.

The Geforce2 MX blew ATI out of the water when it came down to price points. Not a single card in the Radeon lineup can compete with the low price power that the GF2MX card offers. Period. End of discussion.

Not only that but ATI has a history of SHODDY driver support - they are slow to embrace new OSes and standards - and there cards ARE overpriced. Comparing a Geforce 3 price to a Radeon price isnt fair folks - the Radeon gets raped on performance and features. Compare the GF2MX of GF2GTS to the Radeon - at least level the playing field.

Gam
"Flame at Will"

hardware specific code again? (2, Interesting)

Quixadhal (45024) | more than 13 years ago | (#30719)

I think the answer to that question is to rate just how flexible the current API's can be. The two contenders (and please, let's try not to make MORE!) are OpenGL and DirectX. Nvidia has ressurected the venerable Amiga's idea for a fully programmable graphics processor, and I presume that ATI's post-Raedon chip will be similar.

So, which API allows one to most easily get at the GPU's coding power? How many hooks does the high level api have into the gpu's engine, and can the gpu get data from the api on the fly?

If anyone out there has worked with them, I'd be curious to hear what's present or lacking from the standards, and if it's feasable to try and write GPU level code abstractly.

What about OpenGL? (0)

Anonymous Coward | more than 13 years ago | (#3178)

Wasn't the big selling point of OpenGL that it was an open standard that was cross-platform? So much for that idea.

Re:What about OpenGL? (1)

bribecka (176328) | more than 13 years ago | (#30439)

Wasn't the big selling point of OpenGL that it was an open standard that was cross-platform? So much for that idea.

This article is talking about DirectX, it doesn't look like anyone is asking developers to write specificly to one piece of hardware.

In the case of OpenGL, however, it open-standard, cross-platform, but unfortunately the marketing department at MS likes to push DirectX on any of its partner developers. Really, OpenGL is the way to go in just about every case where software will be going to multiple platforms, but since 99% of games come out for PC only, it's probably a toss up between OGL and DirectX. Add in the fact that many of the developers may be ignorant to anything outside of MS world, and the balance tips toward MS/DirectX.

Re:What about OpenGL? (1)

grazzy (56382) | more than 13 years ago | (#11943)

and why?

microsoft actively develops new nice features helping the developers, opengl has stuck to its standard for a looong time.

Re:What about OpenGL? (1)

bribecka (176328) | more than 13 years ago | (#15617)

microsoft actively develops new nice features helping the developers, opengl has stuck to its standard for a looong time.

But the "features" that microsoft develops are easily implemented in OGL, or can be added in as extensions by the vendor. That's the good part about OGL, each individual vendor can add its own extensions to suit whatever is needed.

Of course, then you have this situation, where some vendors support some things, and others support other technologies, etc, but the point of a "standard" is that it shouldn't change that often. Hence OpenGL is on v1.2, and DirectX is on 8. Most "standards" don't go too far up the version ladder.

Re:What about OpenGL? (0)

Anonymous Coward | more than 13 years ago | (#20878)

Add in the fact that many of the developers may be ignorant to anything outside of MS world

They aren't ignorant they just don't see a profit in it. Most companies that develop games for other non-win32 platforms don't last long.

so much for the death of glide (0)

Anonymous Coward | more than 13 years ago | (#32614)

the more things change, the more they stay the same...wasnt this what directx and opengl were supposed to fix?

Quake 2 supported 3 architectures (0)

Anonymous Coward | more than 13 years ago | (#34540)

If you remember Quake 2 had support for Glide OpenGL and PowerVR. I don't know what pixel shader is, but if both cards support up to 1.3 you would think they would stick with that.

Re:Quake 2 supported 3 architectures (1)

_Neurotic (39687) | more than 13 years ago | (#934)

Nope, Quake 2 did not support Glide, rather it supported the 3dfx mini-gl library (OpenGL subset)

Quite the contrary, Quake 2 (much like quake 1) in conjuction with the 3dfx mini-gl did much to boost OpenGL support in games.

Neurotic

one word: x-box (0)

Anonymous Coward | more than 13 years ago | (#35412)

Developers will code for xbox, anything more is wasted effort now.

Writing code twice (1)

bribecka (176328) | more than 13 years ago | (#36340)

Actaully, it doesn't say they would have to write code twice, just write more code to support both. Really, probably not a big deal at all, as every game/graphics engine should have pieces that are specific to the capabilities of the API--remember multitexturing? When that came out, there needed to be "extra" code to support it.

The fact is this article is talking about using DirectX, and that fact alone means that the codebase should be 98% the same for any graphics card. The difference between the ATI and Nvidia implementation is that some features may be enabled/disabled, or, in the case of the 6 texture in one pass deal (for the ATI), probably just a different ordering of API calls. This is not writing things to be hardware specific--on NT/2000, you can't even access the hardware directly (well, you *can* if you really try, but it just limits your audience and means more work for you)...you need to talk to HAL.

Aren't they now? (3, Interesting)

ajs (35943) | more than 13 years ago | (#3840)

Every time I get a game, there's a short list of graphics devices supported on the box. I always hear about the development of this or that game, in terms of specific card features.

Heck, I even remember Carmack talking on Slashdot [slashdot.org] about things like "Nvidia's OpenGL extensions" and other features of specific cards that he was having to take advantage of.

Yeah, the new wiz-bang game will probably be able to limp-along on whatever you've got, but likely will only be optimized for a few special cards.

The video-card industry has gotten really awful. I hope that someone pulls it back in line and we get back on a standards track where card manufacturers contribute to the standards efforts and then work hard to make the standard interface efficient.

Deja vu. (4, Insightful)

AFCArchvile (221494) | more than 13 years ago | (#40600)

"Does this mean that future games will be hardware specific?"

If so, it won't be the first time; remember the days of 3dfx? Original Unreal would only run on Glide hardware acceleration; if you didn't have a 3dfx card, you were forced to run it in software. Of course, this didn't sit well with the growing NVidia user base who consistently pointed out that Quake 2 and Half-Life both rendered on anything running OpenGL (including 3dfx cards; remember those mini-driver days?), and OpenGL and Direct3D renderers were finally introduced in a patch. That's about when 3dfx started to go down the toilet; delaying product releases and missing features (32-bit color and large texture support being two of the most blatant omissions) eventually tainted the 3dfx brand to the point of extinction.

Since then, 3D gaming has been a less lopsided world. Linux gaming was taken seriously. Standardised APIs that could run on almost anything were the rule; if it wasn't OpenGL, it would at least be Direct3D. Then the GL extensions war heated up, with NVidia developing proprietary extensions that would work only on their cards. But this wasn't a problem; you could still run OpenGL games on anything that could run OpenGL; you'd just be missing out on a few features that would only slightly enhance the scenery.

Leave it to Microsoft to screw it all up with DirectX 8. They suddenly started talking about pixel shaders and other new ideas. John Carmack has already described the shortfalls and antics of DX8 [planetquake.com] . And now 3D programmers will have to program for multiple rendering platforms, but at least you can still run it with anything.

Sure, this entire disagreement between ATI and NVidia is bad for the 3D industry, but things could be worse. A LOT worse.

Aren't they already somewhat h/w specific? (1)

pjdepasq (214609) | more than 13 years ago | (#43388)

I have a 3.5 year old Gateay G6-300 with an STB (NVidia) graphics card. I can't run many of the new games these days since it does not support the OpenGL software internally. I've tried Quake3, and the updated software driver, but it's no go. As far as I'm concerned, many games are already hardware specific. :-(

Re:Aren't they already somewhat h/w specific? (0)

Anonymous Coward | more than 13 years ago | (#12191)

You should be able to get q3 going on that setup. Its not pretty, but it should work. I have a k6 300 with a nvidia riva128 64MB RAM. if you need help drop me a line at tucker_1968@NOSPAM.yahoo.com

uh.. (0)

Anonymous Coward | more than 13 years ago | (#4339)

Intel and AMD, etc...

fp (-1, Redundant)

Anonymous Coward | more than 13 years ago | (#43549)

blurst roast!

This is good for hardware and software (3, Insightful)

Skynet (37427) | more than 13 years ago | (#43550)

This is good for hardware because ATI and NVidia will continue to push the envelope, developing more and more advanced graphics boards. Features will creep from one end to the other, just staggered a generation.

This is good for software because developers will have more choice in the hardware that they develop for. ATI doesn't support super-duper-gooified blob rendering? Ah, NVidia does in their new Geforce5. No worries, ATI will have to support it in their next generation boards.

A bipolar competition is ALWAYS good for the consumer.

Re:This is good for hardware and software (1)

(trb001) (224998) | more than 13 years ago | (#14800)

This is good for hardware because ATI and NVidia will continue to push the envelope, developing more and more advanced graphics boards. Features will creep from one end to the other, just staggered a generation.

Eh, I don't know. I would compare this to the late 80's when computers were being developed by Amiga, IBM (and clones), Mac, Apple, etc...you had certain games/software that were available on a given platform and not the other, people just couldn't support multiple hardware configurations. As long as there are multiple companies producing competing products, is there really a reason they can't be compatible at the software level? Personally, I'd rather be able to look at a video card's features (memory, fps) than what games I'm going to be able to play with it.

--trb

But the whole point of DirectX... (3, Insightful)

volpe (58112) | more than 13 years ago | (#15625)

...is that developers shouldn't HAVE to develop for specific hardware. I don't work in the game industry specifically, but I don't see how this is necessarily good for software in general, or graphics software in particular. This doesn't give developers "more choice in the hardware they develop for" It gives them less choice, because they have to decide how to allocate limited resources on a per-platform basis. When you have a common API, you're not forced to choose in the first place. That's why hardware specific features and capabilities ought to be abstracted-out into a common API. What these guys should do is come up with a dozen or so different kinds of high-level magic (e.g. water waves, flame, smoke,bullet-holes, whatever) that they can work with their pixel and vertex shaders, lobby Microsoft to get that magic incorporated into the DirectX spec, and then supply drivers that meet those specs by sending a few pre-packaged routines to the pixel/vertex shaders, rather than have game developers worry about this stuff directly. Or am I missing something?

Re:But the whole point of DirectX... (1)

MartinG (52587) | more than 13 years ago | (#23093)

No, you're not missing anything. At LAST! somebody who understands the importance of proper abstraction. The mistake, it seems is not one by nvidia or ati, but Microsoft. If directx was designed soundly it should be possible for the card manufacturers to both support the same API versions.

I was getting worried - the majority of posters so far seem to be nutcases.

Re:This is good for hardware and software (1, Interesting)

CMBurns (38993) | more than 13 years ago | (#23530)

> This is good for software because developers
> will have more choice in the hardware that they
> develop for.

Bullshit. That path leads straight to the darn old days where every game was board-specific.

> ATI doesn't support super-duper-
> gooified blob rendering? Ah, NVidia does in
> their new Geforce5. No worries, ATI will have to
> support it in their next generation boards.

Wrong, this should be "ATI doesn't support super-duper-gooified blob rendering? Idiot, why did you buy that board in the first place? But no worries, NVidia has the new Geforce5, just spend 300+ bucks an get one. Unfortunatley, this will break application Y as only ATI has the new M.O.R.O.N.-rendering which is required by Y. But hey, such is life!

> A bipolar competition is ALWAYS good for the
> consumer.

This is not "bipolar competition", this is "fragmentation".

C. M. Burns

Re:This is good for hardware and software (3, Insightful)

bribecka (176328) | more than 13 years ago | (#28738)

A bipolar competition is ALWAYS good for the consumer.

You mean like when Netscape and IE were competing? In case you haven't noticed, HTML rendering between the two browsers haven't exactly meshed.

Re:This is good for hardware and software (2, Insightful)

Xugumad (39311) | more than 13 years ago | (#43340)

How is this meant to be good for developers, or consumers? Developers now have three options:

  • Develop for NVidia based cards, which is slower if you have an ATI card
  • Develop for ATI based cards, completely ignoring the NVidia market
  • Develop for both, significantly adding to development effort

This is also terrible for the consumer. Sorry, but that new card you just spend a small fortune on doesn't support the pixel shader version the game you want uses. Oh well, you'll just have to upgrade to the next card, when it comes out, hope that's okay. But don't worry, it will have lots of new features too (which no-one elses card will support).

Simple way to help you choose (5, Funny)

Dr_Cheeks (110261) | more than 13 years ago | (#464)

Just do what I do; utterly fail to save up for that latest bit of kit. Every game I've bought in the past year has supported my TNT2 M64 chipset. Sure, I can't render something like Shreck on my machine, but since I blew the graphics card money on beer I can't tell the difference.

Hell, Half Life and Doom are barely distinguishable from each other if your beer-goggles are thick enough. And it doesn't matter if the frame-rate slows down thru lack of processing power - your reactions are already terrible from the booze.

Yet again, beer is the cause of, and solution to one of life's problems (thanks to Homer for the [slightly paraphrased] quote).

Hmph (0)

Anonymous Coward | more than 13 years ago | (#5076)

Next they will be racing to see whose card will require the biggest HSF combo...

~AC#42

The problem is not usually for the Developers (3, Insightful)

LordZardoz (155141) | more than 13 years ago | (#6273)

Its much like the choice to support AMD's 3DNOW or Intel's SIMD instructions. If you use DirectX 8 or OpenGL, the issue is usually dealt with by the graphics library and the card drivers. Some bleeding edge features are initially only supportible by writing specific code, but that is the exception.

END COMMUNICATION

Re:The problem is not usually for the Developers (1)

Kareena Bhagnani (462993) | more than 13 years ago | (#1430)

Its much like the choice to support AMD's 3DNOW or Intel's SIMD instructions.

..which are converging. The Palomino Athlon core now supports Intel's SSE opcodes as well as 3DNow, and it is promised that the Hammer will also support SSE2. One can only hope that Nvidia and ATI's pixel shaders can also be comfortably converged into a common interface (sounds like they pretty much will be in DirectX 8.1, hopefully it won't be long until there's a common ARB extension for them in OpenGL too).

Some bleeding edge features are initially only supportible by writing specific code, but that is the exception.

And in the case of 3D hardware, the bleeding edge features are sure to be used for extra "flash", not vital functionality. A game might have phong-shaded bump-mapped objects on a Radeon2, but it will still run with slightly less exciting graphics on your elderly TNT2.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?