Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA's Pixel & Vertex Shading Language

CmdrTaco posted more than 12 years ago | from the i-want-my-graphics-3d dept.

Technology 263

Barkhausen Criterion writes "NVIDIA have announced a high-level Pixel and Vertex Shading language developed in conjunction with Microsoft. According to this initial look, the "Cg Compiler" compiles high level Pixel and Vertex Shader language into low-level DirectX and OpenGL code. While the press releases are going amok, CG Channel (Computer Graphics Channel) has the most comprehensive look at the technology. The article writes, "Putting on my speculative hat, the motivation is to drive hardware sales by increasing the prevalence of Pixel and Vertex Shader-enabled applications and gaming titles. This would be accomplished by creating a forward-compatible tool for developers to fully utilize the advanced features of current GPUs, and future GPUs/VPUs." "

cancel ×

263 comments

My Nads (-1, Offtopic)

mynads (585237) | more than 12 years ago | (#3695581)

FP MY NADS!

Re:My Nads (-1)

YourMissionForToday (556292) | more than 12 years ago | (#3695767)

<TT>I REALLY LIKE TO LOOK AT PORN.<BR>I CAN'T HOLD A JOB AND MY WIFE<BR>JUST LEFT ME TO LIVE WITH HER MOTHER.<BR>&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;___<BR>&nbsp; &nbsp; &nbsp; &nbsp;.'&nbsp; &nbsp;`'.<BR>&nbsp; &nbsp; &nbsp; /&nbsp; &nbsp;_&nbsp; &nbsp;_|<BR>&nbsp; &nbsp; &nbsp;l&nbsp; &nbsp; a&nbsp; &nbsp;a| <BR>&nbsp; &nbsp; &nbsp;(,&nbsp; &nbsp; &nbsp; \ |&nbsp; &nbsp; &nbsp; &nbsp;.----.<BR>&nbsp; &nbsp; &nbsp; l&nbsp; &nbsp; &nbsp;-' |&nbsp; &nbsp; &nbsp; /|&nbsp; &nbsp; &nbsp;'--.<BR>&nbsp; &nbsp; &nbsp; &nbsp;\&nbsp; &nbsp;'=&nbsp; l&nbsp; &nbsp; &nbsp; ||&nbsp; &nbsp; ]|&nbsp; &nbsp;`-. "YOU'VE GOT PORN!LOL!!!LOL!!"<BR>&nbsp; &nbsp; &nbsp; &nbsp;l`-.__.'&nbsp; &nbsp; &nbsp; &nbsp;||&nbsp; &nbsp; ]|&nbsp; &nbsp; ::|&nbsp; &nbsp;/<BR>&nbsp; &nbsp; &nbsp;.-'`-.__ l__&nbsp; &nbsp; &nbsp;||&nbsp; &nbsp; ]|&nbsp; &nbsp; ::|_/<BR>&nbsp; &nbsp; l&nbsp; &nbsp; &nbsp; &nbsp; ``&nbsp; `.&nbsp; &nbsp;||&nbsp; &nbsp; ]|&nbsp; &nbsp; ::|<BR>&nbsp; &nbsp;l&nbsp; &nbsp; &nbsp;l&nbsp; &nbsp; &nbsp;l&nbsp; l&nbsp; &nbsp;\|&nbsp; &nbsp; ]|&nbsp; &nbsp;.-'<BR>&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; l&nbsp; &nbsp; l&nbsp; &nbsp;l&nbsp; &nbsp;L.__&nbsp; .--'(<BR>&nbsp; ll&nbsp; &nbsp; &nbsp; `.&nbsp; l&nbsp; &nbsp; l&nbsp; ,--|_&nbsp; &nbsp; &nbsp; \<BR>&nbsp; l '.&nbsp; &nbsp; &nbsp; '.ll&nbsp; &nbsp; ll .-._|=-&nbsp; &nbsp; |_<BR>&nbsp; l&nbsp; &nbsp;'.&nbsp; &nbsp; &nbsp; '.'. l`l/ .'&nbsp; packard '.<BR>&nbsp; l&nbsp; &nbsp; &nbsp;`'l&nbsp; &nbsp; &nbsp;`;-:-;`|&nbsp; &nbsp; bell&nbsp; &nbsp; &nbsp;|<BR>AOL!!! LOLOL<BR>SO EASY TO USE, SPAM, LOOK AT PORN <BR>TRADE WAREZ, DOWNLOAD BRITNEY SPEARS<BR>CHAT WITH UNDERAGE GIRLS, LOOK AT MORE PORN<BR>NO WONDER IT'S NUMBER 1!!!LOL!!!<BR><BR>Important Stuff:<BR>Please try to keep posts on topic.<BR><BR>Try to reply to other people comments instead of starting new threads.<BR><BR>Read other people's messages before posting your own to avoid simply duplicating<BR>&nbsp; what has already been said.<BR><BR>Use a clear subject that describes what your message is about.<BR><BR>Offtopic, Inflammatory, Inappropriate, Illegal, or Offensive comments might be<BR>&nbsp; moderated. (You can read everything, even moderated posts, by adjusting your<BR>&nbsp; threshold on the User Preferences Page)<BR>Problems regarding accounts or comment posting should be sent to<BR> CowboyNeal.<BR><BR>Important Stuff:<BR>Please try to keep posts on topic.<BR><BR>Try to reply to other people comments instead of starting new threads.<BR><BR>Read other people's messages before posting your own to avoid simply duplicating<BR>&nbsp; what has already been said.<BR><BR>Use a clear subject that describes what your message is about.<BR><BR>Offtopic, Inflammatory, Inappropriate, Illegal, or Offensive comments might be<BR>&nbsp; moderated. (You can read everything, even moderated posts, by adjusting your<BR>&nbsp; threshold on the User Preferences Page)<BR>Problems regarding accounts or comment posting should be sent to<BR> CowboyNeal.</TT><BR>--<BR><BR >
<BR>

not an FP (-1)

IAgreeWithThisPost (550896) | more than 12 years ago | (#3695595)

but i looked good doing it!
More Nvidia specs!! [about.com]

Re:not an FP (-1)

YourMissionForToday (556292) | more than 12 years ago | (#3695811)

<TT>I REALLY LIKE TO LOOK AT PORN.<BR>I CAN'T HOLD A JOB AND MY WIFE<BR>JUST LEFT ME TO LIVE WITH HER MOTHER.<BR>&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;___<BR>&nbsp; &nbsp; &nbsp; &nbsp;.'&nbsp; &nbsp;`'.<BR>&nbsp; &nbsp; &nbsp; /&nbsp; &nbsp;_&nbsp; &nbsp;_|<BR>&nbsp; &nbsp; &nbsp;l&nbsp; &nbsp; a&nbsp; &nbsp;a| <BR>&nbsp; &nbsp; &nbsp;(,&nbsp; &nbsp; &nbsp; \ |&nbsp; &nbsp; &nbsp; &nbsp;.----.<BR>&nbsp; &nbsp; &nbsp; l&nbsp; &nbsp; &nbsp;-' |&nbsp; &nbsp; &nbsp; /|&nbsp; &nbsp; &nbsp;'--.<BR>&nbsp; &nbsp; &nbsp; &nbsp;\&nbsp; &nbsp;'=&nbsp; l&nbsp; &nbsp; &nbsp; ||&nbsp; &nbsp; ]|&nbsp; &nbsp;`-. "YOU'VE GOT PORN!LOL!!!LOL!!"<BR>&nbsp; &nbsp; &nbsp; &nbsp;l`-.__.'&nbsp; &nbsp; &nbsp; &nbsp;||&nbsp; &nbsp; ]|&nbsp; &nbsp; ::|&nbsp; &nbsp;/<BR>&nbsp; &nbsp; &nbsp;.-'`-.__ l__&nbsp; &nbsp; &nbsp;||&nbsp; &nbsp; ]|&nbsp; &nbsp; ::|_/<BR>&nbsp; &nbsp; l&nbsp; &nbsp; &nbsp; &nbsp; ``&nbsp; `.&nbsp; &nbsp;||&nbsp; &nbsp; ]|&nbsp; &nbsp; ::|<BR>&nbsp; &nbsp;l&nbsp; &nbsp; &nbsp;l&nbsp; &nbsp; &nbsp;l&nbsp; l&nbsp; &nbsp;\|&nbsp; &nbsp; ]|&nbsp; &nbsp;.-'<BR>&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; l&nbsp; &nbsp; l&nbsp; &nbsp;l&nbsp; &nbsp;L.__&nbsp; .--'(<BR>&nbsp; ll&nbsp; &nbsp; &nbsp; `.&nbsp; l&nbsp; &nbsp; l&nbsp; ,--|_&nbsp; &nbsp; &nbsp; \<BR>&nbsp; l '.&nbsp; &nbsp; &nbsp; '.ll&nbsp; &nbsp; ll .-._|=-&nbsp; &nbsp; |_<BR>&nbsp; l&nbsp; &nbsp;'.&nbsp; &nbsp; &nbsp; '.'. l`l/ .'&nbsp; packard '.<BR>&nbsp; l&nbsp; &nbsp; &nbsp;`'l&nbsp; &nbsp; &nbsp;`;-:-;`|&nbsp; &nbsp; bell&nbsp; &nbsp; &nbsp;|<BR>AOL!!! LOLOL<BR>SO EASY TO USE, SPAM, LOOK AT PORN <BR>TRADE WAREZ, DOWNLOAD BRITNEY SPEARS<BR>CHAT WITH UNDERAGE GIRLS, LOOK AT MORE PORN<BR>NO WONDER IT'S NUMBER 1!!!LOL!!!<BR><BR>Important Stuff:<BR>Please try to keep posts on topic.<BR><BR>Try to reply to other people comments instead of starting new threads.<BR><BR>Read other people's messages before posting your own to avoid simply duplicating<BR>&nbsp; what has already been said.<BR><BR>Use a clear subject that describes what your message is about.<BR><BR>Offtopic, Inflammatory, Inappropriate, Illegal, or Offensive comments might be<BR>&nbsp; moderated. (You can read everything, even moderated posts, by adjusting your<BR>&nbsp; threshold on the User Preferences Page)<BR>Problems regarding accounts or comment posting should be sent to<BR> CowboyNeal.<BR><BR>Important Stuff:<BR>Please try to keep posts on topic.<BR><BR>Try to reply to other people comments instead of starting new threads.<BR><BR>Read other people's messages before posting your own to avoid simply duplicating<BR>&nbsp; what has already been said.<BR><BR>Use a clear subject that describes what your message is about.<BR><BR>Offtopic, Inflammatory, Inappropriate, Illegal, or Offensive comments might be<BR>&nbsp; moderated. (You can read everything, even moderated posts, by adjusting your<BR>&nbsp; threshold on the User Preferences Page)<BR>Problems regarding accounts or comment posting should be sent to<BR> CowboyNeal.</TT><BR>--<BR><BR >
<BR>

Re:not an FP (-1)

L0rdkariya (562469) | more than 12 years ago | (#3695917)

You are one out-of-control motherfucker.

Someone should do something about your crazy troll ass.

Like GIVE YOU A MEDAL !

3dfx/Glide part 2? (2, Interesting)

PackMan97 (244419) | more than 12 years ago | (#3695599)

Hopefully NVidia will be able to avoid the proprietary pitfall that ultimately doomed 3dfx and Glide.

From the story it sounds like NVidia will allow other cards to support Cg so maybe they can. However I wonder if ATI will be willing to support a standard which NVidia controls. It's like wrestling with a crocodile if you ask me!~

Re:3dfx/Glide part 2? (1)

Jucius Maximus (229128) | more than 12 years ago | (#3695637)

"Hopefully NVidia will be able to avoid the proprietary pitfall that ultimately doomed 3dfx and Glide."

NVidia is avoiding at least one 3dfx pitfall ... their product is not simply a beefier version of the previous product (which in itself was a beefier version of the previous product (which in itself was a beefier version of the previous product (which in itself was a beefier version ... )))

Re:3dfx/Glide part 2? (2)

BWJones (18351) | more than 12 years ago | (#3695658)

Hopefully NVidia will be able to avoid the proprietary pitfall that ultimately doomed 3dfx and Glide.

Given that it was nVidia that purchased 3dfx and brought along many of the employees, I should hope that there would be some internal discussion of this.

Re:3dfx/Glide part 2? (2)

SirSlud (67381) | more than 12 years ago | (#3695730)

> like NVidia will allow other cards to support Cg so maybe they can

Um, well, Cg gets compiled to DirectX or OpenGL, so it follows that any card that can do DirectX or OpenGL (read: all of them?) will benifit Cg. I guess different cards to different levels of support, but if they want this to fly, itd be in their best interest to generate multi-card compatible code. Or at least allow you to specify what extentions your generated code will support, to tailer to specific card feature sets? Correct me if I'm confused, if anyone is really in the know.

I think the idea here is that you could use this language to write new shaders for cards on the market _now_ ... the gain is that it supports two targets (DX and OpenGL) and seems a significantly easier way of incorperating new shaders into games than current methods?

sorta... (2)

Steveftoth (78419) | more than 12 years ago | (#3695793)

There's directX and there's directX 8.1 oh and DirectX 8.1a.
Remember when the Radeon first came out? Well they had to release a special directX just to support it's pixel shaders as opposed to just nvidias.
So as a game developer you'll probably have to compile your Cg code with the Nvidia one and the ATI one just to make it work (better).
This tool will really help those XBox developers.
Same thing with OpenGL, since the spec isn't nailed down yet and with Nvidia 'leading the pack' of development. It wouldn't surprise me if they decided to not support any other cards with the OpenGL compiler (which they haven't even released yet).
So hopefully this will NOT turn into a Glide type issue. Since this is actually a level above glide. Glide was very low level, all the Glide functions mostly mapped directly onto the 3dfx hardware, while this is a little bit more abstract.

Re:3dfx/Glide part 2? (4, Informative)

Dark Nexus (172808) | more than 12 years ago | (#3695774)

Well, they're quoted in this [com.com] article on ZDNet (the quote is near the bottom) as saying that they're going to release the language base so other chip makers can write their own compilers for their products.

That was the first thing that popped into my head when I read this article, but it sounds like they're going to give open access to the standards, just not to the interface with their chips.

Programming language != API (0)

Anonymous Coward | more than 12 years ago | (#3695781)

Read a Slashdot's description: ""Cg Compiler" compiles high level Pixel and Vertex Shader language into low-level DirectX and OpenGL code"

Cg doesn't need ANY support from the video card.

This is no different from other programming languages. If someone would create new programming language and x86-assemebler for it, then code written in this language would run on ANY x86 CPU. Similary, code written in Cg would run on ANY video card which has Pixel and Vertex Shaders.

There is a great interview [cgshaders.org] with David Kirk (Chief Scientist at NVIDIA) which talks about other Cg's features like on-the-fly compilation of Cg programs.

Good metaphor, poorly executed (0)

Anonymous Coward | more than 12 years ago | (#3696003)

Indeed but lets take the case of x86 code generated by Intel's compiler ... of course there is no 3DNow! support (the equivalent of PS 1.4) of course code generation is optimal for Intels pipeline etc etc.

So yes code generated by such a compiler can run on anything, but it will run best on whatever the person who controls the compiler wants it to run best on. Which is to say NVIDIA hardware.

Re:3dfx/Glide part 2? (1)

dextr0us (565556) | more than 12 years ago | (#3695930)

IF you would have read the article instead of just posting first, YOU WOULD READ THAT IT IS CROSS PLATFORM!

THE FIRST PARAGRAPH!

Graphics giant NVIDIA today announced Cg, an initiative with participation from Microsoft to create a cross-platform, hardware-independent, high-level Pixel and Vertex Shader programming language.

Re:3dfx/Glide part 2? (1)

JebusIsLord (566856) | more than 12 years ago | (#3695949)

Argh this is such a common misconception. This is NOT A NEW, PROPRIETARY API like glide was, it is simply a high-level programming language that generates direct3d and opengl code so the programmer doesnt have to worry about it. This is fantastic! It also apparently will work fine with other chip architectures. Everyone wins!

Linux Support (-1, Flamebait)

l33t j03 (222209) | more than 12 years ago | (#3695603)

I heard that they won't be offering any Linux drivers because they are afraid associating themselves with the word Linux result in the company's downfall.

Re:Linux Support (3, Informative)

friedmud (512466) | more than 12 years ago | (#3695679)

What are you talking about?? Nvidia makes great linux drivers - and from looking through the pages it looks to me like Cg just outputs regular OpenGL (Well - Nvidia-OpenGL anyway) so I would venture a guess that any of these will run just fine on the nvidia linux drivers.

My only problem is that the toolkit itself is only for windows :-(

Anyone try it with Wine/Winex yet?? I might when I get home.

Derek

In Fact...... (3, Informative)

friedmud (512466) | more than 12 years ago | (#3695765)

From Nvidia's Homepage [nvidia.com] you can check out the press releases and find this:

"NVIDIA's Cg Compiler is also cross platform, supporting programs written for Windows®, OS X, Linux, Mac and Xbox®."

So maybe even though the tools aren't cross platform - the compiler is. I think this is a Great step forward towards OpenGL 2.0 - this is showing that Windows doesn't have to be the only platform to write graphically intensive applications for.

Derek

Re:Linux Support (1)

jra101 (95423) | more than 12 years ago | (#3695782)

There will be a linux compiler.

Re:Linux Support (1)

Mr. McGibby (41471) | more than 12 years ago | (#3695913)

Who the hell cares if the toolkit works in Linux? All I need is a compiler, which I'm sure will be released for Linux.

Re:Linux Support (1)

friedmud (512466) | more than 12 years ago | (#3695987)

Did you see the screenshots of the toolkit??? They were previewing effects IN REAL-TIME! That would save anyone a load of time.

What you said is basically like saying - "I don't need a C++ debugger for linux as long as I have my trusty compiler!"

That may be correct for small/medium projects - but we all know that debuggers (like gdb) save us loads of time.

Derek

Re:Linux Support (0)

Anonymous Coward | more than 12 years ago | (#3695982)

YHBT. YHL. HAND.

News.com beat ya. (2)

aetherspoon (72997) | more than 12 years ago | (#3695604)

News.com [com.com] had this story for awhile.

My biggest question - from reading this, this would actually work correctly on other competing VCards... why did nVidia create it?

Re:News.com beat ya. (2)

brejc8 (223089) | more than 12 years ago | (#3695684)

If they didn't ATI would

It supports other cards sub-optimally (0)

Anonymous Coward | more than 12 years ago | (#3695756)

They control what the backend supports, and it suppports only the baseline DX shaders ... since thats all their hardware supports but their competition has a larger feature set and more advanced DX shader versions you can see where the advantage is for them ...

They control the backend, and the backend will always be optimal for their hardware. It only supports their competitors hardware because that increases use by developers ... thats a win win situation, appear to have a cross platform standard but at the same time stack the deck in your favour.

Re:It supports other cards sub-optimally (1)

ThrasherTT (87841) | more than 12 years ago | (#3695968)

They are letting other vendors build their own backend. Think gcc for GPUs...

Who says? (0)

Anonymous Coward | more than 12 years ago | (#3696029)

I have seen speculation, but nothing directly attributed to NVIDIA.

Re:News.com beat ya. (1)

Merlin42 (148225) | more than 12 years ago | (#3695872)

'Cause 3Dlabs [3dlabs.com] already created it ... see their OpenGL 2.0 proposal [3dlabs.com] . My fear is this is _only_ a preemptive strike against the "full programability" of the 3dlabs P10 which will sooner or later have a consumer version. High level interoperating programability is very much needed to access the power of current and more complicated future cards.

NVidia != compatibility? (1)

FueledByRamen (581784) | more than 12 years ago | (#3695608)

Remember 3dfx's GLIDE libraries? This could end up like those... an "industry standard" supported only by one manufacturer's chipsets, used by all major games. At least 3dfx made good, cheap cards before they died, though.

If it doesn't work with my RADEON, it must be evil!

Re:NVidia != compatibility? (1)

neo8750 (566137) | more than 12 years ago | (#3695823)

I remember Glide as being incompatible with anything without a 3dfx chip set. But in return for the incompatibility we got great performance enhancements.

Pixels & Hardware sales (1)

NickRob (575331) | more than 12 years ago | (#3695610)

If we wanted cutting edge Pixels, we'd go back and play Wolfenstein. Man, I remember those days, the people with sharp features and a whole four frames of animation. And we were glad to have it, too.

Hype or innovation? (3, Insightful)

moonbender (547943) | more than 12 years ago | (#3695619)

You've got to wonder, is this yet another load of Nvidia corporate hype (a la "HW TnL will revolutionise gaming"), or is this useful technology? I wouldn't trust any of the current articles on answering that, judging by the previous Nvidia hypes, it takes a few months till anyone really knows if this is good or bad.

Re:Hype or innovation? (2, Insightful)

array_one (582818) | more than 12 years ago | (#3695652)

ummm.... T&L DID revolutionize the game industry. We are at a point where companies dont have to worry about pushing polygons. Now they are finally moving on to actually improving visual quality, as opposed to geometric complexity.Have you seen what ID has been up to lately?

Re:Hype or innovation? (3, Insightful)

Pulzar (81031) | more than 12 years ago | (#3695871)

Have you seen what ID has been up to lately?

Have you read about how much effort JC has put into pushing polygons in Doom 3? We're hardly at a point where companies don't have to worry about speed issues..

If anything, companies have to put in even more effort into producing some stunning results, because everybody has been spoiled by recent titles.

Re:Hype or innovation? (1)

MisterBlister (539957) | more than 12 years ago | (#3696086)

He isn't putting very much effort at all into pushing polygons..Its going into handling the multipass texture effects.

HW T&L (or, rather HW T, since the L part is of dubious usage because who does simple vertex lighting anyway?) *IS* a huge boon to gaming. Programmers don't need to sweat the details of polygon culling as they did before, with elaborate PVS/BSP setups.. Once GF3 and above are the norm, in almost all cases you can get away with just a very loose frustum cull as long as you render most objects front to back (to take advantage of built in z occluding, guard band clipping..)

Yes, there's always going to be more work to fill up the savings in work from other areas but as that happens the visual quality of the games is rapidly improving because more and more stuff is being properly solved for general cases.

In a couple more iterations, good enough global illumination and shadowing will be 'solved' as well, and then the programmers will move on to something else as the primary focus.

Of course none of this particularly revolutionizes GAMING, as the game industry is free to keep making the same games with better graphics (and this seems to be their general game plan), but you can't hold NVidia, ATI, etc responsible for that.

Re:Hype or innovation? (3, Interesting)

UnknownSoldier (67820) | more than 12 years ago | (#3695829)

> You've got to wonder, is this yet another load of Nvidia corporate hype (a la "HW TnL will revolutionise gaming")

Have you *even* done *any* 3d graphics programming?? HW TnL *offloads* work from the CPU to the GPU. The CPU is then free to spend on other things, such as AI. Work smarter, not harder.

I'm not sure what type of revolution you were expecting. Maybe you were expecting a much higher poly count with HW TnL, like 10x. The point is, we did get faster processing. Software TnL just isn't going to get the same high poly count that we're starting to see in today's games with HW TnL.

> it takes a few months till anyone really knows if this is good or bad.

If it means you don't have to waste your time writing *two* shaders (one for DX, and other for OpenGL) then that is a GOOD THING.

Re:Hype or innovation? (4, Insightful)

friedmud (512466) | more than 12 years ago | (#3695928)

"If it means you don't have to waste your time writing *two* shaders (one for DX, and other for OpenGL) then that is a GOOD THING."

Even better then that! It means you don't have to waste your time writing *4* shaders:

Nvidia/DirectX
Nvidia/OpenGL
ATI/DirectX
ATI/ OpenGL

That is of course, pending a compiler for ATI cards - but I don't think it will be long... Unless ATI holds out for OpenGL2 - but in between now and when OGL2 comes out there is a lot of time to lose maket share to Nvidia because people are writing all of their shaders in Cg - and ATI is getting left out in the rain....

So I would expect ATI to jump on this bandwagon - and quick!

Derek

Re:Hype or innovation? (1)

Paolomania (160098) | more than 12 years ago | (#3695952)

The thing that the original poster didn't *see* is that offloading TnL onto the GPU does not neccesarily improve the *visual* quality of the game. Rather it frees up system resources for the important parts of a game that unfortunatly don't go into making good *screenshots*.

More likely .... (0)

Anonymous Coward | more than 12 years ago | (#3695859)

they're trying to push something that gives them a leg up on ATI - perhaps making it easier to use the things their chip does well than the ones that ATI does .... these days PC-3D is a VERY cut-throat world

Re:Hype or innovation? (1)

blink3478 (579230) | more than 12 years ago | (#3695936)

This isn't hype, it's a natural progression in computer graphics.

What you've been able to do in 3d animation software for ten years now (bump mapping, specular mapping, displacement, self-illumination, glossiness maps etc), you're finally able to do in real-time using the lastest video cards.

In another few years, cards will probably be able to do real-time shadow effects (Doom 3), normal mapping, and eventually raytracing in real-time, for those nice reflection and refraction effects.

In the future, the 'top of the line' rendering effects that 3d Studio Max Softimage and Maya are just getting into (global illumination, caustics, hair, deep shadows) will eventually be able to be processed in real time as well... when processing power and video cards catch up.

D

Re:Hype or innovation? (1)

ThrasherTT (87841) | more than 12 years ago | (#3696032)

It's neither. This is simply the natural next step in the 3D graphics world. Once a standard high-level GPU programming language is adopted, programmers can spend a LOT more time doing stuff that makes for better games, while keeping the eye candy at a very high level (across any decent piece of hardware and 3D API). When I saw this PR, I wept with joy at the thought of being able to forget all the ASM-level tuning that needs to be done currently to get high performance from this generation of 3D HW. And I don't even have to wait 10 years for OpenGL 2.0!

Proprietary standards? (1)

Lord Fren (189373) | more than 12 years ago | (#3695627)

While I this is a great move by NVIDIA to increase the use of Pixel and Vertex Shader in games, is this wholly proprietary? I mean wouldn't it be better for ATI to have a hand in it as well, to work out a standard to make it easier for game developers? I just hope this doesn't turn out like 3dfx..

Re:Proprietary standards? (2)

startled (144833) | more than 12 years ago | (#3695722)

"While I this is a great move by NVIDIA to increase the use of Pixel and Vertex Shader in games, is this wholly proprietary?"

Cg compiles down to OpenGL and DirectX statements, which are not proprietary. Some of the statements are recent extensions to support the kind of stuff they want to do. So, yes, other companies can support these as well. However, they might be following a target being moved around at will by Nvidia. "Oh, you don't support DirectX 9.001's new pixel puking extensions?"

It remains to be seen how it's used. Obviously, Nvidia wants to use this to sell their cards. But MS doesn't have to listen to them when designing DirectX, either. It seems to me that at the very least, it'll be faster than writing separate old-school (last week) vertex and pixel shader code for each different brand.

zerg (2)

Lord Omlette (124579) | more than 12 years ago | (#3695630)

The article writes, "Putting on my speculative hat, the motivation is to drive hardware sales by increasing the prevalence of Pixel and Vertex Shader-enabled applications and gaming titles. This would be accomplished by creating a forward-compatible tool for developers to fully utilize the advanced features of current GPUs, and future GPUs/VPUs."
Putting on my speculative hat, the motivation is to, um, make better looking graphics?

The real test will be how well the crosscompiler outputs OpenGL 2 & DX 9 shaders in practic, not theory.

But let's be serious: cel shading is the only shading anyone really needs. ^^

Software Assurance gives you Open Source Windows! (-1)

IAgreeWithThisPost (550896) | more than 12 years ago | (#3695633)

Software Assurance Membership
February 27, 2002

Software Assurance Membership (SAM) is an integrated set of support services and benefits provided to all Enterprise Agreement customers and those Select License customers who have selected the SAM box for one or more product pools on their Select License enrollment. SAM is designed to reduce your costs across the entire product life cycle and to help you make the most of Microsoft® solutions. As Microsoft's relationship with your organization grows, SAM will provide you with customized service tailored to your environment, and help you use Microsoft software to your maximum advantage.

SAM Eligibility Requirements
SAM requires that all of your new license orders are purchased with Software Assurance (SA) for any product pool--Applications, Systems, or Servers--under your enrollment within the Select License. Qualified license holders include:

All new and current Enterprise Agreement (EA) customers.
Select License customers who have selected the Software Assurance Membership (SAM) box for one or more product pools on their Select License enrollment.
SAM Benefit Highlights
SAM participants receive all SA upgrade benefits for licenses enrolled in SA, plus additional benefits such as training, discounts, and partner services. Benefits include:

Training. Instructor-led IT training at a discount, or unique services from our enterprise training channel of nearly 100 Microsoft Certified Technical Education Centers (CTECs) in the U.S. and worldwide.
Partner services. Microsoft Gold Certified Partners will deliver unique service offerings such as technology architecture, implementation services, or discount pricing to SA members.

Windows 2000/Windows XP source code. SA members in the Systems Pool who have purchased at least 1,500 Microsoft Windows® operating system licenses under their Select License enrollment may be eligible to license approximately 95 percent of the Windows 2000/Windows XP source code at no additional fee. This will be offered in selected countries.

Windows Preinstallation Edition (WinPE). WinPE is offered to all SA members in the Systems Pool. It is a tool that can aid deployment, administration, and end-user productivity. WinPE is a version of Windows XP Professional that supports booting off of CD, DVD, PXE (network), or a hard disk drive. It provides a minimal Win32 subsystem for disk configuration, and launching of setup in the full Windows environment. This offering is designed to be used by large corporations to build their own custom deployment solutions.
Software discounts for your employees. The Microsoft Employee Purchase Program (MSEPP) offers exclusive Microsoft software, hardware, and Microsoft Press discounts to employees of SAM customers. MSEPP is designed to assist your employees in extending their workplace desktop to their home PC

Good times (1)

Procrasturbator (585082) | more than 12 years ago | (#3695638)

I'm buying one right away, and praying that they become industry standard. The next "Amy men" game will be all the sweeter, along with Pac-man 3D!

Re:Good times (2, Funny)

Score Whore (32328) | more than 12 years ago | (#3696044)

The next "Amy men" game will be all the sweeter...


Yes, I agree! We all like a good cross-dressing game for the early morning hours at lan parties!

Bloated code? (0)

creative_name (459764) | more than 12 years ago | (#3695643)

But will this result in the bloated code so prevailant in other Microsoft applications? Anyone who has ever seen the source of Billy Joe's webpage that he made using FrontPage can attest to the fact that the Redmond crew love to throw in all sorts of extra nonsense. In an area that is already so resource intensive, can we really afford the bloated code? Hopefully it will be a non-issue.

Render the truth! (-1, Troll)

Anonymous Coward | more than 12 years ago | (#3695645)

Pluralists would have us to believe that Christianity is just as good as Islam, but I'm here to tell you, sisters and brothers, that Christianity is not just as good as Islam ...Christianity was founded by Jehovah, a demon-possessed incestuous pedophile who had 1 wife -- and she was his 13-year-old mother. And I will tell you Jehovah is not Allah either. Allah's not going to turn you into a terrorist nation that'll try to bomb people for their oil and drop atomic bombs on surrendering nations and take the lives of thousands and thousands of non-Christian people at the whim of your multinational corporations.

Re:Render the truth! (0, Offtopic)

sheepab (461960) | more than 12 years ago | (#3695732)

Pluralists would have us to believe that Christianity is just as good as Islam, but I'm here to tell you, sisters and brothers, that Christianity is not just as good as Islam ...Christianity was founded by Jehovah, a demon-possessed incestuous pedophile who had 1 wife -- and she was his 13-year-old mother. And I will tell you Jehovah is not Allah either. Allah's not going to turn you into a terrorist nation that'll try to bomb people for their oil and drop atomic bombs on surrendering nations and take the lives of thousands and thousands of non-Christian people at the whim of your multinational corporations.

Funny how you post that as "Anonymous Coward". Just makes me laugh. Stay on topic with the article, and mods....please dont mark me down for this.

Re:Render the ghostse! (-1)

GhostseTroll (582659) | more than 12 years ago | (#3696014)

A professor at the University of Mississippi is giving a
lecture on the supernatural. To get a feel for his
audience, he asks: "How many people here believe in
ghostses?" About 90 students raise their hands.

"Well, that's a good start. Out of those of you who
believe in ghostses, do any of you think you've ever seen
a ghostse?" About 40 students raise their hands.

"That's really good. Has anyone here ever talked to a
ghostse?" 15 students raise their hands.

"That's great. Has anyone here ever touched a ghostse?" 3
students raise their hands.

"That's fantastic. But let me ask you one question
further... Have any of you ever made love to a ghostse?"
One student way in the back raises his hand.

The professor is astonished and says, "Son, all the
years I've been giving this lecture, no one has ever
claimed to have slept with a ghostse. You've got to come
up here and tell us about your experience."

The redneck student replies with a nod and a grin, and
begins to make his way up to the podium. The professor
says, "Well, tell us what it's like to have sex with
ghostse."

The student replies, "Ghostse?!? From ah-way back there ah
thought yuh said "goatse." [goatse.cx]

feh (0)

Anonymous Coward | more than 12 years ago | (#3695646)

This means nothing to me until John Carmack gives the seal of approval. Until then it might as well be BitBoys "Oy! Cg!"

More Cross-Platform Games? (0)

Anonymous Coward | more than 12 years ago | (#3695661)

Now, it should be easier to port code to non-MS platforms because the compiler outputs both DirectX and OpenGL.

This could be really good. (3, Informative)

Steveftoth (78419) | more than 12 years ago | (#3695680)

According to the web site, they are working to implement this on top of both OpenGL and DirectX. On linux and Mac as well.
Basically this is a wrapper for the assembly that you would have to write if you were going to write a shader program. It compiles a C-like (as in look a like ) language into either the DirectX shader program or the OpenGL shader program. So you'll need a compiler for each and every API that you want to support. Which means that you'll need a different compiler for OpenGL/Nvidia and OpenGL/ATI until they standardize it.

On a more technical note, the lack of branching in vertex/pixel shaders really needs to be fixed, it's really the only feature that they need to add to them. Which is why the Cg code looks so strange, it's C, but there's no loops.

Re:This could be really good. (0)

Anonymous Coward | more than 12 years ago | (#3696041)

Cg does support loops and branching -- check the language spec.

Depending on the targeted hardware platform for a given shader (e.g., requesting the shader be compiled to DX8 vertex shaders), certain limitations may be imposed, like requiring all loops be unrollable at compile time, to make programming and readability easier, but not really improve the programmability of the GPUs. Other profiles may have these restrictions removed, so that general loops and branching are fully supported.

Assembly vs. High-level? hmmm.... (3, Funny)

rob-fu (564277) | more than 12 years ago | (#3695683)

That's like asking which of the following would I rather do...

a) have a 3-way with two hot chicks
b) clean the floor behind my refrigerator

I wonder.

Options (0)

Anonymous Coward | more than 12 years ago | (#3696046)

Which option goes with which answer...

????

Pixel and Vertex Shading and OpenGL2.0? (0, Offtopic)

Ace Rimmer (179561) | more than 12 years ago | (#3695686)

Has OpenGL 2.0 a chance now? Is anyone capable to compare those?

Re:Pixel and Vertex Shading and OpenGL2.0? (3, Informative)

alriddoch (197022) | more than 12 years ago | (#3695739)

It seems to me that this is probably an attempt to kill OpenGL 2.0, and secure Direct X as the dominant 3D API. OpenGL 2.0 has as far as I can tell been well thought out, and most of the feedback to it has been very positive. The frontend to its shader language is Free Software, and the work done seems to have been done with the best of intentions. I am very cynical about an offering from NVIDIA, especially when you consider their behavoir towards the rest of the 3D card market, and the fact that Microsoft are involved.

Re:Pixel and Vertex Shading and OpenGL2.0? (2)

afidel (530433) | more than 12 years ago | (#3695846)

Since NVidia sits on the OpenGL 3.0 steering commitee and was the first to offer pixel shader extensions to their 2.0 drivers I think you are being a little reactionary to M$'s presence. For one thing the high end of Nvidia's line where they make about 8X the margins is in CAD and the like which will likely never be ruled by D3D. BTW, the interface to NVidia's pixel shader pipline exposed by the OpenGL extensions is much cleaner and better thoughtout then DX8's. See the recent John Carmack .plan update where he ranted about this fact.

Re:Pixel and Vertex Shading and OpenGL2.0? (0)

Anonymous Coward | more than 12 years ago | (#3695855)

>>It seems to me that this is probably an attempt >>to kill OpenGL 2.0, and secure Direct X as the >>dominant 3D API.

Yeah..... it would be easier to assume that than read the damn article.

If you hadn't noticed, the Cg compiler can generate shaders for DirectX AND OpenGL.

FUD FUD FUD. Welcome to Linux Land.

Re:Pixel and Vertex Shading and OpenGL2.0? (2)

Viking Coder (102287) | more than 12 years ago | (#3695965)

Cg compiler can generate shaders for OpenGL 1.4. Not 2.0. BIG difference.

It would be easier to read the damned article.

OpenGL 1.4 is a completely different beast than OpenGL 2.0. Cg is a direct competitor (and attempt to kill) OpenGL 2.0, and secure NVidia as the dominant provider of 3D APIs.

Tell me, Anonymous Coward, why you think that NVidia made Cg instead of supporting OpenGL 2.0 on their hardware? Try not to use words like "monopoly", "closed standard", and "platform specific."

Am I Deformed!? (0, Funny)

Metrollica (552191) | more than 12 years ago | (#3695690)

Hi.

I'm twenty five years old, and up until two weeks ago, I was a virgin. Too many celibacy had worn my self-esteem down to the point where I was finally willing to pay for sex. I'll spare you the details of the event, as this is not what I am writing about.

After having completed the act, the prostitute whose services I had rented immediately exclaimed that something had felt weird. With no particular ceremony, she grabbed my now-flaccid member and subjected it to an intense examination, while biting her thumbnail in consternation.

After a brief period, she informed me that my penis was deformed, in her professional opinion. I had spent my entire life without ever seeing another man urinate, so I was not aware that the output usually emits from the end of the head, not the underside, where mine does.

I'd like to know if I should seek the advice of a doctor or plastic surgeon? Is this the sort of thing that can, or even should be corrected? I've lived with it for twenty five years, and it hasn't bothered me. Is there really any reason to worry about this?

Re:Am I Deformed!? (-1)

YourMissionForToday (556292) | more than 12 years ago | (#3695837)

Hey Metrollica, I stole your ASCII art, but I am too dumb to get it to post correctly. Could you help me? Thx and plz visit my journal, you are the best!
<P>

No compiler is as good as human (0)

Anonymous Coward | more than 12 years ago | (#3695691)

No matter how good the compiler is, it will never be as efficient as a person writing in solid OpenGL and DirectX.

Also, OpenGL and DirectX are most commonly used with C++, which is a 3rd level language.

I sincerely doubt this new language is LISP-like (a higher level language than C++), it seems that this new "language" is little more than some scripting.

Damnit (1)

sheepab (461960) | more than 12 years ago | (#3695697)

I just bought my GeForce 4 TI4600. *sigh* Looks like Ill have to give my other ARM and LEG to pay for the upcoming GeForce 5.

Re:Damnit (2)

SirSlud (67381) | more than 12 years ago | (#3695748)

This technology is compatible with your current card, is it not? My impression is that cG simply makes it easier to generate the same OpenGL and DirectX code games are feeding your GF4 with now. Its to ease the work for the programmer and allow folks to concetrate more on the design of the shaders then their in-code implementation.

Re:Damnit (1)

Phosphor3k (542747) | more than 12 years ago | (#3695769)

Did you expect any different?

And instead of shelling out 300$ for your GF4 TI4600, you should have gotten a Gainward GF4 TI4200 for 150$(shipped, check pricewatch.com). Especially considering they easily overclock to TI4600 speeds with no problem.

Re:Damnit (1)

sheepab (461960) | more than 12 years ago | (#3695800)

I got the Gainward GF4 TI4600....extremely overclockable *nerd smile face*

Re:Damnit (1)

Phosphor3k (542747) | more than 12 years ago | (#3696081)

.....you have my approval.

Is this happenning because of Xbox on Nvidia? (2)

t0qer (230538) | more than 12 years ago | (#3695713)


One has to wonder if this allience is from the current relationship Nvidia and MS has with the Xbox.

Re:Is this happenning because of Xbox on Nvidia? (1)

PissingInTheWind (573929) | more than 12 years ago | (#3695740)

Probably somewhat, to help developers make better looking graphics (or at least make them more easily) on the XBox.

The article claim "cross platform compatibility" on Windows, Linux, Mac and the XBox.

Re:Is this happenning because of Xbox on Nvidia? (2)

brejc8 (223089) | more than 12 years ago | (#3695804)

What will happen is that Nvidia only develops the standard DirectX compilers and leaves it open for anyone else to do the others.

This keeps Micro$oft happy as they will get the best drivers at the same time as not insulting the OpenGL community.

They do NOT want every one else to turn their back on Nvidia.

Although most of the customers use M$ most of the poeple that advise them dont. I advised tons of people as to what they want in their PC and I base the advice on how "nice" the company is.

good for mac games (1)

paradesign (561561) | more than 12 years ago | (#3695776)

this is good because now it will be easier to create cross platform games. which means more games for linux/mac. that is assuming i read it correctly.

Re:good for mac games (1)

TheTrunkDr. (516695) | more than 12 years ago | (#3696036)

um... you don't read correctly, this really won't make much difference to mac gaming... All this does is make a 3d graphics programmer life a little easier. If he's making windows or mac games it's pretty irrelevent they both get easier, this isn't any incentive to start making a mac game instead of a windows (or whatever) one.

proprietary--scrap it (0)

Anonymous Coward | more than 12 years ago | (#3695788)

it's not GPL so screw it.

somebody write a nice GPL'd compiler for OpenGL quick and end this BS before it gets too far.

what a waste of effort to keep something nice like this closed.

One must wonder (1)

miffo.swe (547642) | more than 12 years ago | (#3695838)

I dont think that it should be such a good idea if one wendor would control the backends to the cards. I do dislike directx. Not because it is a vorse standard than open gl but because its not avaliable on every platform. If microsoft has a finger in the game it sure smells funny, that finger is up someones butt if you ask me. Sure it can generate opengl but i would almost presume that the day it gets really widely used it stops or does that in a less efficient way.

Microsoft have never done anything without a hidden agenda (microsoft bob not included).

Sample Code (1)

JanusFury (452699) | more than 12 years ago | (#3695844)

[BEGIN]
SET_PIXELFORMAT(SHINY)
ADD_BUMP_MAPS
BL END_REFLECTION
SET_TRANSPARENCY(0.5)
SET_TEXTURE ("WALL")
[END]

[BEGIN]
SET_PIXELFORMAT(WET)
ADD_BUMP_MAPS
BL END_REFLECTION
SET_TRANSPARENCY(0.3)
SET_COLOR(B LUE)
ADD_FISHIES(YELLOW)
[END]

Syntax (0)

Anonymous Coward | more than 12 years ago | (#3696062)

Them _ are sure used sparingly in the second example... expand for us stupid coders...

Just what are shaders? (0)

Anonymous Coward | more than 12 years ago | (#3695845)

I see things on shaders all the time, but I don't really undersand the difference between pixel and vertex shading...does opengl impliment pixel/vertex shading now? Do things really look a lot better using shaders instead of just using phong and texture mapping? What games that are out using pixel and vertex shaders?

Confused....links would be appreciated.

Inefficiencies (5, Interesting)

Have Blue (616) | more than 12 years ago | (#3695856)

One of these days, nVidia will ship a GPU whose functionality is a proper superset of that of a traditional CPU and then we can ditch the CPU entirely. Just like MMX, but backwards. This is a a recognized law of engineering [tuxedo.org] . At that point, Cg will have to become a "real" compiler. Let's hope nVidia is up to the task...

Re:Inefficiencies (1)

Arandir (19206) | more than 12 years ago | (#3695976)

Damn straight! That CPU just gets in the way. CPUs are for people wanting to use spreadsheets, word processors, accounting, and parsing XML data. Real people just want to play games.

When the GPU can compile its own code, then the computer will finally be freed from those tyrants who want to do use computers for productivity.

Interesting Comparison (2, Funny)

NitsujTPU (19263) | more than 12 years ago | (#3695877)

Did everybody read the comparison between writing in CG and writing hand-optimized assembly code?

Thank GOD they wrote CG, because now I won't have to write all of my programs in assembly anymore.

What is this "compiler" technology that they keep talking about? This might revolutionize computer science!

Not another one of those program-your-gpu lingos (0)

AltaMannen (568693) | more than 12 years ago | (#3695891)

I probably shouldn't be programming 3D videogames since I hate touching hardware chips and much prefer the solutions of Nintendo ("OpenGL-like is the only rendering interface") to the solutions of Sony and Microsoft, but why actually making this shit high-level? Ok for the purpose of writing some sort of optimizer but not for the rendering stuff as I will eventually have to suffer through using something like this.

No, I don't think this language will kill off all the other graphics-wannabe-languages but I think it will join them. Why can't consumers start buying games based on the physics, behaviours, AI, collision and all the other things that are fun to work with instead of basing their purchases purely on graphics?

Besides from my ranting, what does it actually do apart from setting up the material settings before you do the rendering? And isn't that just stuffing parameters into some registers anyway?

Analogy (1)

Viking Coder (102287) | more than 12 years ago | (#3695893)

Cg is to OpenGL 2.0
as DirectX is to OpenGL

It's a closed ("partially open") standard, for a subset of hardware, which is not as forward looking as a proposed competing standard.

Support OpenGL 2.0!

Re:Analogy (1)

WinterSolstice (223271) | more than 12 years ago | (#3696013)

I couldn't agree more. Did that article actually say "low-level Direct X and OpenGL"?

I've written in OpenGL for a long time (first OpenGL game was about when 3dFX came out...) and it is not hard. Freakin' whiners. First assembler is too hard (though I think they do have a point), then C is too hard, then OpenGL is too hard, now DirectX is too hard?

Here's an idea... if you can't write code that is fast, useable, and maintainable GO GET ANOTHER JOB. Stop writing additional layers of abstraction to hog up the CPU and disk.

-WS

Yeah, but... (2, Insightful)

NetRanger (5584) | more than 12 years ago | (#3695897)

There are some issues that I think nobody seems to be addressing, as in:

* Realistic fog/smoke -- not that 2-D fog which looks like a giant translucent grey pancake. Microsoft comes closer with Flight Sim 2002, but it's not quite there yet.

* Fire/flame -- again, nobody has created more realistic acceleration for this kind of effect. It's very important for many games.

Furthermore I would like to see fractal acceleration techniques for organic-looking trees, shrubs, and other scenery. Right now they look like something from a Lego box. In fact, fractals could probably help with fire/smoke effects as well, to add thicker & thinner areas which take on a "semi-random", but not an obvious pattern, effect.

Perhaps I'm just too picky...

hihgly detailed facial features in Cg (1, Funny)

Anonymous Coward | more than 12 years ago | (#3695909)

Does this mean they'll finally be able to make a decent nose-picking routine for Counter-Strike hostage models?

other hardware shading languages (4, Interesting)

mysticbob (21980) | more than 12 years ago | (#3695920)

there's already lots of other shading stuff out there, nvidia's hardly first. at least two other hardware shading languages exist. these languages allow c-like coding, and convert that into platform-specific stuffs. unfortunately, none of the things being marketed here, or now by nvidia, are really cross-platform. references: of course, the progenitor of all these, conceptually, is renderman's shading language. [pixar.com]

hopefully, opengl2's [3dlabs.com] shading will become standard, and mitigate the cross-platform differences. it's seemingly a much better option than this new thing by nvidia, but we'll have to wait and see what does well in the marketplace, and with developers.

Low level?! (0)

Anonymous Coward | more than 12 years ago | (#3695942)

Since when is DirectX and OpenGL "low-level"?

Re:Low level?! (1)

kyoko21 (198413) | more than 12 years ago | (#3695996)

3dlabs's cards of opengl 1.2 implemented on silicon in case you were wondering.

Duke Nukem (2)

DeadBugs (546475) | more than 12 years ago | (#3695943)

If this makes it easier to create high end video games maybe it could boost the Duke Nukem release schedule. I did say maybe

READ THE ARTICLE! (1)

dextr0us (565556) | more than 12 years ago | (#3695974)

I would appreciate it if you would stop spewing your nonsensical dribble on the board. Go and read the article. To all you that have asked the question: Is it cross platform? read the first paragraph of the article at cg channel Graphics giant NVIDIA today announced Cg, an initiative with participation from Microsoft to create a cross-platform, hardware-independent, high-level Pixel and Vertex Shader programming language. can i emphasize CROSS PLATFORM and HARDWARE INDEPENDANT. It ports to DX and (nvidia's) open GL. I really wouldn't worry about its cross compatibilty. All (relevant) cards have drivers for Open GL and Direct X.

In conjunction with Microshit? (0)

Anonymous Coward | more than 12 years ago | (#3695984)


So, more proprietary antics from Microshit to better exploit the people?

No loops? (1)

joeblowme (555290) | more than 12 years ago | (#3695991)

I don't code games for a living but I thought I would check out thier toolset and documentation. While reading through the documentation there is a big note that for and while loops are not supported yet. They are eventually planned to be supported they just aren't yet. You'd think when deciding which functions to include in a programming language the ability to do a loop would be essential but I guess not maybe coding games is mostly event driven so you don't need loops very often. I just thought this was strange.

Re:No loops? (1)

nat5an (558057) | more than 12 years ago | (#3696040)

As I understand it, this is because the pixel/vertex shading virtual machine does not, in fact, support branching or conditional branching, so you can't actually do a loop in the low-level language, so you can't do it in the high-level language either.

Re:No loops? (0)

Anonymous Coward | more than 12 years ago | (#3696060)

The problem is that current GPUs (GF4, Radeon 8500) don't support loops in Vertex/Pixel-shader programs.
This problem will be resolved in next-generation hardware (NV30, R300)

Re:No loops? (2, Informative)

GameMaster (148118) | more than 12 years ago | (#3696066)

The actual assembly language used by the present generation of shader supporting video chips has no support for loops and only marginal support for conditional statements (meaning no explicit jmp op). Since this is the code that the cg compiler compiles down to, they can't add those features to the language. It makes some sense because shaders are meant to be short and sweet. Event though they are hardware accelerated, they get run so many times that even a shader that uses only the max number of ops (which is now at 128 ops for NVIDIA chips) is considered pretty slow. If loops were added it would slow the system down even more. When they say those features are "eventually planned to be supported" they mean that they'll be supported by a future generation of hardware (most likely the directX9 compatible chips).

-GameMaster

Re:No loops? (0)

Anonymous Coward | more than 12 years ago | (#3696075)

This is only for pixel and vertex shaders. It is only used to create things like cool texture affects. It is not used for general game programming. The code you write in Cg is run on the newer graphics cards, and neither ATi or nVidia allow looping in their pixel and vertex shaders

Mississippi Ghostse (-1)

GhostseTroll (582659) | more than 12 years ago | (#3695992)

A professor at the University of Mississippi is giving a
lecture on the supernatural. To get a feel for his
audience, he asks: "How many people here believe in
ghostses?" About 90 students raise their hands.

"Well, that's a good start. Out of those of you who
believe in ghostses, do any of you think you've ever seen
a ghostse?" About 40 students raise their hands.

"That's really good. Has anyone here ever talked to a
ghostse?" 15 students raise their hands.

"That's great. Has anyone here ever touched a ghostse?" 3
students raise their hands.

"That's fantastic. But let me ask you one question
further... Have any of you ever made love to a ghostse?"
One student way in the back raises his hand.

The professor is astonished and says, "Son, all the
years I've been giving this lecture, no one has ever
claimed to have slept with a ghostse. You've got to come
up here and tell us about your experience."

The redneck student replies with a nod and a grin, and
begins to make his way up to the podium. The professor
says, "Well, tell us what it's like to have sex with
ghostse."

The student replies, "Ghostse?!? From ah-way back there ah
thought yuh said "goatse." [goatse.cx]
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...