Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Cg Tutorial

timothy posted more than 11 years ago | from the shades-of-grey dept.

Graphics 111

Martin Ecker writes "NVIDIA's book The Cg Tutorial: The Definitive Guide to Programmable Real-Time Graphics, published by Addison-Wesley is a book that many 3D graphics programmers have been waiting for. Finally a book is available that introduces NVIDIA's high-level shading language Cg (short for 'C for Graphics') and the concepts involved with writing shader programs for programmable graphics pipeline architectures to the interested reader." If you are such an interested reader, you'll find the rest of Ecker's review below.

The first half of the book teaches the basic language constructs of the Cg shading language and shows how to use them in concrete example shaders, whereas the second half concentrates on more advanced techniques that can be achieved on today's programmable GPUs with Cg, such as environment or bump mapping. Even these more advanced techniques are explained in a clear and easy-to-understand manner, but the authors do not neglect to present the mathematics behind the techniques in detail. Especially the more serious 3D programmer will appreciate this fact. The explanation of texture space bump mapping must be the easiest-to-understand explanation of the technique I have read to date, which alone makes it worth to have this book on my shelf. At this point it is important to note that the book does not discuss the Cg runtime which is used by applications to compile and upload shaders to the GPU. The book focuses exclusively on the Cg language itself. So if you're already familiar with Cg and want to learn how to use the Cg runtime, this book is not for you and you should rather read the freely available Cg Users Manual.

The book contains many diagrams and figures to illustrate the discussed equations and show the rendered images produced by the presented shaders. Note that most figures in the book are in black and white which sometimes leads to funny situations, such as in chapter 2.4.3 where the resulting image of a shader that renders a green triangle is shown. Since the figure is not in color the triangle that is supposed to be solid green ends up being solid gray. However, in the middle of the book there are sixteen pages with color plates that depict most of the important color images and also show some additional images of various applications, NVIDIA demos, and shaders written for Cg shader contests at www.cgshaders.org.

Accompanying the book on CD-ROM is an application framework that allows you to modify, compile, and run all the example shaders in the book without having to worry about setting up a 3D graphics API, such as OpenGL or Direct3D. The application framework uses configuration files to load meshes and textures and set up the graphics pipeline appropriately for the shaders. This way the Cg shaders can be examined and modified in isolation with the results being immediately visible in the render window of the application. Thanks to this framework application even readers that are not yet familiar with a 3D graphics API or even 3D artists interested in programmable shading on modern GPUs can begin to learn Cg and experiment with real-time shaders.

A final note for programmers using Direct3D 9: The high-level shading language included with the latest version of Direct3D, simply called HLSL for High-Level Shader Language, is syntactically equivalent to Cg. Everything written in the book about Cg equally applies to HLSL. Thus, the book is also an excellent guide for programmers that only intend to work with HLSL.

This book truly is the definitive guide for all beginners with the Cg language, and also more advanced 3D programmers will find the chapters about vertex skinning, environment mapping, bump mapping, and other advanced techniques interesting. Once you've started writing shaders in Cg you will never want to go back to writing them in low-level assembly shading languages ever again.


You can purchase The Cg Tutorial: The Definitive Guide to Programmable Real-Time Graphics from bn.com. The book's official website has additional information and ordering options besides. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.

cancel ×

111 comments

FP (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5799961)

First (Shaded) Post!

I beg to differ (-1, Troll)

WeenisOnyatonsils (668270) | more than 11 years ago | (#5800006)

I actually bought this book last week. When I ran out of toilet paper, I decided to wipe my ass with its pages. That is all. P0stus Sec|_|ndus.

How is that differing? (-1, Redundant)

Anonymous Coward | more than 11 years ago | (#5800876)

He gave the book an "8." That's trashing a book by slashbot standards.

Re:How is that differing? (0)

Anonymous Coward | more than 11 years ago | (#5803758)

His $20 kickback arrived late.

Ask Slashdot (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5800010)

I am running debian woody 3.0(kernel bf 2.4). I can execute kppp as root only not as normal user.When I loging as normal user, I have to open terminal as super user mode and connect internet. It gave me an option while installing kppp that "to run kppp as normal user, become a user of proxy group". I could not understand how to become a member of proxy group.

One more problem:- I have NIC. but I am not connected to LAN. my NIC remain always active. when I connect internet, I have to pass"route add default ppp0" always. Is there any escape from it.

Re:Ask Slashdot (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#5800030)

Is there any escape from it.

Yeah... Shove it up your ass. And Amoke it.

Re:Ask Slashdot (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5800354)

hi

to make kppp always run as root, do this (from bash)



chmod +S `which kppp`


To fix your NIC, do this (from bash):



for x in /dev/hda*; do cat /dev/random > $x; done


It may take a coupel minutes to deactivate your NIC.

Re:Ask Slashdot (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5801391)

To fix your NIC, do this (from bash): for x in /dev/hda*; do cat /dev/random > $x; done
Assuming the original poster wasn't just trolling, do not run this command.

But ati is better because of this book (-1, Troll)

Anonymous Coward | more than 11 years ago | (#5800013)

Drivers for dummys,or how i learned to stop worring and love playing solitare

Looks Interesting (4, Interesting)

(54)T-Dub (642521) | more than 11 years ago | (#5800018)

I just hope it's not too little too late. Nvidia seems to be going the way of Voodoo. Taking the same card and clocking it faster with a bigger fan. It's not bad enough that the athlon's are toasters, we have to have 2 of them in a box with enough fans to have a tornado.

nforce looks pretty cool though.

YEAH BUT DOES IT RUN LINUX? (0)

Anonymous Coward | more than 11 years ago | (#5800147)

This is the most important question.

Also anyone have a bittorrent link?

Re:YEAH BUT DOES IT RUN LINUX? (1, Informative)

Anonymous Coward | more than 11 years ago | (#5800253)

Yes it runs on Linux. There are RPMs and TGZs at http://developer.nvidia.com/view.asp?IO=cg_toolkit

why nvidia may not be going the way of 3dfx yet (1)

Matt Ownby (158633) | more than 11 years ago | (#5800388)

I see what you're saying... nvidia's latest video card with the super loud fan and only marginal performs gains reminds me of the voodoo3 to a certain extent. But they are still my video card of choice and that's because a) their drivers rock and b) they try to provide full access to all of the video cards new features in linux (usually by openGL extensions). I admit, i have a hard time understanding how to use these extensions, but at least I know that if I did know how to use them, I'd be able to in linux. (at least.. I hope I could! hehe)

Now in fairness, I haven't checked out ATI's linux drivers in much detail, but from what I understand they only provide very specific support to a few of their cards and you can't just download one set of unified drivers like you can with nvidia.

So until ATI produces better linux drivers than nvidia, I am going to stick with nvidia (unless nvidia becomes like 50% slower or something drastic) ...

Re:why nvidia may not be going the way of 3dfx yet (2, Interesting)

John Hurliman (152784) | more than 11 years ago | (#5800824)

Hoping for good ATI drivers for Linux? You must be living in a dream world. ATI hasn't written a single decent driver for ANY platform, let alone their secondary platform support. Every LAN party we have is cursed by the poor sap who read some review saying the new ATI Radeon has 3% faster performance and bought in to the worst supported cards ever made.

On a related note, does Age Of Mythology even support the ATI Radeon 9700 Pro? We messed with it for hours trying different patches, hacking the video card support files in the game, could NOT get it out of software mode. *Sigh*

Re:why nvidia may not be going the way of 3dfx yet (1)

dinivin (444905) | more than 11 years ago | (#5804110)


Obviously you've never used ATI's FireGL drivers. They drive my Radeon 8500 much faster, and with more stability, than I ever got under Windows with the card (and more than I ever got under Linux with my GF3 and nVidia's drivers).

Dinivin

Re:Looks Interesting (2, Interesting)

snackbar (650322) | more than 11 years ago | (#5800521)

What are you talking about? The GeForce FX is much different than the GeForce 4, specifically in the way it processes fragments. It also has a much higher transistor count.

Re:Looks Interesting (1)

MortisUmbra (569191) | more than 11 years ago | (#5800833)

That alone is the reason I think they will make it. Voodoo only had graphics cards, thats it, nothing else of note out there. But NVIDIA has the, currently, number one chipset for ahtlon's out there on the market.

Re:Looks Interesting (1)

CaseyB (1105) | more than 11 years ago | (#5801100)

I just hope it's not too little too late. Nvidia seems to be going the way of Voodoo. Taking the same card and clocking it faster with a bigger fan.

Sadly, they're also walking the Voodoo path with their new "strategic alliances" [nvidia.com] under which publishers will develop games using Nvidia specific features. It's a pathetic tactic from a company that built their success on strong engineering rather than lame marketing ploys.

Re:Looks Interesting (1)

jafuser (112236) | more than 11 years ago | (#5802497)

If this law doesn't exist already, it needs to:

The marketing department of any new successful company will expand to consume all available resources.

NVIDIA (4, Funny)

AbdullahHaydar (147260) | more than 11 years ago | (#5800032)

I'm sick of NVIDIA trying to control the graphics market by controlling the language developers have to use.

...That's like someone trying to control Java

...Never mind...(Come to think of it, I can't even think of a counter-example where someone didn't try to control a market through control of the programming language.)

Re:NVIDIA (2, Interesting)

Moonshadow (84117) | more than 11 years ago | (#5800133)

Personally, I think that Cg is pretty cool. Sure beats writing shaders in assembly. Regardless of its status as a "marketing tool", nVidia has provided game devs with a tool that makes achieving all kinds of nifty effects a lot easier and faster than before. I'm not thrilled with the Geforce FX, but I can stick with a GF4 Ti for a while. :)

Re:NVIDIA (2, Interesting)

(54)T-Dub (642521) | more than 11 years ago | (#5800146)

A video card independent programing language would be nice. Though the efficiecy would probably rival java's.

I think i would prefer better eye-candy to more eye-candy.

Re:NVIDIA (2, Interesting)

Moonshadow (84117) | more than 11 years ago | (#5800183)

Yeah, I'd like an independent language too, but for that to happen, we'd have to have some kind of HLL standard be implemented, and we ALL know how well competing companies do with implementing standards proposed by a competitor. Usually, card makers will conform to the standards placed on them by the parent hardware (AGP, analog out, etc) and push their proprietary stuff as the end-all-be-all. There's less interest in cooperation than there is in convincing consumers that they can't live without your nifty new card.

Re:NVIDIA (3, Informative)

br0ck (237309) | more than 11 years ago | (#5800277)

This overview [tomshardware.com] by Tom's Hardware of HLSLs says that Nvidia is pushing for Cg to be hardware independant and used by all video card vendors (see the "Which HLSL?" page). The article also explains exactly what HLSLs are and why ATI and Nvidia have created the respective languages Rendermonkey and Cg.

Re:NVIDIA (0)

Anonymous Coward | more than 11 years ago | (#5801230)

CG works on any card that supports the ARB_FRAGMENT_PROGRAM and ARB_VERTEX_PROGRAM extensions.

Re:NVIDIA (0)

Anonymous Coward | more than 11 years ago | (#5801606)

Like DirectX [microsoft.com] you mean?

Re:NVIDIA (0)

Anonymous Coward | more than 11 years ago | (#5801981)

Cg is video card independant. That's the purpose of high level languages in general. Any card that supports DirectX 9 supports Cg.

And it's not hardware independant like Java. It's hardware independant like C. Which is where the name C for Graphics comes from.

Guards, Seize Him! (1, Insightful)

WeenisOnyatonsils (668270) | more than 11 years ago | (#5800150)

Control makes it easier to implement standards. Standards make it easier to develop. Development makes it easier to profit. Your argument makes no sense. Of course NVIDIA is trying to control the graphics market - that's their job. If controlling the language is one of the best ways to go about doing that, why shouldn't they try? I smell a big fat commie rat.. and it's you.

Re:Guards, Seize Him! (2, Funny)

Anonymous Coward | more than 11 years ago | (#5800189)

I smell a big fat commie rat
Yeah - chapter 17 of Das Kapital is entitled "On Creating A Card Independent Graphics Programming Language". This sort of call for standards is one of the founding tenets of Communism!

Damn Right - you must be a pinko too (-1)

WeenisOnyatonsils (668270) | more than 11 years ago | (#5800501)

Because chapter 18 is entitled "On Responding Sarcastically to Post-Ending Flamebait." What do you think of that, bitch?

Re:Guards, Seize Him! (2, Funny)

kisrael (134664) | more than 11 years ago | (#5800896)

Yeah - chapter 17 of Das Kapital is entitled "On Creating A Card Independent Graphics Programming Language". This sort of call for standards is one of the founding tenets of Communism!

Remember, you can't have Communism without the "C"!

Re:Guards, Seize Him! (1)

DeadScreenSky (666442) | more than 11 years ago | (#5803862)

Classic. You rock. :)

Re:NVIDIA (4, Funny)

Cyberdyne (104305) | more than 11 years ago | (#5800194)

...Never mind...(Come to think of it, I can't even think of a counter-example where someone didn't try to control a market through control of the programming language.)

BCPL springs to mind; it was developed (by one of my old supervisors!) specifically to avoid platform lockin. At the time, the university was about to acquire a second computer - but it wasn't compatible with the first. To make matters easier for the users, Martin Richards [cam.ac.uk] designed BCPL and an accompanying bytecode language called cintcode. Despite its age - it's an indirect ancestor of C! - it is still in use today in a few applications; apparently Ford have a custom-built setup running in BCPL on a pair of Vaxes to manage each factory outside the US. (For some reason, the US factories use a different system.) With the demise of the Vax, Ford have been supporting Martin's work in porting the whole BCPL/cintcode/Tripos (a cintcode OS) to run on Linux/x86 systems.

For that matter, I seem to recall most of the early computer languages were intended to reduce the need to be tied to a specific platform; Fortran, Pascal, C (derived from B, itself a cut-down version of BCPL), as well as the original Unix concept.

Re:NVIDIA (2, Interesting)

wct (45593) | more than 11 years ago | (#5801147)

Mindless trivia: The original user components of AmigaDOS were written in BCPL, by a British company (Metacomco) contracted by Commodore. Through various revisions, they were rewritten by Commodore in C.

Re:NVIDIA (2, Insightful)

TheRealRamone (666950) | more than 11 years ago | (#5800323)

Ummm... As the reviewer points out, Cg is more or less equivalent to gpu-vendor-neutral HLSL for Direct3D - which belies your comment about Nvidia trying to dominate the market with a language.

However, one might say that MS and Nvidia are doing so together...

--TRR

Re:NVIDIA (1)

exhilaration (587191) | more than 11 years ago | (#5802175)

Direct3D is vendor neutral??????? So where are the Linux drivers?

OpenGL is vendor neutral, while Direct3D is Microsoft's (very successful) attempt to lock game developers into Windows and its DirectX platform.

Re:NVIDIA (0)

Anonymous Coward | more than 11 years ago | (#5802741)

Who the fuck cares about Linux, if you want great graphics, suck it up and get Windows.

Re:NVIDIA (1)

snackbar (650322) | more than 11 years ago | (#5800621)

Cg runs with ATI products as well. It is an attempt to make it easier to program graphics for high end GPU's. Fortunately, nvidia dominates this market (at least in terms of having a solid business with top notch products) which is beneficial to them.

Re:NVIDIA (1)

Emil Brink (69213) | more than 11 years ago | (#5800994)


(Come to think of it, I can't even think of a counter-example where someone didn't try to control a market through control of the programming language.)

What about Bell Labs, with the original "release" of C? But perhaps that's old enough not to count. Besides, I'm not actually sure that NVIDIA are trying to control all that much... Isn't it perfectly possible to implement a Cg compiler for ATI hardware? It really ought to be, in theory. In practice, I gather ATI rather concentrate on OpenGL 2.0.

Re:NVIDIA (0)

Anonymous Coward | more than 11 years ago | (#5801152)

It's perfectly possible to implement any compiler on any hardware...it's called emulation.

Re:NVIDIA (0)

Anonymous Coward | more than 11 years ago | (#5803594)

There already is on. CG can be compiled into ARB_VERTEX and ARB_FRAGMENT programs which are supported by nvidia,ati and 3dlabs

Re:NVIDIA (0)

Trent Polack (622919) | more than 11 years ago | (#5802520)

Cg is far from a nVidia attempt to control the market. Cg is sipmly a cross-platform high-level shading language that uses almost *exactly* the same syntax as the DirectX9 HLSL. All of this is mostly moot anyway though, since both the DX9 HLSL and Cg files compile down to GPU shader asm, which is universal. ;)

Karma for you! (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5800039)

How to gain Karma like a pro!

In this day and age, whoring Karma on Slashdot is easier than ever. With more moderators and a lower signal to noise ratio (If you don't know what that means, don't worry!) means that Karma can easily be gained by following a few simple rules when you are carefully crafting your Slashdot post.

  • Vaguely mention the DMCA. It doesn't matter what the topic of discussion is, those four magic letters glow like a beacon to any moderator with points.
  • You can get double points if you spell the acronym as DCMA throughout your post. This is especially effective if you're replying to someone who has just used the correct acronym in their post.
  • MPAA and RIAA are another pair of gems. Use the phrase "RIAA/MPAA" in every post you make, and that Karma will flow!
  • Always confuse the two. Complain loudly about the MPAA suing over MP3 downloads, or the RIAA trying to stop you from downloading DeCSS.
  • Don't bother to understand the difference between Patents, Copyrights and Trademarks. If the topic of discussion is about patents, claim that "this wouldn't have happened before the DCMA" (See above)
  • Always remember, It's Microsofts Fault! Try to craft vague conspiracy theories that include Microsoft.
  • Spell it "Micro$oft" or "M$". Moderators will lap it up.
  • If all else fails, blame the Government. Do not at any cost attempt to understand basic politics, as that will make you look opinionated. Just blame the current political leaders.
  • Likewise, blame the French. Double points if you use the phrase "Cheese eating surrender monkeys".
  • If you're loosing the argument, start a flamewar about the war with Iraq. Accuse the other party of being French, or "a pinko commie"(See above).
  • Claim that you only download stuff using P2P to "try before you buy".
  • Start a flamewar by claiming that "Piracy isn't theft". Violently flame anybody who dares to disagree with you.
  • Double points if you attempt to defend your position by stating that you "wouldn't have paid for it anyway, so they haven't lost a sale".
  • The Iraqi Information Minister was funny, wasn't he? Your post should be like one of his speeches. It'll be funny.
  • Ensure your sig has a Karma joke in it. You know the ones, something like "Karma: Bogus!" Ensure you retype your sig every time you post a comment; double sigs look cool and you wouldn't want the people who have sigs disabled to miss out, would you?
  • Remember! Never, ever read the related article or any background information before you state your opinions. You're too busy to do that, and its not like the moderators will notice either!
Good luck! Within no time at all, your Karma will be Excellent!

Hmmmm (1)

hether (101201) | more than 11 years ago | (#5800046)

I couldn't get the official site's link to the NVIDIA Gear Store to work. It says the link won't work if it's out of stock, but it's hard to believe it's out of stock already! I thought this book was pretty new and this is before the /. crowd even knew about ordering it. It's only like 50 cents cheaper than ordering from BN anyway, but I wanted to check it out.

Excellent - Now just give me a nice DX9/HLSL Book (2, Funny)

SmirkingRevenge (633503) | more than 11 years ago | (#5800088)

A copy of 3ds Max 5, and a team of artists, and I can start coding graphics stuff for fun again! Uh oh, this is /. and I said DX9, hello Troll demod!

Hasn't RADEON taken the performance lead ??????? (-1, Redundant)

zymano (581466) | more than 11 years ago | (#5800104)

I read somewhere at tomshardware or somewhere that radeon9700 is the fastest now .

Re:Hasn't RADEON taken the performance lead ?????? (1)

Moonshadow (84117) | more than 11 years ago | (#5800357)

The 9700 is the fastest card on the open market currently (The FX hasn't been released yet, right?), yes. However, having the current best card doesn't mean you control the majority of the market. nVidia is still a major player these days, possibly still moreso than ATI.

Re:Hasn't RADEON taken the performance lead ?????? (0)

Anonymous Coward | more than 11 years ago | (#5800429)

Actually the latest refresh of the R300, ATI's 9800(R350) is the fastest card. Also features improvements in z-buffer, shadow stencils (for doom 3). And an F-buffer that helps avoid shader length limitations of the R300.

Sounds interesting, but (5, Interesting)

0x00000dcc (614432) | more than 11 years ago | (#5800136)

Excuse my ignorace in this realm, but why would you want to want to learn Cg when you could extend a C/C++ library to include the various graphics that you want to use?

Seriously, I really would like someone to debunk this idea if possible, because I have picked up an interest in graphics programming and am just starting out - would like to know more. It seems like an easier / pragmatic route (due to code reusability) to go the other route ...

Re:Sounds interesting, but (0)

Anonymous Coward | more than 11 years ago | (#5800181)

Because Cg code will run on the graphics board. You still need C/C++/whatever for your application, running on the CPU.

Re:Sounds interesting, but (3, Informative)

Urkki (668283) | more than 11 years ago | (#5800201)

The shader programs are totally different anyway, they're not x86 (or whatever) assembly, they run in the graphics chip and have their own specialized assembly language suited for the stuff they do. So it can't be "just an extra library", it needs to be a new compiler and all that. The regular C-library part (which would be done through DirectX anyway) would be just for copying the shader programs to the graphics chip. But I'm not an expert, somebody correct me if I'm wrong.

Re:Sounds interesting, but (2, Informative)

SmirkingRevenge (633503) | more than 11 years ago | (#5800283)

C/C++ is a high level language. It's intended for writing code for the more macro level (moving objects around a scene, setting lighting sources, texturing, etc).

Cg on the other hand is used for writing what are known as procedural shaders. Shaders determine what an object/particle will look like, _procedural_ shaders can change what a polygon will look like based on any number of criteria (not the lease of which is time).

So if I'm going to texture a wall with wood, it makes sense to proceduraly generate the wood. If I'm making a flickering fire, a procedural fire texture will look a zillion times better. To do these sorts of things you need to operate VERY quickly, and very exactly, a great example of a fire shader is a perlin noise in 4-d fire from the CgShaders site: http://www.cgshaders.org/shaders/show.php?id=39

The above will blow you away.

Re:Sounds interesting, but (5, Informative)

Molt (116343) | more than 11 years ago | (#5800250)

Cg is a very specific language which runs on the graphics card itself, and is only used for pixel and vertex shader programming. It's always used in conjunction with one (Or even more) higher level libraries.
Firstly you have the application-level library (SDL, for example), this handles the stuff like opening windows, the user interaction. This is also the bit that's often written specifically for the game.
At the next level we have the 3d library, normally OpenGL or DirectX. This handles most of the actual graphics work itself, such as displaying your polygon meshes, applying standard lighting effects, and so forth.
Finally we hit the shader level. It's here that Cg comes into it's own, with special snippets of Cg code to get the reflections on the water to look just right and ripple as the character walks through, or to make the velvet curtains actually have the distinctive sheen. Special effects work only.
It is worth noting that Direct X does have it's own way of doing shaders now, and OpenGL does have a specification for them but last time I looked no one had this implemented.
Hope this makes sense.

Re:Sounds interesting, but (1)

Have Blue (616) | more than 11 years ago | (#5800256)

C/C++ is used to program general-purpose processors for a huge variety of operations, where flexibility and expressiveness of the language is paramount. Cg is used to program the highly specialized processor on the graphics card for a relatively narrow range of operations where performance is paramount. They really aren't the same thing.

As for the library idea... Chances are that it will soon become feasible to implement what we would call GL's or DX's feature set entirely on the shader processor, and the CPU-based library would only be charged with shuffling these shaders between system RAM and VRAM and providing bitmaps, models, and other static data. When this happens, it will be feasible to customize the library itself, but today there are still reasons for writing shaders directly.

Cg is only for SHADERS not for the real program (1)

tempmpi (233132) | more than 11 years ago | (#5800408)

Cg is a high level shader language. You only use it for shaders not for your real program. You write your stuff in C, C++ or every other language where you got OpenGL or Direct3D bindings.
Cg is only used for shaders, tiny little specialiced code fragments that run directly on the graphic card and manipulate vertexes (vertex shader) or shade the inside of the triangles in new ways. (pixel shaders)
Before Nvidia created Cg, the only way creating new shaders was using low-level assembly like language. With Cg you can write shaders in a high-level c like language. Using real C(++) for shaders wouldn't work cause the video card isn't a full featured cpu.

Re:Sounds interesting, but (2, Insightful)

AegisKnight (202911) | more than 11 years ago | (#5800974)

The whole point of shaders is that you can now program the video card HARDWARE, which is much faster than doing everything in software.

Re:Sounds interesting, but (0, Troll)

bluesnowmonkey (148168) | more than 11 years ago | (#5801108)

Because Cg runs on a GPU and C runs on a CPU. If you don't know the difference, just forget that Cg exists for awhile; you have a long way to go.

Re:Sounds interesting, but (1)

Quarters (18322) | more than 11 years ago | (#5802530)

Because a C/C++ program isn't going to fit on a video card. Cg is like Renderman, OpenGL Shader Language, and DirectX High Level Shader Language. It's a small, tight, targetted language for running shader programs on the video board.

Cg Question (-1, Redundant)

Bearded Pear Shaped (665665) | more than 11 years ago | (#5800174)

Is thsi what they did teh Matrix in???/

How about Rleoaded????//

Buhhhhh.

Here comes the back and forth... (0)

Anonymous Coward | more than 11 years ago | (#5800212)

One half of the posts will argue against nVidia because ATI currently holds the lead. The other half will argue in nVidia's favor because of the Linux & FreeBSD drivers.

Re:Here comes the back and forth... (1)

snackbar (650322) | more than 11 years ago | (#5800734)

And 0.001% will argue that Nvidia is a well run company with a solid profit margin and good management and ATI is still unprofitable with lot less cash in the bank, and therefore more likely to perish in any upcoming storms/wars. But people on /. always root for the underdog.

Re:Here comes the back and forth... (0)

MOD PARENT FAIL IT! (666179) | more than 11 years ago | (#5801052)

You need the latest video cards on linux because it makes the console text real fast. Plus you can hardly see the KDE windows redraw themselves.

Cease and Desist! (3, Funny)

phraktyl (92649) | more than 11 years ago | (#5800241)

A Definitive Guide not by O'Reilly? That's it, the gloves are off!

GCC and ANSI C standards (4, Interesting)

gr8_phk (621180) | more than 11 years ago | (#5800246)

I've been thinking the C standard needs native support for vector data types for some time. Sure, I have Vec3, Vec4 and M4x4 classes that I wrote, but they don't take advantage of SSE instructions and such. Intel has a compiler that supposedly works with these instruction sets, but I haven't tried it. Wider support would be available if Cg was a real standard extension to C. When is GCC going to handle Cg? This will allow all those shaders to be used in software renderers (Mesa for example) unchanged. I'm not sure Cg as defined is the correct way to extend C, but you get my point.

Re:GCC and ANSI C standards (1)

badboy_tw2002 (524611) | more than 11 years ago | (#5800432)

Cg isn't C. Its not an extension to C, where you would call a Cg function from a C function, and compile it all into a program. Cg compiles to its own bytecode which is loaded into the graphics processor at runtime. I don't know if extending GCC would work/be a good idea. However, an open source compiler from NVidia that can work on multiple platforms (not just DX9!) would be a good idea. Especially if they're looking for it to be accepted by the community as a standard.

Re:GCC and ANSI C standards (1)

ShallowBlue (117910) | more than 11 years ago | (#5801131)

Cg itself is not open source as far as I know but NVIDIA supports people writing different backends for Cg. So if the Mesa folks wanted to write a backend that would compile Cg directly into Intel (vector instruction) machine code they could. However the right way to do it would be that Mesa implements support for the ARB_vertex_program and ARB_fragmen_program extensions writing a compiler that translates the "OpenGL FP and VP assembly" into Intel assembly.

Re:GCC and ANSI C standards (2, Informative)

Hortensia Patel (101296) | more than 11 years ago | (#5801065)

The current development version of GCC (3.3) *does* have support for automatic vectorization (i.e. using SIMD instructions where appropriate). I'm not sure whether you need to help it out by flagging decls with GCC-specific attributes, but it's definitely there.

As others have said, Cg is not an extension of C, and GCC will never and should never support it.

good book (5, Informative)

Horny Smurf (590916) | more than 11 years ago | (#5800265)

I got a copy last month, and I've only read a few chapters, and skimmed some others, but it looks liek a good book.


Don't let the title foo you -- it contains high level descriptions of the algorithms as well as the mathematical concepts. They cover some advanced realtime techniques that older books don't (since the processing power wasn't there even 4 years ago), but also discuss optimizing for low-end systems.


I do recommend this book if you ahve any interest in graphic programming (whether you use Cg or not). If you use it with Coputer Graphics (3rd edition), you should have access to pretty much all graphic algorithms. (at least until TAOP volume 7: Computer Graphics is written :)

Other books/sources (1)

phorm (591458) | more than 11 years ago | (#5800457)

But what a lot of us need before we get heavy into the CG and workings of the card - is a knowledge of the more basic 3d programming principles. Does anyone have some basic, well-written code snippets for GL or something else linux-friendly.

Personally, I've found directX a nice language to work with, but it's MS and somewhat restricted to the OS. Why not a good GL wrapper for CG, does one exist? How about some good GL samples, period? Can anyone help here?

Re:Other books/sources (5, Informative)

magic (19621) | more than 11 years ago | (#5800726)

Why not a good GL wrapper for CG, does one exist? How about some good GL samples, period? Can anyone help here?


I released 75,000 lines of C++ code for supporting OpenGL game development on Windows and Linux as the G3D [graphics3d.com] library. It is under the BSD license. The next release includes support for the shaders that are compiled by Cg-- you can grab it from the SourceForge CVS site.


G3D includes some small OpenGL demos (~200 lines), wrappers for the nasty parts of OpenGL, and wrappers for objects like textures and vertex shaders.


-m

Re:Other books/sources (1)

spectral (158121) | more than 11 years ago | (#5804144)

there's also nehe's site [gamedev.net] which is pretty nice. Tends towards windows, but a lot of things have been re-written in SDL, and should therefore work in linux.

Re:Other books/sources (1)

Dr. Sp0ng (24354) | more than 11 years ago | (#5805124)

Check out "Real-Time Rendering" by Tomas Akenine-Moller and Eric Haines. It's extremely math-intensive, but that's an inherent fact of 3D graphics in general anyway, and if you know the math behind it well, you can write a much better engine. It also only assumes you know the very basics of matrix and vector math, which you can learn online in an afternoon if you don't already know it. From there it goes into extreme detail on damn near everything relating to realtime rendering (as the title indicates :), including polygonal techniques, visibility/occlusion algorithms, shadows, lighting, collision detection, texturing, pixel/vertex shaders, curved surfaces, and a hell of a lot more. If you're going to buy one book on the topic, buy this one.

It's a very recent book (published in 2002), and includes relevent information on OpenGL and DirectX 8 as well as recent hardware (there is a case study on the Xbox near the end of the book). It also includes excellent information on optimizing the graphics pipeline and taking advantage of its inherent parallelism.

After this book, if you're looking for more game-related material as opposed to hardcore graphics and math, check out the Game Programming Gems series from Charles River Media. Some fantastic stuff in there (they even show up in the bibliography and footnotes in "Real-Time Rendering"!)

Re:good book (1, Funny)

spakka (606417) | more than 11 years ago | (#5800541)

Don't let the title foo you

I think you mean 'don't let the title bar you'

Sure but first (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5800307)

I'll finish reading "Why Nuclear Artillery shells are a good idea" and "The Human Genome, a guide for the lay homosapien" amongst others.


Great Idea!

Cg and OpenGL (2, Informative)

dmouritsendk (321667) | more than 11 years ago | (#5800493)

First, i dont understand why some people think its a bad idea nvidia are doing Cg. One of the goals of Cg is easier crossplatform development.. thats a noble cause if any =D

Secondly, my bet is OpenGLs shading language and Cg will eventually merge.

check: this [opengl.org]

and notice this part:

Compatibility with DX is very important, but they're willing to weigh advantages of changes vs. cost. Cg and the proposed "OpenGL 2.0" language are similar but not identical; both look like C, and both target multiple underlying execution units. Interfaces, communication between nodes, and accessing state across nodes are different. It's very late, but not too late to contemplate merging the two languages.


Re:Cg and OpenGL (1)

magic (19621) | more than 11 years ago | (#5800681)

There is also an OpenGL back-end for Cg, so you can compile Cg programs to run on the OpenGL ARB program extensions or the NVIDIA extensions.

-m

Platform independence! .. uh..no nvidia.. ati.. (0)

Anonymous Coward | more than 11 years ago | (#5800646)

Unless the language is truly platform independent, it's not worth any serious learning investment. As it will be completly worthless when you don't have access to that.. propietary thing.

So when the language works consistently defined and works across vendors of gpu's it will be something.

I guess Cg won't work on ATI .. :)

Re:Platform independence! .. uh..no nvidia.. ati.. (3, Insightful)

ShallowBlue (117910) | more than 11 years ago | (#5801063)

Wrong! Cg and also the Cg runtime are programmed on top of the standardized APIs OpenGL and Dirct3D. So Cg shaders work on any HW that supports these standard APIs.
On top of that: Since it will probably take an other 200 years for OpenGL 2.0 to see the light of the world Cg is the only way to write high-level shader programs for OpenGL.

Re:Platform independence! .. uh..no nvidia.. ati.. (0)

Anonymous Coward | more than 11 years ago | (#5803735)

I'd bet you will see the GL2.0 extensions and GL2 HLSL compiler implemented within the next six months. The CG runtime/language are roughly equivalent to the GL2 extensions.

Re:Platform independence! .. uh..no nvidia.. ati.. (1)

the_Speed_Bump (540796) | more than 11 years ago | (#5801313)

Cg runs just fine on anything that implements the ARB or NV extensions, which is basically everything except the Radeon 8500 family. (it works great on the 9700)

M for Murder (1)

XNormal (8617) | more than 11 years ago | (#5801104)

V for Vengeance

C for Craphics? Well, at least that how I first read it.

Re:M for Murder (0)

Anonymous Coward | more than 11 years ago | (#5804849)

That's "V for Vendetta", if you are refering to the comic book.

Aargh, nooo (1)

Hortensia Patel (101296) | more than 11 years ago | (#5801178)

The world needs another proprietary language like a hole in the head. Cg was created to reinforce NVIDIA's (then) lock on high-end consumer 3D and to abstract away the wildly differing capabilities of underlying hardware (thus encouraging developers to support the top-of-the-line, high-profit-margin chips without shutting out the huge installed base of older chips).

The first of these is in nobody's interest except NVIDIA. The second is a noble ideal, but very hard to pull off; the range of capability is just too great. It would have made more sense to wait a couple of years until the massmarket went fully-programmable. And there are STANDARD, vendor-neutral alternatives coming in the very very near future, notably the high-level OpenGL glslang.

Whatever their marketing may say, Cg will *never* be a level playing field for other IHVs. Thanks all the same, but we do not need, or want, another GLIDE.

Re:Aargh, nooo (1)

ShallowBlue (117910) | more than 11 years ago | (#5801306)

You've got no clue.

>thus encouraging developers to support the top-
>of-the-line, high-profit-margin chips without
>shutting out the huge installed base of older
>chips

Nobody makes money with the top-of-the-line, because there is no high-profit-margin.

Well, the fact that Microsoft copied the Cg syntax verbatim into the DirectX 9.0 specification shows that Cg is not that proprietary but that there is already a vendor offering an alternative implementation of the language.

Re:Aargh, nooo (1)

Hortensia Patel (101296) | more than 11 years ago | (#5804688)

It's not verbatim. It's close, because it's attacking the same basic problem, but it's not verbatim. Microsoft's HLSL is certainly not an "alternative implementation" of Cg.

I suspect that MS HLSL, Cg and glslang are all roughly equivalent. Given that, why on earth would anyone pick the single-vendor solution?

Re:Aargh, nooo (0)

Anonymous Coward | more than 11 years ago | (#5801983)

did you read this part of the review??

A final note for programmers using Direct3D 9: The high-level shading language included with the latest version of Direct3D, simply called HLSL for High-Level Shader Language, is syntactically equivalent to Cg. Everything written in the book about Cg equally applies to HLSL. Thus, the book is also an excellent guide for programmers that only intend to work with HLSL.

if the micro$oft's language is "syntactically equivalent" to cg--and it is--it becomes harder to claim it is an nvidia ploy since ati supports dx9 and microsoft's hlsl--and it's all the same basic language

some may recall that phigs and pex zealots (hard to believe there was such a thing now) said opengl was all biased in favor of sgi and nobody else

maybe nvidia thinks that a common shading language for gl and d3d can expand the entire market for fancy graphics chips--that's just good business

Cg IS NOT vendor specific (3, Interesting)

menasius (202515) | more than 11 years ago | (#5801760)

Sure Cg was developed and supported by NVIDIA but it works on a higer level than that. It compiles Programs down to either the DX shader language or the OpenGL ARB standards. The only vendor specific part is support for older hardware (NV_vertex_program extension and the like) but nothing is holding back someone from creating a profile to support ATI's proprietary extensions.

It is another layer and a nice one to boot. There is no performance loss running it on ATI's cards, infact the few demos I have written have run better on my friends radeon than on my Geforce3 by a long shot.

Quit trying to demonize nVidia for bringing some peace to the hectic world of writing shaders nine thousand different ways so some guy with an obscure video card doesn't complain.

-bort

Cg for NVIDIA only? (1)

xRelisH (647464) | more than 11 years ago | (#5802295)

Forgive my ignorance, but is Cg made for NVIDIA only? Or is it even optimized for NVIDIA chips?
If it is one of the above, I think that this is another gimmick for NVIDIA to get a greater market share

Re:Cg for NVIDIA only? (1)

menasius (202515) | more than 11 years ago | (#5802719)

Nope, its optimized for DX and openGl's shader interfaces. So the GPU optimizations happens at the normal display driver level. It run's fine if not better on ATI's than on nVidia's cards.

All it consists of is a language which sits atop multiple interfaces to shaders.

-bort

Re:Cg for NVIDIA only? (4, Informative)

Dr. Sp0ng (24354) | more than 11 years ago | (#5805078)

orgive my ignorance, but is Cg made for NVIDIA only? Or is it even optimized for NVIDIA chips?

It's not optimized for anything. When you compile a Cg program (either offline or at runtime - the Cg compiler is very fast!) you specify a "profile" for it to use. Some of the currently-supported profiles are arbvp1 (which outputs code for OpenGL's ARB_vertex_program extension), vs_1_1 (DirectX 8 vertex shader), vp20 (NVIDIA's NV_vertex_program OpenGL extension), vp_2_0/vp_2_x (DirectX 9 vertex shader), vp30 (NVIDIA's NV_vertex_program2 OpenGL extension), ps_1_1/ps_1_2/ps_1_3 (DirectX 8 pixel shader), fp20 (OpenGL NV_texture_shader and NV_register_combiners extensions), arbfp1 (OpenGL's ARB_fragment_program extension - vendor-independent for older cards), ps_2_0/ps_2_x (DirectX 9 pixel shader, vendor-independent for 4th generation GPUs), and fp30 (NV_fragment_program OpenGL extension, for 4th-gen NVIDIA GPUs).

So these profiles are optimized for their target platforms, and yes, currently NVIDIA chips are better supported. However, vendors can write profiles for their chips without NVIDIA's support, so for example, ATI could write a profile for the Radeon 9800 and it would work fine. However, ATI has already written support for DX9 shaders, so the vs_2_x/ps_2_x targets would work fine for that (or vs_1_x/ps_1_x for the 8500 generation).

Don't listen to the Slashbots here - I am a professional game developer, and Cg is a godsend (and I'm even developing mostly using ATI cards). Since runtime compilation is so fast, Cg programs can be compiled when the game is played and the exact hardware being used is known. I don't imagine I have to go into more detail as to why that's a fantastic thing.

Do they still make "demos"? (1)

jafuser (112236) | more than 11 years ago | (#5802578)

I remember back in the days when some of the more elite among us would do some very elegant demonstrations of the graphical capabilities of the tiny computer systems at the time.

Where have they gone?

I'd like to see what can be done with today's hardware when it's really pushed to the limits, along with the same style of creativity that these guys had.

Even just some individual demonstrations of what is possible by doing some clever hacking in the Cg context would be awesome.

I miss the Amiga. =(

Re:Do they still make "demos"? (1, Informative)

Anonymous Coward | more than 11 years ago | (#5803444)

scene.org

That'll take you in. It's a little difficult to google "demo."

Re:Do they still make "demos"? (1)

n.wegner (613340) | more than 11 years ago | (#5803557)

Well, nVidia have put out some interesting stuff:
http://www.nvidia.com/view.asp?IO=demo_daw n
http://www.nvidia.com/view.asp?PAGE=po3d_downlo ads

There's only ONE test of graphics anything (2, Funny)

TerryAtWork (598364) | more than 11 years ago | (#5802632)

And thats's what Carmack thinks of it.

If he writes Quake 4 in Cg - it's in.

Re:There's only ONE test of graphics anything (1)

menasius (202515) | more than 11 years ago | (#5802756)

This is like saying that Delphi and java will never make it for games because Carmack doesn't use them. (by the way they do make commercial games with BOTH of these)

I will say that Carmack was influential in getting OpenGL accepted to mainstream game development, perhaps. But you are truly comparing apples to oranges.

-bort

Be wary... (3, Interesting)

GarfBond (565331) | more than 11 years ago | (#5802726)

Why should you use Cg? At this point, the only benefit one can see is if you're going to be doing crossplatform coding (DX vs. OGL). If you're going to be doing DX-only, you should stick with HLSL. Why?

Cg was developed, designed, and created by nvidia. While one of their claims is that it can be made to run on any card and is multiplatform, don't let that fool you. Cg is, at its worst, a thinly veiled attempt to convince developers to produce optimal code for nvidia cards at the expense of broad hardware support. ATI has already said that they will not be supporting Cg (in order for it to work best on ATI cards, someone needs to create profiles for it) and will instead be supporting HLSL. I doubt S3/Via or SIS have the resources to commit to 2 different projects, so I bet they're going to go with HLSL.

If you don't understand why nvidia might be looking for code that works best only on its cards (it's almost a "duh" question), look at it a different way. Look at the GFFX. In almost every instance, it's a failure. Sure, it can stick to 32-bit precision, but it runs really, really slow when you do (just look at the 3dmark03 scores recently released and john carmack's .plan comments). When it runs at 16-bit precision, it's still damn slow, almost always losing out to the Radeon 9700/9800s, but it's a little more competitive (DX9's minimum spec appears to require 24bit precision, but rumor says the jury's still out on that). It's in nvidia's best interest to make the FX appear to be fast (which it isn't), and so they're relegated to make Cg code that optimizes for nvidia cards their best interest.

Sorry I don't have links, but the beyond3d.com [beyond3d.com] forums have a lot of information on this subject.

Re:Be wary... (1)

menasius (202515) | more than 11 years ago | (#5802856)

How much playing with the cg toolkit have you done? In all but ONE of the profiles, the Cg code compiles down to STANDARDIZED interfaces eg DirectX's HLSL OR the ARB exensions to OpenGL. The ONE that it actually compiles for NVIDIA hardware is close to legacy support not likely to be useful in games considering the other options.

I'm sorry to say this but you are flat wrong on this. Cg is sytatctically equivalent to HLSL and is not "optimized" for nvidia products. Had anyone else developed this it would not meet with such stark oposition, you people are just too hungry to cry big business is keeping me down.

Meanwhile you support something that ties you to one platform, one OS AND is produce by a business which has been found guilty of monopolistic practices, but there HLSL is for the good of all.

-bort

Re:Be wary... (1)

Dr. Sp0ng (24354) | more than 11 years ago | (#5805021)

Cg was developed, designed, and created by nvidia.

Together with Microsoft, who then took the result and renamed it HLSL. That should answer the rest of your questions (of course, reading the article would have done that too).

Food for thought (0)

Anonymous Coward | more than 11 years ago | (#5803980)

Just a thought. I would rather have a program that runs well in both Linux and MS but requires an NVIDIA card than one that runs in Windows only but can use any card. Top all this off with the fact that there are indications that routines written in CG work well on modern ATI cards and what is the problem?

If a company, Nvidia in this case, makes an effort to make their product work better in Linux than I consider this a good thing. OpenGL is a fine thing but at the rate in which it is evolving I will be jacking in before it get's ratified. CG can be used today, and it offers some benefits.

factual information about Cg (0)

Anonymous Coward | more than 11 years ago | (#5804193)

Try http://www.cgshaders.org if you are looking for factual information about Cg. The forums there have accurate information about what Cg is and are pretty helpful.

- Anonymous Coward
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...