The Cg Tutorial 111
The Cg Tutorial: The Definitive Guide to Programmable Real-Time Graphics | |
author | Randima Fernando, Mark J. Kilgard |
pages | 384 |
publisher | Addison-Wesley Publishing |
rating | 8 |
reviewer | Martin Ecker |
ISBN | 0321194969 |
summary | An excellent introduction to the high-level shading language Cg (C for Graphics) and its uses in real-time 3D graphics. |
The first half of the book teaches the basic language constructs of the Cg shading language and shows how to use them in concrete example shaders, whereas the second half concentrates on more advanced techniques that can be achieved on today's programmable GPUs with Cg, such as environment or bump mapping. Even these more advanced techniques are explained in a clear and easy-to-understand manner, but the authors do not neglect to present the mathematics behind the techniques in detail. Especially the more serious 3D programmer will appreciate this fact. The explanation of texture space bump mapping must be the easiest-to-understand explanation of the technique I have read to date, which alone makes it worth to have this book on my shelf. At this point it is important to note that the book does not discuss the Cg runtime which is used by applications to compile and upload shaders to the GPU. The book focuses exclusively on the Cg language itself. So if you're already familiar with Cg and want to learn how to use the Cg runtime, this book is not for you and you should rather read the freely available Cg Users Manual.
The book contains many diagrams and figures to illustrate the discussed equations and show the rendered images produced by the presented shaders. Note that most figures in the book are in black and white which sometimes leads to funny situations, such as in chapter 2.4.3 where the resulting image of a shader that renders a green triangle is shown. Since the figure is not in color the triangle that is supposed to be solid green ends up being solid gray. However, in the middle of the book there are sixteen pages with color plates that depict most of the important color images and also show some additional images of various applications, NVIDIA demos, and shaders written for Cg shader contests at www.cgshaders.org.
Accompanying the book on CD-ROM is an application framework that allows you to modify, compile, and run all the example shaders in the book without having to worry about setting up a 3D graphics API, such as OpenGL or Direct3D. The application framework uses configuration files to load meshes and textures and set up the graphics pipeline appropriately for the shaders. This way the Cg shaders can be examined and modified in isolation with the results being immediately visible in the render window of the application. Thanks to this framework application even readers that are not yet familiar with a 3D graphics API or even 3D artists interested in programmable shading on modern GPUs can begin to learn Cg and experiment with real-time shaders.
A final note for programmers using Direct3D 9: The high-level shading language included with the latest version of Direct3D, simply called HLSL for High-Level Shader Language, is syntactically equivalent to Cg. Everything written in the book about Cg equally applies to HLSL. Thus, the book is also an excellent guide for programmers that only intend to work with HLSL.
This book truly is the definitive guide for all beginners with the Cg language, and also more advanced 3D programmers will find the chapters about vertex skinning, environment mapping, bump mapping, and other advanced techniques interesting. Once you've started writing shaders in Cg you will never want to go back to writing them in low-level assembly shading languages ever again.
You can purchase The Cg Tutorial: The Definitive Guide to Programmable Real-Time Graphics from bn.com. The book's official website has additional information and ordering options besides. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
Looks Interesting (Score:4, Interesting)
nforce looks pretty cool though.
Re:YEAH BUT DOES IT RUN LINUX? (Score:1, Informative)
why nvidia may not be going the way of 3dfx yet (Score:2)
Re:why nvidia may not be going the way of 3dfx yet (Score:2, Interesting)
On a related note, does Age Of Mythology even support the ATI Radeon 9700 Pro? We messed with it for hours trying different patches, hacking the video card
Re:why nvidia may not be going the way of 3dfx yet (Score:2)
Obviously you've never used ATI's FireGL drivers. They drive my Radeon 8500 much faster, and with more stability, than I ever got under Windows with the card (and more than I ever got under Linux with my GF3 and nVidia's drivers).
Dinivin
Re:Looks Interesting (Score:2, Interesting)
Re:Looks Interesting (Score:2)
Look at the numbers and you will see that the "4" and the "FX" are, basicly, the same card. I think a "p4 trick" is upon us. They boosted the clock speed but reduced the instructions per cycle.
Why the hell would you retool your factory to produce a card identical to the old one? Seems strange to me.
Re:Looks Interesting (Score:2)
Re:Looks Interesting (Score:1)
Re:Looks Interesting (Score:2)
Sadly, they're also walking the Voodoo path with their new "strategic alliances" [nvidia.com] under which publishers will develop games using Nvidia specific features. It's a pathetic tactic from a company that built their success on strong engineering rather than lame marketing ploys.
Re:Looks Interesting (Score:2)
The marketing department of any new successful company will expand to consume all available resources.
NVIDIA (Score:4, Funny)
Re:NVIDIA (Score:3, Interesting)
Re:NVIDIA (Score:2, Interesting)
I think i would prefer better eye-candy to more eye-candy.
Re:NVIDIA (Score:3, Interesting)
Re:NVIDIA (Score:3, Informative)
Guards, Seize Him! (Score:1, Insightful)
Re:Guards, Seize Him! (Score:2, Funny)
Re:Guards, Seize Him! (Score:3, Funny)
Remember, you can't have Communism without the "C"!
Re:Guards, Seize Him! (Score:1)
Re:NVIDIA (Score:5, Funny)
BCPL springs to mind; it was developed (by one of my old supervisors!) specifically to avoid platform lockin. At the time, the university was about to acquire a second computer - but it wasn't compatible with the first. To make matters easier for the users, Martin Richards [cam.ac.uk] designed BCPL and an accompanying bytecode language called cintcode. Despite its age - it's an indirect ancestor of C! - it is still in use today in a few applications; apparently Ford have a custom-built setup running in BCPL on a pair of Vaxes to manage each factory outside the US. (For some reason, the US factories use a different system.) With the demise of the Vax, Ford have been supporting Martin's work in porting the whole BCPL/cintcode/Tripos (a cintcode OS) to run on Linux/x86 systems.
For that matter, I seem to recall most of the early computer languages were intended to reduce the need to be tied to a specific platform; Fortran, Pascal, C (derived from B, itself a cut-down version of BCPL), as well as the original Unix concept.
Re:NVIDIA (Score:2, Interesting)
Mindless trivia: The original user components of AmigaDOS were written in BCPL, by a British company (Metacomco) contracted by Commodore. Through various revisions, they were rewritten by Commodore in C.
Re:NVIDIA (Score:2, Insightful)
Ummm... As the reviewer points out, Cg is more or less equivalent to gpu-vendor-neutral HLSL for Direct3D - which belies your comment about Nvidia trying to dominate the market with a language.
However, one might say that MS and Nvidia are doing so together...
--TRR
Re:NVIDIA (Score:2)
OpenGL is vendor neutral, while Direct3D is Microsoft's (very successful) attempt to lock game developers into Windows and its DirectX platform.
Re:NVIDIA (Score:2)
And I hate microsoft as much as the next guy...
Re:NVIDIA (Score:1)
Re:NVIDIA (Score:1)
What about Bell Labs, with the original "release" of C? But perhaps that's old enough not to count. Besides, I'm not actually sure that NVIDIA are trying to control all that much... Isn't it perfectly possible to implement a Cg compiler for ATI hardware? It really ought to be, in theory. In practice, I gather ATI rather concentrate on OpenGL 2.0.
Hmmmm (Score:2)
Excellent - Now just give me a nice DX9/HLSL Book (Score:2, Funny)
Sounds interesting, but (Score:5, Interesting)
Seriously, I really would like someone to debunk this idea if possible, because I have picked up an interest in graphics programming and am just starting out - would like to know more. It seems like an easier / pragmatic route (due to code reusability) to go the other route ...
Re:Sounds interesting, but (Score:3, Informative)
Re:Sounds interesting, but (Score:2, Informative)
Cg on the other hand is used for writing what are known as procedural shaders. Shaders determine what an object/particle will look like, _procedural_ shaders can change what a polygon will look like based on any number of criteria (not the lease of which is time).
So if I'm going to texture a wall with wood, it makes sense to proceduraly generat
Re:Sounds interesting, but (Score:2)
Re:Sounds interesting, but (Score:5, Informative)
Firstly you have the application-level library (SDL, for example), this handles the stuff like opening windows, the user interaction. This is also the bit that's often written specifically for the game.
At the next level we have the 3d library, normally OpenGL or DirectX. This handles most of the actual graphics work itself, such as displaying your polygon meshes, applying standard lighting effects, and so forth.
Finally we hit the shader level. It's here that Cg comes into it's own, with special snippets of Cg code to get the reflections on the water to look just right and ripple as the character walks through, or to make the velvet curtains actually have the distinctive sheen. Special effects work only.
It is worth noting that Direct X does have it's own way of doing shaders now, and OpenGL does have a specification for them but last time I looked no one had this implemented.
Hope this makes sense.
Re:Sounds interesting, but (Score:2)
As for the library idea... Chances are that it will soon become feasible to implement what we would call GL's or DX's feature set entirely on the shader processor, and th
Cg is only for SHADERS not for the real program (Score:2)
Cg is only used for shaders, tiny little specialiced code fragments that run directly on the graphic card and manipulate vertexes (vertex shader) or shade the inside of the triangles in new ways. (pixel shaders)
Before Nvidia created Cg, the only way creating new shaders was using low-level assembly like language. With Cg
Re:Sounds interesting, but (Score:2, Insightful)
Re:Sounds interesting, but (Score:2)
Re:Here comes the back and forth... (Score:1)
Cease and Desist! (Score:4, Funny)
GCC and ANSI C standards (Score:5, Interesting)
Re:GCC and ANSI C standards (Score:1)
Re:GCC and ANSI C standards (Score:1)
Re:GCC and ANSI C standards (Score:2, Informative)
As others have said, Cg is not an extension of C, and GCC will never and should never support it.
Re:GCC and ANSI C standards (Score:1)
Re:GCC and ANSI C standards (Score:2)
good book (Score:5, Informative)
Don't let the title foo you -- it contains high level descriptions of the algorithms as well as the mathematical concepts. They cover some advanced realtime techniques that older books don't (since the processing power wasn't there even 4 years ago), but also discuss optimizing for low-end systems.
I do recommend this book if you ahve any interest in graphic programming (whether you use Cg or not). If you use it with Coputer Graphics (3rd edition), you should have access to pretty much all graphic algorithms. (at least until TAOP volume 7: Computer Graphics is written
Other books/sources (Score:2)
Personally, I've found directX a nice language to work with, but it's MS and somewhat restricted to the OS. Why not a good GL wrapper for CG, does one exist? How about some good GL samples, period? Can anyone help here?
Re:Other books/sources (Score:5, Informative)
I released 75,000 lines of C++ code for supporting OpenGL game development on Windows and Linux as the G3D [graphics3d.com] library. It is under the BSD license. The next release includes support for the shaders that are compiled by Cg-- you can grab it from the SourceForge CVS site.
G3D includes some small OpenGL demos (~200 lines), wrappers for the nasty parts of OpenGL, and wrappers for objects like textures and vertex shaders.
-m
Re:Other books/sources (Score:2)
Re:Other books/sources (Score:2)
Re:Other books/sources (Score:1)
(btw, if you want a REALLY math-intensive book, check out "3D Game Engine Design" by David Eberly. That was the first game/graphics programming book I purchased, and it almost scared me away entirely)
Re:good book (Score:2, Funny)
Don't let the title foo you
I think you mean 'don't let the title bar you'Cg and OpenGL (Score:3, Informative)
Secondly, my bet is OpenGLs shading language and Cg will eventually merge.
check: this [opengl.org]
and notice this part:
Compatibility with DX is very important, but they're willing to weigh advantages of changes vs. cost. Cg and the proposed "OpenGL 2.0" language are similar but not identical; both look like C, and both target multiple underlying execution units. Interfaces, communication between nodes, and accessing state across nodes are different. It's very late, but not too late to contemplate merging the two languages.
Re:Cg and OpenGL (Score:2)
-m
Re:Platform independence! .. uh..no nvidia.. ati.. (Score:3, Insightful)
On top of that: Since it will probably take an other 200 years for OpenGL 2.0 to see the light of the world Cg is the only way to write high-level shader programs for OpenGL.
Re:Platform independence! .. uh..no nvidia.. ati.. (Score:1)
M for Murder (Score:2)
C for Craphics? Well, at least that how I first read it.
Aargh, nooo (Score:1)
The first of these is in nobody's interest except NVIDIA. The second is a noble ideal, but very hard to pull off; the range of capability is ju
Re:Aargh, nooo (Score:1)
>thus encouraging developers to support the top-
>of-the-line, high-profit-margin chips without
>shutting out the huge installed base of older
>chips
Nobody makes money with the top-of-the-line, because there is no high-profit-margin.
Well, the fact that Microsoft copied the Cg syntax verbatim into the DirectX 9.0 specification shows that Cg is not that proprietary but that there is already a vendor offering an alternative implementation of the language.
Re:Aargh, nooo (Score:1)
I suspect that MS HLSL, Cg and glslang are all roughly equivalent. Given that, why on earth would anyone pick the single-vendor solution?
Cg IS NOT vendor specific (Score:3, Interesting)
It is another layer and a nice one to boot. There is no performance loss running it on ATI's cards, infact the few demos I have written have run better on my friends radeon than on my Geforce3 by a long shot.
Quit trying to demonize nVidia for bringing some peace to the hectic world of writing shaders nine thousand different ways so some guy with an obscure video card doesn't complain.
-bort
Cg for NVIDIA only? (Score:1)
If it is one of the above, I think that this is another gimmick for NVIDIA to get a greater market share
Re:Cg for NVIDIA only? (Score:1)
All it consists of is a language which sits atop multiple interfaces to shaders.
-bort
Re:Cg for NVIDIA only? (Score:5, Informative)
It's not optimized for anything. When you compile a Cg program (either offline or at runtime - the Cg compiler is very fast!) you specify a "profile" for it to use. Some of the currently-supported profiles are arbvp1 (which outputs code for OpenGL's ARB_vertex_program extension), vs_1_1 (DirectX 8 vertex shader), vp20 (NVIDIA's NV_vertex_program OpenGL extension), vp_2_0/vp_2_x (DirectX 9 vertex shader), vp30 (NVIDIA's NV_vertex_program2 OpenGL extension), ps_1_1/ps_1_2/ps_1_3 (DirectX 8 pixel shader), fp20 (OpenGL NV_texture_shader and NV_register_combiners extensions), arbfp1 (OpenGL's ARB_fragment_program extension - vendor-independent for older cards), ps_2_0/ps_2_x (DirectX 9 pixel shader, vendor-independent for 4th generation GPUs), and fp30 (NV_fragment_program OpenGL extension, for 4th-gen NVIDIA GPUs).
So these profiles are optimized for their target platforms, and yes, currently NVIDIA chips are better supported. However, vendors can write profiles for their chips without NVIDIA's support, so for example, ATI could write a profile for the Radeon 9800 and it would work fine. However, ATI has already written support for DX9 shaders, so the vs_2_x/ps_2_x targets would work fine for that (or vs_1_x/ps_1_x for the 8500 generation).
Don't listen to the Slashbots here - I am a professional game developer, and Cg is a godsend (and I'm even developing mostly using ATI cards). Since runtime compilation is so fast, Cg programs can be compiled when the game is played and the exact hardware being used is known. I don't imagine I have to go into more detail as to why that's a fantastic thing.
Do they still make "demos"? (Score:2)
Where have they gone?
I'd like to see what can be done with today's hardware when it's really pushed to the limits, along with the same style of creativity that these guys had.
Even just some individual demonstrations of what is possible by doing some clever hacking in the Cg context would be awesome.
I miss the Amiga. =(
Re:Do they still make "demos"? (Score:1, Informative)
That'll take you in. It's a little difficult to google "demo."
Re:Do they still make "demos"? (Score:1)
http://www.nvidia.com/view.asp?IO=demo_da
http://www.nvidia.com/view.asp?PAGE=po3d_downl
Re:Do they still make "demos"? (Score:2, Interesting)
There's only ONE test of graphics anything (Score:3, Funny)
If he writes Quake 4 in Cg - it's in.
Re:There's only ONE test of graphics anything (Score:1)
I will say that Carmack was influential in getting OpenGL accepted to mainstream game development, perhaps. But you are truly comparing apples to oranges.
-bort
Re:There's only ONE test of graphics anything (Score:2)
Be wary... (Score:3, Interesting)
Cg was developed, designed, and created by nvidia. While one of their claims is that it can be made to run on any card and is multiplatform, don't let that fool you. Cg is, at its worst, a thinly veiled attempt to convince developers to produce optimal code for nvidia cards at the expense of broad hardware support. ATI has already said that they will not be supporting Cg (in order for it to work best on ATI cards, someone needs to create profiles for it) and will instead be supporting HLSL. I doubt S3/Via or SIS have the resources to commit to 2 different projects, so I bet they're going to go with HLSL.
If you don't understand why nvidia might be looking for code that works best only on its cards (it's almost a "duh" question), look at it a different way. Look at the GFFX. In almost every instance, it's a failure. Sure, it can stick to 32-bit precision, but it runs really, really slow when you do (just look at the 3dmark03 scores recently released and john carmack's .plan comments). When it runs at 16-bit precision, it's still damn slow, almost always losing out to the Radeon 9700/9800s, but it's a little more competitive (DX9's minimum spec appears to require 24bit precision, but rumor says the jury's still out on that). It's in nvidia's best interest to make the FX appear to be fast (which it isn't), and so they're relegated to make Cg code that optimizes for nvidia cards their best interest.
Sorry I don't have links, but the beyond3d.com [beyond3d.com] forums have a lot of information on this subject.
Re:Be wary... (Score:1)
I'm sorry to say this but you are flat wrong on this. Cg is sytatctically equivalent to HLSL and is not "optimized" for nvidia products. Had anyone else developed this it would
Re:Be wary... (Score:2)
Together with Microsoft, who then took the result and renamed it HLSL. That should answer the rest of your questions (of course, reading the article would have done that too).
Re:Hasn't RADEON taken the performance lead ?????? (Score:2)