Cg Effects Explained

 
 

Cg Browser Effects Explained

 

Introduction

This document explains how the various shaders in the Cg Browser work.  Some of the samples are designed to demonstrate simple techniques, such as transform, lighting, texture coordinate generation, and texture coordinate transformation.  Others are intended to show the vast possibilities that vertex and pixel shaders enable.  Some samples have been modified from existing SDK demos to show the advantages of using a programmable vertex shader over the traditional fixed-function T&L.  The samples are designed to be educational, and provide inspiration for the uses of Cg along with programmable shaders.

Please note that although some demos might run on GeForce2 and MX-class products, the majority of shaders packaged with the Cg Browser require a GeForce3 or better graphics card to run.

Following is a list of shaders, in the order that they appear in the Cg Browser.  Each link navigates to a description of the shader as well as an accompanying snapshot.

 

GeForce FX Shaders (NV30+)

Please note that these shaders are not included by default in the Cg Browser.  Instructions for working with them are included in the Readme.txt file that accompanies each shader project.

Improved Skinning MultiPaint Raytraced Refraction
Improved Water

Melting Paint

Skin

 

NV2X Shaders

Matrix Palette Skinning matrix_palette_skinning

Grass Demo

grass_demo Fresnel Reflection Demo fresnel_reflection
Simple Fog simple_fog Sine Wave Perturbation sine_wave_perturbation_ogl Refract Reflect Demo refract_reflect_demo
Simple Lighting simple_lighting Water Demo water_demo Refractive Dispersion Demo refractive_dispersion_demo
Bump Dot3 Diffuse Specular bump_dot3_diffuse_specular Depth Sprites Hardware Shadow Maps hardware_shadow_maps
Bump Dot3x2 Diffuse Specular bump_dot3x2_diffuse_specular Lighting Demo Soft Stencil Shadow Volumes soft_stencil_shadows
Bump Horizon Mapping bump_horizon_mapping Dot3 Diffuse Specular dot3_diffuse_specular Stencil Shadow Volumes stencil_shadow_volumes
Bump Reflection Mapping bump_reflection_mapping Flare flare Anisotropic Lighting anisotropic_lighting
Detail Normal Maps detail_normal_maps Procedural Terrain Demo procedural_terrain_demo Grass Rendering grass_rendering
Water Interaction water_interaction Vertex Noise vertex_noise

 

NV30 Shaders


Improved Water
This demo gives the appearance that the viewer is surrounded by a large grid of vertices (because of the free rotation, but by switching to wireframe, or by increasing the frustum angle, it becomes apparent that the vertices are a static mesh, with the height, normal and texture coordinates being calculated on the fly based of the direction and height of the viewer. This technique allows for very GPU friendly water animations since the static mesh can be precomputed.  The vertices are displaced using sine waves, and in this example, a loop is used to sum up five sine waves, to achieve realistic effects.

 

 

 

 


Improved Skinning
This shader takes in a set of all the transformation matrices that can affect a particular bone. Each bone also sends in a list of matrices that affects it. There is then a simple loop that for each vertex goes through each bone that affects that given vertex and transforms it. This allows just one Cg program to do the entire skinning for vertices affected by any number of bones, instead of having one program for one bone, another program for two bones, and so on.  

 

 

 

 

 


MultiPaint
MultiPaint presents a single-pass solution to a common production problem -- mixing multiple kinds of materials on a single polygonal surface. MultiPaint provides a simple BRDF that is still complex enough to represent many common metallic and dielectric surfaces, then controls all key factors of the variable BRDF through texturing. This permits you to create multiple materials without switching shaders, splitting your model, or resorting to multiple passes.

Uses for MutliPaint might include complex armor built of inlaid metals, woods, and stones, all modeled on a single simple poly mesh; buildings composed of multiple types of stone, glass, and metal, expressed as simple cubes; cloth with inlaid metallic threads; or as in this demo, metal partially-covered with peeling paint.

Using multiple BRDFs is common in the offline world, but rarely optimized -- instead two different shaders may be evaluated and their results blended using a mask texture or chained through "if" statements. For maximum realtime performance, MultiPaint instead integrates all of the key parts of the BRDFs as multiple painted textures, so that only one pass through the shader is required to create the mixed appearance. This permits a single-pass shader containing diffuse, specular, and environmental lighting effects in a compact, fast-executing package.


Melting Paint
This shader uses an environment map with procedurally modified texture lookups to create a melting effect on the surface texture (the NVIDIA logo in this example). The reflection vector is shifted using a noise function, giving the appearance of a bumpy surface.  The surface texture's texture coordinates are shifted in a time-dependent manner, also based on a noise texture.

 

 

 

 

 

 


Raytraced Refraction
This shader presents a method for adding high-quality details to small objects using a single-bounce ray traced pass. In this example, the polygonal surface is sampled and a refraction vector is calculated. This vector is then intersected with a plane that is defined as being perpendicular to the object's X axis. The intersection point is calculated and used as texture indices for a painted Iris.

The demo permits varying the index of refraction, the depth and density of the lens. Note that the choice of geometry is arbitrary -- this sample is a sphere, but any polygonal model can be used.

 

 

 

 

 


Skin
This effect demonstrates some techniques for rendering skin ranging from simple Blinn-Phong Bump-Mapping to more-complex Subsurface Scattering lighting models. It also illustrates the use of "Rim" lighting and simple translucency for capturing some of the more subtle properties of skin resulting from complex, non-local lighting interactions. Finally, it shows how the various techniques can be combined to produce compelling stylized skin.

 

 

 

 

 

 


 

 

NV2X Shaders


Matrix Palette Skinning matrix_palette_skinning

This shader performs matrix palette skinning with vertex diffuse lighting. Each vertex has four indices which reference the four bones (input as uniform parameters) which influence this particular vertex, along with the weights of influence. We calculate the vertex position in world space as influenced by each bone (so, four separate positions are generated per vertex) and then we blend these four positions according to the weights to generate the final vertex position in world space. Since, in this particular mesh, the first two bones influencing each vertex are by far the most significant (have the largest weights), we only transform the normals by the first two bones and their respective weights, and then normalize the result. This normal is then used with the uniform parameter for the light vector in world space to calculate diffuse intensity.

 

 

 


Simple Fog simple_fog

This example renders a fogged landscape with two types of fog.  It shows the interaction of fog calculations with pixel and vertex shader programs.

The first fog factor is a traditional screen depth based fog, calculated from the Z component of the vertex after the model-view-project transform.  This value is written to the shader oFog.x which is used in a blending stage after the pixel shader program.

The second fog term is based on the model space landscape height and appears as the white fog in the valleys.  You can change the scaling of this second term with the “<” and “>” keys.  The value is passed into the pixel shader program, where it is added to the base texture color.

The landscape height is generated as the sum of three blurred and scaled random noise fields.  The blurring produces features of varying roughness, and the more blurred components are added in with a greater scale factor.

 


Simple Lighting simple_lighting

This example shows basic diffuse and specular lighting calculations, based on the Phong illumination model.  The diffuse term is calculated using the usual N dot L formulation, and the specular term uses the Blinn formulation of N dot H.  This is also the example that is used in "A Brief Tutorial", which can be found in the Cg User's Manual.

 


Bump Dot3 Diffuse Specular bump_dot3_diffuse_specular

This effect computes diffuse and specular lighting in two passes using a normal from a normal map. The first pass lays down the diffuse component, by transforming the light vector into tangent space in the vertex program and then using this light vector along with the normal and a decal texture to compute diffuse lighting. In a second pass, the half-angle vector is computed per-vertex, transformed into tangent space, and passed down in a texture coordinate. H dot N is then computed per-pixel, and the result is squared a number of times to achieve a higher specular power, then modulated by a self-shadowing term and added into the framebuffer.

 

 

 

 


Bump Dot3x2 Diffuse Specular bump_dot3x2_diffuse_specular

This demonstrates diffuse and specular illumination on a bump mapped model.  N · L and N · H are computed for each rendered pixel.  The values of the dot-products are then used as (u,v) coordinates to address an irradiance map.  This is a texture which holds a diffuse color ramp in one axis, and a specular color ramp in the alpha channel of the other axis.  In this way, an arbitrary response to the incident light direction is possible.  You can achieve any specular exponent or arbitrary function by editing the color ramps in a paint program.

The demo includes right-button menu options for loading normal map bump maps, base textures, and irradiance maps, as well as the ability to visualize the incident light vector or each texture map on its own.

 

 


Bump Horizon Mapping bump_horizon_mapping

This effect does per-pixel bump mapping and makes every bump cast a shadow using the horizon mapping technique.  Based on the height map, 8 horizon maps are precomputed corresponding to 8 main light directions in the tangent plane.  Every horizon map texel holds the cosine of the angle between the normal and the light direction below which this texel is shadowed by its neighboring texels.  At run-time, we assume that the surface is locally planar enough that the visibility pre-computation holds true.  For every pixel, we compare the actual cosine of the angle between the normal and the current light direction to the limit stored in the horizon map and determine if the pixel is in shadow or not.  The current light direction falls in-between two of the 8 main light directions; the value of the limit is thus computed by linear interpolation between those 2 main directions.

 

 

 


Bump Reflection Mapping bump_reflection_mapping

This shader demonstrates how to perform the ubiquitous "bumpy shiny" effect in Cg, where a per-pixel normal is used to compute a reflection vector to lookup into a cubemap. The vertex program performs the bumpy-shiny setup, by passing a matrix to transform a vector from tangent space to "cube map space" (usually world space) as texture coordinates, along with an eye vector in world space. The pixel shader then computes the texCUBE_reflect_eye_dp3x3 function, which transforms the normal fetched from a normal map by the 3x3 interpolated matrix, computes a reflected vector using the eye vector and transformed normal, and uses this reflected vector to lookup into a cubemap.

 


Detail Normal Maps detail_normal_maps

This demo shows a technique for combining normal maps at runtime. There is a "detail" normal map, which shows fine detail and fades in as you get closer to the surface, and a larger scale primary normal map. The vertex program does standard setup for tangent-space bump-mapping, and passes a light vector in tangent space to the pixel shader. At the pixel level, we combine the two normals together, which results in a non-unit length normal, and then renormalize the normal using a fast single iteration Newton-Raphson approximation. This normal is then used to calculate the diffuse intensity, which is then modulated by the base texture.

 

 

 

 

 


Water Interaction water_interaction

The demo is meant to show two main things:

1) Coupling two dynamic rendered-texture water simulations together so that a tiled texture applied across a large area can transition seamlessly to a local unique detail texture.

2) A method of rendering water reflections similar to Environment Mapped Bump Mapping (EMBM) but using vertex shaders and the pixel shader texm3x2tex operation for added control and realism.

The coupling of two water simulations allows a large area of procedural water to be created while still allowing for unique local features.  The reflection method used performs the same basic calculation as DX6-style EMBM, but in this case the 2x2 rotation matrix and base texture coordinates are calculated in a vertex shader and can vary per-vertex.  The water simulation is performed entirely on the graphics hardware using pixel shaders. Nearest-neighbor differencing and sampling at each texel is accomplished using a combination of vertex and pixel shaders.  


Grass Demo grass_demo

This demo shows the construction and animation of leaves of grass being calculated on the GPU. Each leaf is created by a Bezier curve with a few control points that are procedurally moved around to accommodate the wind. The normals of the leaves is also calculated from this Bezier curve. The vertices being sent to the shader consist of one degenerate quad strip for each leaf, with all the vertices collapsed to the root point of the leaf, but each vertex has a different t parameter which determines where on the Bezier curve the vertex should be located.

 

 


Sine Wave Perturbation sine_wave_perturbation_ogl

A flat grid of polygons is deformed by a sine wave, sampled from the constant memory using a Taylor series.  The shader calculates the first 4 values in the series.  The distance from the center of the grid is used to generate a height, which is applied to the vertex coordinate.  A normal is also generated and used to generate a texture coordinate reflection vector from the eye, which looks into a cubic environment map.

The input data for this shader is simply 2 floating-point values.  The shader generates all normal, texture and color information itself.  The menu-option allows rendering in wire frame, and left clicking allows rotating the mesh.

 

 


Water Demo water_demo

This demo gives the appearance that the viewer is surrounded by a large grid of vertices (because of the free rotation, but by switching to wireframe, or by increasing the frustum angle, it becomes apparent that the vertices are a static mesh, with the height, normal and texture coordinates being calculated on he fly based of the direction and height of the viewer. This technique allows for very GPU friendly water animations since the static mesh can be precomputed.

 

 

 

 


Depth Sprites

This demo illustrates how texture shaders can be used to create sprites with real depth data.  The depth sprites differ from normal sprites in that they can realistically intersect in 3D with other depth sprites or standard 3D objects.  The register combiners are used to normalize the light vector, calculate its reflection off the normal mapped surface, as well as calculate both diffuse and specular lighting equations.

 

 

 

 

 


Lighting Demo lighting_demo

This demo shows how to implement each of the standard OpenGL lighting types in the vertex shader.  It includes examples of spotlights, local lights, and infinite lights.                                                    

 

 

 

 

 

 


Dot3 Diffuse Specular dot3_diffuse_specular

This shader performs diffuse and specular lighting in two passes using an interpolated normal, and shows a couple of different ways to perform per-fragment vector normalization. In the first pass, diffuse lighting is calculated as follows: in the vertex program, a light vector and normal are calculated and passed to the pixel program. At the pixel level, the light vector is normalized using a Newton-Raphson approximation and the light vector is normalized using a normalization cubemap. The normal, light vector, and decal texture are then used to compute the diffuse lighting term. In a second pass, the vertex program calculates the half-angle vector and normal, and the pixel program normalizes these two vectors, performs H dot N, and squares the result a number of times to achieve a higher specular power. This value is then added into the framebuffer.

 


Flare flare
This glow effect is produced by applying a convolution filter to a low resolution texture which contains bright pixels at sources of glow. First, a "glow source" texture is created by rendering the object to a texture render target. For this rendering, a mask channel, here stored in the alpha channel of the object's base texture, determines which parts of the object contribute bright pixels to the "glow source" texture. The rendered texture is then set as the input texture for the convolution operation. The result of the convolution is accumulated in another texture, the "glow" texture, which will contain the brightness of the blurry glow for everything in the scene as viewed from the camera's point of view. The scene is rendered normally, and the "glow" texture is then rendered to the full screen with additive blending.

To perform the convolution of glow source pixels into bright blurry glow, several rendering passes may be used. On GeForce3 and GeForce4, each pass accumulates four samples of the convolution kernel. Each sample is multiplied by a coefficient to determine the shape of the glow, and this shape can be any arbitrary function. To save passes, a separable convolution is used which blurs first horizontally, and then blurs the horizontal blur vertically. This may or may not be close to a separable Gaussian convolution. If the desired blur size is NxN texels, this approach reduces the number of samples which must be accumulated from N*N to 2*N, a substantial savings. This approach sacrifices some flexibility in determining the shape of the blur.

The effect runs from about 200 to 650 fps in windowed mode. Most of the time for rendering in the windowed mode is spent in blitting from backbuffer to the windowed area. To make this less significant, reduce the size of the window. This blit() time is not required for fullscreen mode, so the effect will run faster in a fullscreen application.


Procedural Terrain Demo

This example uses the vertex program Perlin noise implementation to generate a two octave ridged multifractal terrain. The geometry sent to the hardware is a static flat quadrangle mesh, which is displaced on the fly by the programmable vertex hardware. A 3D texture is used to color the terrain based on height. The sliders can be used to modify the frequencies and amplitudes of the 2 noise octaves, producing different styles of terrain.

Reference: "Texturing and Modeling, A Procedural Approach", Ebert et. al

 

 

 


Vertex Noise vertex_noise

This example demonstrates an implementation of Perlin noise using vertex programs. An animated 3D noise function is used to displace the vertices of a sphere along the vertex normal. The geometry is entirely static, but is displaced on the fly by the vertex program hardware. Perlin noise is implemented for the vertex program profile using recursive lookups into a permutation table stored in constant memory. The size of this table determines the period at which the noise function repeats. 3D noise costs around 65 instructions, 2D noise around 45, 1D noise around 20.

Reference: http://mrl.nyu.edu/~perlin/doc/oscar.html

 

 

 


Fresnel Reflection Demo Fresnel_reflection_dx8

When light strikes a boundary between different media, for example, air and glass, some of the light gets refracted and some gets reflected.  The amount of reflection depends on the ratio of refraction-indices of the two media, the polarization of the light, the wavelength of the light, and the angle of incidence of the light.  Fresnel’s formula provides an accurate description of how much light reflects at the boundary:

        R(q) = ⅛ (sin2(q - qt) / sin2(q + qt)) (1 + cos2(q - qt) / cos2(q + qt)) 

We are using two approaches to approximate the Fresnel reflection formulae: per-pixel and per-vertex.  Both compute a reflection vector per-vertex and use it to look up a reflection value per-pixel via a cubic environment-map.  The Fresnel reflection value then blends this reflection-value with a material-color.  The per-pixel approximation derives the Fresnel reflection value from a texture look-up.  It computes cos(q) per-vertex -- cos(q) = N·E is readily available in the vertex-shader, as it is required in computing the reflection-vector -- and uses the interpolated per-pixel cos(q) in a 1D texture look-up that encodes R(arcos(x)). The per-vertex approximation of Fresnel reflection computes the Fresnel term R(q) per vertex.  This per-vertex term R(q) is then linearly interpolated and applied per-pixel.  Encoding equation in a vertex-shader is straightforward, but also sub-optimal in terms of number of vertex-shader instructions and thus performance.  Approximating R(q) as

(2)       R(q) » Ra(q) = R(0) + (1-R(0)) (1-cos(q))5
instead yields good results.
See the white-paper “Fresnel Reflection” on
http://www.nvidia.com/developer for details.

The menu options allow switching between the per-pixel and per-vertex approximation, as well as switching to wire-frame rendering.  Different indices of refraction are accessible via the menu or the +/- keys.


Refract Reflect Demo refract_reflect_demo

This demo calculates a reflection and a refraction vector based on the normal on the surface of the object and modulates them based on Fresnel properties.  See the white-paper “Fresnel Reflection” on http://www.nvidia.com/developer for more information about Fresnel effects.  

 


Refractive Dispersion Demo refractive_dispersion_demo
This example attempts to simulate the wavelength dependent nature of light refraction. In lens design this effect is also known as chromatic aberration.

The code calculates three different refraction vectors for the red, green and blue wavelengths of light, each with slightly different indices of refraction. Each of these vectors is used to index into a cube map of the environment, and the resulting colors are modulated by red, green and blue and then summed to produce the rainbow effect. A reflection vector is also calculated, and used to index into the same cube map, making a total of four texture lookups. The reflection is modulated by a Fresnel approximation, which makes surfaces facing perpendicular to the viewer appear more reflective.

Reference: http://www.botzilla.com/house/RayPS.html

 

 


Hardware Shadow Maps hardware_shadow_maps

This example shows the usage of hardware shadow maps. A render-to-texture to a shadow map is done from the light's point-of-view, updating only depth (no color). This shadow map is then set as a texture when rendering the scene normally. The vertex shader computes the necessary texture coordinates to sample the correct texel from the shadow map given the current position, along with the current depth at that point to compare with the value in the shadow map. At the pixel level, a simple fetch from the shadow map will automatically perform a comparison between the value in the shadow map and the (r / q) texture coordinate, returning black when in shadow and white when in the light. This shadow term is then used to modulate the lighting.

Learn more about shadow mapping at http://developer.nvidia.com/view.asp?IO=shadow_mapping.

 


Soft Stencil Shadow Volumes soft_stencil_shadows

This demo shows a brute force approach to creating soft shadows in graphics hardware. It applies several stencil shadow volume passes to the scene from different light positions to approximate an area light. Each shadow volume pass creates a faint hard edged shadow, and the accumulation of many faint shadows blends into an accurate soft shadow. The technique is well suited to creating realistic soft shadows from area light sources, though the required number of passes may be prohibitive for real-time interactivity. Still, using hardware acceleration of the shadow volume creation and rendering greatly improves the speed at which frames may be rendered. Shadow volumes are created on the GPU by the same technique demonstrated in our "Stencil Shadow Volumes" demo.

To learn more about shadow volumes, check out http://developer.nvidia.com/view.asp?IO=robust_shadow_volumes.

 


Stencil Shadow Volumes stencil_shadow_volumes
This demo presents a technique for the automatic creation of shadow volume geometry. A vertex shader is used to extrude closed hull 3D models into shadow volumes. The shadow volumes are used for stencil buffer shadow rendering, and the volumes themselves can be viewed by hitting the 'SPACE BAR' in the demo. This technique accurately renders self-shadowing objects and intersecting objects. Without the use of a vertex shader, creation of the shadow volume geometry would require costly CPU processing and additional memory. Using a vertex shader, shadow volumes can be generated directly from 3D objects without stalling the graphics pipeline.

In order for the vertex shader to extrude objects with sharp features to the proper shadow volume shape, additional triangles and vertices must be added along sharp edges. The demo code has a simple class to add such triangles and vertices to geometry where needed. This extra geometric data is required only when rendering the extruded shadow volumes. It can be skipped when rendering the objects normally to the screen.

To learn more about shadow volumes, check out http://developer.nvidia.com/view.asp?IO=robust_shadow_volumes.


Anisotropic Lighting anisotropic_lighting

This effect shows how arbitrary ad-hoc lighting models can be encoded in textures. Here, we have a texture with a bright diagonal area that encodes and anisotropic lighting-type effect. The vertex program calculates H dot N for the texture coordinate for one axis of a texture and L dot N for the other axis. End result is a bright stripe in areas where H dot N and L dot N are roughly equal.

 


Grass Rendering grass_rendering

This example shows a volume rendering technique similar to the shell rendering technique from "Real-Time Fur Over Arbitrary Surfaces" by Lengyel et al, except by rendering four layers in a single pass, with eight total layers in only two passes. The first pass computes the texture coordinates for the first four steps along the eye vector in a vertex program. At the pixel level, we use these texture coordinates to look up into four textures representing the first four layers of grass, blend them together based on the density at each pixel, and output this intermediate color to the framebuffer. In a second pass, the final four steps along the eye ray are computed, the last four layers of grass texture are blended together, and the alpha blender is setup to correctly combine these two passes to result in the final rendering of eight layers of grass. An additional tweak is provided to cap the z-component of the eye vector in the vertex program, since this has a tendency to grow extremely large and cause aliasing as the surface is viewed edge-on.

 



 
 
LinkedInTwitterGoogle+FacebookReddit