With the new Pascal-architecture Simultaneous Multi-Projection technology we can implement several new techniques that improve your experience on these displays. And in Virtual Reality, improve performance, too.
The first of these new Virtual Reality techniques is Lens Matched Shading, which builds upon the Multi-Res Shading technology introduced alongside our previous-generation Maxwell architecture. Lens Matched Shading increases pixel shading performance by rendering more natively to the unique dimensions of VR display output. This avoids rendering many pixels that would otherwise be discarded before the image is output to the VR headset.
Single Pass Stereo turbocharges geometry performance by allowing the head-mounted display's left and right displays to share a single geometry pass. We're effectively halving the workload of traditional VR rendering, which requires the GPU to draw geometry twice — once for the left eye and once for the right eye.
This improvement is especially important for geometry-heavy scenes, and those featuring significant levels of tessellation, which remains the most effective way of adding real detail to objects and surfaces in VR.
With tessellation, affected game elements can be accurately lit, shadowed and shaded, and can be examined up close in Virtual Reality. With other solutions, such as Bump Mapping or Parallax Occlusion Mapping, the simulation of geometric detail breaks down when the player approaches or examines affected objects from any angle, which harms immersion. By increasing geometry performance and tessellation by up to 2x, developers are able to add more detail that players can examine up close, significantly improving the look of the game and the player's level of presence.
Together, Pascal's improved performance, and new Single Pass Stereo and Lens Matched Shading significantly improve the Virtual Reality experience for GeForce GTX users.
NVIDIA has spent decades working to perfect 3D graphics, but with VR great graphics demand great audio to create a sense of presence. To this end, NVIDIA has created a game-changing advancement called VRWorks Audio.
Today's VR applications provide positional audio, telling users where a sound comes from within an environment. However, sound in the real world reflects more than just location of the audio source -- sound is a function of the physical environment. For example, a voice in a small room will sound different than the same voice outdoors because of the reflections and reverb caused by the sound bouncing off the walls of the room. Using NVIDIA's OptiX ray tracing engine, VRWorks Audio is able to trace the path of sound in an environment in real-time, delivering physical audio that fully reflects the size, shape, and material properties of the virtual world.
Simply put, we're able to simulate physically-accurate, super realistic real-time audio using the power of your graphics card.