SCHEDULE

Come hear talks featuring innovations from NVIDIA Research as well as the best customer and developer talks from NVIDIA's 2018 GPU Technology Conference.

SUNDAY 8/12

  • 9:30 am – 11:30 am
    West Building, Room 221
    Hands-on Training

       
    •  
       
      Learn how to train a phase-functioned neural network to animate characters using advanced techniques in neural networks. Prerequisite: experience with CNNs.
  •  
  • 9:30 am – 11:30 am
    West Building, Room 222
    Hands-on Training

       
    •  
       
      Discover how to train a generative adversarial network (GAN) to generate image content in DIGITS. Prerequisite: experience with CNNs.
  •  
  • 12:30 pm – 2:30 pm
    West Building, Room 221
    Hands-on Training

       
    •  
       
      Explore how neural networks with autoencoders can be used to dramatically speed up the removal of noise in ray-traced images. Prerequisite: experience with CNNs.
  •  
  • 12:30 pm – 2:30 pm
    West Building, Room 222
    Hands-on Training

       
    •  
       
      Experiment with OpenAI gym to play against your best network and take home code that you can use to train agents in other Atari games. Prerequisite: experience with CNNs.
  •  
  • 1:30 PM - 5:30 PM
    Steerable Application-Adaptive Near-Eye Displays
    Kishore Rathinavel, Praneeth Chakravarthula (University of North Carolina and NVIDIA), Kaan Akşit (NVIDIA), Josef Spjut (NVIDIA), Ben Boudaoud (NVIDIA), David Luebke (NVIDIA),Turner Whitted (University of North Carolina and NVIDIA), Henry Fuchs (University of North Carolina)
    Share
    West Building, Exhibit Hall A
    Emerging Technologies

       
    •  
       
      This augmented reality display uses interchangeable 3D-printed optical components to provide content-specific accommodation support and presents high-resolution imagery in a gaze-contingent manner by implementing a lens actuation-based foveation mechanism.
  •  
  • 3:00 pm - 5:00 pm
    West Building, Room 221
    Hands-on Training

       
    •  
       
      Learn how to transfer the look and feel of one image to another image by extracting distinct visual features using convolutional neural networks (CNNs). Prerequisite: experience with CNNs.
  •  
  • 3:00 pm - 5:00 pm
    West Building, Room 222
    Hands-on Training

       
    •  
       
      Explore how to train a deep reinforcement learning agent to play Starcraft 2. Prerequisite: experience with CNNs.
  •  

MONDAY 8/13

  • 9:00 AM - 9:50 AM
    Sharing Physically Based Materials Between Renderers with MDL
    Jan Jordan, Software Product Manager MDL, NVIDIA [ View Recording ]
    Lutz Kettner, Director, Rendering Software and Material Definition, NVIDIA
    Share
    Sponsored Room 220
    Talk

       
    •  
       
      We'll discuss the basics of NVIDIA's Material Definition Language (MDL), showing how a single material can be used to define matching appearances between different renderers and rendering techniques. End users will learn how physically based definitions can be defined, while developers will learn what's entailed in supporting MDL within their own products or renderers.
  •  
  • 9:30 am - 11:30 am
    West Building, Room 221
    Hands-on Training

       
    •  
       
      Learn how to train a neural network to detect anomalies within data using variational autoencoders. Prerequisite: experience with CNNs.
  •  
  • 9:30 am - 11:30 am
    West Building, Room 222
    Hands-on Training

       
    •  
       
      Leverage the power of a neural network with autoencoders to create high-quality images from low-quality source images. Prerequisite: experience with CNNs.
  •  
  • 10:00 AM - 5:30 PM
    Steerable Application-Adaptive Near-Eye Displays
    Kishore Rathinavel, Praneeth Chakravarthula (University of North Carolina and NVIDIA), Kaan Akşit (NVIDIA), Josef Spjut (NVIDIA), Ben Boudaoud (NVIDIA), David Luebke (NVIDIA),Turner Whitted (University of North Carolina and NVIDIA), Henry Fuchs (University of North Carolina)
    Share
    West Building, Exhibit Hall A
    Emerging Technologies

       
    •  
       
      This augmented reality display uses interchangeable 3D-printed optical components to provide content-specific accommodation support and presents high-resolution imagery in a gaze-contingent manner by implementing a lens actuation-based foveation mechanism.
  •  
  • 10:30 AM - 11:20 AM
    RTX on Vulkan
    Nuno Subtil, Senior Software Engineer, NVIDIA [ View Recording ]
    Share
    Sponsored Room 220
    Talk

       
    •  
       
      NVIDIA's RTX leverages 10+ years of research into accelerated ray tracing on GPUs. In this talk, we'll explore our API for exposing RTX through Vulkan. We will discuss how ray tracing fits in with a low-level rasterization API and cover some details on our Vulkan raytracing extension.
  •  
  • 11:30 AM - 12:20 PM
    Share
    Sponsored Room 220
    Talk

       
    •  
       
      In this talk, we'll cover the use of synthetic data in training deep neural networks for computer vision tasks. We'll explain why this is a critical research area with applications ranging from robotics to autonomous vehicles, and we'll discuss some important techniques for generating synthetic data using domain randomization and 3D graphics engines.
  •  
  • 12:30 PM - 1:20 PM
    WW2 Wave Works: Simulating Ocean and Ships in Real Time for Greyhound
    Manuel Kraemer, Senior Developer Technology Engineer, NVIDIA [ View Recording ]
    Tim Cheblokov, Senior Developer Technology Engineer, NVIDIA
    Habib Zargarpour, CCO, Digital Monarch Media
    Share
    Sponsored Room 220
    Talk

       
    •  
       
      We will present an introduction to the simulation of ocean waves and vessel hydrostatics in real time. With our suite of custom tools, a team of film-makers was able to leverage GPU compute power to simulate an entire fleet of warships in real time. We will also show how these tools were integrated into the pre-visualization pipeline and used in the upcoming feature film "Greyhound".
  •  
  • 12:30 pm - 2:30 pm
    West Building, Room 221
    Hands-on Training

       
    •  
       
      Learn how to use TensorRT to accelerate inferencing performance for neural networks. Prerequisite: basic experience with CNNs and C++
  •  
  • 12:30 pm - 2:30 pm
    West Building, Room 222
    Hands-on Training

       
    •  
       
      Explore how to create analogous images using CycleGAN. Prerequisite: experience with CNNs.
  •  
  • 1:30 PM - 2:20 PM
    Taming the Beast: Using NVIDIA Tools to Unlock Hidden GPU Performance
    Aurelio Reis, SWE Director of Graphics Developer Tools, NVIDIA [ View Recording ]
    Yaki Tebeka, Distinguished Engineer, NVIDIA
    Share
    Sponsored Room 220
    Talk

       
    •  
       
      As graphics continue to advance, it's becoming more and more critical that developers have the tools they need to make the most performant applications possible. In this session, the NVIDIA Developer Tools team will provide an overview of the latest tools available to help you determine if you're CPU or GPU limited, identify those performance limiters, and show some strategies for optimizing your graphics application to peak performance.
  •  
  • 2:00 PM - 3:30 PM
    Fast, High-Precision Ray/Fiber Intersection Using Tight, Disjoint Bounding Volumes
    Nikolaus Binder (NVIDIA) and Alexander Keller (NVIDIA)
    Share
    East Building, Ballroom A
    SIGGRAPH Talk

       
    •  
       
      Restricting path tracing to a small number of paths per pixel for performance reasons rarely achieves a satisfactory image quality for scenes of interest. However, path space filtering may dramatically improve the visual quality by sharing information across vertices of paths classified as "nearby." While contributions can be filtered in path space and beyond the first intersection, searching "nearby" paths is more expensive than filtering in screen space. We greatly improve over this performance penalty by storing and looking up the required information in a hash map using hash keys constructed from jittered and quantized information, such that only a single query may replace costly neighborhood searches.
  •  
  • 2:00 PM - 5:15 PM
    Applications of Vision Science to Virtual and Augmented Reality
    Anjul Patney (NVIDIA), Marina Zannoli (Oculus), Joohwan Kim (NVIDIA), Robert Konrad (Stanford), Frank Steinicke (University of Hamburg), Martin S. Banks (UC Berkeley)
    Share
    West Building Room 301-305
    SIGGRAPH Course

       
    •  
       
      An understanding of vision science is vital in designing technology and applications for future mixed-reality (MR) head-mounted displays (HMDs). Our course provides an overview of the impact of human perception to MR applications, an introduction to human visual perception, and several case studies of using perceptual insights in improving MR experiences.
  •  
  • 2:30 PM - 3:20 PM
    Practical Realtime Raytracing with RTX - From Concepts to Implementation
    Pascal Gautron, Senior Developer Technology Engineering, NVIDIA [ View Recording ]
    Martin-Karl Lefrancois, Senior Developer Technology Engineering, NVIDIA
    Share
    Sponsored Room 220
    Talk

       
    •  
       
      Bring real-time raytracing into your raster-based application using NVIDIA RTX and Microsoft DXR or Vulkan. This session will cover and connect the RTX principles with the implementation details to add raytracing from the ground up. You will learn all about setting up acceleration structures, raytracing pipelines and shader binding tables through simple and progressive additions. We will also cover the characteristics and interactions of the raytracing shaders: ray generation, miss and hit shaders. This talk will be complemented by online resources providing in-depth explanations and easy-to-integrate source code to make your integration of raytracing based on NVIDIA RTX and DirectX/Vulkan the smoothest experience.
  •  
  • 4:00 PM - 6:00 PM
    East Building, Hall C
    Talk

       
    •  
       
      NVIDIA founder and CEO Jensen Huang will take center stage to explore the latest innovations in graphics and how artificial intelligence is changing our world.
      Register now

TUESDAY 8/14

  • 9:00 AM - 10:30am
    Towards Virtual Reality Infinite Walking: Dynamic Saccadic Redirection
    Qi Sun (Stony Brook University), Anjul Patney (NVIDIA), Li-Yi Wei (Adobe), Omer Shapira (NVIDIA), Jingwan Lu (Adobe Research), Paul Asente (Adobe Research), Morgan Mcguire (NVIDIA), David Luebke (NVIDIA), Arie Kaufman (Stony Brook University)
    Share
    West Building, Room 109-110
    SIGGRAPH Technical Paper

       
    •  
       
      Redirected walking techniques can enhance the immersive capabilities of virtual reality (VR) navigation while maintaining visual-vestibular consistency. However, they're often limited by the size, shape, and content of the physical environments. We propose a redirected walking system that can apply to small physical rooms with static and dynamic obstacles. Our method uses a VR headset with head- and eye-tracking to dynamically detect saccadic suppression and redirects the user during the resulting temporary blindness. We also use a dynamic path-planning algorithm that adaptively guides the redirection algorithm to avoid walls and obstacles. Our mapping algorithm runs in real time on a GPU and thus can avoid moving obstacles, even other VR users sharing the same physical space. Finally, we propose using subtle gaze direction as a companion to saccadic redirected walking, further enhancing redirection by imperceptibly increasing the frequency of saccades. We demonstrate that using saccades can significantly increase the rotation gains during redirection without visually distorting the virtual scene. This allows our method to apply to large, open virtual spaces and small physical rooms. We evaluate our system via numerical simulations and user studies.
  •  
  • 9:00 AM - 12:15 PM
    Introduction to DirectX Raytracing
    Chris Wyman (NVIDIA), Colin Barré-Brisebois (SEED), Shawn Hargreaves (Microsoft),Peter Sihirley (NVIDIA)
    Share
    East Building Ballroom BC
    SIGGRAPH Course

       
    •  
       
      This course is an introduction to Microsoft's DirectX Raytracing API, suitable for students, faculty, rendering engineers, and industry researchers. The first half focuses on ray-tracing basics and incremental, open-source shader tutorials for novices. The second half covers API specifics for developers integrating ray tracing into existing raster-based applications.
  •  
  • 9:00 AM - 12:15 PM
    New Rendering Techniques for Real-Time Graphics
    Cem Cebanoyan, Director of Engineering, Game Engines and Core Tech, NVIDIA [ View Recording ]
    Share
    Sponsored Room 220
    Course

       
    •  
       
      Turing uses flexible pixel shader scheduling, offering applications ultimate control over how they spend their shading budget. As a result, applications can focus their shading in the areas where it matters the most, reducing work in other areas where it's less important. We will talk about various scenarios where this capability can offer substantial performance benefit at no perceptible quality loss.
  •  
  • 9:00 AM - 12:15 PM
    Share
    Sponsored Room 220
    Course

       
    •  
       
      Many rendering algorithms require rendering multiple views of the same scene, for reflections, shadows, or stereo rendering. Turing fully embraces these multi-view rendering scenarios, providing hardware acceleration for rendering multiple views at once, with full generality in how geometry is processed across views.
  •  
  • 9:00 AM - 12:15 PM
    Texture Space Shading – A New Way to Manage Shades
    Rahul Sathe, Senior Director Techology Engineering, NVIDIA
    Share
    Sponsored Room 220
    Course

       
    •  
       
      Non-screen-space shading has been known in offline rendering industry for quite some time. Texture-space shading is an advanced form of shading rate control, offering reuse of work not only within a frame, but also across frames, by caching results of view-dependent computation in textures. Turing provides a number of features applications can use to build very efficient texture-space shading systems
  •  
  • 9:00 AM - 12:15 PM
    Share
    Sponsored Room 220
    Course

       
    •  
       
      In this talk new rendering techniques for geometry heavy scenes (M&E, CAD, AEC etc.) will be presented. Such scenes can be comprised of many objects with low triangle count as well as objects with very high triangle counts, both scenarios are addressed. Vulkan and OpenGL API examples are given.
  •  
  • 09:30 am-09:55 am
    Augmented Material Creation with Substance Alchemist
    Pierre Maheut, Market Strategy Director for Architecture and Industrial Design, Allegorithmic
    Jérémie Noguer, Market Strategy Director for Entertainment, Allegorithmic
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Allegorithmic will be showcasing for the very first time at SIGGRAPH 2018 its up-coming product Substance Alchemist. This new tool will allow to manage your material collections and create new materials from pictures, scan or pre-existing materials. This talk will detail the different facets of Alchemist and how GPU accelerated AI will revolutionize material creation.
  •  
  • 10:00 AM - 5:30 PM
    Steerable Application-Adaptive Near-Eye Displays
    Kishore Rathinavel, Praneeth Chakravarthula (University of North Carolina and NVIDIA), Kaan Akşit (NVIDIA), Josef Spjut (NVIDIA), Ben Boudaoud (NVIDIA), David Luebke (NVIDIA),Turner Whitted (University of North Carolina and NVIDIA), Henry Fuchs (University of North Carolina)
    Share
    West Building, Exhibit Hall A
    Emerging Technologies

       
    •  
       
      This augmented reality display uses interchangeable 3D-printed optical components to provide content-specific accommodation support and presents high-resolution imagery in a gaze-contingent manner by implementing a lens actuation-based foveation mechanism.
  •  
  • 10:00 AM - 10:25am
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      D3D12 has provided unparalleled access to the GPU by being as close to the metal as possible. One advanced feature enabled by this is asynchonous compute. In this session, we'll dive into what it takes to enable your application to take advantage of async compute and what tools are available to improve this process. This includes NVIDIA's newest tool, GPU Trace.
  •  
  • 10:30 AM - 10:55am
    WW2 Wave Works: Simulating Ocean and Ships in Real Time for Greyhound
    Habib Zargarpour, CCO, Digital Monach Studio [ View Recording ]
    Manuel Kraemer, Senior Developer Technology Engineer, NVIDIA
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      We will present an introduction to the simulation of ocean waves and vessel hydrostatics in real time. With our suite of custom tools, a team of film-makers was able to leverage GPU compute power to simulate an entire fleet of warships in real time. We will also show how these tools were integrated into the pre-visualization pipeline and used in the upcoming feature film "Greyhound".
  •  
  • 11:00 AM - 11:25am
    NvWebView: A VR Web Browser for the Real World
    Sean Wagstaff, Senior Technical Marketing Manager, NVIDIA [ View Recording ]
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Imagine a world without the internet. Now, imagine a virtual reality (VR) world with the internet. While some VR experiences, such as NVIDIA Holodeck™, offer collaboration with other VR users, most don't allow much interaction with the outside world. With NvWebView, you can invite remote collaborators to join your VR session through Google Hangouts; browse SketchFab content in stereo WebVR​ before dragging it into your scene; or interact with a remote, web-based application without leaving VR. The latest addition to NVIDIA's VRworks SDK is a feature-rich, GPU-optimized Chromium web browser with a powerful API that literally opens a window to web-connected VR experiences. In this talk, you'll learn about NvWebView's unique features and capabilities and how you can implement the SDK in your own application.
  •  
  • 11:30 AM - 11:55am
    Developing with HDR GSYNC Monitors
    Evan Hart, Developer Technology Engineer, NVIDIA [ View Recording ]
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      With the advent of high-dynamic range (HDR) displays, a whole new level of image quality is open for applications to exploit. Unfortunately, many developers have had to struggle with sub-par solutions in developing and mastering their content. This has often been especially tricky for real-time developers such as game developers. Many televisions fail to faithfully represent the content, and the limited set of HDR monitors could often best be described as HDR compatible rather than true HDR. The recent introduction of G-SYNC HDR monitors brings an important improvement in the level of quality and reliability. These monitors provide true high-dynamic range as well as wide color gamut. While they may be marketed as great gaming monitors, the same properties that make them great gaming monitors make them great for developing, testing, and evaluating real-time content. This talk will cover practical experiences in development, analysis, and review of HDR content utilizing the technology behind G-SYNC HDR monitors.
  •  
  • 12:00 PM - 12:25am
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      After a short description of how ANSYS OPTIS software have evolved to become a worldwide reference in predictive rendering, we discuss the precision needed in every aspect of simulation to reach "true to life" rendering, accurate enough not just to appreciate the beauty of the product, but also to track down its slightest flaws. Then we will describe the gap between realistic and accurate, and define, with practical examples, how predictive simulation can be achieved and compared to reality. We conclude on GPU acceleration being a real game-changer in the process of imagining, designing and experiencing a product, testing more possibilities and morecomplex materials, and relying on decisions made over accurate rendering throughout the whole creation process.
  •  
  • 12:30 PM - 12:55 PM
    Sharing Physically Based Materials Between Renderers with MDL
    Jan Jordan, Software Product Manager MDL, NVIDIA [ View PDF ]
    Lutz Kettner, Director, Rendering Software and Material Definition, NVIDIA
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      We'll discuss the basics of NVIDIA's Material Definition Language (MDL), showing how a single material can be used to define matching appearances between different renderers and rendering techniques. End users will learn how physically based definitions can be defined, while developers will learn what's entailed in supporting MDL within their own products or renderers.
  •  
  • 1:00 PM - 1:25 PM
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      VRWorks Audio SDK was released in 2017 for modeling acoustics of the 3D environments in real time. Over the last year, we have made many improvements to VRWorks Audio technology to make it much closer to reality, while boosting performance, with a particular focus on architecture, engineering, and construction (AEC) workflows, architectural walk-throughs, and plugin support for UE4 and Unity. Come and learn about these improvements and a few other exciting (yet undisclosed) VRWorks Audio features and learn how you can use VRWorks Audio in your next project.
  •  
  • 1:30 PM - 1:55 PM
    Adaptive Temporal Antialiasing
    Adam Marrs, Software Engineer, NVIDIA [ View Recording ] [ View PDF ]
    Rahul Sathe, Senior Development Technology Engineer, NVIDIA
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      We'll discuss adaptive temporal antialiasing (ATAA), a new technique that extends traditional temporal antialiasing (TAA) approaches with adaptive ray tracing while conforming to the constraints of a commercial game engine (Unreal Engine 4). Attendees will learn how the algorithm removes the blurring and ghosting of artifacts common to standard temporal antialiasing and achieves image quality approaching 8X supersampling of geometry, shading, and materials while staying within a real-time game frame budget.
  •  
  • 2:00 PM - 2:25 PM
    SOLIDWORKS + NVIDIA Join to Usher in the Future of Design Visualization
    David Randle, Strategy and Business Development, Dassault Systèmes SolidWorks, SolidWorks [ View Recording ]
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Gone are the days waiting hours for 4K content to render and building expensive physical mockups, all to make important design decisions. SOLIDWORKS Visualize now includes NVIDIA's AI denoiser to decipher and magically eliminate noise from your scene, instantly boosting render speeds by 10X. SOLIDWORKS Visualize is also the authoring tool for designers to get rich 3D models from 3D design packages into NVIDIA Holodeck™, enabling truly collaborative, photo-quality virtual reality (VR) design review experiences. Understand the workflow and take away new levels of value by combining best-quality content in a fraction of the time with the easiest design workflow for rich, immersive, and collaborative review.
  •  
  • 2:00 PM - 5:30 PM
    Share
    Sponsored Room 220
    Course

       
    •  
       
      We will explore recent developments in GPU-accelerated, high-quality, interactive ray tracing to support the visual quality and scene complexity required for visual effects, animation, and design. Presentations will be by NVIDIA and by film rendering leaders from Autodesk, Chaos, Isotropix, Pixar, and Weta Digital.
  •  
  • An Introduction to NVIDIA OptiX
    Oliver Klehm, Senior Rendering Engineer, NVIDIA [ View Recording ]
    Share
    Sponsored Room 220
    Course

       
    •  
       
      We explore techniques for GPU-accelerated interactive ray tracing with the high visual quality and scene complexity required for visual effects, animation, and design. Presentations by NVIDIA and by film rendering leaders from Autodesk, Isotropix, Pixar, and Weta Digital
  •  
  • Share
    Sponsored Room 220
    Course

       
    •  
       
      Clarisse iFX is a 3D DCC featuring a fully ray-traced CPU 3D viewport, which allows interactive visualization and layout of massive VFX production scenes. In this talk, we'll present how Clarisse iFX can take advantage of NVIDIA's OptiX ray tracing engine to reach real-time display. We'll discuss how we dealt with the trade-off between memory, interactive edits and performance to improve Clarisse iFX's unique workflow.
  •  
  • Share
    Sponsored Room 220
    Course

       
    •  
       
      We will present how we added GPU acceleration to Arnold, and our goal to maintain a single feature compatible renderer capable of running on both architectures. We will describe how we leveraged Optix, and its JIT compiler to share the codebase between the two architectures and how production rendering features can be mapped to the Optix programming model. We will also discuss how well brute-force physically-based rendering techniques perform on the GPU, and showcase recent results.
  •  
  • Share
    Sponsored Room 220
    Course

       
    •  
       
      We'll discuss photo-realistic rendering in modern movie production and present the path that lead us to leverage GPUs and CPUs in a new scalable rendering architecture. You'll learn about RenderMan XPU, Pixar's next-gen physically-based production path-tracer and how we solve the problem of heterogeneous compute using a shared code base. Come to hear about our partnership with NVIDIA to create the technology that will enable the art and creativity in future feature animation and live action visual effects blockbusters.
  •  
  • GazeboRT: GPU Ray Tracing at Weta Digital
    Brian Sharpe, Senior Software Engineer, Weta Digital
    Share
    Sponsored Room 220
    Course

       
    •  
       
      Gazebo is Weta Digital's general-purpose real-time rendering engine. Built on OpenGL, it has proved highly successful in many areas of our pipeline including virtual production, pre-lighting, scene layout, and modelling. Recently we have developed GazeboRT, which is a port of Gazebo to NVidia's OptiX, allowing for many new possibilities such as spectral rendering, indirect illumination, and correct reflection and refraction. In this talk we present the steps taken to extend our OpenGL architecture to OptiX, and also technical details such as our spectral implementation, light hierarchy, and use of hardware-compressed textures. To finish we will demonstrate GazeboRT with material from some recent Weta Digital productions including Avengers: Infinity War, and Maze Runner: The Death Cure.
  •  
    •  
    •  
       
      This talk examines Chaos Group's first experiences using DirectX Raytracing to build a realtime ray tracer. We present possible solutions to such challenges as on-demand geometry loading and out-of-core raytracing, as well as realtime denoising and antialiasing. We examine the performance of DXR and the factors that influence it. Finally, we explore the potential applications of this technology within the current ecosystem of DCCs.
  •  
  • 2:30 PM - 2:55 PM
    RTX on Vulkan
    Nuno Subtil, Senior Software Engineer, NVIDIA [ View Recording ]
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      NVIDIA's RTX leverages 10+ years of research into accelerated ray tracing on GPUs. In this talk, we'll explore our API for exposing RTX through Vulkan. We will discuss how ray tracing fits in with a low-level rasterization API and cover some details on our Vulkan raytracing extension.
  •  
  • 3:00 PM - 3:25PM
    Machine Learning at the Edge
    Keith Hartfield, Head of Product Management, HP [ View PDF ]
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Edge vs cloud: why the edge still matters. Learn how machine learning and edge computing are important to your business.
  •  
  • 3:30 PM - 3:55 PM
    NVIDIA ProVis VR Update
    Ingo Esser, Senior Developer Techology Engineer, NVIDIA [ View Recording ]
    [ View PDF ]
    Robert Menzel, Developer Technology Engineer, NVIDIA
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Building on our previous talks, we'll give an update on what's happening in the professional virtual reality (VR) space at NVIDIA. We'll first give an update on OpenGL and Vulkan VR functionality and then we'll talk about how to drive dual-input head-mounted displays (HMDs) from two GPUs efficiently.
  •  
  • 4:00 PM - 4:25 PM
    Tackling the Realities of Virtual Reality
    David Luebke, Vice President of Graphics Research, NVIDIA [ View Recording ]
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      David Luebke, NVIDIA's vice president of graphics research, will describe NVIDIA's vision for the future of virtual and augmented reality. Luebke will review some of the "realities of virtual reality": challenges presented by Moore's law, battery technology, optics, and wired and wireless connections. He will then discuss the implications and opportunities presented by these challenges, such as foveation and specialization, and conclude with a deep dive into how rendering technology, such as ray tracing, can evolve to solve the realities of virtual reality.
  •  
  • 5:00 PM - 5:25 PM
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Recent advancements in artificial intelligence are enabling designers to create better products, environments and buildings. The process of generative design is becoming faster and easier than ever with the introduction of design tools powered by deep learning. This talk provides a brief overview of the milestones and key developments in the field of generative design. From parametric modeling to evolutionary optimization and computational design methods. Highlights include recent advancements in design tools using deep learning in the media and entertainment, manufacturing and construction industries.
  •  
    •  
    •  
       
      A deep dive into how Octane 4, shipping with Optix 5.1 based RTX Ray Tracing acceleration support, is a game changer for the future of GPU VFX and Holographic media:

      - Why RTX support in Octane 4 is a major milestone in scaling the RNDR blockchain towards holographic render jobs in 2019 and beyond.

      - Detailed results, comparison and implications of OTOY's RNDR SDK and OctaneBench 2019 speed gains across 3 new RTX optimized backends: Optix, DXR and Vulkan

      - Octane and RNDR 2019 roadmap: with significant performance increases making affordable holographic rendering per RNDR node a reality.

WEDNESDAY 8/15

  • 9:00 AM - 12:15 PM
    Real-Time Ray Tracing
    Chris Wyman, Principal Research Scientist, NVIDIA [ View Recording ]
    Morgan Mcguire, Distinguished Research Scientist, NVIDIA [ View Recording ]
    François Antoine , Director, Embedded Systems and HMI, EPIC Games [ View Recording ]
    Jacopo Pantalenoni, Senior Research Scientist, NVIDIA [ View Recording ]
    Colin Barré-Brisebois, Senior Software Engineer, SEED [ View Recording ]
    Share
    Sponsored Room 220
    Course

       
    •  
       
      Researchers and engineers from NVIDIA joined by leading game studios Epic Games and EA/SEED will present state-of-the-art techniques for ray tracing, sampling, and reconstruction in real time, including recent advances that promise to dramatically advance the state of ray tracing in games, simulation, and VR applications.
  •  
  • 9:00 AM - 12:15 pm
    Share
    Booth # 801, West Hall
    Course

       
    •  
       
      In this session, Edward Liu from NVIDIA will demonstrate how state of the art denoising technologies provided in the Gameworks Ray Tracing module will make 1 sample per pixel ray tracing practical in many scenarios in real-time rendering, including area light shadows, ambient occlusion, glossy reflections and even indirect diffuse global illumination. Edward will show that one sample per pixel ray tracing with denoising can achieve much improved realism and fidelity when compared with traditional real-time rendering techniques.
  •  
  • 10:00 AM - 5:30 PM
    Steerable Application-Adaptive Near-Eye Displays
    Kishore Rathinavel, Praneeth Chakravarthula (University of North Carolina and NVIDIA), Kaan Akşit (NVIDIA), Josef Spjut (NVIDIA), Ben Boudaoud (NVIDIA), David Luebke (NVIDIA),Turner Whitted (University of North Carolina and NVIDIA), Henry Fuchs (University of North Carolina)
    Share
    West Building, Exhibit Hall A
    Emerging Technologies

       
    •  
       
      This augmented reality display uses interchangeable 3D-printed optical components to provide content-specific accommodation support and presents high-resolution imagery in a gaze-contingent manner by implementing a lens actuation-based foveation mechanism.
  •  
  • 10:00 AM - 10:25am
    Avengers: Infinity War
    Matt Aitken, Visual Effects Supervisor, Weta Digital
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Visual Effects Supervisor Matt Aitken will present Weta Digital's work on Avengers: Infinity War, featuring Planet Titan and the intense showdown with Thanos that takes place there. Matt will discuss the extensive CG environment build as well as the many special effects elements that went into the meteor shower, the crashed Q-Ship, and the impressive magic battle between Thanos and Dr. Strange. Matt will also demonstrate how they created a compelling performance for Thanos. Weta built their Thanos puppet with hundreds of muscle-based blend shape controls to give their animators ultimate control over the performance. They also developed a new step in their facial pipeline to bring out even more precise detail from the performance data. Matt will highlight how Weta Digital used GPU technology from NVIDIA to enable tools that gave their artists greater control over the creative process.
  •  
  • 10:30 AM - 10:55am
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Ever-increasing ray tracing performance will have profound effects in the VFX and feature film industry. In this talk we describe our on-going development of RenderMan XPU, and we peek at the future, constructing an estimate of the impact on overall production processes, assets packaging methodology and the software stack.
  •  
  • 11:00 AM - 11:25am
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Learn how to optimize D3D12 application performance using NVIDIA Nsight™ Systems. See how NVIDIA experts profile using Nsight Systems to significantly improve performance. Nsight Systems transforms the profiling experience from a "black box" to a "white box," letting the developer see what's happening at the system, API, and GPU execution level like never before.
  •  
  • 11:30 AM - 11:55am
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      We'll present how we added GPU acceleration to Arnold and our goal of maintaining a single-feature-compatible renderer capable of running on both architectures. We'll describe how we leveraged NVIDIA® OptiX™ and its JIT compiler to share the codebase between the two architectures and how production rendering features can be mapped to the OptiX programming model. We'll also discuss how well brute-force physically based rendering techniques perform on the GPU and showcase recent results.
  •  
  • 12:00 PM - 12:25 AM
    Around the World in Location-Based Entertainment
    Joanna Popper, Global Head of Location Based Entertainment for Virtual Reality, HP
    [ View Recording ] [ View PDF ]
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Location-Based Entertainment is a hot trend in VR right now. Why is causing this focus? What are some of the most exciting projects driving the industry? How does the industry vary globally? Join HP's Global Head of Location Based Entertainment Joanna Popper as she discusses these trends and where the industry is heading.
  •  
  • 12:30 PM - 12:55 PM
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      From interactive to real time, discover how GPU acceleration boosts ray-tracing performances on Clarisse iFX. Clarisse iFX is the world's first 3D digital content creation (DCC) featuring a fully ray-traced CPU 3D viewport that enabled CG artists to create amazing visual effects (VFX) on over 50 Hollywood blockbusters. Thanks to the NVIDIA® OptiX™ Ray Tracing Engine, Clarisse iFX 3D viewport can now benefit directly from GPU power to let users manipulate life-like environments in real time while displaying noise-free lighting scenarios at the same time.
  •  
  • 1:00 PM - 1:25 PM
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Come learn the latest advances in GPU acceleration for the Academy Award-winning V-Ray renderer and how it's improving artistic workflows and speeding final-frame rendering.
  •  
  • 2:00 PM - 2:25 PM
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Digging deep to get the most out of Unreal Engine on NVIDIA GPU's. Hear from David Witters on how he and his team designed and developed the crowd sourced project Mars Home Planet. You will learn about the design process. How the team managed a heterogenous set of 3D models created around the world on a variety of Digital Content Creation tools and how they were optimized, developed, brought together and showcased in a classic environment design, with emphasis on techniques in layout, composition, and lighting to push the boundaries of what is possible in virtual reality.
  •  
  • 2:00 PM - 5:00 PM
    Deep Learning for Real-Time Rendering
    Aaron Lefohn, Senior Director, Graphics Research, NVIDIA [ View Recording ]
    Alexandr Kuznetsov, Ph.D. Student, University of California, San Diego [ View Recording ]
    Jaakko Lehtinen, Senior Research Scientist, NVIDIA [ View Recording ]
    Adrian Tsai, Senior Software Engineer, Microsoft [ View Recording ]
    Don Britian, Principal Engineer, NVIDIA [ View Recording ]
    Share
    Sponsored Room 220
    Course

       
    •  
       
      NVIDIA and partners will delve into the latest research for real-time inference, including the use of cuDNN, NVIDIA® TensorRT™, and Windows ML; enhancing rasterized and ray-traced scenes with deep learning networks; and tightly integrating deep learning into rendering engines.
  •  
  • 2:30 PM - 2:55 PM
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Immersive spaces and the use of virtual, mixed, and augmented realities have brought about a wealth of opportunity in the entertainment, retail and consumer, architectural and construction, and design spaces. At conferences like SIGGRAPH, much of the attention focuses on the glamour of Hollywood, the glitz of design studios, and the games that we play at home or in arcades. In this talk, we we'll focus on the slightly less alluring but none-the-less challenging environment of work-a-day extended realities. From the virtual styling studio to the virtual showroom, virtual reality informs the decisions made regarding how real products are ultimately produced and maintained. Graphics processing technology touches every aspect of your life, even if you've never donned a head-mounted display (HMD) or viewed a stereoscopic 3D movie yourself.
  •  
  • 3:00 PM - 3:25 PM
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      In this hands-on, live demo, we'll show how NVIDIA Nsight™ Compute can be used to profile applications built with NVIDIA® OptiX™. We'll identify perfomance bottlenecks in several OptiX applications and identify the key differences between vanilla NVIDIA CUDA® programs and OptiX applications from a profiling perspective. We'll also demonstrate how to customize Nsight Compute to extract and present profiling information in the way that is most suitable for a given OptiX application. This talk will contain almost no slides and instead focus on live usage of the tools involved.
  •  
  • 3:30 PM - 3:55 PM
    Production-Quality, Final-Frame Rendering on a GPU
    Robert Slater, Immersive Experience, VP of Engineering, Redshift [ View Recording ]
    [ View PDF ]
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      We'll discuss the latest features of Redshift, the GPU-accelerated renderer running on NVIDIA GPUs that is redefining the industry's perception towards GPU final-frame rendering. We'll demonstrate a few customer work examples. This talk will be of interest to industry professionals who want to learn more about GPU-accelerated production-quality rendering as well as software developers who are interested in GPU-accelerated rendering.
  •  
  • 4:30 PM - 4:55 PM
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      The growth in density of housing in cities like London and New York has resulted in the higher demand for efficient smaller apartments. These designs challenge the use of space and function while trying to ensure the occupants have the perception of a larger space than provided. The process of designing these spaces has always been the responsibility and perception of a handful of designers using 2D and 3D static platforms as part of the overall building design and evaluation, typically constraint by a prescriptive program and functional requirement. A combination of human- and AI-based agents creating and testing these spaces through design and virtual immersive environments (NVIDIA Holodeck) will attempt to ensure the final results are efficient and best fit for human occupancy prior to construction.
  •  
  • 4:00 PM - 4:25 PM
    The Future of GPU Crash Debugging
    Jeff Kiel, Senior Manager, Graphics Tools, NVIDIA [ View Recording ]
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      GPU crashes and hangs are a major frustration point for developers that can waste countless hours in deeply frustrating debugging sessions. Until now, few tools have existed to help with this process. Aftermath 2.0 is the next evolution of NVIDIA's post-mortem analysis tools and adds a tremendous amount of information to help you find and resolve those difficult-to-solve crashes and hangs. In this session, we'll deep dive into how this tool works as well as how you can incorporate it into your own workflows.
  •  
  • 5:00 PM - 5:25 PM
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      In this session, Edward Liu from NVIDIA will demonstrate how state of the art denoising technologies provided in the Gameworks Ray Tracing module will make 1 sample per pixel ray tracing practical in many scenarios in real-time rendering, including area light shadows, ambient occlusion, glossy reflections and even indirect diffuse global illumination. Edward will show that one sample per pixel ray tracing with denoising can achieve much improved realism and fidelity when compared with traditional real-time rendering techniques.
  • 5:30 PM - 6:00 PM
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      NVIDIA Holodeck™ is NVIDIA's advanced technology (virtual reality) VR platform. In this tutorial, we'll cover all of the major features of Holodeck, including the latest navigation and file-import capabilities and how to build custom experiences. This session will also include demonstrations from partners who are using Holodeck to accelerate and enhance their workflows.

THURSDAY 8/16

  • 9:00 AM - 10:30am
    Slang: Language Mechanisms for Extensible Real-Time Shading Systems
    Yong He (Carneign Mellon University), Tim Foley (NVIDIA), Kayvon Fatahalian (Stanford University)
    Share
    West Building, Room 109-110
    SIGGRAPH Technical Paper

       
    •  
       
      Designers of real-time rendering engines must balance the conflicting goals of maintaining clear, extensible shading systems and achieving high rendering performance. To achieve these goals, engine architects typically developed engine-specific code synthesis tools, ranging from preprocessor hacking to domain-specific shading languages. The problem is that proprietary tools add significant complexity to modern engines and create additional challenges for learning and adoption. We argue that the advantages of engine-specific code-generation tools can be achieved using the underlying GPU shading language directly, provided the shading language is extended with a small number of mechanisms from modern general-purpose programming languages. We show that adding generics with interface bounds, associated types, and interface/structure extensions to an existing C-like GPU shading language enables real-time rendering engines to build shading systems that are extensible, maintainable, and execute efficiently on modern GPUs without the need for additional domain-specific tools.
  •  
  • 9:00 AM - 12:15 PM
    Deep Learning for Content Creation
    Deqing Sun, Senior Research Scientist, NVIDIA [ View Recording ]
    Doug Roble, Director of Software R&D, Digital Domain [ View Recording ]
    Gavriel State, Senior Director, System Software, NVIDIA
    [ View Recording ]
    Bryan Catanzaro, VP, Applied Deep Learning Research [ View Recording ]
    Jaakko Lehtinen, Senior Research Scientist, NVIDIA [ View Recording ]
    Share
    Sponsored Room 220
    Course

       
    •  
       
      Join NVIDIA's top researchers, including vice president of applied deep learning research, Bryan Catanzaro, for an examination of the novel ways deep learning and machine learning can supercharge content creation. Speakers will cover pipelines and aspects of content creation for films, games, and advertisements
  •  
  • 9:30 AM -9:55 AM
    Applying Deep Learning to Creative Workflows
    Thomas True, Senior Applied Engineer, Professional Video and Image Processing, NVIDIA
    [ View Recording ] [ View PDF ]
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Deep Learning has the potential to revolutionize the creative process in many ways, whether its applying a "style" to images and video, drastically improving denoising, incredibly realistic in-painting or even increasing resolution beyond what's previously possible. This session will describe how deep learning can be easily integrated into creative applications as well as demonstrate several real world examples such as, in-painting, style transfer, and super-resolution.
  •  
  • 10:00 AM - 3:30 PM
    Steerable Application-Adaptive Near-Eye Displays
    Kishore Rathinavel, Praneeth Chakravarthula (University of North Carolina and NVIDIA), Kaan Akşit (NVIDIA), Josef Spjut (NVIDIA), Ben Boudaoud (NVIDIA), David Luebke (NVIDIA),Turner Whitted (University of North Carolina and NVIDIA), Henry Fuchs (University of North Carolina)
    Share
    West Building, Exhibit Hall A
    Emerging Technologies

       
    •  
       
      This augmented reality display uses interchangeable 3D-printed optical components to provide content-specific accommodation support and presents high-resolution imagery in a gaze-contingent manner by implementing a lens actuation-based foveation mechanism.
  •  
  • 10:00 AM - 10:25 PM
    GPU-Accelerated OptiX Ray Tracing for Scientific Visualization
    John Stone, Senior Research Programmer, University of Illinois at Urbana-Champaign
    [ View Recording ] [ View PDF ]
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Advances in NVIDIA OptiX™ and NVIDIA RTX™ ray tracing bring tremendous performance improvements for rendering scenes from technical and scientific visualization workloads. New OptiX acceleration structures enable rendering of scenes with much greater geometric complexity on a single GPU, making it possible for scientists to interactively render high-fidelity visualizations of complex scenes. This talk will describe the results from the adaptation of visual molecular dynamics (VMD), a widely used molecular visualization tool with over 100,000 users, to exploit the latest capabilities of OptiX.
  •  
  • 10:30 AM - 10:55AM
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      A new feature of the upcoming OptiX SDK leverages AI to determine the quality of an image. The deep neural network was trained with pairs of ray traced images and their structural similarity index (SSIM). The network now takes a noisy image as input and outputs the SSIM value for that image which we use SSIM as a quality measure. In this talk we will show you the API for this new feature and provide suggestions on how it can be used for adaptive rendering.
  •  
  • 10:45 AM - 12:15 PM
    Fast-Path Space Filtering by Jittered Spatial Hashing
    Nikolaus Binder (NVIDIA), Sascha Fricke (University of Braunschweig), Alexander Keller (NVIDIA)
    Share
    East Building, Ballroom A
    SIGGRAPH Talk

       
    •  
       
      We improve the performance of subdivision-based ray/fiber intersection for fibers along Bezier curves by pruning with tight, disjoint bounding volumes in ray-centric coordinate systems. The resulting method calculates precise intersections on the surface of a fiber with accurate normals and performs significantly faster for a high number of subdivisions than state-of-the-art methods pruning subregions with axis-aligned bounding boxes.
  •  
  • 11:00 AM - 11:25 AM
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      A new feature of the upcoming OptiX SDK leverages AI to determine the quality of an image automatically and without reference image. A deep neural network was trained with pairs of ray traced images and their structural similarity index (SSIM). The network takes a noisy image as input, then outputs the SSIM value for that intermediate image. In this talk we will show you the API for this new feature and provide suggestions on how it could be used also for adaptive rendering.
  •  
  • 12:00 PM - 12:15 PM
    How VR is Reinventing Workflows
    Allen Buckner, Director of VR Market Development, HP [ View Recording ]
    [ View PDF ]
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Learn how VR is transforming professional workflows across architecture, engineering, healthcare and entertainment.
  •  
  • 1:00 PM - 1:25 PM
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      Real-time games have an extremely small budget for computations of each frame. Learn the right way to approach real-time performance with inference workloads, taking advantage of the newest technologies available.
  •  
  • 1:30 PM -1:55 PM
    Mars Home Planet: From Concept to VR or Mars Home Planet: A Year in Review
    Barbara Marshall, WW Segment Manager, Media & Entertainment, HP Z Workstations, HP
    [ View Recording ]
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      HP, NVIDIA and Technicolor along with architects, engineers, artists and students from around the world united to create an urban area for one million humans on Mars. Learn how a community of 87,205 worked together to conceptualize, model, and render a future city. This session will delve into the three challenges, review a selection of the nearly 1000 submissions received and highlight the Rendering Challenge winners announced on Tuesday morning. Be sure to stop by the StudioXperience and take part in the ultimate Mars VR Experience created by Technicolor.
  •  
  • 2:00 PM - 5:15 PM
    Machine Learning and Rendering
    Alexander Keller (NVIDIA), Jaroslav Křivánek (Charles University, Prague), Jan Novák (Disney Research, Zürich), Anton Kaplanyan (Oculus Research), Marco Salvi (NVIDIA)
    Share
    East Building Ballroom BC
    SIGGRAPH Course

       
    •  
       
      Machine learning techniques recently enabled dramatic improvements in both real-time and offline rendering. In this course, we'll introduce the basic principles of machine learning and review their relationship to rendering. Besides fundamental facts like the mathematical identity of reinforcement learning and the rendering equation, we cover efficient and surprisingly elegant solutions to light transport simulation, participating media, and noise removal.
  •  
  • 2:00 PM - 2:25 PM
    Share
    Booth # 801, West Hall
    Talk

       
    •  
       
      An existing challenge of many virtual reality (VR) headsets is the cumbersome multi-connector cables and/or adapters that these devices require. This challenge often sets the use of head-mounted displays (HMDs) at odds with consumers' desires for portability and simple connectivity of their devices. As important, these cables are often incompatible with the needs of the rapidly growing notebook VR gaming market. The implementation of a single-connector USB Type-C interface is an advancement that can increase the use of HMDs by making them more mobile and less complex to set up. This talk describes how VirtualLink, an AltMode of USB-C optimized for VR, enables the next generation of VR headsets.
  •