NVISION 08
Highlights
Keynotes
Daily summaries
Show Guide
Show Guide (7.1MB PDF)
Monday, August 25th, 2008

Videos



Buzz Aldrin & Eileen Collins

Astronauts Buzz Aldrin and Eileen Collins were on hand tonight to meet, greet, and take photos with NVISION attendees.

Buzz, as all the world knows, is the second man to walk on the moon. A veteran of the Gemini and Apollo space programs, Buzz is also on hand to introduce a special showing of the film Fly Me to the Moon, the first film to be created from the beginning to be stereo 3D. Buzz provides his own voice for the film.

Eileen Collins is another space pioneer, the first woman to pilot and command the space shuttle. Eileen will also be part of tomorrow’s keynote where she will talk about how visual computing is shaping the future of the space program.

3D Stereo Gaming Demo

Just off the GeForce LAN area, NVIDIA is showing off its new 3D stereo gaming technology. Using 3D shutter glasses, a GeForce GPU, and a compatible LCD monitor, gamers can experience true depth perception in their favorite games. Over 350 existing DirectX 9 and 10 games are supported right out of the box, with no special patches or modifications required.

Tricia Helfer Meet & Greet

Actress Tricia Helfer, Cylon Number Six on the SciFi Channel's hit show Battlestar Galactica, was on hand at the Center for Performing Arts to meet fans of the show and sign autographs. Over 200 fans queued up to meet the actress.

Opening Keynote

After a welcome by San Jose Mayor Chuck Reed, Jen-Hsun Huang and a number of special guests kicked off NVISION 08 attendees with a series of demos of cutting-edge applications and technology from across the spectrum of visual computing and giving a taste of what can be seen at the rest of NVISION.

Fundamental to the visual computing revolution is the GPU, the basic technology that is driving the entire industry. An example of the extraordinary discontinuity that the GPU is introducing into the world of high-performance computing is Stanford University's Folding@home program. Folding@home is a distributed computing app that uses idle computer cycles to simulate protein folding and misfolding. The Stanford researchers hope that a better understanding of how proteins grow will help us conquer diseases like Alzheimer's and many types of cancer.

Currently Folding@home uses about 2.6 million PCs, comprising 288 teraflops of processing power. In the few months since the CUDA-accelerated version of the program was released for GeForce GPUs, some 24,300 GPUs have been applied to the problem. But these relatively few processors, less than 1% of the total, provide 1.4 petaflops, or nearly five times the processing power, of all the CPUs in use by Folding@home. The researchers at Stanford are confident that the use of GPUs in the application will significantly accelerate the time to discovery for cures for many diseases.

Switching gears, Jen-Hsun and Peter Stevenson of Realtime Technologies (RTT) showed how visual computing is being applied to automotive design and styling with a demo of real-time raytracing. Stevenson showed a digital prototype of a new Lamborghini model and discussed how the ability to rapidly prototype a design is critical to a responsive design process. This technology debuted in automotive and aerospace design, but is now filtering down to all types of products, from cell phones to sneakers. From the world of industrial design, Jen-Hsun took us into the virtual worlds of massively multiplayer online (MMO) games. Jen-Hsun was joined on stage by Taehoon Kim, co-founder of Nurien Software. Nurien is introducing a new MMO game with stunning graphics, featuring some amazing cloth and hair simulations. This next-generation game from Nurien will start to blend MMO gaming with social networking functions. Taehoon also entertained the crowd with a 3D avatar of Jen-Hsun, dancing and interacting with other avatars.

From online world to the world of sports, in particular the visual effects that enhance television broadcasts of events like the Beijing Olympics. Marv White of Sportvision gave several demos of technologies that are changing the way we watch sports, from how the line of scrimmage is projected onto a football field to how computational graphics are used to show the effects drafting in a NASCAR race. In this last, Sportvision uses the GPU to calculate the fluid dynamics of the air surrounding the race cars.

But visual computing is not just for professional broadcasters. The last five years has seen a revolution in photography as the medium has moved from film to digital. Consumer photo apps are on the verge of being able to combine multiple exposures to create high dynamic range images, eliminating the problem of contrast in exposure and to be able to literally refocus the image after it has been shot. One such app is Photosynth, recently released by Microsoft Live Labs.

Joshua Edwards of Microsoft Live Labs gave the demo of Photosynth, which uses multiple photos of a site or object to create a 3D model and then display a 360 degree perspective of the object. Edwards showed how a series of photographs were combined to create an interactive view of Stonehenge and the main display area of the National Archive building in Washington, DC. Photosynth can give the illusion of depth, but true dimensionalization of graphics is also possible in real time. Jen-Hsun showed the latest 3D stereoscopic gaming technology from NVIDIA (which is also on display in the GeForce LAN area). He showed 3D stereo clips from NVIDIA's new Medusa demo as well as an existing game, Age of Empires, that is effortlessly converted to 3D stereo with a GeForce GPU.

Jeff Han of Perceptive Pixel was up next, giving a demo of his multitouch user interface technology. Graphics is fundamental to all UI work, but Perceptive Pixel takes this to new heights. Using a 100-inch multitouch display, Jeff gave the audience a taste of what the UI of the future might look like. The current bottleneck is the input device. The mouse and cursor are too limiting, but with multitouch technology truly amazing interactions are possible. Initial applications are likely to be military and high-end design, but the technology should rapidly filter into enterprise and home computing.

Finally, Jen-Hsun was joined on stage by Tricia Helfer, Cylon #6 on TV's Battlestar Galactica. Tricia talked about the challenges of acting with virtual characters and took the crowd through the steps of blocking and filming some of the scenes from the hit Sci-Fi Channel series.

So, from ever more realistic graphics, to the massive computational power of GPUs, to enhancements of our entertainment media, to how we interact with both this new world of graphics as well as each other in this new world, this opening keynote gave the attendees a taste of what can be seen during the rest of NVISION.

 

 

Also be sure head on over to the official NVISION Blog, but also remember to check back here often as we continue to add to the list of the day's highlights.

World Leader in Visual Computing Technologies.
Copyright © 2008 NVIDIA Corporation | Legal Info | Site Map