Quadro blog feed RSSMix http://www.rssmix.com/ This feed was created by mixing existing feeds from various sources. en-gb <![CDATA[Dell, HP, Lenovo Ship Workstations with Next-Gen Quadro GPUs]]> Read More]]> Announced just a few weeks ago, our next-gen Quadro GPUs are now available in the latest professional workstations from Dell, HP and Lenovo, the world’s largest workstation suppliers.

It’s rare that both we and our partners come out with new products at more or less the same time. So, it’s worth checking the huge performance increases they can deliver.

We’ve been talking to our customers in industries like manufacturing design, media & entertainment, oil & gas and medical imaging to understand how their users interact, create and collaborate. Consistently, we hear that their data sets are getting larger and more complex and they need to be able to connect to cloud-based resources and mobile devices.

So we ensured our new GPUs – the Quadro K5200, K4200, K2200, K620 and K420, along with our legacy K6000 – interact with and view data up to twice as large as previous generations, and act as the center of visual computing workflows.

The new Quadro-powered workstations from Dell, HP and Lenovo push the limits of what is possible, so users can produce their most creative and inspired work.

These powerful new systems – described further on our Quadro page – include:

Each is built using enterprise-grade components and tested and tuned with NVIDIA Quadro graphics to run the most demanding professional applications.

Sandeep Gupte Mon, 06 Oct 2014 21:16:02 +0100
<![CDATA[How NVIDIA Quadro Powered Gone Girl’s Pioneering 6K Video]]> Read More]]> Readers fell in love with the twists in Gillian Flynn’s best-selling novel Gone Girl. David Fincher’s film version, opening this weekend, could be just as popular.

GGPosterTo bring to life this complex tale of a marriage gone terribly wrong, Fincher built an innovative production pipeline powered by NVIDIA’s next-gen Quadro GPUs.

Gone Girl is the first movie shot entirely in 6K – giving it the most advanced technical workflow of any feature to date.

It’s also the first fully edited in Adobe Premiere Pro CC, which extended the filmmakers’ flexibility throughout the post-production process.

The filmmakers used NVIDIA’s Quadro K5200 GPUs for all stages of production. This enabled fast conversion between formats – from 6K acquisition to 2.5K for the creative edit – for a streamlined review process for Gone Girl’s shoot, editorial and visual effects (VFX) teams on custom-designed HP Z820 workstations.

Quadro enabled 25x faster transcoding than the CPU of 6K to digital picture exchange (DPX), a common film industry format. This was a huge time-saving benefit that accelerated the project workflow for delivery to VFX, allowing for more iterations.

It also delivered near-zero latency during playback and real-time repositioning and stabilization. This gave the filmmakers room for more experimentation when composing final shots.

Gone Girl NVIDIA Workflow

The film, captured on a RED Dragon camera by DP Jeff Cronenweth, also benefitted from Quadro’s GPU debayering support in Premiere Pro CC. This eliminated the need for extra RED ROCKET hardware.

It also helped the post-production team distribute footage across many VFX workstations rather than one central system. 

To learn more, read our case study.


Greg Estes Fri, 03 Oct 2014 14:30:54 +0100
<![CDATA[How Automakers Borrow Ideas from Moviemakers to Build Safer Cars]]> Read More]]> Automakers have long used GPUs to build better-looking cars. Now, Honda’s using them to create safer ones.

It’s an important problem. Over 30,000 people die each year in U.S. car crashes. 1.2 million die worldwide.

That’s why Honda R&D Americas has started using NVIDIA Quadro GPUs in a new way. They’re creating computer-generated footage of simulated car crashes, giving engineers deeper insights into structural behavior.


To do that, they’re harnessing the same technology used to render computer-generated movie effects.

Of course, it’s not news that carmakers use NVIDIA GPUs just about everywhere.

Designers use Quadro-powered workstations to create and visualize cars. GPUs speed up the 3D computer-aided design (CAD) tools used to design automotive assemblies. And GPUs power infotainment and driver assistance systems.

What’s surprising is how automakers are taking ideas from moviemakers to help build safer cars.

Working with 3DXCITE and LS-DYNA, Honda is using our GPUs to integrate visualization with their crash-simulation software. These photo-realistic videos help designers see and understand energy dissipation behavior sooner.


That’s because as a vehicle reacts to a collision, a wave of energy flows through the vehicle. These deformation patterns are like ripples on water. They’re far easier to visualize with realistic lighting. And if engineers can see them, they know where to focus their efforts with more traditional tools.

See this in action in the video below.

And learn more here.

So, next time you stop to appreciate the curves on the latest cars, know that NVIDIA GPUs are helping to make them not just prettier but safer.

Adam Scraba Fri, 26 Sep 2014 21:00:43 +0100
<![CDATA[Next-Gen Quadro on Display at IBC 2014, at the Heart of Modern Media & Entertainment Workflows]]> Read More]]> Our new line-up of Quadro GPUs – the K5200, K4200, K2200 and K620 – will be on display in Europe for the first time at the IBC conference in Amsterdam, Sept. 12-16. And they’re arriving just in time.

That’s because media professionals are working with a new generation of cameras. Offering higher resolution and more flexibility, this new gear is amazing stuff.

But with better resolution comes the need for more processing power. To work with raw camera footage and retain the highest quality throughout the workflow, it’s necessary to process, or debayer, the files. That already requires a lot of horsepower. Upping the resolution from HD to 4K or even 6K makes the challenge even greater.

Increasing the pixels more than 9x over HD demands all the horsepower and memory you can get. While that used to require specialized hardware, today you can add a new NVIDIA Quadro K5200 and you’re ready to go.

5 NVIDIA VCAs used to render a raytraced image of NVIDIA Shield using Autodesk Maya and Iray for Maya in mere seconds.
5 NVIDIA VCAs used to render a raytraced image of NVIDIA Shield using Autodesk Maya and Iray for Maya in mere seconds.

With up to twice the application performance and data-handling capacity, and larger GPU memory, the next-gen Quadro series is at the center of modern workflows.

At IBC, we’ll be showing off a host of cutting-edge workflow technologies including:

  • GPU final-frame rendering with Chaos Group’s V-Ray RT 3.0 and the CG animated short film CONSTRUCT, leveraging the power of NVIDIA GPUs.
  • A prototype of Dolby Laboratories’ Enhanced Dynamic Range display. It supports 20x the brightness of standard displays. This demo features The Foundry’s NUKE STUDIO for creating startling high-range content.
  • A unique virtual camera demonstration of a filmmaker, tablet in hand, walking through a live-action set. The NVIDIA Tegra K1-powered tablet moves through Autodesk Maya scenes rendered by Chaos V-Ray RT.
  • [0x1]’s Iray for Maya harnessing the power of a local workstation and remote NVIDIA Visual Computing Appliance (VCA) to demo performance that scales linearly across appliances.
  • NVIDIA GRID vGPU technology with Adobe Creative Cloud and Autodesk 3D applications. See how you can offload graphics processing from the CPU to the GPU in virtualized environments.

At IBC, we’ll also be featuring the new HP Z840 Workstation series with the Quadro K5200 and the ultra-high-performance CADnetwork Workstation W60 using four Quadro K6000 GPUs.

IBC attendees will also be among the first to see a game-changer for the industry – the newly shipping NVIDIA VCA. It’s a scalable, network-attached GPU rendering appliance. It gets its exceptional performance from eight high-end NVIDIA GPUs. Supporting NVIDIA Iray natively, NVIDIA VCA also enables Autodesk 3ds Max through Chaos Group’s V-Ray RT.

Learn more about the tools and techniques to work smarter and faster from pre-production through post. Visit us at IBC at booth #7.J39.

Jens Neuschäfer Sat, 13 Sep 2014 19:00:47 +0100
<![CDATA[How India’s Digital Magic Transformed Ambition Into Achievement With Quadro GPUs]]> Read More]]> Scooby Doo. The Smurfs. SpongeBob SquarePants. Every generation has its favorite iconic cartoon. We may one day get to interact with them, too, if Digital Magic keeps up its pace.

Digital Magic, of Chennai, India, is the inventor of ToonPet, a real-time 3D cartoon animation system that uses NVIDIA Quadro K5000 GPUs to create amazingly realistic, interactive characters.

BITGEN, a virtual host created by Digital Magic
BITGEN, a virtual host created by Digital Magic

At a recent industry conference in Bangalore, Digital Magic set up a wall-sized screen that lit up with a well-known mythical figure when people would approach. The figure invited guests onto a stage and then proceeded to greet them personally and engage in light conversation. The figure was so realistic and the interaction so fluid, that many wondered if it was really in the hall with them.

The ToonPet “virtual host” was convincing due to its highly refined texture, lighting, particle reflections and body movements — the wheelhouse of visual computing. The project was born from an effort to make visual production tools easy for artists to use, not exclusive to the tech-savvy. And to wring out the many inefficiencies from the typical 3D animation production pipeline.

ToonPet (which stands for “carToon pupPet”) incorporates motion capture, real-time lip syncing and facial expressions, and high-quality renders fed directly to a telecast in HD quality — in normal or 3D stereoscopic mode.

Enabling the virtual host to understand what people are saying, and replying accordingly, adds another fascinating dimension of realism to the display — and further ups the computing power requirements.

From animation and post-production to visual effects, ToonPet provides artists with real-time feedback so they can be productive in the moment, instead of sitting around waiting for images to render. Add 10 layers of color correction tools applied with heavy blur — and without the Quadro K5000, it’s almost impossible to expect playback at 24 frames per second without a dropped frame.

ToonPet also allows playback of completed scenes in real time, which is critical for budget productions, where the idea of “time is money” has never been truer.

NVIDIA Quadro allowed Digital Magic to meet the scale and complexity of their ambitions, which is to push the boundaries of the possible. Saturday mornings may never be the same.

Siva Sankaran L Mon, 25 Aug 2014 20:32:47 +0100
<![CDATA[Bring Your Ray Bans: Dolby Demos Latest Display Technology at SIGGRAPH]]> Go ahead. Take a picture. It won’t do any good. That’s why Dolby is putting its latest technology on display at NVIDIA’s booth at the SIGGRAPH graphics conference this week.

With the help of NVIDIA engineers, Dolby has put Epic’s “Elemental,” Unreal Engine 4 demo on its new “extended dynamic range,” or EDR, display. The result is stunning.

Flashes of sunlight are almost wince-inducing. Glowing red magma seems to burn into your retinas. And shapes hidden by dark shadows are brought out as dark details contrast with darker ones.

“It’s hypnotic,” said Braden Evans, CTO at startup Lucidscape Technologies, as he listened to Dolby’s Bill Hofmann walk a crowd through the technical details. “You could get really addicted to this – I may have a problem.”

Dolby's Bill Hofmann explains EDR at this week's SIGGRAPH conference.
Dolby’s Bill Hofmann explains EDR at this week’s SIGGRAPH conference.

It’s a level of realism that can’t be captured by an ordinary camera. But put side-by-side with a conventional high-definition monitor, and Dolby’s proof-of-concept display is like looking through a window into a virtual world.

The story behind the demo: Dolby wants to do for movies and TV what it has long done with audio – create experiences that are more visceral and truer to life.

Compared to conventional displays, EDR comes closer to the broad range of brightness and color that the human eye can see. Our eyes have a visual range of 20,000 nits, or 200 times that of most high-definition television sets.

To create images that extend across a bigger chunk of that range, Dolby is doing more than just licensing its technology to companies making high-end televisions and displays.

Dolby has built a GPU-accelerated video and visual effects pipeline that can encode up to 10,000 nits. When applied to a videogame demo like “Elemental,” the result is stunning.

So if you’re at SIGGRAPH this week leave your camera behind and step into our booth. And bring some shades.

Broadcast live streaming video on Ustream

Brian Caulfield Thu, 14 Aug 2014 17:02:05 +0100
<![CDATA[Watch This: Virtual Reality, Without the Glasses]]> Read More]]> Exquisite craftsmanship. Insanely small tolerances. Beautiful materials. Whoever says art is dead has never worn a luxury watch. The challenge: showing all these details to potential buyers from the inside out.

The answer: a virtual reality experience that lets users play with a watch – inside and out – as if they were holding it in their hands.

real-time-display-insideThat’s what you’ll find on display at NVIDIA’s booth at the SIGGRAPH graphics conference in Vancouver this week.

Step into NVIDIA’s booth and you can see a luxury watch from the inside out, changing the viewing angle or zooming in and out on individual detail for a better look.

Created by NVIDIA, FashionLab by Dassault Systèmes, watch designer François Quentin, and ALIOSCOPY, the glasses-free 3D visualization specialist, the demo is the latest example of the power of Dassault Systèmes’ CATIA 3D modeler and rendering engine. And it runs on NVIDIA GPUs.

Here’s how it works: five NVIDIA Quadro K6000 GPUs render different viewpoints. Each viewpoint is captured and blended on a sixth Quadro K6000 to render in real time the 3D ALIOSCOPY composited image. Finally, a dedicated iOS app allows interaction with the 3D model as well as realistic animations.

Look for more like this soon. Some features of this demo will be integrated into commercial versions of the next releases of Dassault’s CATIA software. The result: companies can tell a deeper story about not just what their products look like, but how they work.

Brian Caulfield Thu, 14 Aug 2014 09:34:13 +0100
<![CDATA[Next-Gen Quadro at the Heart of Pro Graphics Workflows]]> Read More]]> We’re showing how our latest Quadro professional GPU lineup powers new visual computing workflows at this week’s SIGGRAPH computer graphics conference.

Design professionals and scientists are working with larger, more complex models. Creative professionals continue to raise the bar, building more compelling visual effects at resolutions of 4K and above.

This has led to dramatic increases in the computing power required for visualization. And often it’s teams, rather than individuals, working on projects. That makes the ability to collaborate across time zones and among users – many of whom may be mobile – more important than ever.

This requires a new approach to visual computing. One that supports user interaction on a local workstation, remote viewing of large and complex datasets on mobile devices and tight integration with cloud-based services. All with enterprise-grade reliability.

The latest Quadro lineup offers faster processors, of course, plus increased data handling capability and unique visual computing features such as connectivity to the cloud and mobile devices. The bottom line: We’re extending the concept of visual computing from a graphics card in a workstation to a connected environment, with Quadro at its center.

NVIDIA Quadro Kepler family
The new Quadro lineup lets users interact with their designs or data locally on a workstation, remotely on a mobile device or in tandem with cloud-based services.

Companies like Framestore, PSA Peugeot Citroen, Armstrong White and the team working on the new David Fincher film “Gone Girl have already adopted the latest Quadro GPUs. They’re working with richer scenes and models, harnessing larger data sets, building new cloud-based workflows and improving the ability to collaborate with far-flung teams and clients.

Starting this fall, users can buy the latest GPUs from our workstation OEMs Dell, HP and Lenovo, as well as resellers including PNY Technologies in North America and Europe, ELSA and Ryoyo in Japan, and Leadtek in Asia Pacific.

Be sure to check out our SIGGRAPH landing page for more info and videos from the show floor.

Greg Estes Tue, 12 Aug 2014 14:07:13 +0100
<![CDATA[Big Beauty: NVIDIA Powers World’s Largest Ultra-HD 4K Screen at Kentucky Derby]]> Mint juleps. Outlandish hats. Outsized bets. Amid all the distractions of race day, Kentucky Derby attendees once had a hard time catching the action. No longer.

That’s thanks to the largest ultra-HD 4K video screen in the world. The Big Board, as it’s known, sits 170 feet off the ground and measures 15,224 square feet. That’s the size of three NBA basketball courts.

To prepare for its unveiling, Churchill Downs used a 4K camera to capture pre-recorded content and race day footage. The race organizers re-designed their graphics to work in 4K.

Big day, big screen: with a total resolution of 4160×2176, the Big Board needs a lot of, um, horsepower.

Longtime NVIDIA partner Vizrt designed a custom control interface to manage the 4K content in real time. The screen can even support split-screen views to show live or recorded video, images, and data at the same time.

With a total resolution of 4160×2176, The Big Board needs a lot of, um, horsepower. That’s why Vizrt used our Quadro K5000 GPUs to keep The Big Board running.

Now attendees can experience the race like never before. And sports displays have a new standard.

Greg Estes Mon, 02 Jun 2014 16:00:19 +0100
<![CDATA[How Pixar Uses GPUs to Give Their Artists What They Need Most – Time to Play]]> Read More]]> NemoMonsters IncThe Incredibles.

When we watch Pixar’s entrancing movies, characters and plots seem to fly across the screen at the speed of a small child’s imagination. But until just a few years ago, roughing out the scenes in these stories was a painstaking process, taking hours, even days. But not any longer.

Now, digital animators and lighting artists can push and pull characters in real time, tweaking their expressions and the environment they move through in thousands of subtle ways.

Monsters University close up
Pixar’s Dirk Van Gelder pushed, pulled, tweaked, and manipulated a model of a Monsters University character  in real time for a crowd at our annual GPU Technology Conference Wednesday.

“It’s important for us to create an environment that will be playful, where the animator can reach in and make changes in real time, and that’s enabled by the NVIDIA GPUs that we use,” Pixar engineering lead Dirk Van Gelder told a crowd of more than 2,500 at our annual GPU Technology Conference in San Jose, California.

Why is a company that makes movies appearing at a GPU conference? Because that movie company used to be in the GPU business, sort of, Van Gelder explained. Pixar’s first products were computers that helped power digital animation. As graphics technology advanced, Pixar abandoned that business to take up digital storytelling. Along the way, it adopted SGI’s systems, and, then moved to PCs equipped with NVIDIA’s graphics cards.

They haven’t looked back. “Through all of our history we’ve relied on high-performance graphics,” Van Gelder said. “And for the last ten films that we’ve made the answer for that has been NVIDIA.”

Van Gelder and Pixar technical director Danny Nahmias told the tale of how Pixar uses GPUs to create scenes faster — and how that extra times gives them the time to be more creative.

That’s because GPUs give Pixar’s lighting and animation teams almost instant visual feedback on their ideas. So they can see when a shot isn’t working, or if a daring idea works.

Van Gelder showed how Presto – Pixar’s proprietary GPU-accelerated animation system – lets artists get real-time feedback during the character animation process. To demonstrate, he showed off a scene from Monsters University, where James P. Sullivan, one of the main characters, leans over another student’s chair in a lecture hall to grab a pencil he used to pick his teeth.

Thousands of attendees packed a room to hear about how Pixar's team uses GPUs.
Thousands of attendees packed a room to hear about how Pixar’s team uses GPUs.

In Presto, animators are able to move a camera around the classroom to view Sullivan from any angle. And NVIDIA’s GPUs made it possible to create detailed hair for a wildly hairy character in near real-time, so they could fine tweak the way he slouched his hair mass over the classroom’s chair.

“Every part of him is live and posable in the system,” Van Gelder said. “If we didn’t have fast graphics, we wouldn’t be able to make this happen.”

After Van Gelder spoke, Pixar’s Nahmias showed how Pixar’s interactive lighting preview tool, built on NVIDIA’s OptiX framework. It lets artists replace and adjust virtual lights to create a mood and tone for each scene, and guide the audience’s attention.

James P. Sullivan -- and Presto -- played a starring role in Wednesday's keynote.
James P. Sullivan — and Presto — played a starring role in Wednesday’s keynote.

Before, Pixar’s lighting artists relied on thousands of small cheats, that meant a scene could only be viewed from a limited number of angles. But by shifting to ray-tracing, which models the way light actually bounces around an environment, Pixar’s lighting team could free themselves to explore scenes from a wider variety of angles. And they could instantly change the way a scene was light — shifting from light with golden tones to starker colors with a few keystrokes to change the mood of a scene.

“Lighting sets the mood and tone,” Nahmias said. “It provides the context for all of our shots in support of the story.”

Brian Caulfield Wed, 26 Mar 2014 23:11:34 +0000