NVIDIA Shows Off the Breadth of its Accelerated Computing Platform at SIGGRAPH

At SIGGRAPH 2024, NVIDIA highlighted the importance of accelerated computing by showcasing a range of innovations that emphasize AI, generative models, and virtual simulations.

At SIGGRAPH 2024, NVIDIA highlighted the importance of accelerated computing by showcasing innovations that emphasize AI and generative models
(Credit: Andreas Prott / Alamy Stock Photo)

This week, SIGGRAPH 2024 is being held in Denver, Colorado. The event, now in its 50th year, started as an industry conference for graphics research but has now expanded to include other topics, such as AI and simulation. This parallels NVIDIA’s journey from a company that made graphics cards to the world’s largest accelerated computing vendor. As it does at all events, NVIDIA announced a wide range of innovations highlighting how its platform enables all accelerated computing workloads – from virtual to physical.

Advanced AI models for creating 3D objects, digital humans, and simulating robots are among those innovations. A detailed look at the news is below.

NIM and Omniverse Developments

NVIDIA inference microservices (NIM) is a framework designed to simplify the deployment of generative AI (gen AI). It provides pre-built AI models and containers that can be integrated into apps via application programming interfaces (APIs).

In partnership with WPP Open X, the Coca-Cola Company uses NVIDIA Omniverse and NIM to create personalized 3D ads for over 100 markets. At SIGGRAPH, WPP, a leading marketing and communications services company, announced that Coca-Cola has integrated NIM for Universal Scene Description (OpenUSD) into its Prod X production studio. This allows Coca-Cola to customize and assemble its assets and create culturally relevant ads on a global scale.

Related:What is an AI Factory?

Hugging Face’s inference as a service is now powered by NIM and running on DGX Cloud. This gives Hugging Face’s 4 million developers faster performance and easy access to serverless inference using NVIDIA’s H100 graphics processing units (GPUs). The DGX Cloud, designed with top cloud providers, offers a fully optimized setup.

“Hugging Face’s inference as a service with NVIDIA NIM provides up to 5x higher throughput than without an NIM and the ability to rapidly experiment with production-level deployment with API stability, security, patching, and enterprise-grade support,” said Kari Briski, vice president of generative AI software product management at NVIDIA, during a SIGGRAPH news briefing.

fVDB Framework for Virtual Representations

NVIDIA introduced fVDB, a new deep-learning framework for creating AI-ready virtual representations of the real world. Built on OpenVDB, fVDB is designed for simulating and rendering volumetric data such as water, fire, smoke, and clouds. It converts raw data from techniques like neural radiance fields (NeRFs) and lidar into virtual environments that AI can use.

fVDB can handle environments four times larger than previous frameworks and operates 3.5 times faster. It’s also interoperable with massive real-world datasets. The framework will soon be available as part of NVIDIA’s NIM inference microservices, simplifying integration for developers. By transforming detailed real-world data into AI-ready virtual environments, fVDB can help train AI in autonomous vehicles, robots, and high-performance 3D deep learning.

New Gen AI Models for OpenUSD

NVIDIA has introduced generative AI to OpenUSD, expanding its use in robotics, industrial design, and engineering. NVIDIA’s generative AI models for OpenUSD are available as NIM microservices. Using the models, developers can integrate AI copilots and agents into USD workflows, expanding the possibilities in 3D worlds.

“We built the world’s first gen AI models that can understand OpenUSD-based language, geometry, materials, physics, and spaces. Three NIMs are now available in preview on the NVIDIA API catalog: USD Code, which can answer OpenUSD knowledge questions and generate OpenUSD Python code; USD Search, which enables developers to search through massive libraries of OpenUSD 3D image data; and USD Validate, which checks the compatibility of uploaded files against OpenUSD release versions,” said Rev Lebaredian, vice president of Omniverse and simulation technology at NVIDIA.

According to Lebaredian, additional NIM microservices are coming soon. They include USD Layout for assembling scenes from text prompts; USD SmartMaterial for applying realistic materials to 3D objects, fVDB Mesh Generation for creating meshes based on point-cloud data; fVDB Physics Super-Res for making high-resolution physics simulations, and fVDB NeRF-XL for generating large-scale NeRFs.

Additionally, NVIDIA is expanding OpenUSD with new connectors for unified robotics description format (URDF) and computational fluid dynamics (CFD) simulations. These advancements will make it easier for non-3D experts to create virtual worlds, broadening OpenUSD and Omniverse capabilities for new industries.

Getty Images and Shutterstock Enhancements

NVIDIA announced the general availability of the Getty Images 4K image generation API and Shutterstock’s 3D asset generation service, powered by NVIDIA’s Edify NIMs. These tools allow content creators to design high-quality 4K images and detailed 3D assets using text or image prompts. Both are built using NVIDIA’s visual AI foundry with the Edify architecture, a multimodal gen AI system.

“The Shutterstock 3D service powered by Edify is entering commercial availability. Many enterprises have been asking for this. These 3D assets can be brought directly into popular digital content creation (DCC) tools, tweaked, and used for prototyping and set dressing,” said Briski. In addition to generating 3D assets to populate a scene, NVIDIA and Shutterstock are also providing the ability to generate lighting and backgrounds for these scenes with Edify.”

Developer Program for Robotics

NVIDIA rolled out new tools and services to help developers create the next generation of humanoid robots. These include NIM microservices for robot simulation and learning, the OSMO platform for managing complex robotics tasks, and a teleoperation workflow that uses AI and simulation to train robots. Two examples of the NIM microservices for robot simulation are MimicGen and Robocasa. MimicGen trains robots to mimic human movements captured by devices like Apple Vision Pro, whereas Robocasa generates tasks and realistic environments for robots to practice in.

“We’re making these new NIMS teleoperation technologies and OSMO available to humanoid robot developers as part of a new developer program. Companies like 1x, Boston Dynamics, Field AI, Figure, Fourier, Galbot, LimX Dynamics, Mentee, Neura Robotics, RobotEra, and Skild AI are all joining,” said Lebaredian.

Through the program, developers can get early access to new tools and updates, such as the latest versions of Isaac Sim, Isaac Lab, Jetson Thor, and Project GR00T general-purpose humanoid models.

At SIGGRAPH, NVIDIA showcases an AI-enabled teleoperation workflow that uses minimal human data to create synthetic motion. This process involves capturing demonstrations with Apple Vision Pro, simulating them in Isaac Sim, and using the MimicGen NIM to make synthetic datasets. These datasets train the Project GR00T humanoid model.

Advancements in Physical AI

“How do we build generative AI for the physical world? We need models that can understand and perform complex tasks in the physical world. Three computing platforms are required: NVIDIA AI and DGX supercomputers, Omniverse and OVX supercomputers, and NVIDIA Jetson robotic computers,” said Lebaredian.

These technologies help robots understand and navigate the physical world. In addition to launching new NIM microservices at SIGGRAPH, NVIDIA also introduced a Metropolis reference workflow to assist developers in training robots. NIM microservices and the Metropolis reference workflow help developers build smart spaces with advanced robotics and AI systems for hospitals, factories, warehouses, and more. They transform physical AI by assisting robots to perceive, reason, and navigate their surroundings.

By providing these advanced tools and workflows, NVIDIA is enhancing the capabilities of AI systems and making them more accessible for developers to create real-world applications across different industries.

Summary

At SIGGRAPH 2024, NVIDIA highlighted the importance of accelerated computing by showcasing a range of innovations that emphasize AI, generative models, and virtual simulations. Key developments included the NVIDIA inference microservices (NIM) framework, fVDB for virtual environments, new generative AI models for OpenUSD, and advanced tools for robotics. These technologies demonstrate NVIDIA's commitment to enhancing the capabilities and accessibility of AI and simulation across industries, reinforcing its position as a leader in accelerated computing.

Zeus Kerravala is the founder and principal analyst with ZK Research.

Read his other Network Computing articles here.

About the Author

Zeus Kerravala, Founder and Principal Analyst with ZK Research

Zeus Kerravala is the founder and principal analyst with ZK Research. He spent 10 years at Yankee Group and prior to that held a number of corporate IT positions. Kerravala is considered one of the top 10 IT analysts in the world by Apollo Research, which evaluated 3,960 technology analysts and their individual press coverage metrics.

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights