Skip to Main Content
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

Nvidia Unveils New Supercomputers and AI Algorithms

The first day of the company's GPU Technology Conference was chock full of self-driving cars, trips to Mars, and more.

By Tom Brant
April 5, 2016
Nvidia CEO Huang at GTC16 Keynote

Describing his company as "all-in" when it comes to artificial intelligence and virtual reality, Nvidia CEO Jen-Hsun Huang today unveiled new GPUs and AI platforms for developers at Nvidia's GPU Technology Conference in San Jose, Calif.

While many of the new products and platforms are intended for data centers, Nvidia designed its new VR rendering tool, called Iray VR, to work with consumer devices. The Iray VR algorithms will let users experience more realistic scenes thanks to its ray-tracing technology. It's designed to accurately represent real-world materials and surfaces, so "carbon fiber looks like carbon fiber," Huang said.

There are two versions of Iray VR, one for desktops and data centers, and a version called Iray VR Lite that can run on a wide range of platforms, from full-scale VR headsets to the free Google Cardboard.

"Not too many people have supercomputers," Huang said. "We want people to enjoy VR irrespective of the computing platform they have."

Since it provides photorealistic experiences, Huang sees Iray VR as useful for many industries, including building design, where architects can visualize their creations before construction begins.

Nvidia has already partnered with NASA to create a photorealistic VR experience of Mars, complete with a rendering of the Mars rover. At today's conference, Apple co-founder Steve Wozniak, who is an advocate of Mars travel, helped demo the experience, known as Mars 2030. The Iray VR technology also powers the Everest VR experience created by Solfar Studios. 

Nvidia VR Mars

In addition to VR, Nvidia also revealed updates to its computers that power self-driving car technology. The Drive PX system, announced last year, combines sensors and machine learning to allow the car to understand the road around it and communicate observations and warnings to the driver. The result is real-time mapping of the environment outside the vehicle: the car's cameras can capture and process 16,000 points per frame per camera, for a total of 1.8 million points per second, according to Huang.

Drive PX

"The system will scale from advanced driver assistance to full autonomy," said Nvidia's Senior Automotive Director, Danny Shapiro. "It's a question of logging the miles into the deep-learning systems, basically driving the vehicle through all these different situations."

Once the AI is exposed to enough driving scenarios, it should handle equally well on superhighways in the U.S. and rural, unpaved roads in the developing world, Shapiro said. Drive PX will also power the world's first autonomous race car, part of the new Formula E ePrix electric racing series.

Nvidia race car

To keep up with the ever-increasing data processing demands of machine learning and VR, Nvidia also updated its line of GPU processors. The Tesla P100 is its latest hyperscale data center GPU, a single processor that is capable of replacing hundreds of traditional CPU nodes.

The company also announced its DGX-1 supercomputer, which Huang billed as the world's first deep-learning supercomputer. Using eight Tesla P100 cores, it will deliver 170 teraflops of half-precision (FP16) peak performance, the equivalent of 250 CPU-based servers.

Nvidia Tesla P100

The DGX-1 will cost $129,000, and Nvidia will ship the first ones to universities and hospitals that are engaging in AI research, including Massachusetts General Hospital

"[W]e are entering the radiological era of biometric quantification, where our interpretations will be enhanced by algorithms learned from the diagnostic data of vast patient populations," MGH radiologist Dr. Keith J. Dreyer said in a statement. "Without the processing capabilities of GPUs, this would not be possible."

Get Our Best Stories!

Sign up for What's New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

TRENDING

About Tom Brant

Deputy Managing Editor

I’m the deputy managing editor of the hardware team at PCMag.com. Reading this during the day? Then you've caught me testing gear and editing reviews of laptops, desktop PCs, and tons of other personal tech. (Reading this at night? Then I’m probably dreaming about all those cool products.) I’ve covered the consumer tech world as an editor, reporter, and analyst since 2015.

I’ve evaluated the performance, value, and features of hundreds of personal tech devices and services, from laptops to Wi-Fi hotspots and everything in between. I’ve also covered the launches of dozens of groundbreaking technologies, from hyperloop test tracks in the desert to the latest silicon from Apple and Intel.

I've appeared on CBS News, in USA Today, and at many other outlets to offer analysis on breaking technology news.

Before I joined the tech-journalism ranks, I wrote on topics as diverse as Borneo's rain forests, Middle Eastern airlines, and Big Data's role in presidential elections. A graduate of Middlebury College, I also have a master's degree in journalism and French Studies from New York University.

Read Tom's full bio

Read the latest from Tom Brant