View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
  2. AI and automation
September 13, 2016updated 21 Sep 2016 4:10pm

Nvidia makes smart play with AI computer for self-driving cars and chips to rival Intel

Chip giant introduces palm-sized, energy-efficient AI computer for autonomous vehicles.

By CBR Staff Writer

US chipmaker Nvidia unveiled a palm-sized, energy-efficient AI computer to power automated and autonomous vehicles.

The new single-processor configuration of the Drive PX 2 AI computing platform for AutoCruise functions consumes just 10 watts of power. It will also enable vehicles to use deep neural networks to process data from multiple cameras and sensors.

Chinese auto giant Baidu will use the platform as an in-vehicle car computer for its self-driving cloud-to-car system.

Baidu vice president Liu Jun said: “Baidu and NVIDIA are leveraging our AI skills together to create a cloud-to-car system for self-driving.

“The new, small form-factor DRIVE PX 2 will be used in Baidu’s HD map-based self-driving solution for car manufacturers.”

Nvidia said that a car using the DRIVE PX 2 for AutoCruise can assess in real time what is happening around it, exactly locate itself on an HD map and plan a safe path forward.

In addition, the company has also unveiled new processors in a bid to rival Intel in the fast growing artificial intelligence (AI) market.

Content from our partners
Scan and deliver
GenAI cybersecurity: "A super-human analyst, with a brain the size of a planet."
Cloud, AI, and cyber security – highlights from DTX Manchester

The new chips, Tesla P4 and P40 GPU, are expected to deliver massive improvements in efficiency and speed of AI services.

The chips are specifically made for inferencing, which uses trained deep neural networks to recognize speech, images or text in response to queries from users and devices.

Based on its Pascal architecture, they come with specialised inference instructions based on 8-bit operations to deliver four times higher efficiency compared to previous versions.

The Tesla P4, with its small form-factor and low-power design, can fit into any servers used in data centres, according to the company.

The design helps it to be 40 times more energy efficient than CPUs for inferencing in production workloads.

Nvidia accelerated computing general manager Ian Buck said: “With the Tesla P100 and now Tesla P4 and P40, NVIDIA offers the only end-to-end deep learning platform for the data centre, unlocking the enormous power of AI for a broad range of industries.

“They slash training time from days to hours. They enable insight to be extracted instantly. And they produce real-time responses for consumers from AI-powered services.”

The company claims that a server with eight Tesla P40 accelerators can replace the performance of more than 140 CPU servers.

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.