German automaker Audi will utilize U.S. chipmaker Nvidia's artificial intelligence computing platform to develop autonomous vehicles and hit the roads by 2020. Both companies announced this Wednesday night at the annual consumer electronics show in Las Vegas, the CES.
Audi and Nvidia have been working together for about nearly a decade, although the focus when it started was on using Nvidia's computer graphics chips in Audi's virtual cockpit and navigation. Fortune said that Audi is now working with Nvidia's AI car computing platform to handle the complexities of real road conditions and to produce autonomous vehicles earlier. The goal of Audi is to produce vehicles with Level 4 autonomy, a category by the Society of Automotive Engineers which means that the car can handle all aspects of driving in most conditions or driving modes.
In order to showcase their progress, Audi and Nvidia created an Audi Q7 piloted driving concept vehicle to navigate a complex course. It will be demonstrated at CES as it aims to unveil the power of end-to-end deep learning and how it could be one of many neural networks that can operate inside of an artificial intelligence car. Audi will expand the testing of its autonomous vehicles in public roads in California and select states next year, the company stated.
Nvidia's original architecture for autonomous vehicles involved three components: and AI supercomputer called Drive PX that is powerful enough to process data from the vehicle's cameras and sensors, an AI algorithnm-based operating system, and a cloud-based high-definition 3D map which updates constantly.
Drive PX was introduced by Nvidia two years ago at CES. In 2016, CEO Huang introduced a more powerful next generation computer called Drive PX 2 including a suite of software tools and libraries aiming to accelerate the deployment of autonomous vehicles. Drive PX 2 is being used by Volvo in a fleet of self-driving test cars.
Autoblog said that Drive PX 2 is already being demonstrated in the autonomous Audi Q7. The crossover is able to determine its own path and can sense and drive on different types of surfaces including pavement, dirt, and grass, plus, it can navigate the cones of a simulated construction zone while reading dynamic detour signs.