Tech

Nvidia unveils ‘world’s first’ AI computer chip for fully autonomous vehicles

Visual computing technology company Nvidia has announced a computer chip designed to support the artificial intelligence (AI) technology needed to provide fully autonomous vehicles.

The company’s CEO, Jensen Huang, unveiled the device – known as Pegasus – at its GTC Europe event on 10 October 2017, along with a new AI software development kit (SDK).

The chip supports so-called “Level 5” for vehicle autonomy, which means the car is fully automated, without the need for any human assistance. Such cars would not need a steering wheel, mirrors or pedals, and the inside can be converted into a regular living environment for passengers.

The chip works as part of the Nvidia Drive PX platform, the system used throughout a car to analyse the huge amounts of data generated by a driverless vehicle – Nvidia estimates that autonomous vehicles need 50 to 100 times more computing power than current models.

The Drive PX platform uses information from the car’s cameras and sensors to generate a 360-degree picture of its surroundings to make decisions. Nvidia said the foundations for the AI used in the system come from deep learning and the company’s pre-trained neural networks.

Huang described Pegasus as “the world’s first computer designed for production and deployment of robo-taxis that help us realise this vision of the future of transportation”.

Nvidia is working with more than 25 robo-taxi companies, which are not currently using Pegasus but will trial it next year.

Danny Shapiro, senior director of automotive at the chip maker, said: “They [the robo-taxi companies] don’t have Pegasus, they have trunks full of PCs with our GPUs inside, so Pegasus is that path to production for them.” He added that he expects public trials to follow in 2019.

Shapiro also highlighted the potential of automated vehicles: “We see this really transforming transportation in general and ultimately making all of our lives better. Not having to worry about parking, not being stuck in traffic – or if there is traffic, you could be watching a movie or getting work done. Most importantly, it’s reducing the number of accidents and fatalities that are on our roads today.”

Huang also announced a new SDK, known as Drive IX, which will let developers make applications to personalise the passenger experience.

Drive IX allows the car to use its sensors inside and outside to track eye and lip movements and to understand words.

“All those type of capabilities, combined with the perception of what the car sees, is going to allow our customers to write applications that are really quite magical,” said Huang.

He also said cars will begin to recognise specific passengers and understand their demands. “You will be able to walk up to the car and the car knows exactly who you are and it already adjusted the seats and opens the car. If you’re a passenger, it knows who you are and adjusts the seats and changes everything according to your desires,” he said.

He added that “this car becomes an AI “ your car is an AI. It knows who you are, it understands what you like and it recognises the situation around you.” The SDK will be available before the end of the year.

Show More

Related Articles

Close