Autonomous vehicle tech hinges upon developments in artificial intelligence, reports Shamik Ghosh.

The Automotive Electronics Roadmap Report published by IHS Automotive in June 2016 estimated 122M unit shipments of AI-based automotive systems by 2022 up from 7M units in 2015-16. Despite the fact that the uptake of AI is relatively low in automotive systems today, it will become a standard in years to come thanks to the ongoing developments around autonomous vehicle technology.

So why is AI trending and how the industry is mapping out its timeline?  Dr. Nicolaj Stache, a professor of automotive systems engineering at the HeilbronnUniversityin Germanyexplained how an approach based on AI could be the ‘unique selling point’ for automakers and suppliers alike in the pursuit of their objectives. “AI becomes relevant when we go a little deeper into the design of a machine for the driving task,” he said.

An autonomous car needs to discern its ambient surroundings using the sensor constellation (radar, LiDAR, cameras). This is necessary for the safe conduct of vehicle and to increase its overall situational awareness. Yet that is just the beginning. “All the data acquired need to be processed, which means the machine needs to detect, classify and finally understand what is happening in a scene.”

Stache believes that AI can ‘outperform’ conventional methods for detection and classification of objects using the sensor data. Finally, he also deemed it to be relevant to predict ‘what will happen in the next seconds’ so that the vehicle can adjust its behaviour and plan the subsequent manoeuvring actions accordingly. This is where AI becomes indispensable.

Making autonomy reliable

Stache, who previously served as the head of Continental’s Artificial Intelligence Centre, explained some situations where the reliability of autonomous vehicles is questionable and AI is possibly the answer.

A classic example could be of pedestrian detection − just imagine in how many different poses, clothes, sizes, partially occluded shapes a pedestrian could appear.

How can the autonomous vehicle distinguish between a normal pedestrian and a traffic cop who is signalling the driver to pull over? Can sensors emulate the actual human senses using AI?

The prototypes used in autonomous pilot trials use different sensor setups complementing each other. In contrast to this, a human can drive just by using the eyes and the brain to interpret things based on knowledge and experience. “It is a research topic to convey this to a machine by using AI and thus improve the efficiency of a sensor system,” added Stache.

There are, indeed, organisations that are working towards this goal. For example, Baselabs, the German software company which happens to be both a ‘user’ of AI results on the sensor side and a ‘contributor’ to AI usage on the function development side. Baselabs’ chief marketing officer Holger Lobel said: “AI will allow us to extract the maximum of information out of the available sensor data in the vehicle, and thus generating the best environment model possible. As a data fusion specialist, Baselabs claims to be getting good amount of data from smart sensors. Based on this data, Baselabs creates what is known as the ‘environment model’.

If an AI layer is added on top of the perception systems, the resulting ‘fused’ data can be highly accurate. On the other hand, if AI is used at the function stage, it can augment the environment information that is provided by Baselabs.

“This cannot only reduce the computing complexity of the vehicle but also increase the flexibility of automated vehicles to react to unforeseen situations,” assured Lobel.

Car as your personal assistant

Stache believes that AI could help convey the feeling of a car ownership and combine it with the advantages of a shared autonomous vehicle. The basic premise is that autonomous vehicles could be ‘shared’ more easily because they can move without people involved. As soon as one trip gets completed, the vehicle could be passed on to the next client and so on. The car can be used for other tasks when not in use. But that can only happen when these shared cars are handled intelligently i.e. using AI.

“AI can be used to predict where you want to go at which point of time by considering your habits (and your calendar), it can help determine your favourite path to get there, it could check if it makes sense to pick someone up and so on,” he said. Agreeing with Stache, Lobel also commented: “Based on these learned patterns, AI can predict the consumer’s intentions in order to propose tailored assistance using embedded voice.”

However, Lobel believes that some expectations towards AI are ‘too high to be converted into reality’. While he sees AI to be an advantage over today’s non-intelligent systems, he does not reckon AI to be the ‘silver-bullet’ for all connected consumer products.

Towards ‘humanised’ driving

The regulators in US confirmed earlier in 2016 that the Google’s AI technology behind its self-driving car is ‘as good as a human driver’ but that isn’t true within the wider automotive industry which has yet to realise the full potential of AI and machine learning.

Aaron Steinfeld, an associate research professor at the Robotics Institute of Carnegie Mellon University said: “There are many questions that are not straightforward to answer regarding vehicle autonomy. It is very difficult to write down every possible scenario and outcome, so machine learning approaches are often more efficient.”

According to Stache, this is addressed by collecting large amounts of driving data to use it as an input for AI algorithms which define the driving strategy. “For the interpretation of the data, mathematical models are needed. Artificial intelligence could be used to define these models and derive driving parameters from the data,” concluded Stache.

Steinfeld believes a ‘reconstruction’ of the event is possible using this stored data. “An AI-based approach enables us to see where exactly did the machine perform the decision-making and see why it behaved in a manner it shouldn’t have.” Steinfeld and his team at CMU have been exploring such events since mid-90s.

Finally, Steinfeld is of an opinion that machine learning techniques can use observations of human drivers to learn human-like behaviours. He calls it ‘imitation learning’ which is crucial for a humanised driving experience.

TU-Automotive Detroit 2017

07 Jun 2017 - 08 Jun 2017, NOVI, USA

The Undisputed Home of the Connected, Autonomous Car. With 150 speakers, 200 booths and 3000 attendees it's the world's biggest conference & exhibition dedicated to automotive tech. innovation.