Sensing and perception for the autonomous vehicle explored by Graham Jarvis. [Auto.Jarvis.2016.10.12]

As sensing technologies continue to increase in resolution, they surpass simple detection and ranging functions and instead take on true “vision” functions in terms of classification and mapping. But what do the experts in the field think about this statement? Well, François E. Guichard, Transport Division Vehicle Regulations and Transport Innovations Section at UNECE, thinks the focus should firstly and foremostly on safety.

“While drafting the regulations, the regulator looks at physics and design to test independent values such as stopping distances to avoid collisions, and the impact that they have on the necessary perception distances because it will have a direct impact on the technology to be installed on vehicles,” he explains before adding, “… at highway speeds, the perception distance depends on vehicles relative speeds and the environment, so we are talking about 100 metres or more.”

Regulatory challenges

With safety being of paramount important he thinks it’s important to realise that many of the basic ultra-sonic sensors used in many cars for parking won’t be enough to meet the regulations related to vehicle automation. “So we believe to date that cameras, radars and LiDAR will be used as well as mapping technologies and the derived information will be used to match the information from the sensors to increase the geo-location precision,” he reveals. 

He stresses that the technology has to be able to deliver data and subsequently information to the automation system to permit the use of the function in the normal traffic conditions that exist today without automation: “When Google started working on automated driving, many observers had doubts about the commercial feasibility of the project; the technology was very expensive and the experts said that LiDAR -  such as used by Google were costing about $75,000 (£61,500) but we now observe that the costs for such devices is slowly dropping.”

In essence sensing and perception requires a multitude of technologies: video, LiDAR, radar and mapping for example. Eventually, Guichard believes, the amount of technology required for sensing and perception for autonomous vehicles will reduce as they will gradually become more integrated over the course of its development.

Blurring the lines 

Maxime Flament, head of department for connectivity and automation at Ertico, says his company is bringing together experts from different sectors, which will lead to the different areas and perhaps even the different technologies becoming blurred. “Information that is coming from the Cloud is starting to be used in terms of fusion within the platform and to feedback cloud data,” he adds. For mapping his company and its partners use crowdsourcing to correct maps.

He admits that he’s not directly involved with the sensing technologies of radar, LiDAR and cameras. However, he believes that more information is needed from outside of the vehicle to increase the sensing capabilities of connected and eventually of self-driving vehicles. “If you have prior knowledge of specific road furniture, you can easily detect its status. You can then prepare your sensors to detect the relevant messaging,” he explains.

Sensor fusion

Dr. Stephan Appt, partner at Pinsent Masons Germany, offers his perspective on how, through sensor fusion, the combined view of objects from different sensor types enables the vehicle to progress from sensing to perception.

He said: “Perception is the interpretation of sense data so any system that processes sensor data and performs some action based on the analysis already has ‘perception’. As systems become more autonomous, more control is given to the perception system and more demands are made from it. Sensor fusion allows more sophisticated and reliable decisions to be made. More sources of sensor data are very desirable in autonomous systems. And again, from a liability point of view, manufacturers might need to use sensor fusion in order to comply with what can be expected from them in terms of safeguards for avoiding errors in perception causing damage to property or bodily harm to humans.”

Advances in resolution

With regards to the advances in resolution in automotive sensors, especially radar, he comments: “Meta-materials and advances in manufacturing are enabling new and improved sensors to be created, and in real-time systems like autonomous vehicles temporal resolution is also an important factor.” He says that many of today’s sensors offer only a low bandwidth, which means that they suffer from a high degree latency. Automotive manufacturers will, in his opinion, therefore need to ensure that the bandwidth and latency are sufficient for the safe operation of automated vehicles.  Only then will they achieve the applicable kinds of approval requires to enable them to avoid liability.

Guichard adds: “An excellent vision or perception of the vehicles’ environment is essential for the safe deployment of AD vehicles, which can be achieved through fusion by using different types of sensors.” Each of the sensors has their own strengths and weaknesses and that’s why a combination of them is needed to achieve functional safety.

“It is not enough to have in-vehicle sensors that are doing all of the sensing because they have to be organised in a more interactive way, so that the data stored from the cloud can be re-used to train the sensing algorithms,” stresses Flament. He says this is already being done but in a very limited fashion. However he adds, “We can more by intelligently train the fusion algorithms trough large scale data collection.”

Trusting the data

Yet there is a need to understand how the ability of a vehicle to process complex visual data will yield far more accurate detection, classification and localisation abilities. So Guichard asks: “Do they have enough detection systems today?” In response to his own question he says there are enough sensors, but there is a need for vehicle-to-vehicle communication to improve safety by allowing each vehicle to learn and address complex or hidden situations. Questions, nevertheless, arise about the cyber-security, which could lead to vehicles being hacked and accidents being subsequently caused.

For this reason he believes that “manufacturers are likely to be with more confident with their own data rather than with data that is coming from the cloud or other vehicles”. Trust is also about a question of timing. Trust can only be achieved if the information gleaned from the sensors of each vehicle can be collated, verified and analysed securely in real-time.

“I can imagine that the usual security and safety measures related to connectivity will be in place but the regulators will probably have a view on this, and yet I’m not much aware of the regulator for the automotive sector having a final opinion on the type of connectivity and the security protocols,” he claims.

He concludes that manufacturers and systems suppliers are, therefore, able to be creative in the way they secure sensitive and valuable data, how they guarantee privacy but still generate and use the information available to create a safer, environmentally friendlier and secured mobility. So there are still some questions to be answered and issues to be addressed to improve the sensing and perception capabilities of automated vehicles before they can be regularly be part of today’s roads.

TU-Automotive Detroit 2017

07 Jun 2017 - 08 Jun 2017, NOVI, USA

The Undisputed Home of the Connected, Autonomous Car. With 150 speakers, 200 booths and 3000 attendees it's the world's biggest conference & exhibition dedicated to automotive tech. innovation.