Humans know teenagers on a corner attempting to cross a street may cross against the light, while a woman pushing a child in a stroller is more likely to obey the signal. Christie Schweinsberg reports.

Automakers and suppliers working to bring Level 5 autonomous capability to fruition may have a good handle on how their vehicles will perceive and plan but a third “p” element is tougher.

Prediction, that is anticipating what drivers, pedestrians and others on or near roadways will do, is a current AV development hang-up, says a top Toyota official.

“It’s the prediction piece that’s still the great unknown,” Gill Pratt, CEO of Toyota Research Institute, tells WardsAuto in an email interview. “Humans are very good at predicting human behaviour on the road. Machines will need to be able to predict and anticipate human behaviour much better.”

Pratt has spoken previously about how humans intuit teenagers on a corner attempting to cross a street may cross against the light, while a woman pushing a child in a stroller is more likely to obey the signal.

“Automated driving in most cases is actually pretty easy to accomplish, but it’s managing the situational ‘corner cases’ that present the biggest challenge,” he says, adding Toyota has a “tremendous amount of work” ahead to develop the artificial intelligence necessary to handle the “infinite” scenarios a Level 5 AV will encounter.

Insufficient prediction capability also will become an issue in the controversial Level 3 stage of autonomy, when a vehicle should be able to handle most scenarios but a human driver still needs to be ready to resume control should a tricky situation arise. This article first appeared in WardsAuto.

Connected & Autonomous Vehicles

14 May 2018 - 17 May 2018, Santa Clara, USA

From vehicle electrification and infrastructure to the evolution of ADAS and vehicle automation to enhanced connectivity and new mobility models, no rock will be left unturned.