Hermann Hänle
9. September 2019 0
Automotive

Autonomous eyes

“Oh, look – there’s an ice-cream man. Ah, look out …” Crash. Bang. There are 5 accidents on the German roads every minute – a total of 2.6 million accidents a year – according to Generali’s 2018 Collision Atlas. Usually it’s just the car itself that suffers. Could that also happen with autonomous cars?

Humans: The main risk factor

“Of course not,” is the generally accepted answer. Studies and forecasts attempting to prove autonomous vehicles’ contribution to traffic safety are already piling up. McKinsey, for instance, predicts a 90 percent reduction in all car accidents, which will help to avoid millions of euros’ worth of damage.

Learning to see

However, for this vision to become a reality, first vehicles will have to learn to see properly. A variety of sensor “eyes” are already integrated into most modern cars; typical examples are radar, lidar, and traditional optical cameras. Their “vision” becomes most apparent to me when I’m driving into the garage or into a tight parking bay. Without fail, the beeping starts. 

In a Level 5 car, however, these eyes will also have to be alert during travel – at least we can be sure that they won’t be looking for the next ice-cream man. But they must be trained properly. After all, sensor technology is just part of seeing. Just like humans process optical impulses from the eye in their brains, the car needs on-board intelligence to process the captured data and evaluate the current situation.

Good training is half the battle

It sounds easy, but it isn’t: The sensors capture a heap of data during travel and have to keep all objects nearby – at least in their field of view – in sight. This means that the car must be able to identify nearby objects, as well as knowing real distances and motion vectors in order to be able to assess whether there will be a conflict with the object. Object identification can be trained before travel. During travel, processed radar or lidar data, for example, then comes together with this scene recognition.

The demands placed on such a detection system can, of course, vary. In the Australian Outback with its small volumes of traffic, the key is identifying dingos, kangaroos, or crocodiles crossing the road. In a German or French town, other motorists play a much more significant role. Not to mention cities like London, Paris, Frankfurt – or my favorite example, Stuttgart. What’s more, it’s apparently the German city states of Berlin, Hamburg, and Bremen that are the “leaders” in terms of accident density. Around one in six cars has an accident there each year.

It’s relatively easy to evaluate longitudinal traffic – that’s anything moving in parallel to the direction I’m traveling in. What is more difficult is evaluating traffic moving across my path, or curving movements. With that being said, reliable sensors and situation evaluation are an essential factor in autonomous driving. Automotive manufacturers are working hard on improving their cars’ vision. The key is artificial intelligence. It can identify “scenes” and assign captured image elements to objects during travel. Training object recognition requires not only heaps of real data, but also powerful computing capacity.

Da capo – and better, of course 

With the intelligence having been trained, new tests can be initiated to optimize driver assistance systems. And these don’t necessarily have to take place on the road. If desired, they can be designed to be virtual, so that they don’t put the car (or other road users) at risk. The basis is a platform that can provide complex driving situations for resimulation for driver assistance systems at the touch of a button – like a virtual non-crash test.

At the Daimler EDM CAE Forum, we exhibited such a data analysis and resimulation system. Interested? Then, please give me a note….

Leave a Reply

Your email address will not be published.

* Mandatory field

 
 

Twitter

tsystemsCom @tsystemsCom
Germany can be a #5G pioneer: #TSystems CEO Adel Al-Saleh (@adel_al_saleh) spoke at the #EMO2019 opening ceremony a… https://t.co/Quilmt02la
How can you deploy #DeceptionTechnology for a higher Security level? Answer froms #TSystems & @CyberTrap_ in the we… https://t.co/1BUMGzvKAW
#TSystems won the "Rising Star of the Year" award from the @AWS partner network in DACH region. Learn more about ou… https://t.co/pTpUOKmUFd
The #SAP Leonardo exhibit "Connected Greenhouse" shows a modern, intelligent company based on state-of-the-art SAP… https://t.co/OIJrr5PDI2
Thanks to #encryption, confidential conversations are now also possible with the #iPhone. The software developed by… https://t.co/V9D1cw4oNM
We are at the annual DSAG Congress 2019 and are looking forward to your visit at booth L22. Our @tsystemsde experts… https://t.co/OPrvYpdmwD
Want to store #BigData in the #Cloud? @SmartDataCo about all the benefits and what to know. https://t.co/Sykhtssgty
#ArtificialIntelligence is reaching organizations in many different ways. @Gartner_inc's Hype Circle highlights #AI https://t.co/3FkTxvfsfv
Your company’s most valuable asset is stored on your mainframe: your data. But do you use this source efficiently?… https://t.co/DkD6y7jfom
Load More...