Hermann Hänle
9. September 2019 0

Autonomous eyes

“Oh, look – there’s an ice-cream man. Ah, look out …” Crash. Bang. There are 5 accidents on the German roads every minute – a total of 2.6 million accidents a year – according to Generali’s 2018 Collision Atlas. Usually it’s just the car itself that suffers. Could that also happen with autonomous cars?

Humans: The main risk factor

“Of course not,” is the generally accepted answer. Studies and forecasts attempting to prove autonomous vehicles’ contribution to traffic safety are already piling up. McKinsey, for instance, predicts a 90 percent reduction in all car accidents, which will help to avoid millions of euros’ worth of damage.

Learning to see

However, for this vision to become a reality, first vehicles will have to learn to see properly. A variety of sensor “eyes” are already integrated into most modern cars; typical examples are radar, lidar, and traditional optical cameras. Their “vision” becomes most apparent to me when I’m driving into the garage or into a tight parking bay. Without fail, the beeping starts. 

In a Level 5 car, however, these eyes will also have to be alert during travel – at least we can be sure that they won’t be looking for the next ice-cream man. But they must be trained properly. After all, sensor technology is just part of seeing. Just like humans process optical impulses from the eye in their brains, the car needs on-board intelligence to process the captured data and evaluate the current situation.

Good training is half the battle

It sounds easy, but it isn’t: The sensors capture a heap of data during travel and have to keep all objects nearby – at least in their field of view – in sight. This means that the car must be able to identify nearby objects, as well as knowing real distances and motion vectors in order to be able to assess whether there will be a conflict with the object. Object identification can be trained before travel. During travel, processed radar or lidar data, for example, then comes together with this scene recognition.

The demands placed on such a detection system can, of course, vary. In the Australian Outback with its small volumes of traffic, the key is identifying dingos, kangaroos, or crocodiles crossing the road. In a German or French town, other motorists play a much more significant role. Not to mention cities like London, Paris, Frankfurt – or my favorite example, Stuttgart. What’s more, it’s apparently the German city states of Berlin, Hamburg, and Bremen that are the “leaders” in terms of accident density. Around one in six cars has an accident there each year.

It’s relatively easy to evaluate longitudinal traffic – that’s anything moving in parallel to the direction I’m traveling in. What is more difficult is evaluating traffic moving across my path, or curving movements. With that being said, reliable sensors and situation evaluation are an essential factor in autonomous driving. Automotive manufacturers are working hard on improving their cars’ vision. The key is artificial intelligence. It can identify “scenes” and assign captured image elements to objects during travel. Training object recognition requires not only heaps of real data, but also powerful computing capacity.

Da capo – and better, of course 

With the intelligence having been trained, new tests can be initiated to optimize driver assistance systems. And these don’t necessarily have to take place on the road. If desired, they can be designed to be virtual, so that they don’t put the car (or other road users) at risk. The basis is a platform that can provide complex driving situations for resimulation for driver assistance systems at the touch of a button – like a virtual non-crash test.

At the Daimler EDM CAE Forum, we exhibited such a data analysis and resimulation system. Interested? Then, please give me a note….

Leave a Reply

Your email address will not be published.

* Mandatory field



tsystemsCom @tsystemsCom
Smart solutions for a livable city of tomorrow? @deutschetelekom presents solutions for more sustainability at… https://t.co/oKs9AgHb7J
New release of a hospital information system #KIS. KAGes has migrated SAP/i.s.h. med to #SAP #HANA to further accel… https://t.co/VxWL2r1Kz8
RT @udlap:La #UDLAP y @TSystemsMX organizaron el #TBarCampMx 2019, evento que reúne a diversos expertos bajo el objetivo de c… https://t.co/TGGdwF1x3A
Enterprise networks are protected against cyber attacks by all means. But as more and more companies use #cloud ser… https://t.co/MM0j3BeiR2
RT @mwamsteeker:Do you want to know how we help businesses master the challenges of #digitalization? Check out this speech by our C… https://t.co/QNylQQ8nSl
RT @schindera:Hätte ja nicht gedacht, dass ich mal ne Meldung der Konkurrenz retweete, aber heute mache ich es einfach mal. Leute… https://t.co/L8w45qcf0h
RT @WalterGoldenits:„Mit 350 von 360 möglichen Punkten erreicht der Netzbetreiber aus Bonn als einziger die Note „Sehr gut“. In sämtlic… https://t.co/5J7JSIAkud
RT @DT_IoT:#InternetofThings revolutioniert die Transport- und Intralogistik. 🌐 Im @BVLoffice Podcast erklärt @AvidanRami, IoT… https://t.co/xQZoykb4RZ
RT @stephaniethum:How do you win in the experience #economy? 🤔 “Put the client in the center of your activities and be honest about… https://t.co/eaGS4jBFyY
Load More...