Hermann Hänle
7. June 2019 0
Automotive

On-site taming of the data monster

The problem has long been known to science: researchers are producing more data than can be processed. Famous examples of this are the various nuclear research centers, where computer resources operate round the clock to find the needle in the nuclear physics haystack. The evaluation projects have to go into the virtual queue and wait until they are called up. Almost like in a public authority office.

Big data in vehicle development

In 2018 the global digital data volume was estimated at 33 zettabytes – that’s a lot. If you burnt this amount of data to Blu-ray disks, it would make a pile as high as two return journeys to the moon. Quite a nice game of patience and probably a nightmare for the makers of Blu-ray.
The development of automobiles – not just self-driving ones – is also doing its bit to create global data growth: for example, an autonomous automobile produces three terabytes of data in an hour’s drive. Google Waymo cars alone, with their 16 million test kilometers on public roads to date, account for nearly a million TBs, one exabyte. Keyword: Big Data.
The nearer we get to a car being launched in the market, the more urgently necessary driving tests become. Ultimately driving a car is still a physical activity, not a virtual one – even if it has a digital twin. This applies to “normal” automobiles as well – though they don’t produce quite as many bytes. However, plenty of data emerges all the same. But collecting data isn’t even half the battle.
It’s more like a children’s room: if the boys devastate the room, it happens quickly. But when they have to clean it up again, that can take ages. The raw data alone may possibly be valuable (and secret). But its true value only becomes apparent once it’s been calculated using the right models. And to do that requires huge computing capacities. Otherwise the engineer will still be sitting at his computer till the end of time. Huge computing capacities – that cries out for the cloud.

Big data here, the cloud there

But to go places, the data must first be brought to the computing capacities in the cloud. By the way, that’s a challenge that is generally overlooked. To make things easy, let’s stick with our example of the self-driving car that “drives” while creating three TB an hour. If this data has to be transported directly to the cloud, this would mean a transfer rate of 834 megabytes a second. This is quite a serious figure. Especially when testing is performed in out-of-the-way places to measure how the car performs in hot and cold conditions. Network connections north of the Arctic Circle are not particularly great.

Edge computing leaps into the breach

If the mountain doesn’t come to the prophet, then the prophet just has to go to the mountain. In that case, computing resources and evaluation models come to the Arctic Circle where the data is recorded. In a mobile data center. The idea that the computing capacity is transported to the place where the data is generated in order to reduce latencies is called edge computing. In individual cases, especially regarding many industrial IoT topics, edge computing can be a smart addition to cloud computing.
The edge data center takes on the job of calculating the data and sends the results to the cloud backend. Later– after the tests are concluded – the test data can also be stored there so that the OEM has permanent access to it. The benefits of edge computing are obvious: evaluation is dramatically accelerated, results are available to the engineers more quickly, and as a by-product, costs are also reduced.
At the Daimler EDM CAE Forum, we are introducing a network of superclusters that spans the world, and enables Big Data-based signal processing. Why not drop by and have a look?

Leave a Reply

Your email address will not be published.

* Mandatory field

 
 

Twitter

tsystemsCom @tsystemsCom
Germany can be a #5G pioneer: #TSystems CEO Adel Al-Saleh (@adel_al_saleh) spoke at the #EMO2019 opening ceremony a… https://t.co/Quilmt02la
How can you deploy #DeceptionTechnology for a higher Security level? Answer froms #TSystems & @CyberTrap_ in the we… https://t.co/1BUMGzvKAW
#TSystems won the "Rising Star of the Year" award from the @AWS partner network in DACH region. Learn more about ou… https://t.co/pTpUOKmUFd
The #SAP Leonardo exhibit "Connected Greenhouse" shows a modern, intelligent company based on state-of-the-art SAP… https://t.co/OIJrr5PDI2
Thanks to #encryption, confidential conversations are now also possible with the #iPhone. The software developed by… https://t.co/V9D1cw4oNM
We are at the annual DSAG Congress 2019 and are looking forward to your visit at booth L22. Our @tsystemsde experts… https://t.co/OPrvYpdmwD
Want to store #BigData in the #Cloud? @SmartDataCo about all the benefits and what to know. https://t.co/Sykhtssgty
#ArtificialIntelligence is reaching organizations in many different ways. @Gartner_inc's Hype Circle highlights #AI https://t.co/3FkTxvfsfv
Your company’s most valuable asset is stored on your mainframe: your data. But do you use this source efficiently?… https://t.co/DkD6y7jfom
Load More...