The big question is how to get there? In their plans to support large-scale programmes such as autonomous driving, the design-automation companies are naturally keen to assist but they see their contribution as being one based on much more work in the virtual space.
“No-one is going to drive 8 to 10 billion miles [in a physical car],” Tony Hemmelgarn, president and CEO of Siemens PLM Software, claimed at Design Automation Conference (DAC) in Las Vegas in early June. He pointed to the progress made so far in what is meant to be a rapidly advancing space. “One article said Waymo is in the lead because it had done 9 million.”
A big problem for autonomous driving is that there are two side to the test and verification problem. One is the behaviour of a machine when presented with a certain scenario. The other is generating a big enough set of scenarios to be sure that something important does not fall through the cracks and wind up killing the occupants or other road users. There are also clear risks of testing robotic software in live conditions as a series of high-profile crashes in the US have demonstrated.
“Really it’s about how you leverage faster design using validated software. We can get to billions of miles by using software to analyse the edge cases that need to be assessed,” says Hemmelgarn.
Researchers such as Philipp Slusallek, scientific director of the German Research Centre for AI (DFKI), argue synthetic scenarios are vital for testing the machine-learning systems that will go into production vehicles. One of the big problems with putting the real vehicles through real-world conditions is that, for the most part, those scenarios are pretty uneventful. If you want to test a system’s ability to react to the more dangerous situations, you need to create them. This is more readily done through simulation with the help of sensor inputs captured from physical tests to ensure that what the AI ’sees’ is representative of real-world conditions. This kind of approach is how companies such as Waymo are augmenting their verification regime. By the time it had amassed 8 million miles of driving around the US, Waymo had put thousands of copies of the software through a total of 5 billion simulated miles.
The need for more extensive simulation in automotive and similar projects goes far the long-term aim of autonomous vehicles. At the recent CDNLive EMEA conference, Hans Adlkofer, Infineon Technologies’ automotive systems vice president, explained how semiconductors have become the dominant means of differentiation in automotive. “Eighty per cent of the innovation in automotive design is now coming from the semiconductor space,” he argued. The trend towards electrification is helping to push that in addition to the growing use of infotainment systems alongside driver assistance.
This increased semiconductor and microcontroller content has a knock-on effect: complexity. Adlkofer says there is a need to improve the way that designs are put together. “Traditionally, the OEM or tier one did the specification for the ECU and gave that to the ECU partner to implement,” he explains.
The problem now is that multiple ECUs are expected to cooperate within an overall automotive network. It takes too long for prototypes to be built to test out different management schemes and test profiles that can be passed down the supply chain for implementation. The ECUs, and multiple other subsystems, need to be designed together. This is where the concept of the simulated vehicle or digital twin appears. The vehicle design lives as an interconnected set of models that are updated as changes are made and tests on prototype subsystems reveal more about the design’s physical behaviour in hardware-in-the-loop evaluations.
“We need a design flow that lives with the digital twin concept,” Adlkofer says. “We can have the car living in a digital form even if the parts are not ready so we can test all the different functions and all the different features in a virtual environment.”
Hemmelgarn says: “Many companies claim ’we will reduce complexity in your process’. That’s a mistake. Complexity is not going to go away. What we tell our customers is learn how to use complexity as a competitive advantage: set up a digital representation of what you have so you can make decisions with confidence. That’s where the real value comes from.
“Customers now know they have to use this: ‘I have to be disruptive because other people could be running faster than me’.”
The digital twin
The concept of the digital twin itself has been around since 2003 when Michael Grieves, research professor at the Florida Institute of Technology described the idea in a paper on the use of virtual factories to test production flows. However, most of the implementations have been focused on material properties and mechanical behaviour. The inclusion of electronic control is a relatively recent phenomenon.
Joe Sawicki, executive vice president at Siemens’ EDA subsidiary Mentor, says the clearest example of this kind of digital-twin concept being put into action is in automotive. “Things that combine mechanical and electrical properties? In the short term that’s massively going to be automotive.”
Sawicki points out that the chip-design sector has embraced the core concepts behind the digital twin for many years, albeit under a different name. The late EDA industry analyst Gary Smith promoted the concept of the silicon virtual prototype before Grieves popularised the term digital twin. Because of the complexity of the SoCs being designed into these systems, specialised hardware such as emulators and FPGA prototyping systems are now routinely employed to make sure the simulations can run at reasonable clock speeds and allow software to run much closer to real-world performance than workstation- or server-based simulation. “Emphatically we are big believers in digital twin with emulator for simulating large electronics systems,” Sawicki says.
Outside of automotive, the use of a full digital twin is less urgent. Sawicki sees more piecemeal adoption in other sectors, particularly those where there is a strong interaction between SoC behaviour and the physical environment.
Heat effects, for example, can strongly affect performance and reliability. “Are you going to get thermal warpage?” asks Ansys chief strategist Vic Kulkarni. “What happens to the package and board or the solder joints as the system heats up?”
Kulkarni refers to the usage model as prescriptive analytics. In this, the engineering team develops a set of scenarios and simulates each of them across the electrical and mechanical domains. Ansys has bought a number of companies in recent years to expand the range of domains.
Kulkarni points to the acquisition of Optis as an example. “It’s an optical simulation specialist. The tool will simulate the sun shining into your camera and calculate the different diffraction and refraction effects,” he says.
Once the effects of different types of glare are calculated they can be incorporated into models that support later work with virtual prototypes. That way, it becomes possible to see how well a control system copes as the input quality degrades, avoiding the need to recreate those conditions in the more hazardous physical world. Whether in the form of full system-level simulations or more targeted analyses that couple multiple tools together, product-design testing is going to call much more on the virtual world.