The integration of data and tools still needs improvement, but faster multi-physics simulation is crucial for situational awareness optimization and reliability.
Heterogeneous chip integration, advanced packaging, and the increasing digitalization of various industry fields are pushing digital twins to the forefront of design. The challenge in these complex components is to figure out the potential trade-offs between different chiplets and different packaging methods, and to be able to make decisions quickly in order to catch the market window period. This includes the obvious trade-offs of power, performance, and area/cost, as well as the impact of mechanical engineering, various manufacturing processes on different materials, and the need to consider both global and detailed aspects when designing different parts.
Cadence's Senior Director of Multi-Physics System Analysis Product Management, Sherry Hess, said: "Think about the transformation that mobile phones have brought to our lives, and the popularity of electric vehicles today. These products are not only masterpieces of electronic engineering, but also of mechanical engineering. Whether it's foldable phones or flying cars, electronic products are taking on new forms. Here, engineering fields must coexist and collaborate to create the best possible final product."
All top EDA companies have adopted multi-physics as a tool method beyond chips, focusing on large-scale simulation. "What about drop tests, aerodynamics, and aeroacoustic effects? These are largely computational fluid dynamics and mechanical multi-physics phenomena that must also be considered," Hess said. "So, how does a drop test affect electrical performance? The electronic world and its diverse range of end products are pushing us beyond pure electrical engineering to develop heterogeneous products and engineering teams with a broader perspective."
Linking physical processes and data in the user ecosystem is called linking. Emmanuel Leroy, Chief Product and Technology Officer of Keysight ESI Group, said: "In automotive design, linking welding to collision simulation means that we can simulate the welding process very finely, deal with different parameters, and consider the simulation results in the collision to determine whether the welding will fail in the collision. This can adjust and optimize the type of welding, process parameters, and the number of spot welds we want to use. This breaks the silos in the OEM organization, introducing the concept of parallel engineering, such as how we combine manufacturing and engineering."

Many industries, such as automotive, have widely used digital twins to build mechanical systems. "The change is that people are turning all of these into products that use electronic products," said Marc Serughetti, Vice President of Product Management and Application Engineering at Synopsys. "They use electronic products because it is more efficient, safe, or has the ability to upgrade from an energy perspective. All of these products are moving towards electronic products, and more importantly, towards software-defined products. This allows them to change functions and introduce new momentum in this brand new business model. This is the market trend - software-defined products. Therefore, considering digital twins from this perspective, if all of this is done with electronic products, why don't we add a digital twin part to the electronic products? How to introduce these things is very important. All of these belong to a control system market, in which you are observing something in the physical world and trying to control it. If you want to verify or understand what is happening, it means that you must simulate electronic devices in a system environment."Digital twins have also opened the door to architectural exploration, where engineering teams can simultaneously inspect multiple elements, understand how they operate in concert, and what would happen if a part of the architecture were to change.
"The last time we used a physical product in EDA was with Rubylith, but since then everything has become virtual and digital," said Neil Hand, Director of Product Marketing at Siemens EDA. "We have evolved from domain-specific digital twins to more inclusive ones, and we have established connections in manufacturing on an increasingly larger scale. Now we have chip-level digital twins, followed by 3D-IC, which begins to introduce mechanics and thermals into the mix, and we will move towards product-level. Trade-offs are always present; suppose today there is a good scenario where someone wants to make a custom semiconductor for a unique application. You are a system company, you have all the money in the world, and you are quite patient. The architecture is in place, and you still need to write a specification to hand over to the IC group. You are working hard to go down the 'V', and everyone is making an effort. Then you put them back together, hoping it will work. Once we have more interconnected, fully interoperable digital twins, you can start making trade-offs."
This is a massive shift driven by an enormous amount of high-quality data and significantly enhanced computing power. "It will be an evolution, but also a revolution, because now we must introduce electronic products, and electronic products suddenly account for 50% of all products," said Serughetti. "You cannot ignore this part, and it is crucial to connect these two worlds in some system. We have also found this in the automotive industry. Today, everyone is talking about cars, but it is also happening in aerospace and defense as well as in the industrial sector. In electronics, there are three things you need to do. You are controlling something, you are sending information, or you are conveying information to someone through the UI. These are the basic functions, and we must do this in a safe manner. In this case, how do you verify all these things, especially the electronic components?"
Digital Twins and Shift Left
EDA companies unanimously agree that digital twins are essential for progress in multiple industry sectors. "You must have digital twins," said John Ferguson, Director of Product Management at Siemens EDA. "Without them, you are blindly trying to achieve your goals, and without a shift left strategy (which refers to identifying and fixing errors early in the development process rather than discovering them during post-release testing), you cannot do this. They complement each other. When it comes to system design—especially in 3D, and involving multi-physical fields in 3D, it becomes particularly tricky because there are interdependencies between all things.
The basic concept here is not new. "We have been talking about digital twins for a while, and they used to be divided into aspects that could be handled independently, but now we can no longer do that," said Ferguson. "Thermal affects stress. Force and heat affect electrical behavior. But electrical behavior, in turn, affects heat. This brings an additional burden to the digital twin; you cannot avoid looking at the world from this perspective, where you need to understand the intricate interactions and consider them in the process. It also means its shift left. It is no longer just 'one and done'; you must consider everything together. This is the whole concept we are trying to unravel here."
Cadence's Hess cited the recent examples of high-performance HBM and AI data center expansion that have driven this. "These high-bandwidth memories have increased from a few layers to 12 layers. These layered electronic devices need power, and power generates heat. We need to understand the heat, so we must address the thermal integrity issues found in this process. But the electronic thermal issue is just the first domino in the interdependent chain. What if the power supply for these stacked devices causes thermal stress or warping? How does this lead to mechanical stress or even material fatigue as the operating temperatures of electronic devices continue to rise and fall? This is just one of many examples."
Digital twins are also crucial for the adoption of chips, where different chips can be exchanged to determine the impact on the behavior of multi-chip systems.
Alphawave Semi's Chief Technology Officer, Tony Chan Carusone, said: "New multi-chip designs are currently being developed for the next generation of AI accelerators, CPUs, and networking chips. These designs integrate the latest CMOS logic technology with memory and connection chips, sometimes also integrating additional peripheral chips, all integrated in the same package. They push the limits of thermal dissipation, signal integrity, power integrity, mechanical reliability, and logic performance. Each factor can interact and affect each other, making the optimization of the design challenging. For example, improving signal integrity may damage mechanical stability, and reassigning logic between chips to improve performance may cause local heating issues, affecting reliability."It is worth noting that digital twins are part of a dynamic process. "This is not a single step," Ansys Chief Product Manager Lang Lin points out. "In chip simulation, we obtain all data from the simulation. But for the manufacturing process, the situation is different. First, prepare the substrate and start soldering another substrate on it, then raise the temperature to about 300 degrees to connect the two chips together, after cooling down start the annealing process, the temperature drops to about minus 40 degrees, and then place more chips on another chip. Digital twins are a good concept that can simulate each step one by one. It must capture the state at the end of each step and use it as the initial condition for the next step, which poses a challenge to our traditional simulation tools."
This is very different from the simulation of mechanical processes such as traditional cars or airplanes. "You assemble the mechanical parts together, the engine starts to operate, and then you need to check if there is any warping or mechanical failure," Lin said. "But that was the past. In the past, we dealt with sizes that were more than millimeters, meters, or even several meters. Now we have shrunk to the nanoscale, which requires new mechanical modeling methods and new materials science to build solutions. The most advanced mechanical simulations currently can delve into the micrometer level. For example, when two chips are connected to each other, the bump pitch may be about 40 micrometers. You model the bumps or microbumps - thousands or even millions. By building the entire model at the micrometer level, you can see the connection problems. In the next five years, the problem will appear inside the chip, such as you need to check the structure of Metal-1, Metal-2, Via-3, and the small transistors here, which are at the nanoscale level. We work closely with foundries to achieve this mechanical simulation. This is completely at the forefront. Ultimately, you expect to see the structural models of vias, through-silicon vias, and wires from your small simulation engine or graphical user interface. You will see these structures."
EDA tools for digital twin/system co-design
To make all of this work, some improvements to EDA tools are needed. How do today's EDA tools achieve the full functionality of digital twin/system co-design?
Serughetti of Synopsys said that in this case, the first thing to consider is what is needed and what problems need to be solved. "How great it would be if I had an ultra-fast RTL simulation that could let me run Android or other software stacks and execute the simulation to start Android in 10 seconds? Unfortunately, this is not realistic," he said.
"We have been engaged in simulation work for many years and know that there are two paradigms - abstraction and performance. These two are not very compatible. If it is too accurate, the speed is not fast enough. If it is fast, it is not accurate enough. If you look at the types of technology, the range is from RTL simulation, simulation that has its own use, to virtual ECUs, where hardware can be almost abstracted to only receive CAD messages. This is what I care about. This is a representation of hardware, and the representation is very abstract, and the scan can run very fast. But for engineers trying to see if there are problems with the CAN driver or CAN interface, this is not good enough, not accurate enough."
The solution is better data integration. "There are a large number of technologies in this field that are interconnected," Serughetti said. "Therefore, the use case becomes extremely important. You will have people who focus on performance and power verification, but they cannot verify on highly abstract models, and they need a simulation platform that can run enough and fast enough software to verify. And another type of engineer focuses on testing application software in this environment, and they do not need to care about the underlying hardware, so they can represent electronic products in a software abstract way."
To achieve all of this, various tools also need to work together.
"This is the key, because these things are complex," said Ferguson of Siemens. "We have no way to solve all the problems at once. You have to try repeatedly. We must adopt multiple solutions each time, which is very challenging and very difficult. What should you do? How do you handle all the aspects you need to focus on in a consistent way at the same time? It is very daunting."Ferguson believes that the semiconductor design ecosystem does a better job in identifying problems that have yet to be solved, but there is a need for improvement in integration. "How do we bring them together? How can we get everyone to agree to bring these things together? If everyone has a solution, but each solution gives a different answer, then we are at a disadvantage. Nevertheless, all EDA providers are integrating their solutions, but it is difficult to determine what the right answer is."
Typically, chip manufacturers compare test results with products that have been on the market for some time. "You've been using it for a while, and you haven't seen any serious failures, but that doesn't mean it's accurate," says Ferguson. "Sometimes it's a very tricky situation. You have to determine what the gold standard is. You can measure the silicon wafer in many different ways. You can measure the temperature of the silicon wafer, you can measure the stress of the silicon wafer, you can measure the electrical behavior of the silicon wafer. But all of these have inherent inaccuracies because your chip behaves and fluctuates on the wafer, and you might end up with a batch of defective products. How do you know this is the chip that's going to fail, and not another chip on the same wafer or a different wafer batch? We are all in this bewildering new stage, and the answers we get may not be completely different, but there are still differences. How do you decide which one to rely on? I don't know how we solve this kind of problem. In the end, it depends on your manufacturer. They decide what their reference or gold standard is, and you have to trust that they are doing the right job. Use any tool that meets their accuracy requirements and is certified, but still be prepared. We may have missed something along the way, and we need to go back to the drawing board."
Big Picture Perspective
The chip industry recognizes the value of digital twins, especially in advanced nodes and heterogeneous components related to specific domains. The current challenge is to optimize all tools to work together at a faster pace while maintaining the same or better accuracy as in the past.
Keysight's Leroy said, "What we see from our customers is that there is more collaboration between simulation and artificial intelligence. We often talk about hybrid artificial intelligence and hybrid digital twins, as well as accelerating simulation, but the democratic decision-making of simulation actually comes from the hybrid of artificial intelligence and simulation. We don't want to start from scratch like in the past, get big data, put a lot of artificial intelligence, and then no longer care about physics. You want to have intelligent data, so you need physics, and then use data and artificial intelligence to enhance it to make the right decisions."
All of this is happening in the integration of electrical, mechanical, and CFD. Alphawave's Chan Carusone believes that with the full implementation of digital twin technology, it will be possible to fully examine complex chip designs, thereby achieving a common optimization of cost, power, performance, and reliability. But from now to the future, a lot of work is still needed to achieve this goal.
Comment