The design industry is finally evolving into its ultimate, ideal form. What will that look like? Manufacturers will have unified sets of tools to design, model, and simulate not only individual parts of a proposed electronic product (an integrated circuit, PCB, 3D-IC, or advanced package subsystem, etc.) but also the physical elements of the complete system (electrical, mechanical, and physical) working together as a whole.
How do we get there?
As more manufacturers of things that you carry, drive, fly, or float demand more sophisticated design tools, today’s design companies must transcend their electronic/electromechanical roots to support multiphysics. This encompasses the entire spectrum of physical phenomena including signal and power integrity, thermal dynamics, aerodynamics, fluid dynamics, and more.
Evolution of design tools
The evolution of design tools and the evolution of computing are inextricably intertwined, spiralling in tandem like twin strands of DNA. Each advance in computing demands more sophisticated automated design capability; every advance in automated design is utterly dependent upon advances in computational power.
During the first decades of computing, if engineers wanted to automate a design task, they had to do it themselves. While point tools were coming into play, some of the most popular engineering tools were text editors and spreadsheets. By the end of the 1970s, IC, board, and system design had become complex enough that building custom tools was impractical for all but a few companies. Commercial design tools were introduced in the early 1980s, including computer-aided design (CAD) software, computer-aided manufacturing (CAM) packages, and electronic design automation (EDA) tools for chips and boards.
CAD and CAM merged into CAD/CAM pretty early, a portent that a fully integrated design environment might one day be possible. Each of the domain-specific design disciplines was complex enough that they became industries of their own. Engineers contributed in their specialised areas with the full system divided and reintegrated as components. Assembling systems was relegated to manufacturing. But to model and simulate every element of a car from start to finish? Or a mobile communication device? Or even a radio-controlled toy airplane? The task was too vast.
The tools required were all built in isolation for their specific application domain. A manufacturer committed to automated design would need at least one toolset for mechanical systems, one for ICs, another for circuit boards, yet another for body panels (or hulls or cabinets or cases), and so on. No manufacturer had the time or resources to ensure these individual tools worked together.
Worse, critical elements of the design process such as simulation, modelling, verification, synthesis, and others were complicated enough to merit the founding of companies specialising in each. The design industry actually bifurcated a bit before it began to coalesce.
The past 20 years have seen that coming together. Market-leading design companies like Cadence have spent that time expanding their tool suites and ensuring that the constituent elements work together.
There is little choice but to evolve because customer requirements keep evolving, and the technologies are converging and require co-design and co-optimisation. And tool vendors must keep up with these changes. Manufacturers face immense pressure to reduce time to market (TTM) while creating reliable, environmentally safe products with ever-more complex and sophisticated components and subsystems.
The only way to face these TTM pressures is with more and better modelling, analysis, and verification, and fully concurrent and integrated design processes. It is no longer feasible to start with a discrete tool for chips and then move to another discrete tool for boards before graduating to a discrete tool for modelling system-level behaviour and still produce competitive products.
Manufacturers must see how a tweak in a chip will affect the board it will reside on, and what that change will mean in the behaviour of the system as a whole - all before any of those things has been physically instantiated. In order to accomplish this, it is imperative that all of the data required for design, analysis, simulation, and verification be provided in an efficient and consistent format across all aspects of design.
Cadence, evolution, and multiphysics
Cadence’s history extends almost to the very beginning of the commercial EDA industry, and its growth has been driven by innovation in computational software. Cadence has grown both organically and through acquisition, adding a broader spectrum of design, simulation, analysis, and verification capabilities each time. Advances include significant innovations in computational software and massively parallel architectures to take advantage of the growing available compute power.
Multiphysics simulation has become a critical core competency for the intelligent system design process, where predicting real-world behaviour is paramount. After establishing a strong presence with the Clarity 3D Solver, Celsius Thermal Solver, and Clarity 3D Transient Solver, more recently Cadence acquired three computational fluid dynamics (CFD) companies: Numeca, Pointwise, and Future Facilities.
CFD extends computational software beyond the realm of electronics to encompass important behaviours of electromechanical machines and applications in other areas such as aerospace, automotive, and even molecular biology. Whether it’s airflow in aeronautics or blood flow in pharmaceutical drug exploration, the ability to model and analyse these fluids is a necessity.
These CFD models and analyses must be as accurate - as lifelike - as possible. The flow of air, water, and other fluids is profoundly complex, requiring incredibly sophisticated models and astounding computational power. Blood is a non-Newtonian liquid and has characteristics that are very, very different from water.
With Future Facilities, Cadence gained pioneering expertise in digital twins, which involves simulating entire systems that behave exactly as they will when finally built in the real world. Thermal analyses of data centres, which need accurate models of heat dissipation generated by racks of servers and of airflow through buildings that occupy millions and millions of square feet, is a pressing need in the hyperscale computing era.
At the other end of the spectrum, accurate thermal analyses are also mandatory for the latest trend in semiconductors. Functions once integrated on the largest chips are now instead being distributed among multiple “chiplets”. Because they’re stacked in such a manner that dissipated heat can collect quickly, thermal effects must be understood properly if this trend is to progress.
Cadence’s most recent acquisition is OpenEye Scientific, which created molecular modelling and simulation software used in the pharmaceutical and biotechnology industries for drug discovery.
Molecular modelling is well-known to be useful in the medical field, but materials science is providing solutions to so many challenges from creating more efficient batteries to the formulation of lightweight, radiation-hardened materials for satellites.
A path forward
Cadence remains a leader in EDA, but it has expanded far, far beyond its roots, travelling much further down the evolutionary path than anyone else in the design industry to become a computational software company.
Cadence is the first - and, as of today, the only - company to provide a complete chip-to-system design solution including the ability to simulate and analyse designs at every stage from conception to production.
There is still plenty of room for ongoing innovation in design tools, and integrated tool suites for electromechanical design are imperative. Soon it will be possible to design and simulate all aspects of the most advanced systems -from aircraft to medical systems to consumer electronics - within a unified design environment.
Author details: KT Moore, vice president, Corporate Marketing, Cadence Design Systems