The new economics of verification
4 mins read
With verification dominating leading edge chip design, what does the eda industry have to offer?
Manufacturing used to dominate the cost of bringing a chip to market. Now, design and verification outweighs manufacturing costs by almost two to one. Chip designers spend around three quarters of the total development time verifying the chip, so it's hardly surprising that verification accounts for the lion's share of the development costs. Why does verification now dominate the development process?
If you design a chip that is six time more complex than your last, you'll probably need 100x more simulation cycles to verify it. This exponential growth means the state space of even a medium complexity design is so large that achieving complete coverage is impossible. Verification is an unbounded problem and managing verification – achieving incredibly tough verification objectives within limited budgets and schedules – is a critical process that determines the overall economics of any chip development project.
In order to verify more intelligently, engineers must focus on 'high value' verification tasks, including finding bugs as early as possible in the design process, using the fewest resources possible.
Measuring the value of a verification activity is a difficult, but important, step in managing the total cost of verification. Some studies suggest there is a multiplication factor of 10 at work in verification – a bug that costs $10, to fix in RTL will cost $100k to fix if it gets to layout, rising to $1million if it gets past tapeout. Technologies or activities that help to avoid bugs, or to find bugs earlier in the process, have a high verification value (see figure 1).
Leading chip developers measure verification return on investment. Verification 'investment' is the combination of engineers, compute infrastructure and eda tools needed to meet schedule and quality within the development budget. The largest such investment for most companies is people. Many companies deploy three verification engineers for every hardware designer and increasing the productivity of verification engineers in key areas has a large verification value.
The growth in verification complexity has put tremendous strain on compute infrastructure for chip development. Compute farms used for verification often range from thousands to tens of thousands of servers, with annual operating and depreciation expense in the millions of dollars. Our customers' average compute infrastructure costs have grown by 30% per year over the past eight years, a trend that we expect to continue.
In addition to investing in people and hardware, chip companies invest in verification tools, IP and methodologies to increase the efficiency of their teams and IT infrastructure. Approaching verification intelligently means using the appropriate tools for each task.
Constrained random verification with SystemVerilog is now a mainstream technology that harnesses the power of compute farms to generate and run thousands of tests automatically, even tests that an engineer may never have considered. This has improved verification productivity dramatically over the past few years. Synopsys' VCS, for example, enables engineers to create effective testbench environments that use constrained random stimulus, functional coverage and assertions.
Synopsys technologies for verifying low power designs that contain multiple voltage domains include VCS with MVSIM, a voltage aware simulation engine that offers functional verification for all power management designs. Alongside these are MVRC, a voltage aware static checker, and HSPICE and CustomSimfor analysis of leakage power, floating nodes and dynamic IR drop.
For enhanced performance, Synopsys VCS multicore technology reduces verification time by running the design, testbench, assertions, coverage and debug in parallel on multicore compute platforms. VCS supports both design and application level parallelism.
FPGA based rapid prototyping and debug provide many verification teams with higher productivity and lower infrastructure cost for hardware/software verification and 'at speed' system validation. Synopsys' fpga based rapid prototyping accelerates the validation of fpgas and asics. Its HAPS and CHIPit prototyping systems are suitable for IP and SoC design and verification teams who want to find 'corner case' hardware bugs or to start software development and integration before silicon availability.
Synopsys' functional verification tools are integrated technologies that, when applied intelligently, allow designers to find bugs quickly and easily, improving the quality of the most complex designs significantly and enabling first pass silicon success.
Verification projects often involve millions of lines of code, which means today's verification process is beginning to look more like a major software development effort. Hardware verification teams should examine software industry best practices.
The software industry leads the way in its understanding of process maturity. Because many of its tasks depend on human input, it is standard practice to measure and improve metrics, such as productivity per engineer. This has led it to focus on writing better code with fewer bugs. This 'correct by design' process is more mature than processes in the hardware domain.
Software teams also spend more time reviewing code than do hardware teams. They optimise their development infrastructure by measuring compile and build times and track the time it takes an engineer to check code and complete regressions. Software teams have metrics in place that let them identify and fix productivity bottlenecks.
During the last few years, Synopsys has collaborated with major chip and system vendors working on leading edge designs in several industries. A key element in this collaboration process is to assess objectively the expected value for various new verification approaches based on managing the customer's total cost of verification. Typically, 25 to 30 metrics are tracked for each project, so it can be better understand where bottlenecks and opportunities exist. Through this activity, enormous progress has been made in understanding the relationship between different applications, methodologies and costs.
This activity has led Synopsys to work with several key customers to help them manage growing verification costs – both in terms of people and compute infrastructure. By taking some of the best practices from the software world and applying them to improve verification throughput, Synopsys is eliminating productivity bottlenecks. By looking at engineer deployment, processes and technology, Synopsys has helped some organisations save tens of millions of dollars.
In the future, this collaborative style of working will become more commonplace. Closer collaboration between chip developer and verification solution provider can help reveal where verification costs are incurred and help devise strategies for reducing costs. EDA companies can then work with them to customise their processes and technologies in order to maximise the return on their verification investment. While adding new features to point tools will make verification more productive, major cost reduction requires broader thinking across multiple verification domains and, sometimes, even customised solutions targeted at specific design categories.
There are many compelling verification technologies on the horizon. Which one will drive the next great wave in verification productivity? Which ones should eda vendors and chip developers invest in? Only the future will tell, but one thing is for certain: the next great advance in verification will be driven by the economics of verification, and brought to market by the close collaboration between chip developers and their eda partners.
Author profile:
Manoj Gandhi is senior vice president and general manager of Synopsys' Verification Group.