Models of freedom: How the most widely used EDA tool has developed in its 40 year history
6 mins read
Earlier this year, the Computer History Museum in California held a 40th birthday party for the SPICE analogue modelling simulator – the most longest lived and most widely used electronic design automation (EDA) tool in the world.
Whereas most EDA tools are confined to specific areas, such as printed circuit board (PCB) or integrated circuit (IC) design, just about every electronic engineer can make use of this very accurate analogue modelling environment – and probably has at some stage.
You can attribute some of the prevalence of SPICE to the reasons behind its development at the University of California at Berkeley and its subsequent distribution – factors that reflected the political concerns of the academics who created it.
Technically, the anniversary party came two years early; SPICE was not unveiled until April 1973. The Computer Museum's celebration commemorated instead a presentation made at the International Solid State Circuits Conference (ISSCC) by Professor Ron Rohrer. Building on analogue circuit simulation work he performed at Fairchild Semiconductor, resulting in the FairCirc simulator, Prof Rohrer described a tool with the unfortunate acronym of CANCER.
At the time, most simulation tools were developed in house, either by chipmakers or military contractors – who were the primary target market for ICs – and so included support for the simulation of radiation effects. Thinking of a wider audience for electronics tools, Prof Rohrer's simulator went a different way – the last two letters in the acronym standing for 'excluding radiation'.
Larry Nagel, one of the group who developed SPICE, wrote 25 years later: "The name CANCER was a brash statement that this program would never simulate radiation and was not funded by the defence industry. It was developed at Berkeley in the 1960s, remember!"
Rohrer used CANCER to teach students such as Nagel the fundamentals of circuit simulation. Nagel and his classmates developed a sparse matrix solver that increased massively the size of circuits that could be simulated in reasonable time on an early 1970s mainframe.
As the development was effectively paid for out of public funds, the team decided the best way to distribute the code for the simulator was to release the source code to the public. Having decided to rename the rewritten CANCER with a less unattractive acronym, they charged just for the magnetic tape on which it was shipped. UC Berkeley later decided to use the BSD open-source licence for the SPICE code.
Although the original SPICE only supported bipolar transistors, the arrival at UC Berkeley of David Hodges from Bell Labs saw the inclusion of a MOSFET model. That model saw the emergence of problems still seen today in SPICE simulations: problems with DC convergence and problems with timesteps that are too short, making it impossible to find the derivative of a waveform.
The ability to fit custom device models into SPICE has proved one of the defining features of the simulator. The BSIM transistor model – short for Berkeley Short channel IGFET Model – has been progressively refined to include parasitic effects that have emerged as IC based devices have shrunk to the nanometre scale.
The decision to use open source distribution did not stifle commercialisation; instead, it helped make it the basis for analogue simulation in most circuit design tools. Kim Hailey, who cofounded Meta-Software, lent his name to the best known commercial implementation – Hailey-SPICE, quickly contracted to HSPICE. Meta-Software managed to convince engineering teams to buy a commercial SPICE simulator by tuning the code to run on a microprocessor, rather than a minicomputer or mainframe. This opened up use of the simulator to a wider range of users, most of whom did not have the time or inclination to do the port themselves. Other companies built their own tuned versions of SPICE, putting in performance optimisations to help overcome the simulator's glacial slowness. However, HSPICE – now owned by Synopsys – remains the best known.
Even on fast multiprocessor workstations, performance remains the Achilles heel of SPICE. Developers have come up with subtle and creative ways to implement sparse-matrix solvers, but it is never going to be a quick process. And the sizes of circuits that engineers want to throw at SPICE have grown more quickly than have the capabilities of the hardware. This first became a problem for design teams working on large regular circuits, such as memories. These devices are tuned to operate with very fine tolerances to voltage and process variations, so understanding how small perturbations can corrupt a read or a write are vital. The problem is that it's tough to simulate millions of storage cells feeding into a sense-amplifier using conventional SPICE.
A similar problem arose in system-on-chip (SoC) design. Some of the more problematic blocks for an accurate circuit-level simulator are phase-locked loops (PLLs), charge pumps and voltage regulators. However, large numbers of these types of blocks have sprouted on chips targeted at deep submicron processes. Not only that, these blocks need to be simulated with some of the digital environment around them.
At the same time, engineers have come up with a variety of techniques to try to deal with the loss of voltage headroom and other problems. One answer is to simply move more analogue functionality into the digital domain and to do things such as store calibration and offset values in digital memories.
The idea of using digital calibration can be taken further. For example, it's possible not just to provide a single offset value digitally, but also to use a signal processor to linearise the output of an A/D converter dynamically, based on self tests. Or a network of small A/D and D/A converters might be used to bias op amps to keep them in their linear operating region.
Power consumption can improve with increasing amounts of digital logic, particularly as supply voltages scale down. Calculations by engineers at TSMC have shown that as the required signal to noise ratio increases, the balance in terms of energy cost tips strongly in favour of digital assistance. However, the shift puts more stress on verification tools because of the need for more extensive mixed language simulation. In some cases, it is simply easier to put a test chip on a fast turnaround multiproject wafer and wait a couple of months for silicon to appear than it is to persevere with multi-day simulations that may simply crash after a couple of nights.
To help with the analogue side of the simulation, EDA companies came up with the idea of FastSPICE, perhaps better termed NotQuiteSPICE. FastSpice was invented to accelerate the process of simulating analogue circuits. These products typically use faster, but less accurate, algorithms to improve runtimes. In order to let designers control accuracy, they typically have options to control the trade off between speed and accuracy. For example, charge conservation is important to the switched capacitor circuits used in many on-chip A/D converters, but this adds to the runtime and is not necessary in other circuits. So the vendors make it possible to disable these features.
The issue when these tools were first introduced was that they pushed the burden of deciding which parts of the model are important to a given circuit onto the circuit designer. Pick the wrong optimisation and the nominal 'within 1% of SPICE' accuracy, often claimed by FastSPICE suppliers, quickly becomes 10%, 20% or worse. And the only way to find out that the assumption is wrong is to run SPICE or make a test chip, defeating the whole purpose of using FastSPICE. More recently, tools have incorporated techniques to switch optimisations on and off automatically, based on the characteristics of the circuit fed into them.
The development of parallelised algorithms for SPICE have helped to improve the performance of the main analogue simulator, but there is, in chip design at least, a continuing tension between the use of SPICE and FastSPICE.
Another option, particularly at the early stages of a project, is to use behavioural modelling tools such as Matlab or dedicated analogue-simulation languages. The earliest successful commercial behavioural simulator was Spectre, developed by Ken Kundert at UC Berkeley and subsequently at Cadence Design Systems.
Although it has been used on many projects, Spectre faced an uphill struggle as a proprietary language for a methodology that took years to be embraced by a large number of designers. If it is hard to decide to which optimisations to use in FastSPICE, developing an accurate behavioural model of a circuit that does not yet exist is even tougher.
In practice, behavioural models are either derived from existing circuit simulations or from other empirical sources. Very often, they do not so much simulate the circuit as the environment in which the circuit is meant to operate. For example, teams have developed substrate noise models to gauge the effect which digital switching noise will have on a block such as an A/D converter before they have access to firm data from a fab. Others have implemented models of communications channels or mechanical systems in these languages to see how well a controller can compensate for interference and other problems.
Like the digital synthesis and simulation language Verilog, mixed-signal behavioural modelling received a boost from standardisation. Kundert, among others, contributed heavily to the Verilog-A language and the later Verilog-AMS standard, which evolved at roughly the same time as VHDL-AMS.
The most recent addition to the list of standardised modelling languages is SystemC-AMS. One of the reasons for its creation was a desire among some chipmakers to be able to deliver models to customers for use in their own tools without having to license potentially expensive simulators from the EDA suppliers. Although there are commercial simulation environments for use with SystemC, the reference implementation is supplied as open source.
As with SPICE at its introduction, open source is being used to help drive the wider of use behavioural modelling. Free software seems to go hand in hand with mixed-signal design.