As designs become more complex, the logic analyser is moving towards becoming a virtual instrument
8 mins read
The logic analyser had a difficult birth. It only came about as a result of the struggle by Hewlett Packard to keep pace with Tektronix oscilloscopes during the early 1970s. And so dominant was the oscilloscope in design it took the logic analyser some time to be recognised as a category of instrumentation in its own right.
By 1970, engineers were beginning to understand the ramifications of designing with digital integrated circuits. They were black boxes designed to process bunches of digital signals in parallel. Sticking an oscilloscope on one or two outputs was not going to tell engineers what was going wrong with a design.
In his memoir of how the first of Hewlett Packard's logic analysers was developed, Chuck House, then a programme manager at the company and now a researcher at Stanford University, describes how he discovered the need for something like the instrument on a corporate spying mission for his employer at an 1970s trade fair organised by the IEEE. House and a colleague spent a lot of time on the Tektronix stand and were not summarily ejected as they had originally expected. Tektronix, at the time, was confident in the capabilities and market potential of its Series 7000 scope, which was intended to be the ultimate instrument of its time, with a variety of instruments rolled into one mainframe.
Customers on the Tektronix stand kept asking for the same thing: to have more channels without having to assemble together a bunch of scopes. It was the only way they could make sense of the multiple changing signals within a design that relied on one or more large scale integration (LSI) devices. House convinced HP to back a plan to build something that might fill that gap on the basis that going head to head with the Series 70000 was unlikely to work.
HP's flawed, but groundbreaking, first attempt at a logic analyser – although it was yet to acquire the name – was launched at Westcon in 1973. House later recalled: "It seems crazy to imagine now, but this unit represented a paradigm shift that confused many folk. At a division review demonstration, HP's corporate development director uttered in frustration, 'Time always flows left to right; you're telling me that it goes top to bottom?'"
Other early complaints seem eerily familiar. The logic analyser has often struggled with delivering the required channels, memory and speed. House continued: "Computer folk said: 'Twelve channels? All computers have sixteen or more.' The same folk also said: 'No one builds 10MHz systems, they're all 20MHz and up'."
However, the timeline view quickly shifted back to the horizontal axis.
By 1974, HP had a department known as the 'Logic Analyser Lab'. It would take some time before features now seen as essential became incorporated into the hardware and for the mainframe to be separated from the oscilloscope from which it was derived.
As the second half of the 1970s arrived, companies such as Digital Equipment Corporation (DEC) and IBM saw the potential for logic analysis in the design of their faster computers, even though, as House recalled, DEC expressed concerns about buying critical instruments from a company with which it competed in the minicomputer market.
The core features with which we are now familiar quickly settled in as the logic analyser moved from being a tool for high end computers to a widely used logic debugging aid. Typically, the instrument would let the user group the signals from a collection of probes and convert the captured data into timing diagrams – using a familiar representation from datasheets.
Users could, in general, choose to use one of two modes. Timing mode allowed the instrument to sample the digital inputs at regular intervals. More commonly, users would employ state mode, where at least one incoming signal could be defined as a clock source and used to tell the instrument when to sample the other lines. The logic analyser could either run freely, similar to an analogue oscilloscope, or be set to capture a series of pulses after a trigger condition was met. Although free running would easily show the target doing something it would often do little to help zero in on a defect – except for situations where engineers were looking for obvious glitches. By hunting through the stored data captured from timing mode, they could see glitches and use the signals preceding them to set up a useful trigger condition to try capture to capture the glitch again and less laboriously.
The trigger modes could be very complex, letting users, for example, only set the analyser to collect data once a particular type of data packet or error condition was seen on a bus. These trigger modes also helped to emphasis a reputation for complexity, compared to the more intuitive oscilloscopes that usually sat alongside the instrument on the bench.
The state mode would prove to be vital in making the logic analyser a viable tool for software and system level development. Rather than showing timing diagrams, the processors that started appearing inside analysers towards the end of the 1970s could be used to decode signals further – showing signals on buses as machine code instructions and their data on their way to and from a microprocessor, Ethernet packets and headers or state machine transitions.
More than 20 years after it first appeared as a piggyback unit, HP brought the logic analyser full circle by bringing its functions back into a scope. The mixed signal oscilloscope (MSO) was born from the realisation that a fast growing body of designers was trying to bring up embedded systems that interfaced with a range of real world, analogue signals and which used digital communications in electrically hostile environments.
Automotive designers were making increasing use of serial buses and networks such as the Inter-Integrated Circuit (i2c) and the Controller Area Network (CAN). These interconnects, particularly the cable oriented CAN bus, are susceptible to signal integrity problems caused by the electrical interference often found in automobiles, with ignition systems and random system noise potentially injecting errors into communications despite the protocol level protection that some of these buses have.
A logic analyser might reveal protocol errors, but without an obvious clue as to whether the error was caused by faulty digital logic – which might be tucked away inside a proprietary core inside a microcontroller – or random signal interference. Naturally, an oscilloscope could be attached to the same system, but cross probing between the two views was not entirely straightforward. The use of two instruments also raised project costs, particularly as the trend in standalone instruments was towards the very high signal rates used in telecom and networking applications.
Recognising that embedded systems used in automotive and industrial applications were mixed signal in nature, HP decided the answer was a 'mixed signal oscilloscope'. The company developed the MSO as a hybrid test instrument that combined the measurement capabilities of a digital storage oscilloscope (DSO) – the form of oscilloscope pioneered by LeCroy before being taken up by both HP and Tektronix – with some of the measurement capabilities of a logic analyser.
With an MSO, users could see multiple time aligned analogue and digital waveforms on the same display. Aimed at a lower cost bracket than DSOs or standalone logic analysers, MSOs typically lack the large number of digital acquisition channels of standalone logic analysers and also lack the more complex protocol analyses provided by dedicated serial-protocol analysers, although this has changed in the past five years as the usage model for MSO has expanded. Originally, MSOs were designed to offer relatively simple operation: they were designed very much as oscilloscopes with added digital analysis, rather than as logic analysers with analogue inputs on the basis that logic analysers have carried the reputation of being more complex to use.
It took almost a decade for Tektronix to follow HP – spun off as Agilent Technologies at the end of the 1990s – into the market for MSOs. But both companies realised during the past decade that there was a use for the MSO beyond comparatively low clock speed industrial environments –making sense of the complex, pipelined protocols used by high speed memories in computers and communications switches as well as the low-voltage interchip serial communication protocols such as LVDS.
One specific driver was the migration from the DDR2 to DDR3 protocol used by high speed DRAMs. The clocking scheme became significantly more difficult to debug. Instead of being laid out using a capacitance prone tree structure in which all devices were meant to receive the clock at the same time, DDR3 moved to a mesochronous structure in which each chip along a shared bus can receive the clock at a different time to its neighbours.
With maybe 20mm between each chip, the result could be a signal delay of 60ps for each as the clock moved down the PCB, with an overall cycle time for memory transactions of 800ps. In order to deal with those signals, the instrumentation companies upped the speed of their MSOs significantly. Typically, the engineer would use the digital channels to look at whether the protocol behaved as expected and then flip input to a high-resolution analogue channel if the sampled data was incorrect – which could be due to interference or poor signal integrity.
The ability to analyse the signal quality of serial channels has been pulled back into the standalone logic analysers, with high end devices now offering the ability to switch between analogue and digital views of the signal hitting a probe and to see visualisations such as eye diagrams.
The wireless revolution has led to a further change in the way logic analysers are used. In his memoir, House recalled it was the problem of shifting away from the frequency oriented view of many analogue focused instruments of the time that slowed the development of the logic analyser – an instrument rooted firmly in the time domain. A link to the vector signal analysers used to assess modulation schemes and other wireless design issues has brought the two worlds back together. Software can convert signal data captured by a logic analyser into a form that can be displayed on a vector signal analyser, making it easier for RF engineers to understand how the digital system is behaving.
A big influence on logic analyser technology over the past 20 years has been chip packaging. One of the biggest problems with using logic analysers has not been with the instrument itself, but the way in which it attaches to the board. Even with large pitch dual in line packages, complex chips would disappear under a forest of brightly coloured signal probes, which were extremely easy to dislodge after hours of careful positioning. The shift to surface mount packaging for most digital devices presented a major obstacle to the use of logic analysers, although the eventual solution made probing easier to accommodate even if it required some additional forethought from board designers.
One possibility was to plan ahead and put probe sockets onto the PCB, although this would lead to differences in electrical behaviour between development and production boards. By the start of the 2000s, so called connectorless probes came into common use. These still required the PCB to be laid out to present bare signal pads, but they were much smaller and so suffered from far less capacitive loading than conventional connectors. The probe head itself clamps onto the PCB, using pressure to make the contact between the probe tip and the PCB pads. Although providing the signal connections takes some preplanning, it is at least able to provide access to signals that pass in and out of ball grid array packages that would otherwise be inaccessible.
Competition with the in circuit emulator for software debug provided a further lease of life for the logic analyser towards the end of the 1990s. Some emulators, such as those made by Lauterbach, added logic analyser inputs to provide the ability to trace I/O lines as well as code execution. But the logic analyser manufacturers worked to supplant the emulator entirely, helped to a large extent by changes in processor architecture.
On chip caches and high clock speeds made it almost impossible to design traditional in circuit emulators that could run at full speed and provide a full trace of program execution. In response, debug logic moved on chip, so the emphasis in hardware assisted debug shifted to providing high speed digital inputs that could capture trace information sent out by the processor – a task for which the logic analyser proved to be ideal. In turn, logic analysers were upgraded with inverse assemblers and inference logic to track events such as branches if a full program-execution trace was not available. The emulator functions moved towards providing run control and overlay memory to help speed up the process of patching code and testing it.
The move toward highly integrated SoC designs continued the trend as logic analyser functions began to move on chip. The logic analyser has, in effect, entered a world where it's now a half virtual, half physical instrument. The FPGA was one of the first devices to acquire on chip logic analyser functions. Altera and Xilinx added the ability to trace internal logic signals – taking advantage of the reroutability of their architectures – to make it easier to see what was happening inside a prototype.
Logic analyser manufacturers, such as Agilent, extended the concept by using soft IP cores as virtual probes that would feed signals to an external hardware instrument so that internal logic signals could be traced alongside those appearing on PCB traces, and correlated in time. Hardwired SoCs are also sprouting logic analyser-like functions. These trace modules are supported by on chip debug frameworks such as ARM's Coresight – if the data is to be accessible by customers – or by design for test tools if the main aim is to help the manufacturer diagnose problems with prototype devices.
To make sense of the different stimuli and provide a holistic view of what is happening between all these different virtual and physical instruments, software is likely to become the main push in logic analyser technology, especially as system clock speeds have levelled, reducing the need to push further into the gigahertz domain.