Embedded instrumentation illuminates the future of validation and test
4 mins read
Some of the challenges facing test and measurement may seem oppressive, but a few beacons – such as embedded instrumentation – are shining through the darkness.
The advance of technology is always accompanied by challenges. While nobody argues with the need for faster processing and data communications, more compact systems, lower power consumption and greener operations, achieving these is challenging. Fortunately, embedded instrumentation is proving an effective solution.
In order to keep abreast of Moore's Law, chip designers are turning to exotic packaging techniques involving several silicon die stacked on top of one another in the same package. This has wrought havoc with on chip validation and test because testing the inner die in a 3d die stack is problematic at best, since there is no physical access for a probe. And poor testing can simply wreck yields in chip fabrication, which drives up the cost of components.
The same logic could be applied to circuit boards, which are packing more functionality into smaller spaces. The number of board layers and the chip densities on circuit boards are increasing dramatically, leading to boards that are impossible for test probes to physically access. Now though, embedded instrumentation is shedding some light on this situation.
Embedded instrumentation
Embedded instrumentation has been around since the advent of built in self test (BIST). However, the concept of embedded instrumentation is gaining attention because it holds great promise for meeting many of the test and measurement challenges that lie ahead.
Generally, embedded instrumentation involves embedding some form of test and measurement logic into chips to characterise, debug and test chips. In addition, board and system designers are more frequently embedding instrumentation IP into asics, fpgas and other programmable devices.
Sometimes, designers develop this IP or they may purchase it from a chip vendor or a third party. What is emerging now is the adoption of embedded instrumentation in non intrusive board and system level validation and test applications.
This has happened due to a change in attitude toward embedded instrumentation. Previously, the application of each embedded instrument was limited to a specific purpose. For example, a chip vendor might embed an instrument to debug a device during development. Once the chip moved into production, the instrument would be rendered inaccessible to the board designer; it had served its purpose and wasn't needed anymore. Similarly, access to an instrument designed into an asic for board test might be closed off when the system was shipped, denying field service technicians the advantages of embedded test for troubleshooting.
These limitations are now being lifted for a number of reasons. The first is sheer necessity. Older, intrusive or probe based technologies – like oscilloscopes and logic analysers for validation applications and in circuit test or flying probe testers for manufacturing board test – are challenged by higher processing and data communications speeds and by today's more complex circuit boards. There is simply no place to put a probe on chips or circuit boards. For example, many reference designs specifically forbid the placement of a test pad for a probe on any high speed serial I/O bus. It's a signal integrity issue, because the pad would have a capacitive effect on the underlying bus, disrupting its signalling.
Moreover, it is more cost efficient for the validation, test and debug routines based on embedded instrumentation to migrate with the chips so they can be reused in circuit board validation and test. Later still, these same routines might be deployed again in field service. The alternative, which has been going on for years, has been to incur the cost of developing new validation and test routines during each phase in a product's life cycle. The portability of validation and test routines based on embedded instrumentation delivers cost benefits that continue to accrue over a system's life cycle.
Tooling up
The concept of a tools platform or a unified tool environment is most appropriate for embedded instrumentation since different sets of tools may be needed at any given point in the life cycle of a chip, circuit board or system. For example, a platform supporting tools that use Interconnect Built In Self Test (IBIST) could extend the application of IBIST beyond the realm of chip characterisation into board validation and test applications. Validating the signal integrity on a circuit board could involve several tools, such as pattern generation and checking, bit error rate testing (BERT) and/or margining tests (see fig 1).
A second example of a non intrusive board test technology based on embedded instrumentation is boundary scan (IEEE1149.1), or Jtag. Boundary scan makes use of an instrumentation infrastructure embedded in chips and on circuit boards. Boundary scan is critical in its own right insofar as it provides embedded structural test capabilities for circuit board, but it can also function as an access mechanism for embedded instruments.
Another instance of an embedded instrumentation technology is processor controlled test (PCT). In a board test application, PCT takes advantage of the most powerful instrument on most circuit boards – the processor. PCTs are executed when control of the processor is temporarily given over to the PCT system. The board's cpu then reads and writes to memory and I/O registers on the board's addressable devices. In this way, PCT exercises the functionality of the board and, as a by product, detects and diagnoses structural faults. PCT is an 'at speed' test technology, which means it will detect faults which are manifested when the board is running at operational speeds. In contrast, boundary scan is a static test technology. Taken together, these two embedded instrumentation technologies provide test coverage for a broad spectrum of structural and electrical faults.
Just over the horizon
Several other embedded instrumentation technologies are emerging, including IEEE1149.7, an addition to the IEEE 149.x boundary scan family. IEEE 1149.7, which builds on the original boundary scan specification, reduces the boundary scan footprint and adds architectural features for testing SoCs and other new device packages like multidie 3d chips with their vertical through silicon vias (TSVs).
In addition, the IEEE P1687 Internal JTAG standard specifies an open instrument interface and access mechanism that will encourage third party tools for embedded instruments. Fig 2 illustrates how IEEE1149.7 and IEEEP1687 might function in tandem to validate, test and debug a 3d chip SoC.
The yellow dots represent vertical TSVs on each die for the 1149.7 connections. IJTAG provides a standard method for controlling and automating all of the on chip instruments.
Reaping what has been sown
The solutions for many emerging test and measurement challenges have already been sown in the instrumentation that is being embedded in chips and on circuit boards. Harvesting these solutions awaits the ingenuity of tool providers and their ability to assemble cohesive test environments that transform the potential of embedded instrumentation into workable and cost effective solutions.
Author profile:
Reg Waller is European director for ASSET InterTech.