New Electronics Roundtable: Getting to grips with the challenges of designing pcbs featuring fpgas
6 mins read
Designing pcbs was at one time a sequential process. But the growth in popularity of fpgas has seen that process become more of a parallel operation.
Where fpgas once provided 'glue logic' and similar functionality, the devices are now central to the operation of many board level products. Because of this, the fpgas being developed are complex, high pin count devices. This level of complexity brings design iterations and each iteration may require the board on which the fpga sits to be redesigned; even seemingly small changes to an fpga's pin out could have dramatic effects on the pcb's layout and even the number of layers.
In a reader survey, New Electronics discovered that 47% of readers had recently designed a board level product in which an fpga provided the main functionality. And only 20% of those readers said it was the first time they had done so. For those who have designed an fpga based board before, 58% said their latest design was more complex, as was the design process itself because of the need to get the fpga and pcb right. Of the respondents, more than 50% found fpga design hard and 40% struggled with board design.
Looking to address these issues, New Electronics held a roundtable to discuss fpga centric design. Helping to explore the questions were representatives from Lattice Semiconductor, Linear Technology, Mentor Graphics and Rohde & Schwarz, while Arrow provided a distribution perspective.
Adam Clarkson from Lattice pointed out that fpgas cover a wide spectrum. "They're not just 28G beasts," he said. "There is integration and complexity, even at the low end. But alongside the fpga, designers need to manage issues such power and sequencing."
Clarkson said power is critical. "FPGAs need multiple supplies and they have to be sequenced correctly. And you can't have 100mV overshoot on a 0.9V supply, so there is the need for accuracy."
Steve Clark said: "High end mcus and fpgas require more complex voltage rails, with specific start up sequences."
But does Lattice see a trend away from microcontrollers to fpgas? Michael Buckley thinks fpgas and mcus should be kept apart. "Once you integrate, you're stuck. Separation means more flexibility."
Rather than replacing mcus, fpgas are more likely to be where peripherals are integrated. Clarkson explained: "It's bringing programmability to the rest of the board. It provides designers with a quick way of getting a board out because they have control over such factors as sequencing." A particular example is Lattice's Platform Manager, in which programmable analogue and logic is used to support common functions, such as power management, digital housekeeping and glue logic. "If you have errors, you can reconfigure this device to change the performance and taking this approach can solve many control issues."
Clarkson continued: "High end fpgas will be more of a processor platform. At the low end, micropower fpgas will be used for such functions as offloading sensor management from processors."
Providing the pcb design perspective, Mentor Graphics' Rakesh Jain said: "We see designs spanning all ranges; from big fpgas with lots of functionality to other boards, where fpgas still act as glue logic. But one of the restrictions is volume; a low volume design is unlikely to be able to afford integrating an mcu and lots of I/O into a big fpga."
Jain sees a change in how pcb design is being undertaken; particularly when fpgas are involved. "Once, it was done by different people or teams who didn't talk to each other. They had conflicting objectives and often found out late in the design cycle that they had been working towards different goals."
The problem was that if the board design had been mostly completed, it meant the fpga design had to be changed or a different package selected. "Then they would run into timing issues and the product would be late to market," Jain continued.
So what can go wrong? "FPGA designers allocate pins and hand off to the pcb designers. They change the pin out and throw the changes back over the wall; there are too many iterations," Jain contended. "Both teams must work with each other or use tools which help them to communicate. If you can simplify the pcb/fpga flow, design can happen simultaneously."
Mentor has developed I/O Designer for this task. The package supports the latest fpgas and can quickly convert an fpga design into a pcb schematic, ready for layout. It spans the design cycle from HDL descriptions to pcb level symbols, as well as to the physical pin information necessary for fpga place and route tools. "FPGA designers aren't always aware of design rules," Jain continued. "They often leave it to the place and route tool to work out the best placement. I/O Designer will check whether pin outs meet device requirements, providing correct by construction pin assignments."
But I/O Designer isn't a design tool. "It can't design an fpga or a pcb," Jain reflected, "but it certainly helps you do both."
Another part of the pcb design jigsaw is constructing the signal chain.
"Analogue is the largest technology segment in the UK and power management, which represents 55% of that, is the fastest growing sector. So a third of Arrow's UK FAE expertise is focused on this important growth area," said Steve Clark.
Simon Bramble from Linear noted: "Everyone wants smaller parts with more functionality," he pointed out. "That means more transistors and lower voltages. In response, signal chain developers are being force to create smaller dc/dc converters with higher switching frequencies in order to get more into the package."
Lower voltages, he contended, are 'a nightmare' when it comes to power management. "It also means you need sensing at the fpga's power supply pin." Another consequence of low voltages is tolerances. "Designers could live with 100mV ripple in the past but that could take you outside of tolerance, particularly with a 1V supply."
Lower voltages also mean higher currents. "And that brings thermal management problems," he continued. "One way around this is parallel phasing. If you need 20A, then use two 10A supplies in parallel; it divides the current by two and the losses by a factor of four."
In Clark's opinion, system architects and hardware designers often do not have the skills to design complex power management circuitry. "With fpgas, the power requirements often cannot be defined until the logic design is completed. This means flexible off the shelf solutions are imperative to meet challenging time to market objectives and to allow hardware engineers to focus on system functionality, while finalising power management late in the design cycle."
Linear has developed a range of parts called µModules; complete power management solutions with integrated dc/dc controllers, power transistors, capacitors and compensation components. "These bring the simplicity of an ldo with the efficiency of a dc/dc converter. And, because they integrate a number of components, they make the board look better."
Avoiding noise and distortion in the signal chain is important. "Noise is the main problem," Bramble contended. "Sampling noise and digital noise will corrupt analogue signals." Another issue is the so called 'jaws of death' – the input to an a/d converter. "It's a nasty environment," Bramble pointed out. "Noise on the reference will look the same as noise on the signal, so use a lower noise reference."
Other areas to consider include digital noise and power supply noise. "Lots of code transitions will map back and corrupt the date, so randomise the data. And don't ignore the power supply rejection ratio; it will help to remove noise at the input."
Bramble said the signal chain is only as strong as the weakest link. "The entire chain is important and if you're not sending a good signal, the a/d converter will only deal with what it's given."
Finally, don't forget the need for good pcb layout. "PCBs featuring fpgas tend to have a lot of layers and this helps to keep signal layers separate from ground planes and so on," he said.
With the board, fpga, power supply and signal chain designed, all that left is to test things. Richard Bloor from Rohde & Schwarz said test companies see all elements. "Engineers will do simulations, a boundary scan will see a device failing and we'll get a call saying there's a problem."
One of the reasons for problems is complexity. "There are more devices in less space and more things that can go wrong. All of these can be difficult to solve."
A common problem, said Bloor, is signal integrity. "There's a lot of talk about this, but what does it mean? From a digital perspective, the signal should be clean, low noise and with no ringing; you get 1s when you should get 1s."
Other faults which need to be investigated include analogue deviations and timing errors. "A lot of these can be random and infrequent," Bloor said. "Engineers still like to use analogue scopes to investigate these effects because the only delay is the flyback time."
In a move to provide engineers with the best possible view of what's going on in their designs, Rohde & Schwarz has developed an acquisition system which is faster than the best analogue scopes. "The RTO offers high signal fidelity, precise triggering, high acquisition rates and serial decoding," Bloor noted.
He added that digital problems are often easier to find using analogue representations. "If you can't see the true analogue signal, you won't see things like amplitude problems and ringing."
Other problems with pcb designs include crosstalk. "High density designs with multiple layers and high speed data means the problems can be acute; and errors are there because of layout."
Jithu Abraham, added: "Where you have multiple clock signals there are tiny mismatches and if you can't see an analogue representation, it's hard to know what's going on."
"Engineers need to see analogue signals," Bloor concluded, "but digital designers are only interested in protocols."