New Electronics roundtable: The challenges of targeting fpgas at embedded applications
5 mins read
Embedded systems have, in the main, been designed around microcontrollers. But growing demand for increased performance, more capable signal processing and parallel processing means fpgas are moving ever closer to the heart of embedded systems.
In a recent survey, New Electronics asked readers whether they had recently designed a board level product in which an fpga provided the main functionality. More than a third of respondents said yes. Asked why they specified an fpga, readers cited parameters such as fast I/O response, interface flexibility, parallel processing potential and the ability to integrate specific functionality.
Most respondents said the fpga had been designed in house, but 32% said the project took longer than expected due to fpga design problems. Asked whether they had used an fpga/pcb codesign package, 89% of respondents said no.
Moving to an fpga based embedded system appears to bring with it a range of issues. Do the benefits outweigh the downsides? How steep is the learning curve? And what else should you look out for?
Looking to discuss these issues, New Electronics convened a roundtable to address the challenges of implementing fpgas in demanding applications.
Venkatesh Narayanan, director of software and systems engineering with Microsemi's SoC division, said demanding applications had many vectors. "More recently, power consumption, security and reliability have become critical. Power consumption is certainly important when it comes to designing portable products or wireless equipment, but these systems also need high reliability and security."
Other areas where Narayanan sees fpgas playing a greater role include the electrification of aircraft and automotive safety systems.
But what do these apps require of the fpga? Narayanan explained some of the factors. "Communications infrastructure apps demand 'five nines' uptime and the ability to perform reliable and secure updates; you can't allow them to bring down the network. Industrial automation needs reliability and secure communications, while military comms systems demand low power and the best security." He added that fpgas deployed in the wireless network had to survive a wide range of temperatures while consuming low levels of power. And, for any demanding app, fpgas need to have extremely low failure levels, he noted.
Tristan Jones, regional marketing engineer with National Instruments UK and Ireland, pointed out: "National Instruments addresses the industrial market and definitely sees fpgas as being capable of solving demanding application problems in that area."
Narayanan noted the safety requirements of demanding applications. "IEC61508 mandates safety," he said, "and it's the baseline for most systems. But there's also the need for secure updates because fpgas are configurable in system."
One way to assess safety is through the Design Assurance Level (DAL), codified in DO254 and DO178 and mainly used in the aviation industry.
DAL A applies to a system whose failure may cause a crash and whose impact would be 'catastrophic'. DAL A calls for redundancy and dissimilar technologies to be used in, for example, primary flight controls. DAL B events are seen as 'hazardous' and redundancy is again called for. "In these systems," Narayanan said, "you can't tolerate firm errors. Processors need to boot securely because systems assume the 'circle of trust' from this action. You can't have configuration upsets, you can't bring systems down – they have to operate reliably."
So how do fpgas meet these needs? "There are two or three aspects," Narayanan offered. "When vendors implement fpgas, they do the optimisation. Microsemi, for example, designs configuration memories so they have increased firm error immunity. But there are other techniques, including triple modular redundancy and SECDED (simultaneous error correction/double error detection), as well as solutions to mitigate memory errors during remote or in system updates."
Other features include things like dual PCIe ports for redundant communications and designing to minimise leakage and static power.
Design optimisation techniques are also available to engineers, allowing them to further reduce power consumption through such approaches as clock gating. Hemant Shah, product management director, Cadence Design Systems, said: "These are 'tweaks' users can make, but changes still need to be validated at the system level."
Narayanan said the fpga industry is responding to the needs of demanding applications. "We are using process technology to provide high level of reliability, extended qualification and the design tools that let engineers take advantage of these features."
Shah turned the discussion towards the need for fpga/pcb codesign and the tendency for projects to take longer than planned. "Whose problem is it?" he asked. He believes the traditional approach involves three people: fpga designer; board designer; and layout designer. "The fpga designer wants to make the device work as it should – functionality is critical and timing is king. The board designer, meanwhile, is keen to ensure the power delivery network is correct. Then there's the layout designer, who's doing place and route. Their problem is the growing number of pins and the shrinking pin pitch. Layout designers need to get things done quickly, but they also need to use the minimum number of layers."
The problem which Shah sees is the design process is serial and iterative. "Problems increase with complexity. When the layout designer gets the board design, they struggle and have to send it back for negotiation. While some companies can live with more layers, others see this as so important, they keep the design process going around.
"But while they don't want too many pcb layers, neither do they want the design cycle to be too long."
Shah says an obvious solution is to have all three engineers sitting alongside each other. "That's not efficient," he commented. "The problem can be simplified if they use the tools that are available." And looking at the New Electronics survey, that's not happening – 89% of respondents don't use fpga/pcb codesign tools.
Pin assignment is one of the main challenges, Shah observed. Jones asked about the timeframe for pin assignment. Shah said: "Maybe a day, but what's important is the consequence of doing assignment without allowing for placement of components."
Narayanan followed up. "Does the system designer do component placement, then create pin assignments, or is it the other way round?"
Shah said most designers have a diagram 'in their head'. "They will say 'it will look like this', but it's a reference; they have to wait for pcb designers. But, by then, the design is almost complete. Exploration has to take place during the creation phase and it shouldn't be a manual process."
Whilst Narayanan and Shah addressed fpga level issues, Jones wanted to move what he said had been an interesting discussion towards an applications focus.
"There's a long tail of embedded applications," he said, "and fpgas are moving down this tail and becoming attractive for low volumes. They are being used to solve complex challenges and for the deployment of complex algorithms."
But those working towards to end of this 'long tail' are not always familiar with fpga technology. "These are the domain experts," Jones pointed out. "They understand the applications, but they have specific challenges to address."
FPGAs, said Jones, solve a number of these challenges, including reliability, high speed dsp, precise control and synchronisation of I/O and timing. "But these people need to develop software, hardware, fpgas and pcbs. They could be individuals or small teams who have to do everything."
He says another approach is rather than design from the ground up, they need to think about 'build vs buy'. "They should consider a more integrated system approach that brings the benefits of fpgas, but which also brings a more rigid approach to development through the use of 'off the shelf' tools. LabVIEW RIO brings together these various elements to help realise systems."
In this way, he continued, some of the load is removed from the domain expert. "They don't have to understand everything. This approach can accelerate development, but allows them to retain ownership of the system that is created."
He pointed out that NI customers using this approach tend to have smaller design teams, shorter design cycles and more of them are ahead of schedule – 58%, compared with the industry average of 42%.
Narayanan asked when the systems require multicore processing whether designers are more interested in redundancy or in cores that run in lockstep.
Jones responded: "Typically, engineers are drawing on more processing power and the ability to have more power on the mcu side allows the system to undertake more. The app can be partitioned between mcu and fpga so deterministic tasks are handled by the fpga and others by the mcu.
"It gives headroom for performance, along with deterministic and parallel processing. But safety is important and has to be a consideration," he concluded.
Participating in the Demanding Applications roundtable were:
Venkatesh Narayanan, director of software and systems engineering, Microsemi SoC division
Hemant Shah, product management director, Cadence Design Systems
Tristan Jones, Regional Marketing Engineer, Industrial and Embedded Systems, National Instruments UK and Ireland
Graham Pitcher, group editor, Findlay Media