This latest Neurala VIA offers more flexibility, provides advanced edge AI capabilities, and maximises available compute power, enabling developers to more efficiently create and deploy AI models at the edge.
According to Neurala CEO and Co-Founder, Dr. Max Versace, “Running AI at the edge is increasingly important to reduce power, latency, minimise bandwidth, and increase data privacy. We’ve many years’ of experience making it easy for our clients to create, deploy and customise AI at the edge on a variety of hardware from PCs, to smart phones, all the way to specialised processors and imaging sensors.
“Leveraging Lattice sensAI, designed to speed customer development and deployment of always-on, on-device AI into a wide range of Edge applications, coupled with this new version of Neurala VIA, makes deploying AI at the edge and maximising the use of the available compute, easier than ever before.”
“As AI rapidly transforms various markets and applications, the need for improved efficiency in Edge AI computing is essential,” explained Matt Dobrodziej, VP of Segment Marketing at Lattice Semiconductor. “This collaboration with Neurala is a great example of how Lattice’s low power FPGA technology and sensAI solution stack can help accelerate development cycles and enable designers to build and deploy scalable Edge AI applications.”
The collaboration between Neurala and Lattice – bringing together Lattice's FPGA-based machine learning solutions and Neurala's VIA platform – means that developers will now have access to a powerful and flexible toolset for efficiently deploying AI models at the edge and will be able to quickly develop and test sophisticated Deep Learning applications directly at the edge.