This collaboration aims to leverage the strengths of both companies to develop chiplet-based solutions that enhance performance while optimising power consumption. Movellus Aeonic Digital IP allows Tenstorrent to leverage advanced clocking techniques to reduce overall energy consumption.
“By collaborating with Movellus and integrating their technology, we are optimising power efficiency in our processors and continuing to drive Tenstorrent’s leadership in scalable AI chiplets,” said Keith Witek, Chief Operating Officer of Tenstorrent.
Movellus’ Aeonic portfolio offers products that have been designed to tackle critical infrastructure challenges in modern computing, such as on-die sensing, digital clocking, and power delivery. Notably, the digital adaptive clocking family facilitates architectural advancements like per-core distributed clocking and fine-grained dynamic frequency scaling (DFS).
Additionally, when coupled with droop detectors, these products provide advanced clock management capabilities, allowing for Vmin reduction by effectively mitigating droop while minimising any performance impact.
With these products it is possible to combine localized clocking, DFS, DVFS, and droop mitigation in a unified solution, simplifying design and easing integration.
“Movellus’ digital clocking technology enables us to distribute digital PLLs across our chip to provide a localized, fine-grained clocking, something that is not possible with traditional analogue PLLs,” said Michael Smith, Senior Director of SoC Hardware Engineering at Tenstorrent. “In addition, their process agnostic digital architecture provides a cohesive software interface across multiple devices.”
“Tenstorrent is pioneering AI and HPC compute with their novel and scalable hardware architectures and software stack,” said Mo Faisal, CEO of Movellus. “We are grateful to be a part of this ground-up approach that is foundational to the advancement of AI and HPC compute with an intention of maximising energy efficiency – a much-needed development in the age of AI.”