Novel technologies needed to handle SKA's data deluge
1 min read
Since man's early days, we have looked at the stars and wondered what it's all about. Slowly, the workings of the universe are beginning to make sense and physicists are getting closer to developing a unified theory – one equation that explains how everything works.
But the closer scientists get to the unified equation, the harder they have to look for the final elusive elements. Often, that involves staring further into space – backward in time – to find the answers. Over the years, optical telescopes have evolved to the point where it is becoming physically impossible to make the mirrors required. As optical telescopes reach their limits, radio telescopes have taken over. The great benefit of a radio telescope is that it is not restricted by size; in fact, it can be as large as you want.
And that's the approach of the Square Kilometre Array (SKA). The title is something of a misnomer; while the actual collection area will be 1km2, this will be distributed across an array of thousands of receptors extending 3000km from a central core region.
It's a huge project: construction alone is expected to cost €1.5billion. There's another huge element: handling the data that's collected. The SKA is expected to produce a few exabytes of data per day for a single beam per square kilometre. After processing this data, it is expected that up to 1500petabytes of data will need to be stored per year. To put that into context, the large hadron collider at CERN generates approximately 15Pbyte of data per year.
Addressing this issue is the DOME project, in which IBM is working with the Netherlands Institute for Radio Astronomy to develop the necessary technology. The list of potential technologies is impressive:
advanced accelerators, 3d stacked chips, novel optical interconnects, nanophotonics, high performance storage systems and phase change memory.
While these developments will undoubtedly help the SKA, what other applications might these technologies enable?