In memory of Feynman, the US based Foresight Institute offered, in 1996, a $250,000 prize to the first individual or group to design and construct a functional nanometre scale robotic arm, as well as a functional nanometre scale computing device that can add two 8bit binary numbers. This prize has yet to be won.
But work by a team at the University of California Santa Barbara is said to be pushing towards achieving the second element. The team claims it has developed a design in which a functional computer could be realised in a 50nm cube. All the team has to do to win the Prize is to make it – and 31 similar devices. According to team member Dmitri Strukov, the challenge could be ‘easily met’.
Part of the UCSB approach is based on material implication logic. At first sight, this appears to be something along the lines of in-memory processing. The question is whether this idea will have application in what we might term ‘regular’ computing devices or is it an academic exercise?
In-memory processing is a ‘big data’ thing. What it effectively says is that it’s easier to keep all the data you want to process in RAM, rather than move it back and forth from disk. And if you can integrate memory and processing elements into one device, analysis can be performed even more quickly.
The UCSB design also features memristors, created in 2008 by a team at HP Labs which included Strukov. Their development followed up on theoretical work in the 1970s by Professor Leon Chua, who claimed memristors were the ‘fourth basic element’ of integrated circuits, alongside capacitors, resistors and inductors.
The thing about memristors is the smaller they are, the better they are expected to perform. But, while there has been a lot of talk about memristors, applications for the devices have been less obvious. Could it be the technology has finally found a role?