Everything, everywhere: Soon humans will be marooned in a sea of machines which talk to other machines
8 mins read
Computing is now well on its way to becoming invisible, a prospect that the late Mark Weiser, former PARC chief technology officer, saw more than 20 years ago.
In a 1991 essay for Scientific American: "The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it."
When Weiser wrote his essay, some 50 million personal computers had been sold. Most of those could only connect to each other using a modem; the internet – a preserve of academia and the military – had yet to be opened to the wider public.
The GSM Association, which represents mobile operators, claimed last year that, by 2020, the number of computing devices that can connect to a network will total 25 billion, with PCs and laptops likely to represent 10% of the total (see fig 1). Half of those billions of connected devices will be machines relaying data to other machines (see fig 2). Things that will only ever talk to other things will outnumber people by two to one. If the prediction holds up, the Internet of Things will have arrived and they are already disappearing into the background.
At the end of the 1990s, University of California researcher Kris Pister proposed to the US Defense Advanced Research Projects Agency (DARPA) the idea of aircraft scattering lightweight sensors in much the same way they dropped bags of propaganda in the past. These tiny sensors, each powered by a hearing aid battery, would listen out for vehicles as they passed in the night, then relay that information to military commanders.
Pister envisaged the sensors could even be dropped onto vehicles by small self guiding or remote controlled drones and then provide information about their movement. Pister went on to found Dust Networks so he could commercialise the 'smart dust' idea. Linear Technology bought Pister's company in 2011.
In Europe, Philips Electronics' HomeLab developed the concept of ambient intelligence. It was devised initially as the cornerstone of home automation, in which sensors spread around the house would react to the movements of its occupants. Residents did not have to carry keys: a camera would recognise them as they walked up the path. Once inside, the lights would turn on automatically and, because their mobile phone had reported their proximity to the house, the heating would already be switched on.
The automatic home is still more science fiction than reality but there are indications that, with the right incentives, it could become a reality. Andy Stanford-Clark, CTO of smart energy technologies at IBM, has been pioneering an effort around his home on the Isle of Wight that uses networked embedded devices to save energy. Stanford-Clark's home is fitted with sensors that report on energy use and other activities. The loft, for example, has electronic rodent traps that generate alerts when they catch something – 'mouse events', in a pithy nod to programming jargon.
The idea has been extended to the island's Chale project, which is meant to bring households out of fuel poverty by making smarter use of renewable energy. People enrolled in the project have access to free energy generated by solar panels and wind turbines, among other sources, only paying if their consumption exceeds the capacity of the renewables. The idea is that a washing machine and dishwasher would only be used at the same time if there is enough renewable energy capacity – if not, one is switched off in favour of the other. Early indications are that, on average, energy bills are being cut by as much as half.
The concepts behind ambient intelligence have already spread into the commercial world. Offices are routinely being built that contain sensors to track the movements of occupants so that empty rooms are not kept lit. By sending commands to networked light fittings, it is possible to split an open plan office into separately lit segments simply by putting the lamps into different groups with the help of control software and moving some partitions around to create the walls.
More advanced uses of sensors are springing up. One example is using vibrations to detect when a window has broken. When glass shatters, it generally has a strong frequency peak at 13kHz. An object simply hitting the window and bouncing off produces a much broader spectrum. So, with the help of a Fourier transform to analyse the spectrum of sudden strong vibrations, the sensor can alert security to a potential break in. In the wider world, a similar technique is being used to alert emergency services to avalanches on ski runs. Again, the approaching wall of sound has a characteristic sound.
The connection between weather and energy has provided the impetus for the Solar Energy Research Institute of Singapore (SERIS) to deploy 70 embedded computers across the island. They will provide accurate data on cloud cover in real time to energy companies.
At a large scale, Doppler radar can capture clouds as they head towards the island. Using that data, a model can calculate how solar yields will fall as the clouds pass over different photovoltaic installations on the island but the data can be made more accurate with the ground level sensors, which report their readings using the 3G wireless network.
Traffic management provides a large opportunity for the Internet of Things and it may not need many new sensors to be installed. Once connected to the internet over a wireless network, satnavs become useful sources of congestion data. Even if only a fraction of cars on a motorway have a GPS, the fact that a group of them along a stretch reports a speed of 10mph quickly signals that something is wrong.
Some applications will need fixed sensors. The SmartSantander project, for example, has started a trial of parking sensors in the Spanish city. Sensors packed into a can and buried in the tarmac use changes in a magnetic field caused by a metal car chassis to detect whether each designated parking space is occupied or not. When a car pulls in or moves away, the sensors use the Zigbee low power wireless standard to send a message to relays mounted on streetlamps, which then pass the data to a central computer (see fig 3). In principle, drivers could be guided to available parking spaces, rather than driving around slowly looking for them a major source of urban congestion.
The sheer volume of devices that could join the internet in the medium term will reshape the way in which high performance computing is performed, putting much more emphasis on real time data mining.
An example of the way in which data picked up electronically in real time can be applied on a massive scale lies in Tesco's supply chain – although this data is captured from barcodes, rather than active sensors. At 4am almost every morning, trucks roll out from the supermarket chain's huge warehouses around the UK. Their contents are determined by the output of a statistical model fed not just by the barcode data collected at thousands of checkouts, but also by the local weather forecast and other seasonal data.
Tesco runs the software across two IBM mainframes and a Teradata data warehouse system that stores billions of transactions dating back through the store's recent trading activity. Not surprisingly, hot weather leads to big increases in the sales of barbecue meat, but hidden within the data are more subtle effects, such as the 'first hot weekend effect'. Compared across two consecutive hot weekends, salad sales on the first will be higher. To analyse the data every day, Tesco's software engineers built a regression model in Matlab that takes in weather and sales data. At the moment, Tesco has to extract the information from the Teradata machine and feed it to computers that can run Matlab, but the supermarket's engineers want to get to the stage where they can run Matlab inside the Teradata box in real time.
Bin Guo and colleagues from the Northwestern Polytechnical University of China call this kind of analysis of patterns from many different sensors and locations 'embedded intelligence'. They see artificial intelligence and data mining techniques used to process 'digital traces that can be compiled into comprehensive pictures of human daily facets, with the potential to transform our understanding of our lives, organisations as well as societies'.
The compute power needed to analyse entire transport networks to advise drivers and public transport commuters on which route to take is likely to be immense, which is potentially bad news for energy efficiency. Data centres already consume 1.5% of all electricity generated. The data deluge from smart sensors could demand even more power. The European Commission and research groups around the continent see the combination of cloud computing and ambient intelligence providing an opportunity for manufacturers in the region to claw their way back into mainstream computing. They are working on the principle that tomorrow's servers will demand large numbers of low power microprocessors similar to those used in mobile handsets.
Much of the compute power may not even sit in data centres. Luigi Grasso, a research fellow within the pan European CATRENE project, expects cellular basestations to provide a first layer of filtering and processing for much of the data coming from sensor networks. They will, in effect, become embedded supercomputers. For this kind of system, the Technical University of Dresden, working with commercial partners such as AMD and IBM, is putting together plans for a billion core machine that can fit into a 10 x 10 x 10cm box. As well as demanding more powerful multicore processors – combining hundreds or even thousands of 32 or 64bit engines – the system architecture calls for short range wireless communications between boards inside the cube to reduce the power and physical interconnect that would be needed to support efficient multiprocessing.
A number of potential obstacles stand in the way of the rise of the Internet of Things. One is simply a matter of address space. Potentially, there could be trillions of nodes accessible using the Internet Protocol (IP) if objects as simple as light bulbs and pallets containing active RFID tags join the network. The version of IP in widespread use today is almost out of addresses just considering the number of servers and internet gateways there are in use around the world. Its 32bit address space only allows for 4billion individual nodes and not all of that space is realistically accessible as large blocks of the address space are dedicated to the US military and some large corporates.
IPv6 expands the address field to 128bit, clearing away any concerns over available addresses. There are, in theory, more addresses available than there are atoms in the known universe – the field allows for some 300 undecillion (3 x 1038) IDs. The shift to IPv6 on the global internet has been extremely slow, which might look as though it will slow down the development of the Internet of Things. In practice, concerns over security may mean IPv4 remains suitable for long enough for the Internet of Things to become established.
Network address translation (NAT) has become a mainstay of the internet. Many organisations have hundreds or thousands of machines sitting behind a NAT server so the traffic, as seen by routers on the internet, seems to come from just one machine. The NAT server maintains a table of ingoing and outgoing addresses so it can direct packets to the right machines on the local side of the network. This has a spin off benefit in that the NAT server can also act as a simple firewall. Without its cooperation, no one outside the network can find out the internal address of a target machine. If they cannot find it, they cannot obtain data from it or provide it with commands.
NAT is far from perfect; in order to communicate with the outside world, machines on the local side have to be able to punch holes in the firewall that can provide a way in for a hacker. The degree to which a hacker can cause problems on a network of embedded devices can vary from the inconvenient – a remote vandal turning electrical appliances on and off – to the dangerous. For example, an attacker might override alarms from a roadside network that warn drivers and vehicles of obstacles or foggy conditions ahead.
Companies involved in the smart grid expect to put a high degree of security into the energy meters and controllers that connect to their networks to try to prevent hackers reprogramming their devices, and any sensors sitting behind them. Ultimately, the embedded devices themselves are likely to acquire more security features to try to reduce dependence on firewalls that hackers may work their away around.
The final obstacle is social. There is a question over how comfortable people will be if they are aware they are being tracked in real time every minute of the day, even if the aim is to help them reduce their energy consumption or commute time. This is one reason why the European Commission launched a consultation to work out how to update the Data Protection Act.
The first step is a survey to try to work out how much privacy people will give up to support aims such as greater energy efficiency and what safeguards might be needed. If they do accept the greater intrusiveness of a constantly monitored environment, the Internet of Things will slide quietly into the background, even as its node count reaches into the trillions.