Home help: How robots could provide a solution to our ageing population
8 mins read
No-one's getting any younger and we have a lot more people getting old. By 2060, the number of people in the European Union aged 65 and over will almost double to 29.5% of the population, according to Eurostat. The percentage of people aged 80 and above is expected to triple.
Unless we see breakthroughs in the treatment of neurodegenerative diseases, such as Alzheimer's, and a reduction in the incidence of strokes, the rise in numbers of the old will be accompanied by a rise in conditions that need long term care and assistance.
The cost of care – a manually intensive process – for European nations is rising fast and threatens to outstrip taxpayers' ability to fund it. A number of research programmes are underway to try to find technological ways of providing some of the care, which even if it cannot entirely substitute for traditional assistance, can at least reduce it. But technology faces its own challenges.
Assisted living has the dual problem of requiring systems that are inexpensive, but which stretch the abilities of existing technologies to breaking point. The systems need to be so easy to use that they set themselves up and provide a level of responsiveness to users that is far better than most user interfaces in use today.
These requirements contrast technological reality. In recent years, robotics researchers have taken a step back and reassessed what their creations are capable of. At the 2012 Turing Centenary Conference in Manchester, leading robotics scientist Professor Manuela Veloso of Carnegie-Mellon University said robots are likely to suffer from 'tremendous limitations' for years to come.
Prof Veloso pointed to the problem of achieving what are simple functions for most humans – turning door handles and climbing stairs. Showing a picture of the many different types of door handle that exist, she asked: "What kind of machine learning will solve this?"
Leila Takayama, who recently moved from robotics labs and business incubator Willow Garage to Google's X research group, knows how hard it is for a robot to open a door. "When a robot tries to figure out how to open a door, it just sits there thinking about it for a long time," she says, adding the robot does not like to be disturbed while it's sitting and thinking.
People opening the door themselves would disrupt the programming, forcing the robot to start thinking about the problem all over again before it could finally make a move and try to turn the handle. Until the robots can learn to do these everyday tasks – even standing up on its own is a major achievement for a legged robot – the answer seems to be for them to ask for help.
At Carnegie-Mellon, for example, Prof Veloso's robots are programmed to seek assistance. They will stand near a lift or a door and ask passers by to press a button or open the door for them so they can move through. Takayama believes in giving robots human or animal like gestures so they can show disappointment or even shame when they get something wrong. These simple gestures would also help show when a robot is genuinely trying to figure something out, such as a door handle, or is just dormant. She recalls how, when robots were being trained to open doors, it was impossible to whether the robot was processing or not.
As a result, practical care robots may have a mutual dependency on the people they are helping to care for. They may perform some tasks well but, like a pet cat, call for help when it has to get to the other side of a door.
In some experiments, soft and cuddly 'pet bots' have proven to be popular. Paro, the robotic baby seal developed by Takanori Shibata at the National Institute of Advanced Industrial Science and Technology (AIST), demonstrated how a relatively simple processor and program – at least compared with the multicore engines inside fully mobile robots such as those made by Willow Garage – could elicit an emotional bond in people similar to that of an organic pet.
A pilot study, led by Professor Wendy Moyle from Griffith University, Australia and involving Northumbria University's Professor Glenda Cook and researchers from institutions in Germany, analysed the difference between interacting with Paro and participation in a reading group.
Research has already shown that interaction with animals can have a positive effect on older adults, making them more sociable, more talkative and reducing their feelings of loneliness. However, putting animals into residential care homes increases the risk of infection and places a greater burden on the nursing staff.
This study by Prof Moyle and her colleagues suggested that Paro acted as a useful surrogate for a real pet and helped to reduce some of the symptoms of dementia, such as agitation, aggression, isolation and loneliness. Prof Cook says the presence of the robopet in the care home also helped start conversations between residents that would not otherwise have happened.
The choice of a seal pup in Shibata's original design was deliberate. Few people have direct contact with real baby seals in the wild and so are less likely to have an Uncanny Valley reaction to their robotic analogues than if they were presented with an electromechanical dog or cat.
Uncanny Valley describes revulsion when the human features of a machine look and move almost like natural human beings, but not quite.
In other experiments into assisted living, robots are being used more as mobile reminders than as active carers. An example is the Kompaï made by Robosoft used in the EU funded Mobiserv research project. The conical robot sports a touchscreen panel on its front, moving around the home to offer advice, such as when to take pills, or ask questions meant to stimulate activity. For example, if the user has not moved from their chair for while, it can ask whether they are hungry or thirsty and so move to make a sandwich or coffee.
In the Mobiserv work, the robot did not use just its own sensors. The project explored the use of sensors embedded in fabrics to help determine the user's state. For example, sensors could monitor heart rate and blood pressure to check they are within limits. Accelerometers and pressure sensors extend the ability of a robot to sense condition – detecting whether someone has moved from their chair or bed or struggled to get up when they wanted to or, worse, suffered a fall after they stood up. At points like this, the robot can turn into an avatar for other people.
If someone has fallen back repeatedly when they have tried to leave their chair, the robot can put them in touch with a local doctor, nurse or care assistant to see if they need help to get around or treatment.
The robot as avatar need not be confined to medical calls. For users who find it difficult to move around, telepresence provides a degree of social stimulation that is important for slowing down the advance of some cognitive diseases and preventing older people from becoming isolated with their own homes.
Some of the Willow Garage projects have used robots as mobile telepresence machines – they roll around under the control of a remote user. According to Takayama, although the early implementations of this kind of mobile robot avatar felt 'real geeky', the researcher who worked from home miles away and was only available previously through a voice conferencing on a meeting room table became 'a real person, a part of the team'.
In the Mobiserv project, the Kompaï robot can ask if the user would like to call a friend if they have not done anything social for a while. A natural extension would then be to use its chest mounted screen as the display for the friend at the other end of the videoconferencing link.
Open source hardware and software or proprietary hardware with extensive APIs have made it possible not just for researchers but also for users themselves to experiment with assistive technology and, in turn, demonstrate what those users really want from the technology.
Microsoft's Kinect motion sensing system, developed for the Xbox, has become a favoured platform for experimenting with assistive technology. Although not open source, it is supported by a software development kit and university groups have developed toolkits that work with prototyping software, such as National Instruments' LabView.
Willow Garage developed a series of robots, such as the the PR2, that is designed to be modified by users in their own experiments on robotics. This approach allowed Henry Evans, left paralysed after a brainstem stroke, to program a PR2 to do things no-one expected.
Takayama says: "We guessed wrong about what Henry wanted to do. The first thing he did was teach the robot to scratch his face."
Because able bodied people deal with problems almost unconsciously, they do not realise how often the face itches. To call someone in to scratch your face every few minutes is not realistic, but a robot can be there to do it whenever it is needed – activated in Evans' case by the way he moves his head.
Other heavily disabled users are trialling more intrusive ways to control robotic manipulators. Some heavily disabled patients have had implants inserted into their brains that let them, after retraining that part of the brain, to exert control and manipulate a robot arm. Implants are intrusive and pose the danger of infection, but the hope is that wireless sensors on skull caps may be developed that are sensitive enough to pick up the electrical signals of multiple neurons on the surface of the brain that can be taught to control external robot limbs.
Another way to use the technology is to assist with recovery, rather than trying to replace lost functions. A number of robotics researchers believe this approach is likely to be more fruitful in the short to medium term, at least because it overcomes many of the user interface issues associated with assistive robots. Some robotic systems were originally devised as prosthetics, but researchers found there was likely to be greater value in using them for rehabilitation.
The leading cause of disability in industrialised countries is stroke and many sufferers find they cannot control limbs well after the event. For those who have lost motor skills, robots can provide a way of rehabilitating them by guiding limbs until the brain has recovered enough function to control the movements by itself.
Robots are not always necessary in rehabilitation. According to the Stroke Association, almost 70% of stroke sufferers exhibit apraxia and action dissociation syndrome (AADS), leaving them unable to perform apparently simple tasks, such as making a cup of tea, even through they do not have the motor skill problems that can also be the result of a stroke. These problems make it much harder for the sufferers to live independently in their own homes.
Cogwatch, an EU funded project coordinated by the University of Birmingham, focuses on patients with AADS. At this stage, sensors are being fitted to everyday objects, such as cutlery, kettles and toothbrushes to measure grip strength, motion and orientation. The devices transmit the sensor readings to a central server that records how they are being used.
The aim is to close the feedback loop with training programmes that help people recover the skills they once had, using the sensor readings to spot where they are going wrong to personalise the guidance.
In similar work, but without the embedded sensors, researchers from the University of Sheffield used their Kinethesia toolkit to link LabView to the Microsoft Kinect to build a prototype of a system designed to make it easier to diagnose motor skills problems and to build physiotherapy programs that the computer can help guide.
Kinect is also helping with the senses. As people age, they are not only more likely to lose cognitive abilities, they will also suffer from greater problems with eyesight and hearing, making it far harder for them to engage with the world.
Stephen Hicks and colleagues at the University of Oxford found inspiration in Kinect's ability to track objects when they developed the Smart Specs prototype. The idea behind Smart Specs is to provide users who have very limited vision – perhaps reduced to the ability to sense broad colour changes or to recognise the outlines of large objects – with information about their surroundings so they can move around more easily.
Colours and different lighting patterns generated by LEDs built into the specs give the users visual clues about the object at which they are looking if they cannot guess what it is from shape alone.
Another strategy might be to combine senses. Using augmented reality as a form of artificial synesthesia could help those who cannot see or hear well navigate more easily. The prototype Smart Specs can relay spoken words through an earpiece, potentially decoding menus and signs using the built in camera and processing.
The In Situ Audio Services project at McGill University uses 'earcons', the aural equivalents of visual icons, to provide a different type of aural cue. Evocative sounds played to them as they walk along a street can help them identify shops and restaurants, as well as bus stops and, potentially, approaching buses. A similar project, Timbremap, from the University of Toronto, uses changes in sound to guides a user's finger around objects in a photographic scene or a map so older users need not lose memories captured in pictures from long ago.
Although work is continuing on many fronts, some of the technologies that can help reduce isolation and provide some level of health protection already exist. As much of the focus in assisted living is on telepresence and providing help and advice remotely, devices such as smart TVs and digital picture frames could serve as low cost alternatives to dedicated robots such as the Kompaï.
Robots have plenty of development ahead of them, but the most prevalent device for assisted living may turn out to be the telly.
Go to DesignSpark for a range of content addressing hobbyist electronics, including:
Patients and carers:
http://www.designspark.com/eng/blog/the-internet-of-things-connecting-patients-and-carers