Electronics interfaces help restore communication and mobility to the paralysed
6 mins read
Using the mind to control objects – telekinesis – is a favourite of believers in psychic powers, but hardly a proper target for science. Even so, we are getting very close to something like it. A combination of sophisticated electronic interfaces and pioneering neurophysiological work is forging remarkable advances that may soon enable totally paralysed people – who cannot talk – to effectively communicate again and others fitted with prosthetic limbs to control them through thinking.
There is nothing supernatural about this – by using electronics attached to the surface of the brain, or implanted inside, effective brain computer interfaces (BCIs) are being created.
One of the latest advances came in April, when a US team successfully used a relatively new technique – electrocorticography (ECoG) – to create an interface to the speech centre of patients' brains. Using this, they could learn to control a computer cursor, achieving up to 90% accuracy after just 15 minutes training. This is the first time brain areas related to speech have been used in BCIs. Previously, brain signals used for BCIs have been taken only from the sensorimotor cortex.
ECoG involves placing electrodes directly onto a patient's brain to record the electrical activity generated by the firing of neurons. It has previously been used to identify regions of the brain that cause epilepsy and has led to effective treatments, but it is now being seen as an important tool for creating BCIs.
"ECoG has emerged as a new signal platform for BCI systems," says Dr Eric Leuthardt, leader of the speech centre project at Washington University in St Louis. "Compared with scalp recorded electroencephalographic (EEG) signals, ECoG has much larger signal magnitude, increased spatial resolution and higher frequency bandwidth (0 to 500Hz versus 0 to 40Hz for EEG)."
This is significant, because amplitudes in frequencies higher than 40Hz carry information that appears to be particularly amenable to BCI operation.
"These signals, which are challenging to detect with EEG, are thought to be produced by smaller cortical assemblies and show stronger correlations with neuronal action potential firings than classic lower frequency rhythms. Furthermore, these high frequency changes have also been associated with numerous aspects of speech and motor function in humans. In summary, there is substantial evidence that ECoG should have critical advantages for BCI operation," he says.
In his project four patients tried to move a cursor on a monitor toward a target using predefined words associated with specific directions. For instance, saying or thinking of the word 'AH' would move the cursor right.
Clearly, successfully acquiring brain signals is a critical part of ECoG. To do this, electrode arrays were placed on the surface of their brains. These consisted of 64 electrodes spaced 10mm apart, and one patient also had an experimental microarray, consisting of 16 microwires, 75µm in diameter, that were spaced 1mm apart. The electrocortical signals were acquired using biosignal amplifiers made by Austrian company g.tec. All signals were internally sampled at 38.4kHz with a 6.6kHz low pass filter.
An important aspect of Dr Leuthardt's study is that it has shown ECoG can use brain areas not directly involved with motor movements, but with cortical areas associated with speech.
"This system may provide distinct advantages for neuroprosthetic operation," he says. "Both real and imagined speech are commonly utilised in day-to-day life – we are often talking to others overtly and ourselves covertly. Thus, using this cognitive operation may offer the opportunity for a more intuitive and easily operable system. It is encouraging that two of our four patients immediately had greater than 90% accuracy without any prior training, and all achieved high levels of control within minutes."
Dr Leuthardt's area of study is very much work in progress – several academic papers will appear in the next few months, concerning higher levels of speech decoding and the effects of age on brain computer interfaces.
"From the electronic hardware point of view, I think the key here is making small high density electrode arrays that have an integrated ability to amplify, process, and transmit signals wirelessly. I think the technology is present to implement that, but it has not yet been fully integrated. With non invasive approaches, we need to develop techniques for achieving better signal to noise ratios and access to higher frequency rhythms in the brain. For invasive approaches, making the prospective implants small and minimally invasive is key. And for applications, in the near future, we will likely see small implants for people with severe motor disabilities that will facilitate their ability to communicate."
Speech is also at the heart of another BCI developed by US researchers, led by Frank Guenther at Boston University. Here, the aim was to help people with so called locked in syndrome – a dreadful condition in which a person's cognitive capacity is unaffected but they are almost completely physically paralysed – perhaps moving an eye at most. But even though they cannot talk, the firing of neurons in the area of the brain doing speech motor planning is retained.
Guenther calls the system a brain machine – as opposed to computer – interface (BMI) and he says it is radically different from previous versions that have tried to help such people.
"Current BMIs for restoring communication can provide important capabilities via a typing process, but unfortunately they are only capable of slow communication rates. We use a novel approach to speech restoration in which we decode continuous auditory parameters for a real time speech synthesiser from neuronal activity in motor cortex during attempted speech."
Years ago, a special electrode was implanted in the patient's brain near the boundary between the speech related premotor and primary motor cortex.
"The electrode used in the current study was a Neurotrophic Electrode designed for permanent human implantation," Guenther says. "Neurites (projection from a neuron) grow into the electrode cone, resulting in signalling patterns on the electrode wires within 3 to 4 months of implantation that are maintained indefinitely. Neural signals detected by this electrode were used to drive continuous 'movements' of a speech synthesiser that provided audio output to the user in real time. Thus the user received immediate auditory feedback of his ongoing speech that allowed him to improve his utterances with practice."
The system is 'telemetric' – it requires no wires or connectors passing through the skin, eliminating the risk of infection.
"Signals from the two channel Neurotrophic Electrode are amplified and converted into fm radio signals for wireless transmission across the scalp," Guenther says. "During data collection, the subcutaneous electronics are powered by an induction power supply via a power coil temporarily attached to the subject's head using a water soluble paste. Two additional coils act as receiving antennae for the fm signals."
The signals are then routed to a Cheetah electrophysiological recording system, made by US company Neuralynx, which specialises in making recording systems for neuroscience. This digitises and processes the signals, then sends them to a neural decoder, whose output constitutes the input to a speech synthesiser that provides the subject with audio feedback. The neural decoder and speech synthesiser are implemented on a desktop computer.
"The delay from neural firing to corresponding audio output is 30 to 70ms, with an estimated average delay of 50ms, approximating the delay from motor cortical activity to corresponding speech sound output in a neurologically intact human," Guenther says.
The Neurotrophic Electrode was invented by Dr Philip Kennedy, a pioneer of mind control whose work goes back to the late 1980s. He founded Neural Signals, which has developed two basic types of communicator for paralysed patients: the Brain Communicator, which is specifically for brain stroke or traumatic injury patients who are locked in or nearly locked in, but still retain normal cognitive function; and the Muscle Communicator, which is for moderately to severely paralysed patients who can intentionally control muscle movement in the jaw, eye or any part of the body.
The Brain Communicator uses either an implanted neurotrophic electrode or patented conductive screws implanted in the skull without entering the brain. The conductive screws detect cortical signals more reliably than conventional scalp electrodes can. The Muscle Communicator is available in three versions: EMG, which uses a removable skin surface electrode to detect any slight muscle movement and uses the signals to activate a switch; EOG, which uses a removable surface electrode to detect movement of eye muscles and uses the signals to activate a switch or move a cursor on a computer screen; and a piezoelectric switch that can be turned on or off by slight movement of jaw, hand or foot muscle.
On the same lines is the Thought Translation Device developed by German researchers at the University of Tubingen. This BCI has enabled totally paralysed patients to communicate by self regulating their EEG signals, using feedback. Led by Niels Birbaumer, the team is now looking to develop non invasive, wireless versions. But he says the main challenge today is not principally technology.
"We need to get industry more interested to build and sell such systems, and get family doctors more interested, so they know what can now be done for paralysed patients."
Feedback has also been used by neuroscientists at the Mayo Clinic in Florida, but instead of using EEG signals, they exploited ECoG, as their patients already had electrodes fitted. They looked at a screen showing a single alphanumeric character and each time a certain letter flashed, a computer recorded the brain's response. The patients were then asked to think of specific letters, the computer recorded the information and calibrated the system with their specific brain wave. When the patient then thought of a letter, it appeared on the screen.
"We were able to consistently predict the desired letters for our patients at or near 100% accuracy," says lead investigator Dr Jerry Shih. "While this is comparable to other researchers' results with EEGs, this approach is more localised and can potentially provide a faster communication rate."
Finally, at the University of Pittsburgh, Andrew Schwartz and fellow researchers are aiming to extend mind control so brain signals can control prosthetic limbs. Working with monkeys, Schwartz showed three years ago they could control a simple robotic limb via electrodes implanted in their brains' motor cortex. By working out the electrical language used by the cortex to guide arm movement, Schwartz converted those signals into instructions for a crude robotic limb.
Now, the aim is to enable them to use a more sophisticated limb that nearly matches the human arm and hand, known as the Modular Prosthetic Limb (MPL). Schwartz is aiming to teach the monkeys to use the five fingered MPL and perform many everyday tasks, controlling it from the brain. If it works, he plans to give people with spinal cord injuries, paralysed from the neck down, a chance to try the MPL.
Ultimately, mind control could spell the end of paralysis for many sufferers.