Tom Carter, Ultrahaptics' chief technology officer, explained the start up's history. "The company's origins were in a research project during my degree course at Bristol University. I jumped at the chance and, when I completed that course, I thought it would be a good opportunity for a PhD thesis and developed the technology further.
"Once the technology was stable, we started to get interest from potential customers and span the company out of the university in 2013."
While much haptics technology involves people still needing to touch whatever it is they want to interact with, Ultrahaptics is looking to take the technology into what could be seen as a new dimension. It's approach is to use a technique which it calls ultrasound to allow the user to 'feel' something that isn't there. It's what Carter calls 'non contact contact'.
Carter noted: "Input technology has moved a lot more quickly that haptics technology has. Once the touchscreen was developed, it offered a more flexible user interface than simple buttons; you don't have to remember to press *#3 on your phone's keyboard to call up a menu, for example. But touchscreens offer less tactile feedback. Although you can get vibrational feedback on a touchscreen, the sensation is produced right across the screen."
One step beyond physical contact is gesture recognition, a method which is beginning to gain some kind of momentum mainly because of Microsoft's Kinect system. Kinect is a bar shaped system which sits either above or below a display. It includes a camera, sensors and microphones, with software interpreting the captured data.
"The problem with systems like Kinect," said Carter, "is that while people are interacting with devices, they aren't touching them. The lack of feedback is, in my view, a massive loss.
"If you can solve that issue, then it will be a useful thing to have."
And that, essentially, is what Ultrahaptics has done. "The technology is all our own," Carter asserted. "We've worked out how to do haptics better."
He described the approach as similar to using a touchscreen. "But as you use our technology, the interface can change; our technology allows what you feel to change. The physical properties can change in a fluid manner and the more we explore the technology, the more benefits we are finding."
An example of the benefits, said Carter, is the fact that you don't have to hold your hand in exactly the right position. "With a physical system, if your hand is in the wrong place, you push the wrong button. With our haptic system, your hand can be roughly in the right place."
Haptics technology allows the user to 'feel the force' |
Ultrahaptics is currently experiencing the growing pains of a start up. It has received seed funding from IP Group and has started to expand the management team with experienced executives who can take the company to the next level.
One of the strategic appointments is Steve Cliffe as chief executive, who joined Ultrahaptics six months ago from Plessey Semiconductors, where he was business development director. Before that, he held CEO roles.
"I'm staggered how fast we've moved, even in six months. We're ahead of the business plan in terms of customer engagements and are selling evaluation kits."
Ultrahaptics' evaluation kit consists of a 16 x 16cm device featuring 256 speakers. "It's that size because it's designed to be flexible," Carter pointed out. "It has to suit a range of customers; from those looking to implement systems with discrete buttons to virtual reality developers, who are looking at big systems."
The 256 speakers project enough ultrasound energy for users to 'feel' something at up to 1m from the evaluation kit. "While the kit has a small grid of speakers," Carter continued, "a production system doesn't have to be in that format; users would interact with that system simply by holding their hand above a surface into which the technology is integrated."
However, the smaller the system in an end product, the smaller the interaction. "A range of around 1m is suitable for use in applications such as car dashboards," Carter suggested, "but a lot of apps will require a range of 20 to 40cm and our technology can scale down to that. When a product is small, the speaker grid and driver chip will be embedded."
The 256 speakers in the Ultrahaptics evaluation kit allow users to 'feel' an object at a distance of up to 1m |
The 256 speakers, combined with the ability to focus them at particular points, allows high accuracy. "If you move your hand by even 5mm," Carter claimed, "the intensity will be much smaller; it's a discrete sensation."
Cliffe said interest is coming from what he calls 'PC type companies'. "These have seen the development of 3D images, but now want to offer users the ability to 'touch' the images that come out of the screen. They're also interested in creating keyboards in free space.
"But we're also talking with the automotive sector, which has been working on gesture control for some time. But these companies have realised that if you create a gesture control system without any feedback, it means drivers have to take their eyes off the road for longer than they would with a button based interface. Our technology can give feedback in mid air, so there's a lot of interest.
"Because you can create different 'feels', the driver knows what the control is. Once they've used it a couple of times, they can do it 'with their eyes closed', which means they don't take their eyes off the road."
Cliffe also noted interest from the consumer electronics world. "One short term opportunity is with audio systems, where users can turn the volume up and down or move to the next track. There are a lot of simple things that can be done using this approach."
The Ultrahaptics approach is 'input agnostic', according to Carter. "It will work with any sensing technology. Developers could specify proximity detection, infra red, cameras or even projected capacitive systems.
"One thing we have found is that companies have been working on gesture recognition systems for some year and don't want to part with something they have spent a lot of money on. So we need to let them integrate that technology.
"The output will depend on the particular application. The sensing technology will detect where the hand is and what it's doing; the application software will then know what the interaction requires in terms of haptic feedback.
"The algorithms run quickly and the speakers turn on at the right time and with the right power."
The Ultrahaptics evaluation system is currently hosted by a PC; Carter says that's the easiest way to provide software updates and similar. "But it is ready to be embedded on an MCU," he continued. However, he noted that because potential customers will have their favourite platform, Ultrahaptics hasn't selected a particular device.
Carter noted that technology development is also taking advantage of the jump in processing power available. "When we first started the research, the button algorithm took 20minutes to render one frame, but the evaluation software now runs at 200frame/s. A single thread on an embedded graphics chip can run at 120k frame/s, so the algorithms should run on most MCUs."
The team is also working on shrinking the technology. "There will still be the need for reasonable power," Carter said, "but we have new materials in mind for these applications and are looking at developing technology suited for use in mobile phones."
Cliffe is charged with growing the business and is confident this can be achieved. "We have to grow the team here," he said, "and we need to develop a system that allows companies to develop their own system, maybe embedded as firmware in a customer's MCU.
"We'll be developing the company using an IP licensing model and this is getting a good reaction from the people we've talked to so far. There are some big markets out there and getting even a small percentage of the business will be good," he concluded.