In most modern control room systems, for example, displays have to be designed so operators can easily see information, while visual clarity is vital if multiple displays are competing for an operator’s attention.
Since the launch of the iPhone in 2007, HMI design has been driven by the explosion in personal computing and by the portable devices we carry around with us.
Industrial customers are also consumers and, as such, expect the technology and functionality they take for granted in their personal devices – such as graphics quality, tactile feedback and speed – will be made available to them.
As a result of the pervasive nature of handheld mobile technology, most people can, with no training, access a broad array of functions. But, while device capabilities have increased exponentially, designers of hardware and software interfaces have also paid close attention to the way in which people interact with them.
HMI solutions tend to be customisable in order to meet various size, aspect ratio, performance and integration requirements. They also need to be integrated easily, with few additional discrete components needed, as well as accurate and responsive. Advanced algorithms are delivering on this, while providing sensing performance over the entire touchscreen surface.
“The pervasiveness of touchscreen technology in modern mobile consumer devices has been crucial in its adoption across a variety of market sectors,“ suggests Andrew Hsu, Synaptics’s director of concept prototyping.
Hsu, who leads the company’s efforts in developing high fidelity prototypes for demonstrating and investigating the impact of user interface technologies, believes people are ‘comfortable using touchscreen technology’.
“Apple’s introduction of the iPhone provided that little bit of magic; it was truly revolutionary given its size and abilities, display quality and responsiveness as well as its ease of use. It’s just very intuitive to use.”
According to Hsu: “Developers of HMI solutions are working at a time when people expect to be able to monitor, control and manipulate data easily from a control system. The biggest challenge in the automotive and industrial spaces is trying to manage and align designs to mirror those found in the consumer market and in much reduced timescales.”
Recently, the Qt development framework was able to create in just nine weeks a complete HMI for a smart in vehicle infotainment system. Built for Intel, the system was designed to demonstrate how it was possible to integrate highly graphical and interactive interfaces into next generation vehicles.
Hsu noted a growing trend is greater standardisation of the HMI interface. “Customisation today arises from developments in the software and middle ware,” he explains. “We need to speed the development cycles, while addressing calls for greater reliability and robustness of the technology, but not compromising on safety standards.”
With multimedia, communication, air conditioning, telematics and navigation features being built into the average car, Hsu believes improvements in technology mean manufacturers have been able to place far greater importance on the implementation of intuitive HMIs that can address the expanding scope of functionality that drivers need to deal with, while ensuring they can still concentrate on the road ahead.
“The focus tends to be on HMIs that are capable of supporting touch-less operation,” he suggests.
Industry analyst TechNavio has predicted that the automotive HMI market will rise at a compound annual growth rate of 7.7% over the next four years, but this level of growth is symptomatic of the HMI market in general.
So what is required for an automotive HMI to be effective? “It needs to allow for tasks that can be undertaken in a straightforward and prompt manner – it should not distract the user or impact their ability to drive safely,” Hsu notes. “The HMI for a smartphone has been designed to be immersive – you focus entirely on the screen. That isn’t possible in automotive.”
Newer HMI technologies offering a handset type experience are being developed using touch sensing hardware and more sophisticated chips are being deployed to drive displays, meaning there is no need for a separate touch screen assembly.
For example the third-generation R-Car computing platform from Renesas is an automotive computing platform solution capable of complex processing supporting applications such as safety support and in-vehicle infotainment systems, but which is also capable to support HMI computing - processing large amounts of data and supporting enhanced graphics displays.
Greater connectivity between various systems and services such as smartphones and cloud services is growing and has led to a significant increase in the volume of data being transmitted from outside the system. As a result, there is a growing demand for HMI computing to process this data accurately in real-time.
“In the automotive environment, the driver cannot spend time navigating through a touchscreen.
“From a design perspective, we have to reduce the cognitive loading so force and haptic technology start to play more of a role. With handsets, you know what you are touching; you do not have the luxury to stare at a handset screen when driving or operating machinery,” says Hsu. “Force sensing enables more precise control and a more intuitive experience.”
Optoelectronics are offering the possibility to implement multi-mode HMI systems, where the detection of light can be used to supplement the touchcontrol element, but there are technical challenges associated with this approach that need to be overcome.
“Many companies are looking at employing a smartphone as the HMI,” according to Hsu,” but there is an ongoing debate as to how these devices can effectively turn your car into a smartphone accessory.”
A growing number of apps are being developed for use in vehicles, many of which allow the user to control a car’s features from virtually anywhere: securing the vehicle; checking fuel levels remotely; talking to an advisor; locating your car on a map; and even starting it remotely.
“While it is realistic to see the smartphone aggregating the vehicle’s HMI, the challenge will be connectivity – even if the electronics within the smartphone are capable of turning it into a ‘kind of brain’. The communications interface will remain a design challenge and the designer needs to understand the design constraints – what is the driver capable of and comfortable with doing? Cognitive loading must be factored into any design and the trend of ratcheting up exponential knowledge will certainly be a key challenge going forward,” Hsu continues
Hsu questions those looking at augmented reality (AR) to provide drivers with additional information – even, if by adding information that might otherwise be opaque to the driver, their intentions are to make driving safer.
AR adds extra data to a live view of the world and is being taken up by a number of industries. In the automotive industry, the focus is on heads up display technology to project directions or speed onto the windscreen in the driver’s line of sight.
“We need to be mindful of the driver environment,” Hsu concludes. “Driving is the primary task and augmented reality, like many technologies, may need to be dialled back.”