These include:
• The proliferation of Advanced Driver Assistance Systems (ADAS) technologies,
• electrification,
• and mobility-as-a-service.
The importance of these trends is reflected in a comparison of the market capitalisations of Tesla, which makes fewer than 400,000 vehicles a year, and Ford. Tesla bases its strategy on a series of innovations in battery-powered traction, autonomous driving and robotaxi capabilities to support Tesla-branded ride-sharing services.
Ford makes much of its money from traditional American pick-up trucks that feature high-powered internal combustion engines. Ford, with 2017 production volume of more than six million units, had a market capitalisation of just $37bn in late 2019, while Tesla was worth $44bn.
While adoption of mobility-as-a-service has been driven by business model and software innovation pioneered by the likes of Uber, and increasing electrification depends on production innovations such as Tesla’s battery ‘gigafactory’, the focus of innovation in driver assistance is on hardware and software technology – a combination of sophisticated sensor systems and artificial intelligence.
All assisted driving systems rely in part on multiple forms of ‘perception technology’: in fully autonomous vehicles, optical technologies such as LiDAR (light detection and ranging) and visual cameras will work alongside electro-magnetic motion sensors and RF/microwave systems – RADAR and satellite positioning.
It might seem surprising that RADAR should be playing a part in the most exciting developments in automotive technology. In fact, many 24GHz RADAR sensors are mounted in the bumpers of vehicles on the road today – Analog Devices alone has to date supplied some 300 million units to automotive manufacturers for use in applications such as blind-spot detection, automated lane-changing, and Autonomous Emergency Braking (AEB).
But demand for ever higher levels of driver assistance, supported by the evolution of functions such as AEB and Adaptive Cruise Control (ACC) in new ADAS implementations, is driving suppliers to develop new RADAR systems which offer higher precision, longer range, faster detection and a more complete picture of the world around the vehicle.
RADAR as the ‘eyes’
Car manufacturers are piling investment into the refinement of driver assistance technology for two reasons: for safety, and for comfort. Driver assistance systems such as AEB and ACC save lives and prevent accidents. Cars which feature these systems are rewarded with a higher official NCAP safety score, a mark which lifts the value and consumer appeal of new cars.
Both AEB and autonomous emergency steering systems continue to evolve in scope and complexity to serve the growing market for vehicles in the Level 2 or Level 3 (L2/L3) categories of driver assistance technology. New NCAP specifications, for instance, call for better detection of pedestrians - ‘vulnerable road users’, in NCAP’s terminology. Developing AEB systems will operate reliably in more complex events than they are typically specified for today, controlling the braking function at higher vehicle speeds in both urban and highway settings.
The market is also responding to signals from car buyers who want technology to reduce the effort involved in driving, particularly on the motorway. Premium cars such as the Mercedes S-Class already offer limited highway auto-pilot capabilities, such as adaptive control of distance to the car in front, and active steering assistance to keep the car in its lane. Automotive suppliers are continually implementing enhancements to these features so that they can be used in a wider range of more complex situations. This intensifies the need for RADAR sensors that offer superior performance.
The move towards higher L4 and L5 autonomy, which isolates the driver entirely from direct control of the vehicle, will require the development of sensing systems which have a 360° view around the car in real time. The control systems for these ’robotaxis’ will be incredibly complex, and will need redundancy to eliminate the risk of false detection events, combining the inputs from separate sensor types such as RADAR, cameras and LiDAR sensors.
Visual cameras can be used to assist the recognition of objects such as human beings, animals and road signs. LiDAR technology creates rich point clouds, taking an instantaneous measurement of the vehicle’s distance from objects in the outside world and measuring the objects’ size to produce a high-resolution 3D map of the outside world.
But a RADAR sensor’s unique capabilities, which are continually being extended, make it a crucial complement to these other sensor types in L4 and L5 systems. In L2 and L3 use cases, RADAR is actually the dominant sensor type because it offers the best combination of size, cost and performance attributes.
Crucially, RADAR performs ‘4D’ sensing: with a single shot, it can measure the range, velocity, angle and elevation of an object from which its millimetre-wave pulse is reflected. A RADAR sensor also operates in conditions, such as rain, fog and snow, which impair or disable the operation of LiDAR sensors and visual cameras.
Greater integration
Automotive RADAR systems under development will in time make today’s RADAR technology appear blunt and limited in comparison. Today, a RADAR sensor mounted in the front bumper does an excellent job of measuring the distance to a single vehicle in front, and its speed.
A full highway auto-pilot system, however, will need to be able to operate safely on the autobahn in Germany, where a motorbike, for instance – smaller, and so harder to detect than a passenger car – can approach on the outside lane at speeds higher than 180km/h. To provide early and accurate detection of such a hazard, an auto-pilot’s RADAR system therefore needs to sense with greater precision, faster, and at longer range.
Developing these capabilities while staying within the automotive industry’s tight constraints on size and cost calls for innovation in semiconductor technology, RF system operation and signal processing – fields in which Analog Devices excels.
At Analog Devices, a new generation of RADAR components including 76-81GHz Monolithic Microwave IC (MMIC) transmitters and receivers, is based on a new Drive360 28nm CMOS technology platform. Marking a departure from the industry’s conventional use of SiGe semiconductor technology for RADAR, the Drive360 platform provides valuable advantages including
- High output power and low return noise for detection of objects at a longer range
- Low phase noise and high intermediate frequency (IF) bandwidth, giving ultra-high precision for the detection of small objects such as motorbikes and infant pedestrians, which before would have been hard for a RADAR sensor to see
- High-performance phase modulation, enabling the RADAR sensor to discriminate more effectively between multiple objects in a scene
- Ultra-fast pulse transmission, giving a faster response to fast-moving objects such as the motorbike advancing at 180km/h
The use of CMOS technology also supports a high level of integration of digital functions in RADAR devices, helping to reduce the cost and size of advanced RADAR systems. Core Analog Devices intellectual property in functions such as oversampled analogue-to-digital converters, and ultra-low noise digital PLL clocks, also helps to increase the speed of operation, resolution and stability of next-generation 77GHz RADAR sensors.
A combination of advanced semiconductor technology, analogue expertise and system software capability will enable RADAR technology to extend the capabilities of ADAS deployed in the next generation of vehicles. And Analog Devices will remain at the heart of the development of RADAR now and into the next decade.
Author details:
Donal McCarthy, Director of Automotive Radar Product Line, Analog Devices