The collaboration will focus on improving the fidelity of simulation solutions to significantly reduce the requirement to collect physical data during the sensor development cycle and will enable perception systems for ADAS and autonomous vehicles to be developed, trained and tested in a virtual environment, which will rapidly accelerate development.
“Sony Semiconductor Solutions has been a crucial collaborator in the development of our recently launched ray tracing technology and rFpro’s Multi-Exposure Camera technology, which accurately replicates what cameras ‘see’ for the first time,” said Matt Daley, rFpro Operations Director. “By working closely with Sony and integrating its sensor models into our technology we have been able to achieve a high level of correlation in the simulation. As a result, together, we are helping the industry reduce its reliance on collecting real-world data, which is expensive and time intensive.”
“The collaboration will provide an automotive-grade End2End simulation pipeline to the ADAS perception system developers,” said Kenji Onishi, Deputy Senior General Manager, Automotive Business Division, Sony Semiconductor Solutions. “Sony has prepared a sensor model based on the internal architecture of the image sensors used in camera systems to achieve automotive-grade fidelity.”
Signal processing, LED flicker mitigation, spectral effects and image sensor control, such as motion blur and rolling shutter, are some examples of the phenomena replicated by Sony’s sensor model. Sony’s interface between the sensor model and the rendering system is very efficient and allows high-fidelity simulations to run quickly. The interface is also common across all Sony’s automotive image sensors enabling users to quickly transition to a new generation of sensor models as they become available.
Simulation enables vehicle systems to be subjected to a limitless array of scenarios. The weather, time of day, amount of traffic and pedestrians can all be controlled and varied independently and automatically. In rFpro everything in the scene has been physically modelled with accurate material characteristics and the road surface recreated to within a height accuracy of 1mm.
“Vehicles can drive thousands of high-value, high-activity virtual miles every day in simulation,” said Daley. “Edge cases can be identified and new iterations generated quickly to thoroughly exercise sensor systems. It removes the need to wait for exposure in the real world, where the majority of miles driven are relatively uneventful.”
Sony is the first partner to collaborate with rFpro on its recently launched ray tracing technology, which is the company’s software-in-the-loop (SIL) solution aimed at generating synthetic training data. It uses multiple light rays to accurately capture all the nuances of the real world. As a multi-path technique, it can reliably simulate the huge number of reflections that happen around a camera. This is critical for low-light scenarios or environments where there are multiple light sources to accurately portray reflections and shadows.
rFpro’s ray tracing is applied to every element in a simulated scene, which has been physically modelled to include accurate material properties to create the highest-fidelity images. As this is computationally demanding it can be decoupled from real-time. The rate of frame rendering is adjusted to suit the level of detail required.
Modern HDR (High Dynamic Range) cameras used in the automotive industry capture multiple exposures of varying lengths of time. For example, a short, medium and long exposure per frame. To simulate this accurately, rFpro has introduced its multi-exposure camera API, which integrates with the sensor manufacturer’s sensor model and its own API. This enables the sensor model to sample the virtual world with the exact exposure periods as its physical counterpart.
According to Daley, “With the rFpro rendering and Sony’s sensor model pipeline, motion blur, rolling shutter characteristics, colour filter characteristics and signal processing of the physical sensors exist in the simulated images. It also allows the LED flicker mitigation feature in the image sensor to be replicated on the sensor model and to be fully exercised in simulation. It enables the sensor model to interact with the fast-flickering LED light sources simulated on rFpro’s high fidelity assets and rendering system, such as traffic lights and vehicle brake lights. Accurately replicating these phenomena is critical to achieving high levels of correlation with the real world.”