OTSL adds three new simulators for autonomous driving
OTSL Inc. has enhanced its 3D real-time sensor simulator product lineup for autonomous driving, COSMOsim, by adding three new types of simulators: infrared/uBolometer (microbolometer), camera, and ultrasonic sensors. Last year, OTSL released the Advanced Millimeter Wave Radar (AMMWR) simulator and Advanced Laser Radar (ALR)/LiDAR simulator, entering the business of real-time simulators for autonomous driving. This time OTSL added its Advanced Infrared/uBolometer simulator (AIRB simulator), Advanced Camera Image Sensor simulator (ACIS simulator), and Advanced Ultrasonic simulator (AUS simulator) to its simulator product lineup, offering a total of five types of simulators that can be used for all sensor systems for autonomous driving.
"Expectations toward autonomous driving are growing worldwide, creating higher demand for high-performance simulators that can verify safety and accuracy in environments that closely match real driving conditions," said Shoji Hatano, CEO, OTSL. "OTSL's simulator product lineup for autonomous driving provides an integrated platform that allows you to operate five types of simulators at the same time in real time on a single screen. OTSL aims to eliminate almost all blind spots for autonomous driving, which are hard to identify in a real driving test, by combining simulations using multiple sensors with different features."
The AIRB simulator superimposes the far-infrared (temperature) data of all road objects on a 3D map and visualizes them in real time while moving them freely in any direction and at any speed. Since infrared/uBolometer sensors not only enable recognizing objects at night but also are not influenced significantly by bad weather such as rain and snow, they will be particularly valuable in road conditions where lighting infrastructure on expressways is underdeveloped. The AIRB simulator analyzes the material data of objects on a 3D map from their molecular structures and performs sophisticated calculation processing in real time to bring data such as absorption spectra, thermal radiation based on the blackbody radiation model, and temperature rises due to the radiation energy of sunlight as close to real far-infrared (temperature) data as possible. The AIRB simulator for autonomous driving enables dynamic real-time simulation.
The ACIS simulator enables simulation for autonomous driving not only for a camera but also for lens configurations. Since autonomous driving technology using cameras can be implemented at low cost, the company says it is an effective solution in Japan and other regions where road conditions are relatively good due to extensive night lighting infrastructure. However, common camera simulators for autonomous driving do not allow configuration of detailed optical characteristics that are different for each lens, such as the ghost phenomenon where the light reflected at the lens surface is shown in the image or where the image is distorted or deformed due to light refraction in the lens. Because of this, simulation images remain different from real-world images. The ACIS simulator allows configuration of optical characteristics such as aberration and depth of field that are different for each lens, and the setting of lens characteristics, such as enabling/disabling of the anti-reflection coating, enabling a more accurate real-time simulation to be implemented.
The AUS simulator is designed for ultrasonic sensors that are mainly used for autonomous parking or parking assist technology. The ability to freely set the number of sensors, mounting height and angle, detection capability depending on the material of the mounting location and the mounting method, and ultrasonic irradiation angle and intensity enables real-time simulation in various situations of not only parking but also autonomous driving.
COSMOsim enables the operation of five types of simulators—millimeter-wave radar, LiDAR, camera, infrared/uBolometer, and ultrasonic sensors—simultaneously in real time on a single screen. Automotive manufacturers can simulate a driving situation by sensor-based modeling and check the recognition and control of autonomous driving, and verify the sensor-mounting positions on vehicles efficiently, which eliminates the need for test driving with real vehicles. System component suppliers (vehicle sensor manufacturers) can review the design parameters and check the reaching distance and sensing area with higher efficiency by visualizing the behavior of vehicle sensors. Semiconductor manufacturers developing sensor devices can model and simulate a device under development and verify it at high speed.