The World Health Organisation (WHO) reported 1.35 million traffic deaths occur each year as a result of road traffic accidents, with more than half being pedestrians, writes Chris Posch, Engineering Director Automotive, FLIR Systems.
Safety will be both the biggest enabler and the biggest barrier in the adoption of autonomous driving; the future of which is reliant on making today’s Advanced Driver Assistance Systems (ADAS), Autonomous Vehicle (AV) systems and Automatic Emergency Braking (AEB) safer for drivers, passengers and pedestrians alike.
With every passing year, driver-assist systems are increasingly quintessential in consumer car models and while these systems are already safety approved, these systems remain a work in progress. Implementing systems that are safe—in all types of conditions—continues to pose the biggest technology challenge to the industry, specifically the ability to spot pedestrians, roadside objects and animals in inclement weather and darkness.
Enhancing AV Safety
Once automakers can demonstrate functional safety through repeated testing of ADAS and AV systems, human drivers can begin to transition their trust from human to machine. Technology development and systems’ testing is at a critical point in time, whereby the industry still needs to refine systems in conditional automation in order to improve safety in fully autonomous vehicles.
A recent study from the American Automobile Association (AAA) found that cars from Toyota, General Motors, Tesla and Honda with current AEB systems struck test dummies 60% of the time in daylight and failed 100% of the time in nighttime testing, underlining the integral need for enhanced nighttime vision and better AEB systems overall.
How can automakers improve these systems? By expanding the ability of the car to “see” via additional sensor integration that can detect and interpret data around the vehicle to improve situational awareness. One such technology is found in thermal cameras with their ability to detect and measure infrared or heat energy that is emitted and reflected by everything on Earth.
Through testing of thermal cameras in both today’s ADAS and future AV systems, we know that these vehicles can “see” four times further than traditional headlights, making them particularly effective during nighttime driving. The adoption of thermal sensors, with their unique ability to see, detect and classify humans and other living things in our cluttered driving environments via machine vision, provides an additional layer of data crucial to improving safety.
The ability to see clearly through the darkness, smoke, most fog and glare from the sun or oncoming headlights provides a safety advantage in practically all driving environments. Furthermore, this technology provides another redundant sensor that will be required for high-level ADAS and full AV to become functionally safe, providing confirmatory data for visible, radar and LIDAR systems, the most common sensors found on existing systems.
However, thermal imaging isn’t just for the future, it can be implemented in cars today. In response to the AAA study, FLIR recently announced a proof-of-concept tests conducted in conjunction with VSI Labs that look at the capability of thermal cameras to enhance AEB systems. In this study, the use of thermal camera and radar data stopped the vehicle under test 100% of the time across six test scenarios in both daylight and in darkness.
With AEB regulations set to tighten, the number of companies utilising thermal technology for ADAS and AVs is due to rise; the European Union has backed United Nations Economic Commission for Europe (UNECE) regulation for advanced emergency braking systems (AEBS). Standardisation of AEB is a good example of what is needed for autonomous to be successful. With standard definitions, designs can be homogenous, reducing manufacturing cost and allowing testing and rating agencies like Euro NCAP to develop tests that provide manufacturers and consumers with safety information.
Utilising Thermal for Faster Rollout and Adoption
Major global auto brands including Volkswagen, Audi, BMW and General Motors (GM) already offer thermal imaging sensing to car buyers, but these sensors today serve as early warning systems and do not inform automated vehicle decision making. Veoneer, a Swedish provider of automotive technology, has provided more than 700,000 vehicles with thermal-enhanced systems already on the road today.
The next step is integrating the thermal sensor within the existing suite of complementary and orthogonal sensors to help optimise driving performance across all conditions by providing critical information and redundancy to ensure safety. When combined with visible light data and LIDAR, thermal data paired with machine learning can create a more comprehensive vision system for identifying and classifying roadway objects, especially pedestrians and living things in assisted and fully autonomous driving modes.
To accelerate technology development and support innovative automakers collecting machine learning data to build robust ADAS/AV AI systems, FLIR has released thermal data sets to further enable developers. Specific to major cities, developers can start building AI systems with a cost-effective, weatherproof thermal camera. These datasets empower the automotive industry to quickly evaluate thermal sensors on next-generation algorithms and provide a more comprehensive, redundant, and safer system in cars today that works in all driving conditions.
While there is a lot of hype, the reality is that technology innovation in this space requires long testing periods for safety purposes and as such, fully autonomous cars on our roadways are still years away. Despite that, the potential for innovation and change within autonomous vehicles is happening right now and it is the use of thermal technology that will bring about significant change in today’s ADAS systems and fill the performance gap in tomorrow’s AVs.