Industry standard

Industry-standard dual-camera LIDAR sensors on self-driving vehicles can be tricked

Researchers from duke university showed the first attack strategy that can fool industry-standard autonomous vehicle sensors into thinking that nearby objects are closer (or farther away) than they appear undetected.

The researchers showed that a popular method for securing LiDAR sensors against “naive attacks” is still vulnerable at longer distances and only works at short distances. Here, a LiDAR system is tricked into thinking a car is somewhere else until it’s too late to avoid a sudden and drastic course correction. Image credit: Duke University.

The study suggests that the integration of three-dimensional (3D) optical capabilities or the ability to distribute data with adjacent cars may be necessary to fully protect self-driving cars from attack. The results will be highlighted at the USENIX 2022 Security Symposium, a premier venue in the field, between August 10and and 12and.

One of the biggest challenges faced by scientists designing autonomous driving units is guarding against attacks. A typical strategy for ensuring security is to check data from unconnected systems against each other to ensure that their measurements make sense together.

The most common tracking technology used by today’s self-driving car manufacturers incorporates 3D data from LiDAR and two-dimensional (2D) data from cameras, which are essentially laser-based radars. This combination has been established as very strong against a wide variety of attacks that attempt to trick the visual system into perceiving the world inaccurately.

This issue can now hopefully be resolved.

Our goal is to understand the limitations of existing systems so that we can protect ourselves against attacks. This research shows how adding a few data points to the 3D point cloud in front of or behind where an object actually is can cause these systems to make dangerous decisions.

Miroslav Pajic, Dickinson Family Associate Professor of Electrical and Computer Engineering, Duke University

The new attack strategy works by firing a laser gun into a car’s LIDAR sensor to embed fake data points into its perception. If these data points are different from those of the car camera, previous research has shown that the system can identify the attack.

But the new study by Pajic and his colleagues demonstrates that 3D LIDAR data points wisely placed in a specific area of ​​a camera’s 2D field of view can fool the system.

This sensitive area extends in front of the lens of a camera in the form of a trunk, that is to say a 3D pyramid whose tip is cut off. For a forward-facing camera mounted on a car, this would mean that certain data points placed behind or in front of another adjacent car can change the perception of the system by more than a few meters.

This so-called cold attack can trick the adaptive cruise control into thinking that a vehicle is slowing down or speeding up. And by the time the system can figure out there is a problem, there will be no way to avoid hitting the car without aggressive maneuvers that could create even more problems.

Miroslav Pajic, Dickinson Family Associate Professor of Electrical and Computer Engineering, Duke University

Pajic explains that there’s not much risk of someone spending time planting lasers on a car or roadside object to fool individual vehicles traveling down the highway. This risk is heightened, however, in military scenarios where single vehicles can be very high value targets.

If hackers discovered a way to virtually train these fake data points rather than requiring physical lasers, many vehicles could be attacked simultaneously.

Pajic says the way to protect against these attacks is through additional redundancy. For example, if cars had “stereo cameras” with crossed fields of view, they could better approximate distances and identify LIDAR data that does not match their perception.

Stereo cameras are more likely to be a reliable consistency check, although no software has been validated enough to know how to determine if LIDAR/stereo camera data is consistent or what to do if it turns out that they are inconsistent. Additionally, perfect securing of the entire vehicle would require multiple sets of stereo cameras around its entire body to provide 100% coverage.

Spencer Hallyburton, PhD candidate and lead study author, Cyber-Physical Systems Lab, Duke University

Another choice, Pajic suggests, is to create systems where cars in close proximity to each other share some of their data. Physical attacks may not be able to affect multiple cars simultaneously, and since different brands of cars may have different operating systems, a cyberattack may not be able to attack all cars at once .

With all the work going on in this area, we will be able to build systems that you can trust with your life. It could take more than 10 years, but I’m confident we’ll get there.

Miroslav Pajic, Dickinson Family Associate Professor of Electrical and Computer Engineering, Duke University

This study received support from the Office of Naval Research (N00014-20-1-2745), the National Science Foundation (CNS-1652544, CNS-2112562), and the Air Force Office of Scientific Research (FA9550- 19-1-0169).

Source: https://duc.edu/