Self Driving Cars Can be Tricked Using Fake Signals

Apart from remote hacks, self driving cars are also vulnerable to the hacks that are done within a short distance from them. It does not need expensive tools in order to do it. It can be done by using cheap electronics too.

Jonathan Petit, a security researcher, has proven that you can trick (the laser ranging commong on autonomous vehicles) by sending laser pulses so that the self driving car might think that there is other car or other object nearby whereas in reality, the objects do not exist. All is needed is a low power laser, basic computing device such as Arduino kit or Raspbery Pi and the right timing.

In his proof-of-concept attack, Petit showcased that he is able to create an imaginary objects as far as 330 feet away and he also able to create multiple copies of it and make them move at the same time. The impact of it to the self driving car is that these imaginary objects is able to make it stop or move as it is designed to avoid threats.

Petit’s method only works if the LIDAR unit pulses are not encrypted, therefore, it is possible to prevent this type of hacking if the production ready self driving car is encrypted when sending out LIDAR unit pulses. However, it is unknown whether this issue will be a big issue or it becomes pretty common issue for self driving car.

Print Friendly

Related posts: