Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Security researcher disabled self-driving car with a laser and a Raspberry Pi

This site may earn commission on affiliate links.
Saw this in the news today. Autopilot in the Tesla shouldn't be effected by this specifically since it is using cameras instead of these Lidar sensors but still worth noting when thinking of all the things Tesla needs to be considering before releasing auto pilot to the world.

Self-driving cars can be fooled by fake cars, pedestrians and other bogus signals

Google, Uber and even Apple’s potential self-driving car can all be foiled by little more than a homebrew laser pointer thanks to their Lidar sensor systems.

Jonathan Petit, principal scientist at software security company Security Innovation, has unearthed a gaping security vulnerability in Lidar sensors – essentially the eyes of any self-driving car. In a research paper due to be presented at the Black Hat Europe security conference in November, Petit outlines how a low-power laser and pulse generator can fool a car into believing that other cars and pedestrians are around it.

The vulnerability means that self-driving cars could be halted in the middle of the road when they believe another car or person has appeared in view suddenly.
 
I believe in most places in the US, it is illegal to shine visible lasers in the direction of a person's face, as it can cause temporary blindness and cause an accident. Shining a visible laser at an aircraft is a felony and there was recently a push to enforce it after an aircraft (possibly in NY) was forced to abort a landing due to laser light entering the cockpit from a ground based laser pointer. Although I expect a filter to be put on future lidar systems, its interesting that the sensors are sensitive to visible light.
 
I believe in most places in the US, it is illegal to shine visible lasers in the direction of a person's face, as it can cause temporary blindness and cause an accident.

Or permanent.

There's an IT networking saying for those people who deal with long-haul fiber optic cables... "Do not look into laser with remaining good eye."
 
IMHO this is a real vulnerability that should get fixed before release. In answer to the above comments:

a) I didn't see anything that talked about visible lasers, most likely he used the same frequency as the lidar.
b) The restrictions on lasers has to do with power output, low powered lasers such as the pointers used in classrooms and likely the ones used in this case aren't restricted.
c) This is nothing like holding up a cutout. Lasers are very small and easily concealed, so a hidden hacker, or one in another moving vehicle could spoof the lidar.

I suppose the fix would be putting a code into the lidar pulse and only accepting pulses with the same code.

Then there's the question of existing radar equiped cars such as Teslas. While accidental interference doesn't seem to be a major issue, I wonder if a malevolent person could cause a car to suddenly brake on the highway by spoofing the radar?
 
The principal discussion here is: Do self driving cars have to be perfect, non-manipulable, foolproof, or do they only need to be (far) better and safer than human drivers? Shining a laser or a bright light in the eyes of a human driver will have the potential to cause accidents. This is bascically the same, regardless of the frequency of the light. However, it's clearly malevolent behaviour. Here, let me come up with some other things that will impede self-driving cars and can cause accidents with them: strong electromagnetic pulses that disable all the on-board electronics, projecting fake imagery on the road using projectors, spray painting over speed signs and other traffic signs, dressing children or animals up in camoflauge of some kind etc. etc.

Of course there systems will be prone to manipulation, some forms the same as could be used to target a human, sometimes new types of manipulation. So what?
 
+1 to Johan. Self driving cars are not perfect. Of course any system can be fooled. Any security camera can be disabled by putting a plastic bag over it. A lock can be broken, a window glass can be smashed, a tire can be deflated with a nail, a driver can be blinded with a strong light or laser, a traffic light can be disabled, ...
I hate those stupid articles that try to make it sound like this was an issue.