Researchers hacked a Tesla’s autopilot using three stickers on the road

WHY THIS MATTERS IN BRIEF

Neural networks see the world differently to humans, which is why they can be corrupted using relatively simple trickery.

 

Interested in the Exponential Future? Connect, download a free E-Book, watch a keynote, or browse my blog.

A prolific cyber security research firm has announced that it’s managed to make Tesla’s self-driving car feature veer off course, and into theoretical in coming traffic, just by sticking three small stickers on the road pavement. Also, it’s not the first time researchers have been able to trick self-driving vehicles into driving dangerously after elsewhere researchers stuck small stickers onto road signs that tricked cars into mis-reading speed signs saying 30 mph as 100 mph. All of which is the result of the way the Artificial Intelligence (AI) neural networks that control the cars see the world, and as researchers cotton onto the fact that this is a potentially major security concern, if for no other reason because it’s cheap and low-tech, they’re already becoming a worrisome occurrence.

Tencent’s Keen Lab, a two time honoree of Tesla’s Bug Bounty hall of fame program, said in a research paper that it found two ways to trick Autopilot’s lane recognition by changing the physical road surface in front of it.

 

READ
Hackers can copy your fingerprints directly from photos

 

The company’s first attempt to confuse Autopilot used blurring patches on the left-lane line, which the team said was too difficult for someone to actually deploy in the real world and easy for Tesla’s computer to recognise.

 

The explainer (in Chinese)
 

“It is difficult for an attacker to deploy some unobtrusive markings in the physical world to disable the lane recognition function of a moving Tesla vehicle,” Keen said.

 

The white dot is clearly visible

 

The researchers said they suspected that Tesla also handled this situation well because it’s already added many “abnormal lanes” in its training set of Autopilot miles. This gives Tesla vehicles a good sense of lane direction even without good lighting, or in inclement weather, they said.

Not deterred by the low plausibility of the first idea, Keen then set out to make Tesla’s Autopilot mistakenly think there was a traffic lane when one wasn’t actually present.

The researchers painted three tiny squares, which you can see in the picture, in the traffic lane to mimic merge striping and cause the car to veer into oncoming traffic in the left lane.

 

READ
Watch the F/A-18 release swarms of autonomous drones into the sky

 

“Misleading the autopilot vehicle to the wrong direction [of traffic] with some patches made by a malicious attacker is sometimes more dangerous than making it fail to recognise the lane,” Keen said.

“If the vehicle knows that the fake lane is pointing to the reverse lane, it should ignore this fake lane and then it could avoid a traffic accident,” they said.

In response to Keen’s findings, Tesla said the issues “didn’t represent real world problems and no drivers had encountered any of the report’s identified problems.”

“In this demonstration the researchers adjusted the physical environment, by placing tape on the road and altering lane lines, around the vehicle to make the car behave differently when Autopilot is in use,” the company said.

“This is not a real-world concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should be prepared to do so at all times.”

 

READ
Revolutionary new SkinGun fires stem cells to heal burns without scarring

 

A Tesla representative told Business Insider that while Keen’s findings weren’t eligible for the company’s bug bounty program, the company held the researcher’s insights in high regard.

“We know it took an extraordinary amount of time, effort, and skill, and we look forward to reviewing future reports from this group,” a representative said.

However, while today, in a pre fully autonomous car world, Telsa mandates that drivers have to keep an eye on the road and “be ready in an instant to take over from the cars Autopilot system in the event of an emergency” tomorrow, when that driver is in the back of the car having a massage in a massage seat, something that’s being touted by Tesla competitor Toyota, this kind of hack could quickly become a very real real-world concern. So whether Tesla recognise it as a problem or not I for one suggest they look into ways of fixing it now before someone doesn’t come home for dinner…

Source: Business Insider

Related Posts

Leave a comment

Subscribe To Our Newsletter

Join our mailing list to receive the latest futuristic news and updates from Matt and the team.

Thanks, you've successfully subscribed!

Pin It on Pinterest

Share This