- Researchers at McAfee were able to trick two Teslas into autonomously speeding up by 50 mph.
- The researchers stuck a 2-inch strip of tape on a 35-mph speed sign, and the car’s system misread it as 85 mph and adjusted its speed accordingly.
- The safety of Tesla’s autopilot features has come under close scrutiny, but CEO Elon Musk has predicted the company will have “feature-complete full self-driving” this year.
- Visit Business Insider’s homepage for more stories.
It turns out all it needs to fool a Tesla’s camera system is a little tape.
Two security researchers managed to trick two Teslas into accelerating well past the speed limit – by fooling their camera systems into misreading a speed sign. We first saw the news via MIT Technology Review.
McAfee researchers Steve Povolny and Shivangee Trivedi stuck two inches of black tape on a 35mph speed sign, slightly elongating the middle line in the “3”.
They then drove a 2016 Tesla Model X towards the sign with cruise control enabled.
Cruise control is a feature of Tesla’s autopilot which is supposed to control the car’s speed and keep it a safe distance behind the car in front of it.
As the car approached the altered sign it misread it as 85mph, and started to accelerate by 50mph.
The same happened in a 2016 Model S.
McAfee disclosed the research to Tesla and MobilEye EyeQ3, the company which provides the Tesla 2016 models with their camera systems, last year. Tesla declined to comment to MIT Tech Review but said it would not be fixing hardware problems on that generation of vehicles.
MobilEye EyeQ3 dismissed the research.
A spokesperson told MIT Tech Review the modified sign could have been misread by a human and said the camera hadn’t been designed specifically for fully autonomous driving which they said will use a variety of technologies including “crowdsourced mapping” to support cameras.
Tesla’s newer models use proprietary cameras, and MobilEye EyeQ3 has released newer versions of its cameras since then which when tested by McAfee were not fooled by the modified sign.
McAfee researcher Steve Povolny told MIT Tech Review that the findings are still concerning, as plenty of 2016 Teslas are still on the roads. “We are not trying to spread fear and say that if you drive this car, it will accelerate into through a barrier, or to sensationalize it. The reason we are doing this research is we’re really trying to raise awareness for both consumers and vendors of the types of flaws that are possible,” he said.
The safety of Tesla’s autopilot systems is under close scrutiny. Last year the National Highway Traffic Safety Administration launched a federal probe into two fatal Tesla crashes in which it determined the autopilot had been 0n.
The company is also fighting a lawsuit against the wife of Walter Huang, an Apple engineer who died after his Tesla crashed into a motorway barrier while on autopilot.
According to Huang’s wife, he had complained about the car’s autopilot veering towards that same barrier multiple times before. Data released by the National Transportation Safety Board last week confirmed the claims, per Ars Technica.
Currently, Tesla emphasizes that its autopilot tools are not meant to make the car fully autonomous, and drivers must always keep their hands on the wheel. But CEO Elon Musk insists that he intends to make Teslas fully self-driving in the near future.
Last year the tech billionaire claimed the company would have a “feature-complete full self-driving” vehicle by the end of 2019. He was forced to walk back that prediction during Tesla’s Q4 earnings call at the beginning of 2020 but still suggested full self-driving is just on the horizon. “It’s looking like we might be feature-complete in a few months,” Musk said.