BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Hackers Made Tesla Cars Autonomously Accelerate Up To 85 In A 35 Zone

Following
This article is more than 4 years old.

Are you part of the automotive revolution? The move towards electric vehicles is inevitable, given the environmental factors driving the change away from fossil fuels. For many people who have already bought into the EV lifestyle, Tesla is the vehicle of choice. The attraction goes way beyond just the whole "it's electric" thing; Tesla is pushing the envelope in so many areas of automotive technology from battery development and vehicle security to autonomous driving systems. It was the latter that researchers from the McAfee advanced threat research team decided to explore. During an 18-month research project, they were able to demonstrate a replicable methodology whereby a Tesla Model X and a Tesla Model S, both from 2016 and both with the Tesla hardware pack 1, were made to accelerate by 50 miles per hour autonomously. Here's how they did it.

Did security researchers hack not one, but two, Tesla cars?

Tesla has a deserved reputation as the company pushing the electric vehicle envelope the most. Things don't always go as planned, such as when the windows didn't withstand a steel ball being thrown at them during the much-hyped debut of the Tesla Cybertruck. There were also some security issues with Model S key fobs before June 2018, but that issue was dealt with, and Tesla has moved on to have a good track record as far as security is concerned. As, indeed, it should: Tesla works hard at getting security right. The company has even put up a half-million-dollar bounty if elite hackers at the forthcoming Pwn2Own initiative can evade the multiple layers of security in a Tesla Model 3 and execute a full vehicle compromise. This made me raise my eyebrows when I was told that security researchers had managed to hack not one, but two, Tesla vehicles and manipulate the autonomous speed detection system. Unsurprisingly, it turned out not to be quite that straightforward. The Tesla vehicles themselves, a Model X  and a Model S both from 2016 and both with "hardware pack 1" installed were not hacked. The actual target of the security research attention was the Mobileye camera, developed by an Intel subsidiary company and the global leader in autonomous driving and real-time collision avoidance systems.

The Mobileye EyeQ 3 camera exploit

According to the Mobileye web page, more than 40 million cars currently have Mobileye technology installed. These use machine vision algorithms to deliver advanced driver assistance and collision avoidance systems and autonomous driving solutions, with a primary focus on road safety. Steve Povolny, head of the advanced threat research team at McAfee, along with researcher Shivangee, looked to determine how easy it would be to implement an "adversarial machine learning" methodology, also known as model hacking, to manipulate the Mobileye EyeQ 3 system in those Tesla vehicles with the hardware pack 1 fitted. It should be pointed out that the Tesla vehicles did not rely entirely upon the data received from these cameras, such as the reading of road speed signs, when it came to autonomous driving functionality such as automatic cruise control. However, to cut a very long story involving numerous, and very obvious, stickers placed over stop signs and speed signs, the research team finally managed to hone their exploit down to a single piece of 2 inches long electrical tape.

Remarkably, by extending the central bar of the number three on a 35 miles per hour speed limit sign, just with the 2-inch piece of tape, it was determined that a repeatable misclassification from the Tesla Model X and Model S test vehicles. Both of these 2016 models were enabled with Speed Assist (SA) and Tesla Automatic Cruise Control (TACC), and both were susceptible to the model hacking exploit.

The alteration of a roadside speed sign was subtle enough for it likely not to be noticed by most people with the naked eye, especially when driving past. However, that subtle extension was enough to fool the Mobileye EyeQ 3 system into thinking the 35 miles per hour limit was 85 miles per hour. The testing of both vehicles, owned by McAfee employees, demonstrated that the vehicle would autonomously accelerate up to the 85 miles per hour limit. Stickering road signs isn't a new approach, but the McAfee research is perhaps the most refined demonstration of the exploit methodology to date.

The real-world risk of such method hacking

The researchers readily admit that you have to step back and take a reality check with this type of exploit. "Is there a feasible scenario where an adversary could leverage this type of an attack to cause harm?" the report stated. The researchers say yes, but also state that "this work is highly academic at this time."

Bear in mind that there are numerous "only if" mitigations attached to a successful attack. Only if the vehicles are earlier models using the combination of that Tesla hardware pack 1 and the Mobileye EyeQ 3 camera. The latest Tesla models do not implement Mobileye technology. Only if used on the first implementation of TACC when the driver double taps the lever to engage TACC. Only if an attacker has managed to go and sticker signs along a stretch of road, unnoticed, and only if a susceptible vehicle with the relevant technology and TACC activated drives by. It should also be noted that the researchers found that a 2020 vehicle implementing the latest version of the MobilEye camera did not appear to be susceptible to this attack.

The Tesla and Mobileye response

Tesla didn't respond to my request for a statement. However, McAfee researchers say that the findings were disclosed to Tesla on September 27, 2019, and had an acknowledgment of the research in return. Tesla has not, according to the McAfee report, "expressed any current plans to address the issue on the existing platform."

A Mobileye spokesperson says that "the modifications to the traffic signs introduced in this research can confuse a human eye and therefore we do not consider this an adversarial attack. Traffic sign fonts are determined by regulators, and so advanced driver assistance systems (ADAS) are primarily focused on other more challenging use cases, and this system in particular was designed to support human drivers – not autonomous driving. Autonomous vehicle technology will not rely on sensing alone, but will also be supported by various other technologies and data, such as crowdsourced mapping, to ensure the reliability of the information received from the camera sensor and offer more robust redundancies and safety."

You can see the exploit in action in this McAfee advanced threat research video. For reasons of safety, the video shows the acceleration speed spiking but the researchers applied the brakes well before the 85 miles per hour target was reached.

Follow me on Twitter or LinkedInCheck out my website or some of my other work here