Bookmark and Share

Tesla Motors, known for wrapping their innovative long range battery technology in an attractive future-car shell, has recently become the victim/beneficiary of some white-hat hackers who displayed their research at the Def Con hacker conference in early August. Researchers presented their findings surrounding vulnerabilities that remain in Tesla’s sophisticated sensors which the vehicles utilize for their autonomous driving capabilities. Tesla took some negative press last May when Joshua Brown was killed in Florida when the autonomous driving features of his Model S failed to avoid hitting a tractor trailer.

Researchers working at the University of South Carolina and Zhejiang University in China have discovered, in collaboration with Chinese security firm Qihoo 360, several possible methods for confusing and disrupting Tesla’s Model S self-driving capabilities.

Tesla’s autonomous driving systems detect the vehicle’s surroundings through three different methods: visual cameras, radar and short-range ultrasonic sensors.

The researchers discovered that through the application of a couple expensive high-tech gadgets such as a radio signal generator and a frequency multiplier created by Keysight Technologies and Virginia Diodes Inc., they were able to precisely jam the front-end-grill radio sensors. Since the Tesla Model S uses these radar sensors to determine the locations of nearby objects, researcher and University of South Carolina professor Wenyuan Xu, believes that this illustrates the potential for determined hackers to potentially cause high-speed collisions. These “hacks” were done under practice conditions, and would be significantly more difficult to reproduce against a potential victim in the real world.

Even Tesla’s Batmobile-esque “summon” feature, controlled by its short range ultrasonic sensors which also control its self-parking feature, can be manipulated by a cheaper array of technologies. In order to fool sound-based sensors, the researchers used a small Arduino³ computer to create certain electric voltages that are then passed through an ultrasonic transducer to convert those voltages to sound waves. Using those converted sound waves the researchers were able to trick the vehicle into falsely recognizing a phantom object in its path, or to cause it to fail to recognize a legitimate obstacle. They also showed that sometimes the sensors could also be fooled by wrapping objects in cheap, low-tech acoustic dampening foam.

On the upside, the visual cameras remain largely immune to tampering as the researchers had little success manipulating them through various jamming and spoofing techniques. Though much of the focus of the Def Con presentation was on Tesla, the researchers also successfully jammed sensors on vehicles made by Ford, Audi and Volkswagen.

A Tesla spokesperson responded to the research: “We appreciate the work Wenyuan and team put into researching potential attacks on sensors used in the Autopilot system. We have reviewed these results with Wenyuan’s team and have thus far not been able to reproduce any real-world cases that pose risk to Tesla drivers.”

White hat hackers and independent researchers are a boon for technology businesses as they can play a crucial role in aiding the discovery of problems and vulnerabilities before mass deployment. Tesla CEO and wunderkind Elon Musk took to Twitter to echo this belief earlier this year in response to a Tesla owner’s complaints about system updates, proclaiming “good hacking is a gift.”