Sometimes researchers and cybersecurity specialists don’t detect security flaws in some technological developments until they hit the market. In the worst cases, the first to detect these flaws or security loopholes are threat actors, so the work of researchers and security firms is vital to correct these flaws before they are exploited.
Recently, a team of researchers from Ben-Gurion University in Israel published a report detailing a security loophole in autopilot systems used in various smart cars, including the Tesla Model X. According to the researchers, it is possible to employ a drone and a mini projector to deceive these navigation systems, projecting fake images along the way or onto surrounding billboards.
The exploitation of these safety faults would result in unforeseen activation of the car brakes or, in other cases, sudden changes in the course of the car, compromising the physical integrity of the occupants. For testing, cybersecurity experts used a Tesla Model X, as well as an advanced driving assistance system (ADAS) from manufacturer Mobileye.
HACKING AUTOMATIC DRIVING SYSTEMS
Look carefully at the following image; Do you think what you’re seeing is real?
Well, both the Tesla Model X and Mobileye 630 PRO system identified the projected images as real, forcing a change in the car’s actual operation.
Researchers refer to these images as “phantoms”; depthless objects that ADAS and autopilot systems perceive and consider as a real physical object. Projections may vary, including images of people, other cars, road signs or even lanes on the asphalt.
During the experiment, cybersecurity experts used a mini drone-mounted projector. The projection of the ghosts took just 125 milliseconds, enough to cause the Tesla Model X to deviate from the original course.
These projections can also force sudden braking of the Tesla Model X. It is then appreciated how the car slows down significantly after the automatic handling system detected one of these ghosts, identifying it as a real person.
While the exploitation of these loopholes is complex, some security measures need to be considered. According to cybersecurity experts, one way to mitigate the risk of exploitation is to configure ADAS systems to take into account factors such as light reflected in objects and surface, obtaining better detection of real objects and projections without depth.
This is not the first time that security failures are detected in a connected car. A couple of years ago, the International Institute of Cyber Security (IICS) reported a security hole in a Jeep Cherokee that allowed attackers to take control of the car’s critical systems, so multiple models had to be removed from the market.