Hacked billboards could trick self-driving cars into suddenly stopping
‘A Tesla will apply the brakes or possibly swerve, and that’s dangerous,' security researcher warns
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Self-driving cars could be tricked into suddenly stopping or dangerously swerving if presented with a “phantom object” on a billboard or other digital display, researchers have warned.
A demonstration by security researchers at Israel’s Ben Gurion University of the Negev showed that a hijacked billboard showing an image of a stop sign for just a fraction of a second would be enough to trigger the automatic brakes of an autonomous car.
Automated driving systems could also be confused by an image of a pedestrian flashing in front the car.
“The attacker just shines an image of something on the road or injects a few frames into a digital billboard, and the car will apply the brakes or possibly swerve, and that’s dangerous,” researcher Yisroel Mirsky told Wired, which first reported the vulnerability.
“The driver won’t even notice at all. So somebody’s car will just react, and they won’t understand why.”
The researchers created a scenario where attackers inject fake road signs into an internet-enabled billboard by hacking it – a technique that has been used by hackers in the past. An incident in 2017, for example, saw a road sign in California display the message “Trump has herpes.”
The phantom objects inserted by the researchers were able to fool a Tesla Model X running the most recent version of Tesla’s Autopilot software.
They used a Mcdonald’s advert playing on a television-sized billboard for the test, adding a Stop sign over the ad for just 0.43 seconds.
A research paper detailing the attack method revealed that it also worked on the Mobileye 630 Pro self-driving system.
The researchers proposed that countermeasures could be taken, such as introducing software that recognises when a flashed speed sign is a phantom object.
Tesla’s Autopilot system is known to occasionally misinterpret speed signs placed on the back of school buses. Earlier this week, a Tesla owner alerted the company’s CEO Elon Musk to the issue on Twitter.
“This is getting annoying now,” they tweeted, together with an image of their Tesla identifying the bus as a speed sign.
Mr Musk replied: “We face a tough dichotomy of applying resources to the old architecture or applying them to the new. It’s not a question of money. If there was a ‘great engineer’ factory, we would place a large order! Unfortunately, great engineers are very rare."
The researchers of the latest study said they presented their findings to Tesla and Mobileye.
The Independent has reached out to Mobileye for comment. A spokesperson for Tesla was not available.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments