Tesla autopilot crash: Fatal collision was tragic but self-driving technology should still continue, say experts

Scientists and engineers say that automated driving technology could make our roads safer in the long run

Emma Boyle
Friday 01 July 2016 13:18 BST
Comments
The interior of a Tesla Model S is shown in autopilot mode
The interior of a Tesla Model S is shown in autopilot mode (Reuters)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Experts have come out in defence of automated driving technology after a driver was killed while using his Tesla's autopilot feature. Specialists in the fields of artificial intelligence, engineering, and transport have said that while the death was tragic, it should not prevent the software from being developed.

Joshua Brown, 40, died when his Tesla Model S went underneath the trailer of a lorry that had turned left in front of him on a Florida road in May, prompting an urgent investigation by Tesla itself and the US authorities.

In a statement on its blog, Tesla explained that the technology is still under development and that as an assist feature drivers “need to maintain control and responsibility for [their] vehicle' while using it.“ The company went on to say that the autopilot mode does still result “in a statistically significant improvement in safety.”

Experts have responded to the news by saying that Tesla should not let the accident deter the continued development of the technology. According to Professor Nello Cristianini, Professor of Artificial Intelligence at the University of Bristol, it has the potential to greatly improve road safety. “Tesla reports less fatalities in autonomous cars than in the general driving population (1 fatality every 130 million miles against the 1 fatality in 94 million miles for general drivers),” he said.

Meanwhile, Prof William Harwin, Professor of Cybernetics at the University of Reading, has said that “Unfortunately, fatal accidents will always happen with new engineering systems. There are any number of examples from the past” and that though “Tesla should recall all relevant products to at least disable the lane changing feature until this accident can be fully investigated [...] the bottom line is that cars are likely to be safer with these automatic features, and ultimately with vehicles that can drive autonomously.”

Harwin’s opinion is shared by Prof Duc Pham FREng, from the School of Engineering at The University of Birmingham and Prof Slawomir Nasuto, Professor of Cybernetics at the University of Reading who both say that accidents such as this are an “inevitable” part of developing a new technology such as this.

Though driverless technology is indeed still very much “a work in progress” according to Prof Alan Winfield, Bristol Robotics Laboratory and Director of Science Communication Unit at the University of the West of England he believes that Tesla’s partial automation approach is the wrong one: “An autopilot that requires that the driver is paying attention and ready to take over in a split second is the wrong approach. It is inevitable that a driver's attention will wane if they have nothing to do. I also believe it irresponsible of manufacturers to make unregulated autopilot software available for drivers to try out on public roads.”

Professor Nasuto, however, points out that “it is much, much harder to design artificial intelligence to replace the role of a driver, operating alongside other human drivers, pedestrians and cyclists, than to replace the whole road system” with Sahar Danesh, IET Principal Policy Advisor for Transport, stating “we are unlikely to see fully autonomous vehicles in the very near future but what we will see is increased levels of automation, such as speed and lane control, rather than completely driverless cars.”

Whether or not the degree of automation Tesla allows is the right one, all of the experts agree that the results of Tesla’s investigation into the crash will lead to improvements in self-driving technology that will avoid similar incidents in the future.

Sahar Danesh emphasised that “It is important to remember that driverless vehicles have huge potential to transform the UK’s transport network. In the long term, autonomous cars could improve road safety, reduce congestion and lower emissions” but that “public acceptance and trust are crucial.”

The experts avoid laying the blame with any particular party involved in the crash and investigations are currently ongoing, however, it appears that the general consensus is that whilst driverless technology does indeed have a long way to go before it could be considered completely safe and the accident is a tragedy, the technology has the potential to improve safety on our roads and development should continue.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in