Google's self-driving cars are now tackling street driving, but are they too wimpy?
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Google has released the latest update to its self-driving car project, showing how the vehicles have been busy learning to navigate not just motorways, but chaotic city streets.
The video below shows the company’s driverless car navigating a range of different hazards, including construction works, blocked lanes and railway crossings.
“As it turns out, what looks chaotic and random on a city street to the human eye is actually fairly predictable to a computer,” wrote Chris Urmson, the director of the project, in a blog post on Monday.
“As we’ve encountered thousands of different situations, we’ve built software models of what to expect, from the likely (a car stopping at a red light) to the unlikely (blowing through it).”
The technology giant say that their vehicles have now logged nearly 700,000 autonomous miles since 2009, with the only two recorded accidents directly caused by humans (in one a driverless car was rear-ended at a stop light in the other a human had taken control of the vehicle).
However, the latest video by Google highlights another potential problem: what if self-driving cars are too timid?
In the city driving demonstration Google’s vehicle is cautious to a fault, in one scenario detecting when a cyclist makes a hand signal to change lanes (a big step forward) and continuing to yield “even when [the cyclist] changes his mind multiple times".
Now, obviously a human driver would do the same – not knowing whether the cyclist was having some sort of trouble or just being an idiot – but it does suggest that self-driving cars might lack some of the intuition necessary for realistic driving.
Being extremely careful is certainly a point in self-driving cars’ favour, but it could cause problems – from bored kids playing chicken with robot drivers to humans taking advantage of the cars deferential treatment when, say, pulling out in traffic.
There’ll always be a human in the car to take the wheel (or beep the horn) but if Google really wants to build, as it says, “a vehicle that operates fully without human intervention" will it need to give its algorithms a bit of aggression, as well as prudence?
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments