The Arizona police have confirmed the death of a woman struck by Uber self-driving vehicle, an accident that appears to be the first fatal crash involving autonomous driving and a pedestrian.
Temple police officers said that the crash, which hit a woman who later died of injuries at a hospital, occurred when the Uber vehicle was in autonomous mode with an operator behind the wheel.
Having suspended self-driving vehicles in Phoenix, Uber is also said to be working with the police in the investigation. A statement from the company's Twitter account said: “Some incredibly sad news out of Arizona. Our hearts go out to the victim’s family. We are fully cooperating with local authorities in their investigation of this incident.” A spokesman from Uber declined to comment further on the crash.
Uber's self-driving vehicles have been on testing for many months in Phoenix, San Francisco, Pittsburgh, and Toronto. But the service is being suspended sequel to this accident.
Self-driving cars are designed with the belief that computers can operate vehicles better than human. But the notion is going ambiguous as accidents involving the autonomous mode of driving look more threatening than those with human operators.
According to The Guardian, automobile accidents kill about 33,000 people every year in the U.S. That's approximately 90 people killed every day. Taking a lower scale and referencing May 7, 2017, that recorded 89 deaths in an auto crash, including Joshua Brown who died using autopilot mode, we can say, in the US, human drivers kill approximately 88 people in auto crash daily.
A report from the US National Vehicle Crash Causation Survey revealed that "94% of all accidents in the US are caused by human error (i.e. drivers)."
This statistics may not be bad enough to consider more strict regulations on the autonomous mode of driving. But concerns would force more scrutiny of the technology to further minimize the odds of accident occurrence.
However, autonomous mode of driving cannot be declared safer yet or otherwise. A report released by Rand Corporation argued that safety and injury terms of self-driving would take decades or centuries to establish since it requires driving with the technology for many miles.
"Fully autonomous vehicles would have to be driven hundreds of millions of miles and sometimes hundreds of billions of miles to demonstrate their safety in terms of fatalities and injuries," Rand said.
It is still unclear to ascertain if the self-driving technology is leading us the wrong way. Perhaps we are looking at a technology that could help us to avoid yearly 330,000 deaths in the next decade. We can say that the question on automobile safety would boil down to the society's choice in due time, but the media will play a good role on how accidents involving autonomous vehicles are reported.
"In parallel to developing new testing methods, it is imperative to develop adaptive regulations that are designed from the outset to evolve with the technology so that society can better harness the benefits and manage the risks of these rapidly evolving and potentially transformative technologies" Rand advised.