The problem with self-driving cars is that there’s really no such thing. At least not yet.
In England, carmakers have been told to stop claiming that their vehicles are “self-driving,” because drivers have become over-reliant on a technology that still has lots of flaws. An industry report says that carmakers are giving this technology misleading names, such as AutoPilot.
The flaws in self-driving tech came to an ugly head March 18 in Tempe, Arizona, as an autonomous Uber car with a backup driver killed a pedestrian pushing a bicycle. And this past Friday, a comprehensive police report was made public. It says that the pedestrian’s death would not have occurred - IF the safety driver had been paying attention. But the driver was heavily distracted while supposedly managing the car, busy watching “The Voice” streaming to her phone over Hulu.
The Tempe Police Department said the crash could have been “entirely avoidable.” Instead, a pedestrian died and the driver could face charges of vehicular manslaughter, which could include a jail term of up to 10 years.
The Uber car was in autonomous mode at the time of the crash. However, Uber, like other self-driving cars, requires a backup driver inside the car to intervene when the system fails or if a tricky situation occurs. Instead, the distracted backup driver had been looking down for nearly one-third of the 22 minutes before the crash. She looked up only half a second before the crash.
The driver was clearly lured into a false since of security. And Uber trusted the tech enough to disable the emergency braking system in the Volvo involved in the crash. Uber, Waymo and Tesla have a moral obligation to tell the world the truth regardless of what it may do to their stock prices. This technology will come.
Eventually. For now, it’s simply not ready.