The pictures are dramatic. An autonomous taxi owned by Waymo bursts into flames on a San Francisco street. It was the work of vandals and celebrants of the Lunar New Year. Some observers, however, have suggested the burning vehicle is a metaphor for the development, and acceptance, of self-driving vehicles.
There have been a series of setbacks for the companies building and testing autonomous vehicles, but Anirudh Koul, who captured this video of the burning autonomous taxi, tells me this was not a political statement. He says antisocial elements were conducting, “a series of escalating dares,” that resulted in someone, “putting a powerful firecracker [in the car] and blowing it up.”
Still, it has been a difficult few months for those working on autonomous vehicles.
In December, Tesla agreed to recall just about every car it has made for a self-driving software change. Cruise, GM’s autonomous vehicle company, has shuttered many of its cars, or as it says, “proactively pause[d] driverless operations across all of our fleets.”
Cruise’s action came after a pedestrian was pinned under a vehicle. The company is accused of under-reporting the severity of the accident. It lost its California permit, and has seen the forced exodus of several executives. Cruise recently offered to settle part of its problems with regulators.
THE GOAL
The goal of a fully autonomous fleet is laudable. Humans get distracted, or are impaired often behind the wheel. The promise of autonomy is to dramatically reduce the nearly 40,000 traffic deaths each year. The many timelines and promises of autonomy have not been met.
“Ultimately full autonomy, autonomous vehicles, will save lives. But until then, it's the wild wild west.” That is Robert Sumwalt, the former Chairman of the National Transportation Safety Board (NTSB). He’s a former pilot who has used autonomy in aircraft and dealt with some of the early promises and accidents involving self-driving cars
“We should not have people out driving cars and having other road users being the guinea pigs to see how well the system will operate,” Sumwalt told me recently.
HE HUNG UP!
While he was heading the NTSB, a Tesla driver was killed after the car drove into highway barrier in 2018 (this is the accident that ultimately led to the December 2023 Tesla recall). Tesla was a party to the NTSB investigation. None of the parties are allowed to speak to the media other than the NTSB. But, Elon Musk announced the driver, not the vehicle, was responsible for the crash.
Sumwalt called Musk to remove Tesla as a party to the continuing investigation. “He threatened to sue the NTSB. And when he said, ‘I want to do whatever we can to prevent being kicked off of the investigation.’ I said, ‘sorry, but we're past that point. Tesla has been removed from the investigation.’ He basically hung up on me.”
Musk continues to tout what his company calls Full Self Driving. But in the fine print, even Tesla admits that drivers need to stay engaged and attentive to the road and steering wheel.
“You just can't just get turned loose. It's designed to be a driver assistance system. But it's also with the caveat that the driver has to maintain his or her eyes on the road. And his mind his or her mind on the road and his or her hands on the steering wheel,” Sumwalt says.
Interview/Podcast with Robert Sumwalt which can also be viewed on YouTube.
A SLOW DOWN?
Research and development are continuing at Waymo, Tesla, and others. Despite all the promises, the road to autonomy was never going to be smooth or fast. I asked Sumwalt when we will get to fully autonomous vehicles on the road?. “I have no clue. I don't think it will be in my lifetime and I'm 67 years old at this point.”
Finally, here are two side by side photos of the same San Francisco street. The promotional promise, and if you believe some observers, the peril still ahead for self-driving vehicles.
I think this is still many many years away from happening. The car makers place many caveats on operating these vehicles safely such as remaining fully engaged in the driving process with their hands on the wheel and/or watching the road. Not that I disagree with these caveats. I don't. It is the fact that too many drivers will NOT respect or follow the "rules" that they must keep their hands on the wheel, or keep an eye on the road, not drink and drive, etc. in order to react to unexpected road hazards. To quote comedian Ron White, "You can't fix stupid".
If every vehicle on the road were fully automated and could "communicate" with every other vehicle within its local driving sphere, I can see it being more acceptable by drivers as well as the NTSB and other government regulators.
Having said all that, if they can reduce accidents and deaths by large numbers, even with their current problems and limitations, then perhaps they can be seen as "acceptable". There are over 40,000 vehicle accident deaths every year in the U.S., not to mention the millions and millions of injuries and damage, virtually all caused by the drivers of vehicles. If autonomous vehicles can substantially reduce these numbers, regulators and the public may decide they may be a good way to go. The two big problems are determining if these reductions are realistic, and then getting enough of them on the road with "guiney pig" drivers and passengers to prove it.
We'll see.