ADVERTISEMENT

Uber Crash Shows Need for Collaboration in Self-Driving Cars

Here’s a vision of the future: A fleet of 25 million robots roam the streets of America, killing someone every day.

Uber Crash Shows Need for Collaboration in Self-Driving Cars
An engineer points to a screen inside a Hyundai Motor Co. Nexo autonomous fuel cell electric vehicle during a test drive in Pyeongchang, Gangwon Province, South Korea. (Photographer: SeongJoon Cho/Bloomberg)  

(Bloomberg Gadfly) -- Here's a vision of the future for you: A fleet of 25 million robots roam the streets of America, killing someone every day.

The owners of the robots skew toward the wealthiest 10 percent of the country, who use the technology to get more time sleeping, posting on Instagram, getting over drinking binges and watching cat videos. The victims skew toward the poor and underprivileged.

Believe it or not, that might be a slightly rose-tinged view of the near future of autonomous vehicles: A situation where self-driving cars make up 10 percent of the U.S. vehicle fleet and cause 90 percent fewer fatal accidents. If the autonomous fleet is larger, or less successful at averting accidents, the toll will be higher.

That's the context in which to think about Uber Technologies Inc.'s decision to suspend road tests of its self-driving cars Monday, after one of them struck and killed a woman in Arizona.

Uber Crash Shows Need for Collaboration in Self-Driving Cars

The incident shouldn't come as a surprise. Following the lead of many experts in automobile and aviation safety, Gadfly has been arguing for years that the industry needs to be more honest with itself about the risks inherent in automated driving. It's well past time for the technologies' boosters to heed this wake-up call.

This isn't Luddism. The lesson from other transportation technologies is that autonomous cars should eventually become very safe. But the same history indicates the most likely route to that destination involves learning the grim lessons of fatalities as accidents start to spike in line with adoption of the technology. Transportation safety advances the same way Max Planck saw science progressing: One funeral at a time.

Uber Crash Shows Need for Collaboration in Self-Driving Cars

What practical lessons should be drawn from this?

  1. Be realistic about expectations. Outside of some routine and specialized uses in gynecology, robotic surgery shows no clear pattern of outperforming human surgeons in safety terms, despite the fact that it's been widespread for two decades and is only used in the controlled environment of an operating theater. Similarly, in its initial years, the safety benefits of autonomous-vehicle technology are likely to be little better than would be gained from (say) better enforcement of seat-belts, or speed limits. It would be wise to acknowledge that the early benefits may seem modest.
  2. Be honest about what we know. The relatively light toll from autonomous vehicles to date needs to be considered in the light of how little they're used. The fatality rate on U.S. roads is about 1.18 per 100 million vehicle miles traveled. Uber had completed just two million miles in December and even Alphabet Inc.'s Waymo has only made it to four million miles. There's simply not enough data to make bold claims about safety yet, and it may take decades to assemble such information. Rolling out this technology is a calculated leap in the dark, and should be acknowledged as such.
  3. Don't try to reason with fear. Even once we have autonomous cars that are demonstrably safer, people may not feel safer around them because we tend to grade risks according to our perceived control. Airlines and aerospace companies have responded to that by making air travel extraordinarily safe, but people are still more fearful of flying than driving. The fact that we may feel more at risk around cars whose accident responses are governed by algorithms is a problem the industry needs to overcome, rather than wish away by pointing to theoretical safety gains.
  4. Sharing is caring. The best way of finding out the causes of crashes and averting future ones is to share information about them. That's largely the approach taken by the aviation industry, but it's one that many in the auto industry are reluctant to embrace because of a tendency to see such data as proprietary information that companies need to maintain their competitive advantage. The larger the industry's safety database, the faster accident rates are likely to fall.
  5. It's more important to be right than first. It's natural in a technology gold rush to want to be first to the mother lode -- but the biggest risk to the deployment of autonomous cars may be that accidents caused by an over-eager rollout lead to a regulatory backlash. Despite attempts to harmonize safety rules at the federal level in the U.S., the relatively permissive environment for driverless cars could still be taken away by state-level regulations. The players to emulate are probably those being most circumspect, not those that are most bullish.
Uber Crash Shows Need for Collaboration in Self-Driving Cars

Silicon Valley has proved adept over the decades at developing new technologies at lightning speed -- but as the fallout of the Theranos Inc. scandal has shown, a beta-testing, shoot-for-the-stars approach that might work for selfie apps can become scandalous in dealing with people's lives.

That's the lesson for the auto industry. "Move fast and break things" is all very well, unless the things that end up broken are human bodies.

This column does not necessarily reflect the opinion of Bloomberg LP and its owners.

David Fickling is a Bloomberg Gadfly columnist covering commodities, as well as industrial and consumer companies. He has been a reporter for Bloomberg News, Dow Jones, the Wall Street Journal, the Financial Times and the Guardian.

To contact the author of this story: David Fickling in Sydney at dfickling@bloomberg.net.

To contact the editor responsible for this story: Paul Sillitoe at psillitoe@bloomberg.net.

©2018 Bloomberg L.P.