Conference workers speak in front of a demo booth at Facebook’s annual F8 developer conference. (Photographer: Noah Berger/AP Photo)

Facebook Failed to Hit Pause on Opening Portal to Wicked

(Bloomberg Gadfly) -- We can all agree that companies should make products that don't hurt people, whether it's cars or smartphones or snack foods.

But do software companies consider the potential harm that might arise from their products?

This week, responding to a murderer who promoted his killing on Facebook Live, CEO Mark Zuckerberg said the company would do more to limit such horrors. "We have a lot of work, and we will keep doing all we can to prevent tragedies like this from happening," he told a concert hall full of developers in San Jose, Calif.

Facebook Failed to Hit Pause on Opening Portal to Wicked

But this isn't the first time Facebook has had to confront unspeakable acts that were streamed live. There have been more than 60 shocking events broadcast in real time on Facebook, according to a Wall Street Journal tally, including a beating of a mentally disabled man in Chicago and a Florida teenager who streamed her suicide.

Facebook doesn't want this to happen with its live video technology. And companies can't be responsible for all the evils carried out on their digital properties.

But this week I wondered if instead of now deploying more people and technology to screen for horrific abuses streamed live, Facebook should have thought harder about the wisdom of letting the technology loose on the world in the first place.

I was surprised when I read about the background of how Facebook threw itself behind live video last year. Zuckerberg decided in a meeting to forge ahead with live video technology and ordered 100 people to work on it nonstop over two months. During the rapid development, it seemed no one at Facebook anticipated how people might abuse live video. (Facebook in a statement said it thought "long and hard" about the potential abuses of live video.) 

Facebook Failed to Hit Pause on Opening Portal to Wicked

Sure, nearly everyone would use this technology for innocuous activities like broadcasting themselves with Chewbacca masks. But Facebook has nearly 2 billion users, and inevitably some of these people will use the technology to distribute terrible acts, without a filter, to the world. Product development is the moment to think about such potential worst-case scenarios -- not to simply clean up the messes later.

Of course, there are no silver bullets to preventing violence on or off Facebook. Most YouTube videos aren't live broadcasts, and plenty of terrors still crop up there. Live video may hold unique challenges, however. Some academics have suggested live video technology emboldens people who want attention for violence.

It's interesting to contrast Facebook's rush into live video with the methodical way the company is trying commercials in the middle of live or prerecorded videos in its news feed.

Facebook first started letting celebrities use live video in 2015 and then opened it up to everyone within a few months. With ads that are inserted in the middle of videos, however, Facebook started last year with a handful of select companies, and earlier this year it expanded the test to prerecorded videos.

This week at Facebook's developer conference, executives said such ads were still in "early testing." And the worst-case scenario if these video ads go awry is some Facebook users or marketers get annoyed. Unlike with live video, this is not life or death.

A version of this column originally appeared in Bloomberg's Fully Charged technology newsletter. You can sign up here.

This column does not necessarily reflect the opinion of Bloomberg LP and its owners.

Shira Ovide is a Bloomberg Gadfly columnist covering technology. She previously was a reporter for the Wall Street Journal.

To contact the author of this story: Shira Ovide in New York at