Mark Zuckerberg Is Totally Out of His Depth
(Bloomberg Opinion) -- I might be the only person on Earth feeling sorry for the big boys of technology. Jack Dorsey from Twitter, Mark Zuckerberg from Facebook, all those Google nerds: They’re monumentally screwed, because they have no idea how to tame the monsters they have created.
The way I see it, these guys -- and they are mostly guys -- were arbitrarily chosen. They started with some good ideas, some luck, great timing, got lots of people to believe in their rosy vision, and they won the unicorn lottery. Little did they know or care what problems they were creating. And now, they’re being asked to solve -- or acknowledge, or something --- some really big issues, such as what to do about people who use their platforms to meddle in elections or spread lies, paranoia, bigotry and straight-up hate.
The world expects great things of them, because they’re supposed to be geniuses. Problem is, they’re not. There’s nothing they can do except apologize, turn off their big machines and walk away. I doubt they’ll do that. Instead, they're manufacturing baloney explanations about how they’ll use more technology, or maybe more people, to handle the civic duties they had hoped to avoid.
Take Zuckerberg. He made big promises to Congress about the capacity of artificial intelligence to root out toxic content. Sadly, they were also wildly premature -- something that Dorsey at least had the decency to admit:
Algorithms can’t comprehend truth. They just repeat the past. If we train them to delete tweets with a specific word in them, they can certainly be automated. But the malevolent actors who want to blow through an automated censoring algorithm will do so easily. Until we have a model of truth that is much better than what we have now, there’s simply nothing else to say about it.
That leaves hiring humans to filter everything that emerges from the firehose of meaningless updates, cat pictures and lies, possibly with an automatically generated list of ranked things to worry about (which, to be clear, is not AI, it’s just an automatically generated list of things to worry about). Yet there are major problems here, too. For instance:
- If you use people, you’re admitting that you have a policy of censorship, and users don’t like that, especially when they get censored. (Of course you already had censorship-by-algorithm before, but it was nice -- and very inexpensive! -- to imagine that it wasn’t censorship, when it totally was.)
- You have to create a censorship policy, but you’re a rich nerd boss and you have no idea how to do that, which doesn’t feel great.
- It’s expensive to hire people to do censorship. And if you try to save money by hiring people in other countries who can’t possibly understand the context, you're asking for trouble.
- When you make your inconsistent policies public, it gets much harder to dodge responsibility for stuff like fomenting genocide. Also, those problems suddenly seem much more up close and real.
- Your investors, who had pretty high hopes for your future profits, will get disappointed. Maybe time to pair up with a bank?
So, back to my sympathy. These boys are all super rich, so it's limited. But I’m imagining being their mom, feeling for them. They all started out wanting to make the world a better place using cool technology, and here they are, dealing with all of this democracy and public responsibility stuff, which they never signed up for and honestly don’t have the chops to handle.
As their fictional mom, I’d like to offer some advice. Retire, step aside. Maybe find a new hobby. Ask someone smarter and more educated, thoughtful, and civic-minded to decide on the future of your companies.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Cathy O’Neil is a Bloomberg Opinion columnist. She is a mathematician who has worked as a professor, hedge-fund analyst and data scientist. She founded ORCAA, an algorithmic auditing company, and is the author of “Weapons of Math Destruction.”
©2018 Bloomberg L.P.