ADVERTISEMENT

How to Provide Real Oversight of Social Media

How to Provide Real Oversight of Social Media

As you watch yet another congressional hearing where social media CEOs awkwardly put on suits and ties to defend the indefensible to the uncomprehending, you couldn’t be blamed for feeling hopeless. Our long-standing policies for regulating traditional media have collapsed in the face of user-generated content, with networks of tens of millions of people creating it in real time at no cost. And it’s not just members of Congress who are ineffectually shaking their fists at the spread of misinformation and extremism on social media platforms; the executives of these companies themselves also seem powerless to do much more than keep abuse to a dull roar, offending defenders of free speech and defenders of civil discourse in equal measure.

Most policy proposals to keep abuses of social media in check range from the bad to the worse. Breaking up Big Tech may appeal to antitrust lawyers, but a YouTube that’s no longer part of Google isn’t less likely to host anti-vaccination propaganda. Revoking Section 230 protections — which provide companies legal immunity from content their users post — for entities that host content would essentially kill them.

The usual Silicon Valley response to regulation is to offer self-regulation instead — only tech companies have the skills and speed to fix what they broke. They’re not wrong, but even Facebook Inc., Twitter Inc. and Google have struggled to find techniques that are both good at dampening the worst abuses at scale and still trusted enough to keep Congress off their backs. 

We have a solution that resurrects a concept from the golden age of newspapers, back when they were the vaunted Fourth Estate that provided the necessary information, accountability and counterbalance to government that makes for a well-functioning democracy: the ombudsman.

Newspapers typically employed a quasi-independent ombudsman who served the broader interests of readers — of society, if you will. Inspired by similar roles in government, a newspaper ombudsman had two main functions: a reactive one (handling complaints from the public) and a proactive one (watching the organization from both inside and outside and flagging bad behavior). 

Today, ombudsmen are an endangered species in the media world as newspapers lose their central role in the public discourse. The New York Times retired the role, known as the public editor, in 2017, and it was one of the last to do so. But it’s ripe for reinvention in the digital economy. If social media companies are the “networked Fourth Estate,” can they accept the independent oversight responsibilities that come with that status?

Facebook has nodded at this with its Oversight Board, whose globally diverse members include lawyers, activists and academics (not required for membership: having ever used Facebook). It is now more than two years old and has made decisions on just seven cases out of the more than 220,000 complaints that have been referred to it. Putting aside whether these were the right decisions — its members say they’re already frustrated by the binary “leave it up/take it down” nature of their charter  — it’s clear that this board is too slow, too reactive, too opaque and too removed from the operations of the company to make an appreciable dent in Facebook’s acceptable-speech problem. 

Independent ombudsmen would be much better. First, they can be proactive, not just reactive, to complaints. Second, because they would have access to a company’s engineers and technology, they could address root algorithmic causes rather than just consequences. Finally, because they would have full-time positions with budgets for full-time staff, not a committee meeting quarterly, they could be fast, nimble and find problematic patterns within hundreds of thousands of complaints. It’s better than counting on existing internal ethics and policy teams — which as non-independent groups risk conflict of interest, and have no requirement of transparency and no assurance of action.

Here’s how this could work:

  1. Once a social media company hits 25 million monthly active users in the U.S., congrats! Your success means that you’ve triggered the regulatory “public interest” threshold.
  2. Within 90 days, you must hire an independent board member who will serve as the people’s representative, aka the ombudsman. Call it “director of social governance.”
  3. The ombudsman, who would be hired for a set term (say, two years) and cannot be fired, must be free of financial conflicts of interest.
  4. The ombudsman will have full access to the company and its technology, as any board member does. Unlike the other board members, this one issues public reports and recommendations, similar to the roles of inspectors general and the General Accounting Office in government.
  5. Like traditional ombudsmen, this representative performs two key roles: (1) reviewing public complaints and (2) initiating  investigations of structural root-cause problems in the company’s algorithmic designs and business model.
  6. Once the representative issues a public recommendation, the company must respond within 48 hours if it can be fixed with existing software, or 30 business days if it requires more software development or algorithmic testing.

Let’s stop kidding ourselves that the old-world, hierarchical approaches to regulating media outlets will work with social media companies. These are networks; they require network solutions. Let’s augment them with network-style pieces so that they function better. These networks are important, and they are going to be with us in one form or another forever. We have to help them learn to self-manage, the way the old Fourth Estate used to.

In one case, they were confounded by the subtle racial politics of ethnic minorities in Myanmar, and the discussions were in a language none of the panel members speak.

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Chris Anderson is the chief executive officer of 3D Robotics. He is also the founder of the Linux Foundation's Dronecode Project, as well as the DIY Drones and DIY Robocars communities. He is a former editor in chief of Wired magazine and author of "The Long Tail" and "Free."

James Currier is a general partner at venture capital firm NFX.

©2021 Bloomberg L.P.