ADVERTISEMENT

YouTube’s Plan To Clean Up The Mess That Made It Rich

Extremist propaganda, dangerous hoaxes, tasered rats—YouTube is having its worst year ever. Except financially.  

Source: Bloomberg Businessweek
Source: Bloomberg Businessweek

Susan Wojcicki, the chief executive officer of YouTube, was in a meeting on the second floor of her company’s headquarters in San Bruno, Calif., when she heard the first gunshot. It came from outside; more followed. Some of her employees ran for the exits; others barricaded themselves in conference rooms. Those eating lunch on the outdoor patio hid under the tables.

At a press conference the following day, April 4, the San Bruno police confirmed that the suspected shooter was Nasim Aghdam, 39, an enigmatic social media personality from San Diego. She’d acted alone, wounding a handful of passersby, then taking her own life. Prior to the rampage, Aghdam posted hundreds of videos on YouTube, holding forth on subjects such as veganism, bodybuilding, and animal rights. According to police, she’d grown enraged with YouTube, which she said was intentionally limiting the reach of her work and her ability to profit from it. “We will come together and heal as a family,” Wojcicki wrote on Twitter.

The incident was tragic and awful in all the ways that have become a depressing American routine. It also put an exclamation point on a dreadful stretch for YouTube, which has lurched from crisis to crisis over the past year. While Donald Trump has in one way or another kept Facebook and Twitter in the headlines—and not the good kind—YouTube has struggled to contain the fallout from the darker impulses of its vast, decentralized creative community.

Susan Wojcicki, chief executive officer of YouTube Inc. (Photographer: David Paul Morris/Bloomberg)  
Susan Wojcicki, chief executive officer of YouTube Inc. (Photographer: David Paul Morris/Bloomberg)  

Somehow, things keep getting worse. PewDiePie, a Swedish comedian and top YouTube personality, made an off-color joke about Nazis. The Times of London reported that many mainstream brands were unwittingly funding white supremacists and Islamic extremists by advertising alongside their videos. The Manchester bomber reportedly used a how-to clip from YouTube to make his deadly explosive. A series of investigations showed how the company’s algorithms were serving up bizarre, grotesque videos to young children. After several mass shootings, conspiracy theorists flooded YouTube with videos falsely claiming that gun control activists had staged the massacres. Parents of children killed at Sandy Hook Elementary School filed a defamation lawsuit against Alex Jones, whose show, InfoWars, is a major presence on YouTube, with 2.3 million subscribers.

And of course there’s the case of Logan Paul, a popular YouTube personality with a penchant for shock humor. In December, while visiting Japan, Paul posted a video showing a dead body he and his buddies discovered in a forest known for suicides. During the resulting backlash, Paul apologized for his insensitivity. A few days later he posted a video in which he tasered dead rats. “No rat comes into my house without getting tased,” he said. “I hate rats.”

For years, YouTube has bragged to marketers that its laissez-faire attitude toward video creators was a feature, not a bug. The company was pioneering a form of mass entertainment more democratic, diverse, and authentic than traditional TV, its argument went, because it was unfettered by producers, network executives, or regulators. Its legions of creators fly around the internet with minimal guidance or oversight: Here are the keys to the jumbo jet, kid—knock yourself out. The ensuing string of crashes has grown difficult for its 1.5 billion monthly users to ignore.

“It’s like a Las Vegas casino saying, ‘Wow, we can’t believe people are spending 36 hours in a casino.’ It’s designed like that”

Creators and advertisers may grumble about YouTube’s imperfect editorial policies and woeful communication, but few comparable venues reach such a massive audience of youngsters. As a result, YouTube’s no-good, messy, horrifying year has also likely been massively lucrative for its parent company, Google.

YouTube, so far, has been better able to avoid political fallout than its fellow internet titans partly because politicians and media scolds don’t watch it as closely. The site skews younger than Facebook, the social network that actively helped the Trump campaign target ads, or Twitter, the president’s biggest platform. And because YouTube doesn’t look like social media, it’s tougher to recognize how its most horrifying videos spread. (You probably heard about them on Facebook or Twitter.) In the fall, when Facebook, Twitter, and Google sent lawyers instead of executives to testify before Congress about Russian meddling in the presidential election, Team Google repeatedly stressed that YouTube and its other properties aren’t really social networks and therefore can’t fall prey to the worst of the internet’s trolls, bots, or propagandists.

Much like Facebook and Twitter, however, YouTube has long prioritized growth over safety. Hany Farid, senior adviser to the Counter Extremism Project, which works with internet companies to stamp out child pornography and terrorist messaging, says that of the companies he works with, “Google is the least receptive.” With each safety mishap, he says, YouTube acts freshly shocked. “It’s like a Las Vegas casino saying, ‘Wow, we can’t believe people are spending 36 hours in a casino.’ It’s designed like that.”

That’s not how Google or YouTube see things. Over the past year, YouTube has made the most sweeping changes since its early days, removing videos it deemed inappropriate and stripping away the advertising from others. But to date, both the video-sharing service and its corporate parent have struggled to articulate how their plan will make things better. Only recently, as Washington has edged closer to training its regulatory eye on Silicon Valley, did YouTube executives agree to walk Bloomberg Businessweek through its proposed fixes and explain how the site got to this point. Conversations with more than a dozen people at YouTube, some of whom asked not to be identified while discussing sensitive internal matters, reveal a company still grappling to reach a balance between contributors’ freedom of expression and society’s need to protect itself.

“The whole world has become a lot less stable and more polarized,” says Robert Kyncl, YouTube’s chief business officer. “Because of that, our responsibility is that much greater.”

Robert Kynel, chief business officer for YouTube Inc. (Photographer: David Paul Morris/Bloomberg)
Robert Kynel, chief business officer for YouTube Inc. (Photographer: David Paul Morris/Bloomberg)

In interviews at the San Bruno complex, YouTube executives often resorted to a civic metaphor: YouTube is like a small town that’s grown so large, so fast, that its municipal systems—its zoning laws, courts, and sanitation crews, if you will—have failed to keep pace. “We’ve gone from being a small village to being a city that requires proper infrastructure,” Kyncl says. “That’s what we’ve been building.”

But minimal infrastructure was a conscious choice, according to Hunter Walk, who ran YouTube’s product team from 2007 to 2011. When the markets tanked in 2008, Google tightened YouTube’s budgets and took staffers off community safety efforts—such as patrolling YouTube’s notorious comments section—in favor of projects with better revenue potential. “For me, that’s YouTube’s original sin,” Walk says. “Trust and safety has always been a top priority. This was true 10 years ago and it remains true today,” YouTube said in an emailed statement.

As oversight dwindled, the amount of material posted on YouTube doubled in two years. By 2010, 24 hours of video were being uploaded every minute. (Today, it’s more like 450 hours per minute.) Suddenly, YouTube needed a better system to help viewers navigate the deluge, something that would keep them from feeling overwhelmed and wandering back to the comfort of their TVs.

In 2010, YouTube hired French programmer Guillaume Chaslot, who soon began developing algorithms that could better match viewers with videos that would keep them watching. Eventually, YouTube engineers found a simple, winning formula: When a viewer finished a video, the site immediately recommended another on a similar topic with a comparable sensibility. Chaslot says the team learned it could increase engagement, and hit ad goals, by bumping up videos with a proven record of keeping viewers watching.

“We hear feedback on both sides: people criticizing us for not doing enough, and others feeling like we’re going too far”

Over time, Chaslot saw adverse effects. Garbage often floated to the top—rants by both flat-Earthers and Holocaust deniers did well. If you watched one video starring a wild-eyed conspiracy theorist, the algorithm would feed you another, and another, and another—and on it went down a rabbit hole of untruth. “You come into that filter bubble, but you have no way out,” says Chaslot, who left the company in 2013 and now runs a project called AlgoTransparency. “There’s no interest for YouTube to find one.”

Despite YouTube’s rapid growth, advertisers stayed wary of the website, and it didn’t generate much revenue until 2010, when Kyncl arrived from Netflix Inc. Over the next several years, he turned YouTube’s amateur creator base from a weakness to a strength. Netflix had to spend hundreds of millions of dollars to license TV shows and movies, but YouTube’s users offered a bottomless reservoir of content for free. Kyncl set about identifying the most popular personalities and turning them into marketable stars.

In 2011, YouTube tweaked the rules so more creators could make money from ads that its algorithm automatically packaged with their videos. A year later it opened a studio in Los Angeles so some of the best amateurs could use high-end production equipment. YouTube also did some advertising of its own, plastering the faces of its rising stars on billboards in major cities. More and more, YouTube was starting to convince advertisers it had become the new TV. Kyncl said as much onstage at Madison Square Garden in 2015 during the company’s annual “brandcast,” at which executives showcase new YouTube programming in front of the world’s top advertisers. Google doesn’t release financial numbers for YouTube, but analysts at Morgan Stanley estimate that the service’s revenue will top $22 billion in 2019.

Unlike with traditional TV, where very little goes on the air unlawyered, top creators can achieve cultural sway without telling anyone at YouTube where they’re going, who they’re filming, or what they might be tasing. All of which, under U.S. law, protected YouTube from liability for damages stemming from the videos it distributed. It also left YouTube in a reactive position: Whenever controversies ignited, executives could do little but try to douse the flames of umbrage long after they had spread. And that bare-bones bucket brigade stood no chance of meeting the challenges of the Trump era.

An employee holding recording equipment walks past Google Inc.’s YouTube logo displayed at the company’s YouTube Space studio in Tokyo, Japan (Photographer: Kiyoshi Ota/Bloomberg)  
An employee holding recording equipment walks past Google Inc.’s YouTube logo displayed at the company’s YouTube Space studio in Tokyo, Japan (Photographer: Kiyoshi Ota/Bloomberg)  

Through the long string of YouTube fiascoes, one thing has been constant: The company has struggled to explain its decisions. It took several days, for example, to come up with a punishment for Logan Paul’s Japanese forest video. Various YouTube contributors lambasted the company for taking too long to respond. “When the nature of the content is that sensitive, and the video is trending, you expect YouTube to be more on top of their game,” says Aditi Rajvanshi, a former YouTube employee who now consults for YouTube stars. Kyncl says avoiding a snap judgment should be seen as evidence of YouTube’s attentiveness to editorial concerns, not as dereliction of duty. The deliberations over how to handle Paul were complex and protracted, he says, because the company didn’t consider the video star beyond redemption. He included a message for a suicide prevention hotline, for instance, and chose not to register the video for ads. “We hear feedback on both sides: people criticizing us for not doing enough, and others feeling like we’re going too far,” said director of public policy Juniper Downs.

Behind the scenes, YouTube executives acknowledged that their infrastructure-challenged megacity needed a massive police presence. They stopped Holocaust-denial videos from popping up in the recommendation feature. In March, YouTube reached out to BuzzFeed CEO Jonah Peretti for ideas, according to a person with knowledge of the exchange. (Peretti declined to comment for this article.)

Meanwhile, Wojcicki went on a listening tour, reassuring nervous advertisers. She met with top buyers at a marketing conference in late 2017 and at the Consumer Electronics Show in January. At some client meetings, YouTube brought along staffers from its content and software teams to ask how traditional TV standards departments work, says Susan Schiekofer, head of digital trading for media buyer GroupM.

“We want to be on the right side of history”

In December, Wojcicki said in a company blog post that YouTube and Google would appoint as many as 10,000 people to help cut the spread of misinformation and abusive content. (The hires represented a 25 percent hike in moderator staffing.) YouTube also pledged that a human moderator would review every video in its Google Preferred program for advertisers before any ad was attached. The goal was to reduce the risk of, for example, a cereal ad running alongside a beheading video. The problem has persisted in the months since then, however, in part because of an institutional disconnect: The staff that monitors YouTube’s content is separate from the Google team that oversees the ads.

In recent months, Marc Pritchard, the top marketer at Procter & Gamble Co. and a prominent Google critic, has met with Wojcicki multiple times. “You went to a large galaxy that was beyond what anyone had ever seen,” Pritchard recalls telling her at one point. “And I don’t think you’ve realized the impact you’ve had.”

“We want to be on the right side of history,” Wojcicki assured him.

YouTube’s growing ranks of moderators now scan the worst videos online: torture, bombs, porn. Moderators are discouraged from working for more than two hours at a time. A psychologist is on call, and group therapy sessions are available. Some moderators are contract workers in the Philippines. Some, like those on Raul Gomez’s team, work out of San Bruno.

Weeks before the Aghdam shooting, Gomez walked Bloomberg Businessweek reporters through a scenario involving anger directed at his co-workers. Gomez isn’t his real name—he insisted on a pseudonym because people in his line of work often face death threats. Nobody likes it when his video is taken down; the psychologically disturbed, even less so.

Inside an office named “Gangnam Style,” Gomez projected a video on the wall. The clip was first posted last summer, soon after a Google engineer was fired for writing an inflammatory internal memo about gender imbalances in tech. Afterward, apoplectic right-wing commentators accused Google of discriminating against conservatives. The clip, produced by a popular right-wing media outlet, opened with a jaunty young woman directing viewers to her laptop, where she pulled up Twitter pages of several Google employees. She scoured through their posts, reading aloud certain passages while ridiculing the named individuals as dimwitted liberals.

With a grimace, Gomez said that while it may be unpleasant to see colleagues singled out for public mockery, the video didn’t violate YouTube’s harassment policy. The performer didn’t exhort her viewers to violence. And if the video were taken down, YouTube would likely face another round of censorship allegations at Breitbart News or the Drudge Report.

Not unlike their Silicon Valley peers, YouTube executives remain convinced that the long-term solution isn’t old-timey Homo sapiens but technology. During the service’s early days, it was rife with pirated videos uploaded and shared without copyright owners’ consent. Eventually, YouTube built an automated system to weed out copyright violations. The idea is that someday humans will be able to train the machines, in a similar manner, to sniff out misinformation, smut, and abuse. They’ve already made some progress. After October’s mass shooting in Las Vegas, YouTube engineers adjusted the algorithms so they recommended more content only from sources the company deemed authoritative.

Finding the right people to help this refinement process is proving to be a challenge. For months, YouTube has been trying to hire someone who can more clearly define its internal policies and messaging about what makes a video publishable. As of mid-April, the position remains vacant.