ADVERTISEMENT

Is This Facebook’s ‘Big Tobacco’ Moment?

Critics say Big Tobacco once used the same playbook, and it’s fueling a whole new level of outrage against the social media giant.

Is This Facebook’s ‘Big Tobacco’ Moment?
Mark Zukerberg, Facebook founder speaks during an event. (Guillermo Gutierrez/Bloomberg)

Facebook Inc. executives have long boasted that its platforms are safe, even as they invested in ways to keep teenagers hooked and hid what they knew about the side effects. Sound familiar? Critics say Big Tobacco once used the same playbook, and it’s fueling a whole new level of outrage against the social media giant.

Facebook consistently played down its own research that showed how photo-sharing app Instagram can harm the mental well-being of its youngest users, according to a report in the Wall Street Journal. Almost a third of young teen girls told Facebook they feel worse about their bodies after scrolling through the site, documents reviewed by the newspaper showed. Despite that knowledge, Facebook is dedicating more resources to reaching even younger consumers, including developing a children’s version of Instagram.

Is This Facebook’s ‘Big Tobacco’ Moment?

The revelations are prompting some lawmakers to compare Facebook’s actions to a decades-long campaign by the country’s biggest tobacco companies to mislead the public about the cancerous and habit-forming effects of cigarettes. “Its executives knew about the addictive chemicals in tobacco and yet they did nothing to try and keep the product out of the hands of children,” says Representative Bill Johnson, an Ohio Republican. “They knew that if they could get children addicted early, they’d have a customer for life. It’s very much the same way—children, young people, are addicted to these platforms, and you can see report after report on the damage that’s being done.”

The long-term effects of social media are exactly what’s driving concerns about Facebook’s plan to build an Instagram for kids. The service, sometimes called Instagram Youth internally, is intended to give preteens an entrance ramp onto social media until they turn 13 and are allowed to join the main site. Facebook argues that kids are lying about their age to get on Instagram anyway, so a youth-oriented product—with parental controls—would be a safer alternative.

More than three dozen state attorneys general have already urged Facebook Chief Executive Officer Mark Zuckerberg to drop the project, arguing that Instagram Youth could contribute to conditions such as depression, loneliness, and anxiety. So have U.S. lawmakers and a coalition of privacy and child welfare advocates. “If Facebook goes ahead with Instagram Youth, then really what we’re saying is they’re accountable to no one,” says Josh Golin, executive director of Fairplay, a nonprofit dedicated to ending marketing aimed at children.

To understand how children’s mental well-being is affected by Instagram, Facebook surveyed tens of thousands of users and mined its own data over the past three years, according to the Journal, which based its reporting on internal Facebook research that the publication obtained. The review found that users felt pressured to present an idealized version of themselves on Instagram, and that it often led them to make negative comparisons of themselves with others. Internal researchers warned that Instagram’s design led young people toward potentially harmful content on the platform.

During a March 2021 congressional hearing, Zuckerberg wasn’t as forthcoming about the evidence around the effects of social media on mental health, boasting that online connections can help people feel less lonely. When asked whether Facebook had internal research on the impact of its platforms on kids, Zuckerberg said it was something they “try to study” before adding, “I believe the answer is yes.”

The research on social media and mental health can be ambiguous. Some studies show a link between heavy use and childhood depression, lower self-esteem, and suicidal tendencies. Other academics argue the correlation between social media use and poor mental health outcomes is weak and that other factors could be at work. Experts in both camps agree that Facebook is best positioned to conduct the highest-quality research, because it knows exactly what its users are doing on Instagram and for how long.

On Sept. 14, Karina Newton, Instagram’s head of public policy, wrote a blog post highlighting the similarly inconclusive nature of the company’s research. “Social media isn’t inherently good or bad for people. Many find it helpful one day, and problematic the next,” she wrote. Instagram is looking into ways to steer vulnerable users away from certain types of posts and “towards content that inspires and uplifts them,” she added. A Facebook representative declined to comment beyond the contents of the blog.

Big Tobacco’s strategy was, for decades, to cast doubt on public-health research. A full-page advertisement published nationwide in newspapers in January 1954 established the industry’s public messaging for the next 50 years: Smoking wasn’t a proven cause of lung cancer, and more research on cigarettes and health was needed. While the tobacco companies questioned and distorted scientific data, their own research recognized the risks. They also understood that nicotine was addictive, even as they publicly denied its effects to avoid regulation and thwart legal liability from smokers. It was a whistleblower—an executive from the Brown & Williamson Tobacco Corp.—who helped expose the industry’s secrets.

Facebook’s strategy is to make its platforms more addictive, just as cigarette companies did with additives, the company’s former director of monetization, Tim Kendall, told Congress last year. Facebook relies not just on likes and updates to keep users hooked, but also on misinformation and conspiracy theories that provoke a strong reaction, he said. “These services are making us sick.”

It took decades for the government to hold Big Tobacco to account. In 1998 a coalition of states reached a $246 billion settlement with the industry that required companies to make annual payments to the states and limited the visibility of cigarette advertising. A year later the U.S. Department of Justice sued the tobacco companies, accusing them of a racketeering conspiracy to defraud the public. In 2006, after a nine-month trial, a federal judge in Washington agreed, saying that the companies “marketed and sold their lethal product with zeal, with deception, with a single-minded focus on their financial success, and without regard for the human tragedy or social costs that success exacted.” Dozens of states passed laws banning smoking at restaurants, bars, and workplaces.

The percentage of high school students who smoked frequently plummeted to 2.6% in 2017 from 16.7% in 1997, a report from the Centers for Disease Control and Prevention showed. Still, about 34 million adults in the U.S. remain smokers, according to the CDC.

Fixing social media’s ills will require a similar shift in public awareness, says Asha Rangappa, who teaches a course on social media and information warfare at Yale University’s Jackson Institute for Public Affairs. “The idea that information can actually cause harm is not something that we as Americans can get our head around,” she says.

On Capitol Hill, Representative Johnson is working on a bill instructing the National Institutes of Health to study the mental health risks of social media and whether to apply a warning label to the tech platforms. Senators Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) said they were in touch with a Facebook whistleblower and “will use every resource at our disposal to investigate what Facebook knew and when they knew it.”

In recent days, Facebook executives have defended its actions but have yet to publicly release more internal studies. Nick Clegg, Facebook’s vice president of global affairs and communications, pledged the company would continue to invest in research on complex issues and “improve our products and services as a result.”

Blumenthal, chair of the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security, plans to hold hearings on Facebook’s knowledge of its harmful effects. “We’re at a turning point, because the analogy to Big Tobacco is very apt,” says Blumenthal. “It’s not just that they were doing harm, but they knew it and they concealed it, which is what makes it all the more hideous because people became addicted and the harm was compounded.”
 
Read next: Peter Thiel Gamed Silicon Valley, Donald Trump, and Democracy to Make Billions, Tax-Free

©2021 Bloomberg L.P.