ADVERTISEMENT

Facebook’s Instagram Research Isn’t Anything Like Science

The inadequacy of the social media giant's efforts to establish the truth should be a scandal, too.

Facebook’s Instagram Research Isn’t Anything Like Science
The Facebook Inc. Instagram logo is displayed in an arranged photograph. (Photographer: Gabby Jones/Bloomberg)

As Facebook weathers yet another scandal, this time fueled by its internal research on the effects of Instagram, I’d like to focus on something slightly different that should be a scandal, too: the quality of that internal research.

Facebook has been pushing back against a story in the Wall Street Journal, which cited a leaked internal report suggesting that Instagram harms teenagers by fostering insecurities around “social comparison” and sometimes even suicidal thoughts. In its public-facing blog, the company featured an article titled “What Our Research Really Says About Teen Well-Being and Instagram.” It pointed out ways in which the app was found to be benign or mildly positive, and also sought to downplay the significance of the research, noting that it “did not measure causal relationships between Instagram and real-world issues,” and sometimes “relied on input from only 40 teens.”

Facebook is right on one point: Its internal research doesn’t demonstrate much of anything. This just isn’t how science is done. One never relies on a single study to determine a relationship, in part because any single experiment entails too many choices that limit its applicability. Do you study teenagers or young adults? Men or women? How do you reach them? What questions do you ask? Do you follow up after six months or 12? And that’s just for design, let alone analysis. Only when multiple studies with different approaches get the same answer can one start to draw strong conclusions.

That said, one can reach a pretty strong conclusion by observing the way Facebook has done research over the years: It’s afraid to know the truth. After all, why not do more studies? If it’s possible that your product is leading young women to kill themselves, wouldn’t you want to explore further, at least to clear your name? Why not let outside researchers use your data, to get a better answer faster? Instead, Facebook allows only tiny internal studies and tries to keep them under lock and key. Even if they leak, the company maintains deniability: The results are far from conclusive.

Facebook is not alone in its aversion to self-knowledge. Something similar happened at Google not so long ago, when internal researchers had the audacity to think they were able to do critical work, writing a paper on how large language models like those used at the company can be environmentally damaging and even racist. In that case, Google fired two of the founders of its Ethical Artificial Intelligence team, Timnit Gebru and Meg Mitchell.

I’m not so naïve as to think that public embarrassment will compel the big tech companies to allow access for real science on the impact of their products. Making that happen is a job for Congress or the Federal Trade Commission. In the meantime, as the likes of Facebook and Google sweep through academic departments with job offers for newly minted Ph.Ds and even seasoned professors, those who choose to take the money should recognize that what they’re getting into isn’t scientific inquiry — and still leak whatever they can.

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Cathy O’Neil is a Bloomberg Opinion columnist. She is a mathematician who has worked as a professor, hedge-fund analyst and data scientist. She founded ORCAA, an algorithmic auditing company, and is the author of “Weapons of Math Destruction.”

©2021 Bloomberg L.P.