ADVERTISEMENT

Meta’s Nudity-Blurring Feature To Deter Teen Sextortion Draws Fire

A new feature designed to blur naked photos sent to teens in private Instagram messages is coming under attack even before it’s rolled out.

Meta said it would deploy machine-learning technology to blur any nude image initially received by a user under 18.
Meta said it would deploy machine-learning technology to blur any nude image initially received by a user under 18.

A new feature designed to blur naked photos sent to teens in private Instagram messages is coming under attack even before it’s rolled out. 

The feature, which Meta Platforms Inc. plans to begin testing next month, is intended to deter online sextortion crimes, which have led to more than two dozen teen suicides over the past two years. But users can override it: One tap on the blurred image, and everything is revealed.

“This is too little too late and, quite frankly, doesn’t do anything to protect kids,” said Lina Nealon, vice president and director of corporate strategic initiatives at the National Center on Sexual Exploitation, an anti-pornography organization. That nude image remains “just a mere click away,” she said.

Arturo Bejar, who spent six years working on child safety tools at Meta, called the new feature “woefully insufficient.” Bejar left in 2021 after flagging to executives that teenagers, including his own daughter, were receiving too many unwanted sexual advances on the platform. He testified at a Senate hearing in November that the company wasn’t doing enough to prevent harm against its youngest users. 

“What teen boy isn’t going to unblur a naked photo sent by a girl?” Bejar said in an interview. “How many kids need to die before there are measures that effectively prevent sextortion? What’s at stake here is teenager’s lives.”

Meta’s move to fight sextortion was announced earlier this month after Bloomberg News made inquiries about the platform’s efforts to combat the crime, in which scammers posing as teenage girls on Instagram or Snapchat get their targets to send nude images. The scammers then blackmail them. The announcement came just days before Bloomberg published a story about the issue, focusing on the death of Jordan DeMay, a 17-year-old football and basketball star from Marquette, Michigan.

The company said it would start testing nudity-protection features for teens, including encouraging them to think twice before sending or forwarding explicit images, and warning them to scrutinize profiles because people might not be who they say they are. Meta also said it would deploy machine-learning technology to blur any nude image initially received by a user under 18.

Gail Kent, a policy director for Messenger and Instagram Direct Message at Meta, defended the company’s position on allowing teens to view the nude images if they choose to. She said in an interview that users have an expectation of privacy when sending messages, and that the technology isn’t good enough to detect the context in which an image has been shared or to determine the age of the person in a photo.

It is against Meta’s policies for users of any age to share naked images in public spaces, such as Facebook or Instagram feeds, but private messages are harder to police, Kent said. 

“There is an expectation that people have when sending an SMS or a telephone call that the company facilitating that is not proactively reading or listening to these conversations or making judgment calls on what is and isn’t being said,” said Kent, who spent two decades working in law enforcement before joining Meta five years ago.

Meta CEO Mark Zuckerberg at a congressional hearing in Washington in January.Photographer: Kent Nishimura/Bloomberg
Meta CEO Mark Zuckerberg at a congressional hearing in Washington in January.Photographer: Kent Nishimura/Bloomberg

The new features are designed to add a layer of friction when sending or receiving explicit images, Kent said. “This is all about getting people to stop and think,” she said. “There are no silver bullets in this. We are just going to have to continue to evolve.”

The company plans to roll out the new feature globally for all Instagram users under 18 in the coming months and to explore similar options for Facebook Messenger.

New Mexico Attorney General Raúl Torrez, who has filed a lawsuit accusing Meta of facilitating child sexual abuse, said the new measures “fall significantly short of what is needed.” He said law enforcement has pleaded with Meta for years to address sextortion schemes on its platforms. 

“It is deeply disturbing that the company will only lift a finger once its executives become concerned about bad publicity,” Torrez said. “Meta’s superficial efforts are indicative of a broader reluctance to implement the safeguards necessary to truly protect young individuals on Instagram, Facebook and Whatsapp.”

Meta is the first social media platform to offer a nudity-blurring feature for minors. Snap Inc. declined to share any details about whether it is working on a similar feature, and TikTok didn’t respond to requests for comment.

More stories like this are available on bloomberg.com

©2024 Bloomberg L.P.