Social media

Meta unveils features to combat teen ‘sextortion'

Meta’s head of global safety said “sextortion” has become more common across social media platforms, including Instagram

In this photo illustration, the Meta logo is displayed on the screen of an iPhone in front of a Facebook logo on October 29, 2021 in Paris, France.
Illustration by Chesnot/Getty Images

Meta is ramping up its efforts to fight sextortion on Instagram with the launch of new features that are designed to help prevent young users from getting blackmailed into sharing nude photos.

Starting Thursday, Instagram will automatically block follow requests sent to teenage users from accounts that display certain “scammy behaviors,” according to Meta. It will hide users’ following and follower lists from those accounts with the aim of preventing scammers from using these lists to blackmail their targets.

The social media platform is also launching a global rollout of its “nudity protection” feature in Instagram direct messages, which blurs out any images that the platform detects as containing nudity. Users sending or forwarding images with nudity detected will also be prompted to think twice. This feature will be automatically on for users under 18, according to Meta’s latest announcement. The company said it is also removing the ability to take screenshots or screen recordings of disappearing photos and videos sent through Instagram direct messages or Facebook Messenger.

Antigone Davis, Meta’s head of global safety, said sextortion — which is defined by the FBI as “a crime that involves adults coercing kids and teens into sending explicit images online” — has become more common across social media platforms.

“So many of the people committing this crime are motivated by money, which means we’re finding more people are engaged in this type of crime, and we’ve seen an increase in it,” she told NBC News.

The announcement marks Meta’s latest effort to combat child safety issues across its platforms amid continued scrutiny from lawmakers and parents, who worry social media platforms do not have enough protections in place to keep their kids safe from harm.

At a Senate online child safety hearing in January, Meta CEO Mark Zuckerberg apologized to parents who said Instagram contributed to their children’s suicides or exploitation.

U.S. & World

Stories that affect your life across the U.S. and around the world.

Why abortions rose after Roe was overturned

The 10 US states with the highest average credit scores

The social media giant has been criticized for not endorsing the Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0), which were passed by the Senate in July.

The legislation, which has not yet passed the House, is among a flurry of proposed bills that lawmakers have considered in recent years that center on American kids’ privacy and safety online.

A spokesperson for Meta previously told NBC News that the tech giant supports “the development of age-appropriate standards for teens online” but believes “federal legislation should require app stores to get parents’ approval whenever their teens under 16 download apps.”

The new features are also the latest safeguards put into place for younger users on the app. Last month, Meta announced it will begin automatically placing all users under 18 into “teen accounts” with stricter privacy settings.

Haley Hinkle, policy counsel at Fairplay, a nonprofit working with families who have lost children to harms related to social media platforms, said features like the ones that were unveiled on Thursday are like Band-Aids for a systemic problem that requires more action from lawmakers.

“I don’t think the safety features do enough when ultimately what families and advocates are calling for is federal regulation that will require the platforms to actually, at every stage, implement safe design for young people,” Hinkle said.

Although platforms have repeatedly taken “incremental steps in order to improve public relations and try to appease lawmakers,” Hinkle said, she questions the efficacy of leaving it up to private companies to come up with solutions on their own timeline.

“Blocking screenshots on a platform is fine, but a determined, organized criminal could just use a workaround,” Hinkle said. “They could use another device and take a photo of the photo that they’re then going to use to extort a child.”

Meanwhile, Instagram will display an in-feed sextortion education PSA to users across the United States, Canada, the United Kingdom and Australia.

The video, according to Davis, will tell viewers what sextortion is, how to spot it and how to seek support if they are targeted. It’s common for shame to prevent teens from asking for help if they’re facing sextortion, she said, so the campaign aims to reassure victims that they are not at fault.

Hinkle said she would like to see platforms focus on preventing the crime without putting the onus on kids or their parents. But it’s hard to discern whether Meta is doing enough, she said, when the inner workings of the platform remain a “black box” to the public.

“If they really want to promote meaningful change and online safety for kids and teens, they can stop opposing federal solutions like the Kids Online Safety Act,” Hinkle said. “We need transparency and auditing requirements like those in the bill in order to actually assess for ourselves as a society, as advocates and as families.”

This story first appeared on NBCNews.com. More from NBC News:

Copyright NBC News
Contact Us