Meta deploys tools to combat sextortion on Instagram

Staff Writer
By Staff Writer, ITWeb
Johannesburg, 12 Apr 2024
Meta unveils AI tools to combat sextortion on Instagram.
Meta unveils AI tools to combat sextortion on Instagram.

Facebook parent company Meta is rolling out a set of new features and tools aimed at protecting users, especially teens, from sextortion on Instagram.

The efforts include using machine learning (ML) to detect nudity in direct messages (DMs), restrictions on messaging teens, educational resources, and industry collaboration to stop sextortion across platforms.

Sexual extortion, or sextortion, involves persuading a person to send explicit photos online and then threatening to make the images public unless the victim pays money or engages in sexual favours.

Meta highlights how sextortion scammers may use DMs to share or request intimate images to then use for extortion.

It notes that the new tools aim to prevent unwanted exposure and educate people on the risks involved.

According to the statement, at the core of Meta's approach is a "nudity protection" feature for Instagram DMs which utilises on-device ML to automatically blur nudity in messages, defaulting for users under 18 globally with prompts for adults.

Warnings are issued to senders and recipients upon detection, with reconsideration prompts for forwarding. Recipients view blurred images with options to block or report, alongside safety tips on the risks of sharing nudes. The nudity protection feature will "soon start testing" on Instagram DMs, the social media company explains.

"We've spent years working closely with experts to understand the tactics scammers use to find and extort victims online, so we can develop effective ways to help stop them," Meta says.

Additionally, Meta is also restricting how adults can message or interact with teen accounts, even if already connected.

It may hide teens from accounts flagged for potential sextortion behaviour on things like follower lists and search results. The company is also testing notifying people who may have interacted with removed sextortion accounts and connecting teens reporting issues to child safety helplines.

The tech company says it is also developing technology to identify potential sextortion accounts based on certain signals and take "precautionary steps" to limit their ability to find and interact with teens.