Hundreds of thousands of women and children – society’s most vulnerable – are being trafficked, lured into financial extortion and sextortion by criminals who are increasingly weaponising technology to find, trap and silence them.
Artificial intelligence (AI), the internet and digital platforms have become the trafficker’s most powerful tools, enabling abuse at a scale previously impossible and making detection harder for authorities and platforms alike. However, governments are fighting back using the very same tools.
While the scale of the crime is hard to pin down because of its underground nature, the United Nations Office on Drugs and Crime’s (UNODC’s) Global Report on Trafficking in Persons 2024 records at least 162 nationalities trafficked to 128 countries.
Of these, 32% involved African citizens, underscoring the continent’s role as the most geographically-dispersed trafficking origin. This is despite more than 180 nations having ratified or acceded to the UN Protocol to Prevent, Suppress and Punish Trafficking in Persons.
The Exodus Road, a global non-profit with offices in five countries, says that last year alone it participated in securing freedom for 679 survivors, the arrest of 262 perpetrators, and prevention and education for 8 120 citizens and law enforcement officers, across 89 criminal human trafficking cases.
Young and vulnerable
In the preface to UNODC’s report, executive director Ghada Waly notes that in 2022, women and girls represented 61% of detected trafficking victims globally, with most cases linked to sexual exploitation.
The number of children identified as victims is growing sharply, up by a third in three years, the report states. In multiple regions, children now account for most of the victims that were identified, it says.
The number of girls identified as victims has surged by 38%, Waly says, noting that “human trafficking continues to target the vulnerable, and we see this in persistent as well as emerging trends”.
The National Centre for Missing & Exploited Children (NCMEC) tracks crimes against children as much as possible given the clandestine nature of exploitation.
In a recent blog, Patricia Davis, its executive director of communications, details the devastating scale of children being enticed into sadistic self-harm or financial sextortion.
Total reports to the NCMEC CyberTipline jumped 77% year-on-year when comparing the first half of 2024 to the same period last year.
Reports of financial sextortion – still a relatively new crime type – spiked 70% to 23 593 over the same timeframe.Unlike sexually-motivated offenders, financial sextortion perpetrators are driven purely by money.
“These alarming increases are a wake-up call,” says John Shehan, senior vice-president overseeing NCMEC’s Exploited Children Division. “We need parents, caregivers, educators and communities to stay alert and talk openly with children about online risks.”
Sadistic online exploitation has become one of the most disturbing crime types NCMEC tracks.Violent online groups are targeting children on publicly-available messaging platforms, including Discord, Roblox and gaming sites, Davis writes.
Offenders initially befriend children before forcing them to record or live-stream acts of harm against themselves or others.
“One mother told NCMEC an online predator made her daughter cut its screen name into her arm with a razor blade, then told her she was a good girl and that they love her,” Davis writes. The child reportedly responded: “I love you, too!”
“These guys are very scary,” the mother told NCMEC. “Just the power they have over my daughter is mind-blowing.”
Non-ending stalking ground
The US Department of State’s 2025 Trafficking in Persons Report warns that traffickers are exploiting technological innovation and are now using digital communication to drive online commercial sexual exploitation and sex trafficking, including of children.
“They have also significantly expanded online scam operations that run on forcing trafficking victims to carry out scams that defraud individuals around the world,” it states.
AI is enabling an “unprecedented” level of social engineering and tailored exploitation, says the department. “This represents a dangerous advancement where traffickers use automated systems to help identify and trap victims at an unprecedented scale.”
Offenders are also using generative AI (GenAI) to create explicit images using a child’s face harvested from public social media or school postings, then using those images to blackmail the child, NCMEC says.
Reports of GenAI-related child sexual exploitation soared from 6 835 in the first half of 2024, to 440 419 in the same period of 2025, says NCMEC.
The UNODC report highlights that victims of sexual exploitation are also being manipulated online, with more than 50 cases recorded in 2022. Several cases involve children under the age of 14.
“Offenders often use fake social media accounts to convince their targets, mostly teenage boys, to send them sexually-explicit images, then immediately begin demanding money,” writes Davis. “We’re aware of more than three-dozen teens who’ve taken their lives as a result of being victimised by this crime.”
NCMEC began tracking GenAI in 2023, and the growth has been staggering, says Shehan. “It’s important that we stay on top of these emerging threats to warn the public and adjust our strategies for protecting children.”
Despite trafficking for forced criminality – including online scams – being the third most common form of detected victimisation, criminal justice systems continue to focus predominantly on prosecuting sexual exploitation rather than this emerging crime category, says UNODC.
Beyond direct exploitation, criminal networks are also recruiting young professionals for their digital skills to run sophisticated scams.
Infrastructure used to launder money through online gaming has also been converted for financial fraud, including crypto-currency scams, investment schemes and romance-investment fraud, UNODC notes.
Fighting back
Many governments are now leveraging AI to deepen cross-border partnerships and tackle online scams, says the US Department of State, which identifies several ways AI can serve as a prevention and intervention tool.
Social media protection: AI tools can detect and flag harmful or inappropriate content, identify unlawful activities, and flag suspicious conversations and job advertisements in real-time.
Victim awareness: AI can drive targeted multilingual awareness campaigns, adapt messaging to local contexts, and use content provenance technology to track the origin of digital material, building a more informed and resilient public.
Operational intelligence: AI technologies can help identify and assist trafficking victims by analysing online commercial sex advertisements, extracting language patterns, processing digital evidence, and enabling law enforcement to make data-driven operational decisions.
* The NCMEC’s CyberTipline can be found here, while Childline’s number is 080 005 5555. People Opposed to Women Abuse can be reached here, and if you or someone you know has been raped or abused, you can contact TEARS Foundation.
Share