Digital identity and know your customer (KYC) technologies are facing heightened scrutiny as human trafficking continues to rise, generating over $150 billion annually and flooding global financial systems with illicit funds and transactions, according to LexisNexis Risk Solutions.
A LexisNexis Risk Solutions study found 60% of South African organisations have seen an increase in AI-facilitated financial crime – above the 56% global average.
The data analytics and technology firm estimates more than 27 million people worldwide are victims of human trafficking, including individuals coerced into acting as "money mules" – people who transfer or move illegally obtained funds on behalf of criminals or organised crime networks.
"Criminals launder the proceeds of their crimes by using money mules," said Jason Lane-Sellers, director of identity and fraud, EMEA, at LexisNexis Risk Solutions. "Traffickers often target vulnerable individuals, promising legitimate work that could alleviate their financial difficulties or help them support their families. They manipulate these people into creating accounts to filter funds obtained through fraudulent schemes, such as scams."
According to LexisNexis Risk Solutions, money mules play a critical role in networked fraud schemes and enable the laundering of funds across borders, affecting financial institutions and consumers worldwide, including in SA.
Despite the availability of advanced detection tools, many organisations have yet to fully implement technology capable of identifying mule accounts.
Lane-Sellers noted that digital identity attributes – such as device and e-mail intelligence – can help detect attempts to open multiple accounts across banks. Behavioural analytics can also uncover signs of coaching or manipulation during the account creation process.
Transaction risk management systems powered by machine learning and advanced analytics offer another layer of protection, capable of identifying suspicious patterns in account activity and transactions in real-time.
AI – a tool and a threat
Artificial intelligence (AI) and machine learning can process vast amounts of data flowing through the financial system, detecting hidden patterns and connections invisible to human analysts. This allows for the identification of fraudulent activity and mule accounts before illicit transactions are completed.
However, criminals are also leveraging AI. According to Lane-Sellers, they use social and digital media to post fraudulent job ads, lure victims and create convincing fake interfaces, AI-generated content and reviews that mimic legitimate operations.
“Organisations and individuals must exercise caution and independently verify any unexpected opportunities. Remember to avoid any links provided by potential fraudsters and instead type the relevant information into a search engine yourself to validate it.”
Scam centres
Lane-Sellers warned that scam operations have become increasingly professional and specialised. “Scam centres are increasingly specialising in certain fields such as customer manipulation, identity creation, establishing mule accounts or executing romance scams. These scam centres can operate internationally and are sadly often staffed by human trafficked individuals.”
To counter these evolving threats, LexisNexis Risk Solutions advises organisations to adopt a layered defence strategy: leveraging digital identity verification at onboarding, employing behavioural intelligence during user interactions and applying machine learning to monitor transactions in real-time.
“By integrating these analytical methods at each stage of the customer journey, organisations can effectively combat and mitigate risks posed by criminal operations,” said Lane-Sellers.
Only by embedding advanced analytics and AI across the customer life cycle can organisations detect, disrupt and ultimately reduce the impact of networked criminal operations, he added.
Share