Money mules are an important part of the criminal money laundering pipeline. They help channel the proceeds from fraud and other criminal activities to the criminals themselves while obfuscating the process. The UK’s Financial Conduct Authority has estimated that more than $40 billion is laundered every week, with only 1% intercepted and seized.
A new report (PDF) from the P20 group (a collaborative thought leadership ‘sandbox’ seeking cooperation and joint action in the non-competitive areas of the global payments industry) has published recommendations on how to tackle the money mule aspect of illicit money laundering.
Money mules are often dismissed as a rather benign part of criminal activity – but they are an important part of channeling illicit funds from the source of crime to the ultimate destination – whether that is criminal gangs, terrorists or even adversarial nation states. P20’s argument is fundamentally twofold. Firstly, if money mules can be disrupted, the movement of illegal money can be disturbed; and secondly, law enforcement and banks can follow the money back to the criminal source and forward to the criminal destination.
Key to achieving this is a greater use of AI-based mule detection systems. At a high level, this is analogous to modern cybersecurity thinking: assume you have already been breached and concentrate on response to mitigate effect. Finding, tracking and disrupting mules recognizes that the crime has been committed and then responds to the movement of the money to mitigate the amount received by the criminal organizers.
“The widespread reliance on money mules for money laundering gives banks and other payment service providers an opportunity to identify a variety of financial crimes. Finding the money mules and following the money can help fight fraud, identity theft and cybercrime, while preventing stolen money ending up in criminals’ hands,” explains Duncan Sandys, CEO at P20.
There are two areas where AI-based systems can help financial institutions: application fraud and payment fraud. The first uses Know Your Customer (KYC) principles to detect attempts to open accounts for fraudulent purposes. The second uses behavioral monitoring to detect an account that has developed into fraudulent use.
The report notes that there are three types of mules: complicit, witting, and unwitting. Complicit mules know what they are doing and may open multiple accounts to scale their operation. This is where KYC principles can prevent the development of the mule channel by declining the account.
Witting mules may suspect something is wrong but ignore their instincts. Unwitting mules may genuinely be scammed into thinking they are doing something legal, and proceed just to make a little legal ‘pocket money’. In both cases, behavioral analysis on the account can be used to highlight suspicious activity.
“If a person has a job that could justify weekly deposits of, say $1,000, and that suddenly jumps to $3,000 or $4,000,” P20’s president Peter Radcliffe told SecurityWeek, “that should throw up a warning sign that something has changed – maybe they’re laundering money.” AI-based behavioral tracking on accounts can help identify pattern changes that may be suspicious.
“Application of machine learning to anti-money laundering (AML) is a relatively new approach,” notes the report. “Most AML products on the market are built on a combination of rulesets and list-based screening. So, while more advanced techniques have been applied widely to combatting fraud, AML is still in the early stages of utilizing machine learning to improve prevention and detection efforts.”
Radcliffe is not overly worried about AI’s traditional weakness – false positives. The behavioral monitoring doesn’t cause automatic effects but just highlights situations that may require greater investigation. The problem is that this process is not yet being sufficiently followed by all financial institutions.
The report also notes two reasons it may be more consistent in the UK than in the US. Firstly, there are fewer but large banks who can afford the resources necessary to put such systems in place. But secondly, the UK banks have adopted a Contingent Reimbursement Model. “Firms,” notes the report, “must take reasonable steps to detect accounts which may be, or are being used, to receive Authorized Push Payment (APP) scam funds. This means that if banks don’t combat mules, then they may end up being liable for the fraud loss rather than the sending bank.”
Apart from using AI to detect money mules internally, P20 also urges greater mutual collaboration and information sharing between the banks. “Some of these mules may have 50 or more accounts paying in smaller amounts across 50 organizations,” said Radcliffe. “Why does anybody need 50 bank accounts?” Greater information sharing could detect issues like this, and leader to better overall anti-money laundering results.
Historically, the tools developed to address financial crime have been deployed in silos, limiting the ability to harness the full potential of the enterprise, the industry and public/private partnership. “A focused, collaborative approach to money mules could not only address this crucial link in crime networks but could serve as a model for broader cross-discipline collaboration to fight financial crime,” suggests the report.
P20 was conceived in 2016 as a joint UK/US group to focus on payments globally. It was officially launched in 2017 with the help of Jack Lew, the former US Treasury Secretary and the UK City Minister, Steve Barclay. It has offices in London UK (the financial services capital of the world), and Atlanta USA (the payments capital of the U.S, processing 75% of $7.4 trillion in annual payments).