Is the use of AI by banks in the fight against money laundering a blessing or a curse for the bank customer?

The fight against money laundering and terrorist financing has gained momentum in recent years. Financial institutions, such as banks, play a crucial role in this as "gatekeepers" of our financial system. They are legally obliged to monitor their clients' transactions and report unusual transactions to the Financial Information Processing Unit (CFI).. To analyze the immense flow of transactions, banks are increasingly using advanced technology, particularly artificial intelligence (AI). But what does this evolution mean for you as a bank customer? And what guarantees are there that these smart systems do not violate your rights?

The legal reporting obligations of banks

Banks, by virtue of the anti-money laundering legislation required to conduct ongoing customer due diligence. This means that they must identify and verify their clients, and establish a risk profile. They must then review their clients' transactions against this profile. When a transaction does not fit the expected pattern or when the bank otherwise suspects that the funds have an illegal origin or are related to terrorism, it must report this to CTIF-CFI.

At the heart of the matter is the assessment of whether a transaction is "suspicious. The legislature has largely left this assessment to the subjective judgment of the bank itself. It is the bank that must determine whether there is "a presumption." If that suspicion is there, she is obligated to report; if it is not there, she is just not allowed to report.

Artificial intelligence as new tracking dog

To support and automate this subjective assessment, banks are increasingly turning to AI. Broadly speaking, two types of systems can be distinguished:

  1. Rule-based systems (Expert systems): These systems, which can be considered a rudimentary form of AI, operate based on predefined rules and decision trees ("if x, then y"). For example, consider a rule that generates an alarm for every cash deposit over a certain amount.
  2. Artificial Neural Networks (ANNs): This is the more advanced form of AI, mimicking the workings of the human brain. These systems, which also underlie technologies such as ChatGPT, are trained with huge amounts of data on transactions. Using "deep learning," the algorithms learn to independently recognize patterns that may indicate money laundering, without a programmer having to input every line. The system can thus calculate the probability that a transaction is suspicious and classify it.

The promise of AI is great: more efficient and effective detection of suspicious transactions, benefiting society as a whole. The systems can detect deviations from normal patterns (anomalies) that a human analyst might overlook.

The legal risks: "automation bias" and lack of transparency

However, the deployment of AI is not without its dangers. One of the biggest risks is the lack of transparency of deep learning models. Because the algorithms are self-learning, it becomes extremely difficult, if not impossible, to figure out exactly why the system flagged a particular transaction as suspicious. We refer to this as the "black box" effect.

This raises fundamental legal questions. An AI system can make statistical connections, but it cannot reason legally. It cannot understand the context of a transaction or weigh what is reasonable and fair in a specific case.

When a bank blindly relies on the output of an AI system - a phenomenon known as "automation bias - the consequences for the client can be dramatic:

  • Unjustified blocking of accounts: A "false positive" (a transaction wrongly labeled as suspicious by the AI system) can lead to your accounts being frozen.
  • Termination of banking relationship: Banks may decide to end the relationship with a client based on an opaque AI score.
  • Violation of privacy: Your transaction data is analyzed by complex algorithms, which constitutes a significant invasion of your privacy and the protection of your personal data.
  • Discrimination: If the data used to train the AI model is biased, the system may unfairly discriminate against certain groups of people.

The AI Act: a shield for citizens?

By 2024, the European Union has the AI Act passed, a regulation imposing rules on the development and use of artificial intelligence. The AI Act takes a risk-based approach. Systems that pose a "high risk" to the fundamental rights of citizens are subject to strict requirements. These include mandatory human oversight, high requirements for data quality and transparency.

Remarkably, the AI systems that banks use to combat money laundering do not appear to qualify as "high risk" under the current text of the AI Act. This is a troubling finding, given the potentially serious consequences for clients in the event of a wrong decision. This is an incorrect assessment by the legislature.

Although the AI Act prohibits systems that estimate the risk that a person will commit a criminal offense based on profiling, it makes an exception for systems that serve "to support human judgment." However, the line between a support system and a de facto decision making system is paper-thin and not clearly delineated in the law.

Conclusion: AI as a tool, not a judge

Artificial intelligence can undoubtedly be a useful tool in the fight against money laundering. However, it should never be more than that. The final decision on whether or not to report a transaction as suspicious and any consequences associated with it must always be made by a human being. A human analyst must critically evaluate the output of the system, conduct additional research and soundly justify the decision.

Banks deploying AI would do well to voluntarily apply the AI Act's stringent requirements for high-risk systems. This includes ensuring robust human oversight and avoiding blind reliance on technology.

Are you as a business owner or individual facing an incomprehensible decision from your bank? Is your account blocked or your banking relationship terminated without clear justification? It is possible that an opaque algorithm is at the root of your problem. Our law firm has the expertise to assist you. Our lawyers analyze your file, seek clarification from the financial institution and defend your rights both in and out of court.


Joris Deene

Attorney-partner at Everest Attorneys

Contact

Questions? Need advice?
Contact Attorney Joris Deene.

Phone: 09/280.20.68
E-mail: joris.deene@everest-law.be

Topics