Search Icon Search Icon Menu Icon Menu Icon

22 April 2020, 14:27




the power of data driven decisions transaction monitoring systems are rule based. MLRO should be 100% confident It’s a NoSQL Powerful real-time Compliance should be in real time Artificial intelligence in transaction monitoring. A unique approach multiple powerful algorithms

As a tech guy, almost one year within a reg-tech startup DX Compliance I felt like writing about the tech involved or which can be involved in this sector. This is my first post and I hope I’ll write more going forward. Innovation and technology are empowering influences for powerful and effective startups. With this as a driving force, we always thrive to be innovative when it comes to tech and its implementation. There are many technologies that can be leveraged for AML compliance. With a Master’s in Data Analytics, I am highly aware of the power of data-driven decisions. Initially, when we started with building the rule-based element of our systems, I questioned the capabilities, but the reality is, this is what clients want. Below are the few technologies which we are using to provide compliance as a competitive advantage.

  1. Graph Analytics
  2. Elastic Stack
  3. Cloud Services
  4. Artificial Intelligence
RegTech startup DX Compliance

Graph Analytics

As of today most of the transaction monitoring systems are rule-based which fetch data from a relational database and run the data against a set of rules. Relational databases have certain limitations when it comes to handling large data and getting real insights from it. We are using a unique approach for handling the false positives to increase accuracy using graph analytics. Transaction behavioural patterns are also observed with some of the powerful graph algorithms. An MLRO should be 100% confident while making decisions with supporting reasons for generating a SAR or discounting a hit. Also, these extracted graph features can be used to create rules

Elastic Stack

We had a use case of global search functionality in our application, though it seems to be simple but there are a lot of querying that are happening behind the scenes. As the complexity of these queries increases the execution time increases and it hinders the users experience. It is then we have realised the importance of integrating Elastic Search with our application. It’s a NoSQL database which provides API’s for full text search and renders JSON response. It also offers advanced queries to perform detail analysis allowing many types of searches- structured, unstructured, geo, metric etc. Apart from a quick search, the tool also offers complex analytics and many advanced features. It provides horizontal scalability, reliability and multitenant capability for a real-time search. Along with the search capabilities it also offers a wide range of analytical abilities on the Elastic search database.

Kibana, a data visualisation tool which comes with the elastic stack. Powerful, real-time, front-end dashboard which comes with histograms, line graphs, pie charts and many more. Plus, you can use Vega grammar to design your own visualizations. These charts are easily configurable. Perform advanced time series analysis on your Elastic search data with curated time series UIs. Describe queries, transformations, and visualizations with powerful, easy-to-learn expressions. It helps to detect the anomalies hiding in your Elastic-search data and explore the properties that significantly influence them with unsupervised machine learning features. This enhances the reporting abilities of the compliance team.

Elastic Stack

Cloud Services (Amazon Kinesis)

For early stage companies, striking a balance between ongoing maintenance costs and performance is essential. Thankfully, a wide variety of cloud services ensure that this is no longer as challenging as it once was. From maintaining the code repos, to more sophisticated solutions such as data streaming and Machine Learning, there a fantastic solutions to make use of. Thanks to the well-known AWS Activate programme for supporting the startups via issuing credits. Setting up the service is a just matter of seconds on the cloud. There is a special reason for mentioning the cloud services and you may wonder how it can be used for compliance.

Compliance should be in real time, and the decision making should be aligned with the real time data in your systems. When I was brainstorming around this scenario I came across the Apache Kafka which helps to maintain real time data pipelines. Mostly transactional databases are used in banks and payment service providers. We are using Amazon Kinesis which is slightly easier to maintain than Apache Kafka. In our case, WAL logs of PostgreSQL have been pushed to the Kinesis stream using WAL2Json which is then pushed to an S3 instance, which can then be consumed by other applications like analytical databases. This enables a decision making process with real time data while also allowing for online learning to train machine learning models.

This is one of the use cases where cloud services come in handy. There is a lot to extract from these services which brings down the cost of maintaining the compliance teams. Mainly, the maintenance costs of the complicated systems have been reduced dramatically. These resources are instead better used in decision making and risk mitigation within an organisation.

Artifical Intelligence

There is a buzz around for use of artificial intelligence in transaction monitoring. One thing we should keep in mind is that artificial intelligence doesn’t mean machine learning. It is a subset of Artificial Intelligence. Recently I have been through many articles in which someone supports and others are against the use of machine learning in reg-tech. Striking the right chord at the right time is the key for the use of any tool in Regtech. There are many ways in which we can implement artificial intelligence in transaction monitoring

Diagram of tree nodes in Artificial Intelligence used for Transaction Monitoring

After brainstorming on this topic we came up with a unique approach of feature engineering of the dataset, on which we trained our machine learning models. As I mentioned earlier, we have used graph analytics from which we extract graph features for nodes (i.e Account). This sets the context for machine learning models. The data is updated in real time, and we have developed a training mechanism so that the models respond to every change in incoming data and can learn from the data.

In this way setting the context for machine learning is of huge importance. Once this is done, then there are multiple powerful algorithms like Gradient Boosting and Ada boosting which can be used to build the classifiers. Here, is a special mention of these algorithms because of their accuracy and precision which can be achieved by them. Moreover, the data needed for training these models is less compared to others. They are based on something known as wisdom of the crowd. This machine learning model should be stacked along with other systems such as rule based monitoring, which helps in reducing the false positive rate significantly


There is enough technology around which can be used to enhance the power of your teams and mitigate the risk. This is a small attempt to address some of the useful technologies which are powerful and quite useful. Finally there, is more that needs to be done in Reg-Tech and so much scope for helping companies.

Signing up gives you exclusive access to essential industry insights, don’t miss out!

21.01.2021    Anti-Money Laundering

How Covid is Changing Money laundering

Covid-19 Pandemic & Money Laundering

Get access


AML Regulation in the UAE

Eliminating ML/FT risk in the UAE

Get access

25.06.2020    compliance

How the Compliance Department of the Future works

Financial institutions rely on outdated and inefficient software to handle their compliance needs

Get access

Keep yourself up-to-date

By clicking the Button you confirming that you’re
agree with our following Terms and Conditions