Before you start you will need the following:
You need a template, for instance, let's give it a name AML_test
You need to cut API kyes in console (* later reffered as apiKey:apiSecret)
You need a test file.
How to create a template
Click create template, select one time type and give it a name
Template
Template is created only using native sensors (executed within a rules engine)
You can upload this file in your enviroment, no further sensors are needed, as these come together with the engine.
Template also come with set of default variables: so you can test the template directly in the debugger:
We can test the template by running it in a debugger mode, let's say once every second (using periodic setting), eventhough it is configured to run once, we can change that setting while debugging a task:
This template will later be part of the sublfow (policy rule) which can be then combined with other subflows.
So, let's create one parent template which will call this one:
Note that the Subflow node is going to consume stream of data (variables settings points to stream data), and that template that this sensor will be calling is AML_test:
Note that conditional evaluator node enables you to test if the calling subflow (policy rule) has evaluated data to be in RISK or not this way:
${nodes.AML_FundTransfer.rawData.Risk} == "True"
This can be later enhaced with sending results to a kinesis stream, or calling other functions or storing the input data and results into another database.
Here is the file you can upload in your enviroment:
In case you have a bulk file that you would like to use for tests or in production, it needs to be first transformed into a JSON format.
For instance, this is a BULK file with only one data entry:
{ "data": [ [ { "resource": "AML_Resource, "Txns": 35, "Field": 100, "Transaction_Value": 0.1, "Instrument": "Wire", "Transaction_Type": "Transfer", "Monthly_Revenue": 0.04, "Transaction_Date": "2022-11-10 00:00:00", "AvgMonthlyTransactionValue": 17142.8571428571, "Avg12MonthTransactionValue": 271428.5714285714, "ClientGroup": "Platinum", "PeerGroup": 17142.8571428571, "TransactionFlow": "Dr", "TransactionCounterparty": "Beneficiary", "CounterpartyLocation": "HK", "TaxHaven": "Yes", "BeneficialOwnerResidence": "HK", "CustomerStatus": "A", "Transaction_Value_Monthly": 1, "Transaction_Volume": 1, "Avg_Monthly_Transaction_volume": 0.75, "Last_two_days_trn_value": 922000.0, "DrDate": "2022-11-10", "CrDate": "2022-11-10" } ] ] }
Imagine that this file is stored on your laptop under the name dataInput.json.
After that, you can test the template with this curl command:
curl --user apiKey:apiSecret -H "Content-Type: application/json" -X POST -d @./dataInputOne.json https://<yourdomain>/rules/v1/templates/AML_Parent/run
The outcome of the test run we can see in the task logs (not that last line says that we are in the RISK)
So, that's it. If you have your ETL process running, you can simple extend it with this call and let the rest happen in our system. In some other blogs, you will see how we can first send data to a pub/sub (for instance Kinesis) and consume messages from there and forward them to our platform.
In some of the following posts, we shall see how to extend this use case with case management and analytics reports (see the figure below), so stay tunned and have fun!
Full video with end to end test is below: