My Experience as a Data Tester for Banking Services

My Experience as a Data Tester for Banking Services

The Fun Called Data Testing

As a Data Tester, you work on interesting data sets. At the same time, a data tester’s job goes beyond just data testing, and he can contribute to many other areas. The data testing work I have been doing for one of the big four consulting for the last year has been fun and interesting. In this article, I talk about the same.

Ellicium Solutions client, a Big Four consulting firm, helps banks in the US to stay compliant with the federal government’s rules. To do so, banks must submit a report of specific data in FED-specific format to the government. Our client’s work helps banks in their report submission.

For this audit firm, Data Testing was the most important task. And that’s where my role began. I led a testing team for an entire report submission period.

Data Testing Approach for Banking Services

Based on the rules and instructions published by FED, we needed to validate the correctness of the data submitted by the bank. The validations we generally perform are always up to 1 or a maximum of 2. But in the Banking industry, you must consider precisions till 5.

Hence, testing becomes the most crucial part of any banking project. We had to be very careful, as a mismatch could lead to severe repercussions even at the 5th precision. This is just a simple example. We had to perform many such “NOT TO BE MISSED” kinds of validations. As ‘correctness’ was the most important aspect.

On the other hand, rules were in a large number. Hence, we had to automate this process. The rules, as well as the entire data, were stored in Oracle. So, we wrote SQL procedures, which would read each direction individually, perform calculations, and validate the output against the data. Writing these procedures was the most challenging and exciting part of this project!

Data Testing Dashboard

To get an overview of all this, we created one simple dashboard. Based on the rules, the correctness and quality were demonstrated on this dashboard. We would check the data on the dashboard with that of the database. The failure % was the important measure required to be validated correctly. Since the rules were tested thoroughly and inter-related, this became an easy job for us.

As mentioned earlier, banks submit all this data in Excel to FED. Hence, testing and validating these XML files’ correctness was vital. To automate this entire process, Python scripts were created. The XML files were tested against the database using Python scripts, and the mismatches were reported in a spreadsheet.

What is next?

The basic perception people hold for the data testing is that testers do not develop anything; they just validate. Based on my experience, this perception does an injustice to the tester’s work. You will require some hands-on skills if you want to test on bulk data or speed up your testing process.

Being hands-on in SQL has helped me a lot in most of the projects. Coming back to the project we are discussing in this article, after the first report submission, I realised on important thing. For every report, some set of specified tasks must be performed repetitively. Just calculations or rules change from report to report. So, currently, I am working on developing a generalized framework for testing various reports. I will complete it soon and hopefully, write another article stating more details about it. Stay tuned.