My experience As A Data Tester For Banking Services
The Fun Called Data Testing
As a Data Tester, you get to work on some interesting data sets. At the same time, data tester’s job goes beyond just data testing and he can contribute to many other areas. The data testing work I have been doing for one of the big four consulting for last one year, has been both fun and interesting. In this article, I talk about the same.
Ellicium Solutions client, one of the Big Four consulting firm helps banks in USA to stay compliant with federal government’s rules. For doing so, banks need to submit a report of specific data in FED specific format, to government. Our client’s work helps banks in their report submission.
For this audit firm, Data Testing was the most important task. And that’s where my role began. I led a testing team for an entire report submission period.
Data Testing Approach For Banking Services
Based on the rules and instructions published by FED, we needed to validate the correctness of the data submitted by the bank. The validations that we generally perform are always up to the precisions of 1 or maximum 2. But when it comes to Banking industry, you must consider precisions till 5. Hence, testing becomes the most crucial part of any banking project. We had to be very careful, as a mismatch even at the 5th precision, could lead to severe repercussion. This is just a simple example. There were many such “NOT TO BE MISSED” kind of validations that we had to perform. As ‘correctness’ was the most important aspect.
On other hand, rules were in a large number. Hence, we had to automate this process. The rules as well as entire data was stored in Oracle. So, we wrote SQL procedures, which would read each rule individually, perform calculations and validate the output against the data. Writing these procedure, was the most challenging and interesting part of this project!
Data Testing Dashboard
To get the overview of all this, we created one simple dashboard. Based on rules, the correctness and quality were demonstrated on this dashboard. We would check the data on the dashboard with that of the database. The failure % was the important measure that was required to be validated correctly. Since the rules were tested thoroughly and things were inter-related, this became an easy job for us.
As mentioned earlier, banks submit all this data in an Excel format to FED. Hence it was vital to test and to validate the correctness of these XML files. To automate this entire process, Python scripts were created. The XML files were tested against the database using Python scripts and the mismatches were reported in a spreadsheet.
What is next?
The basic perception people hold for the data testing is that, testers do not develop anything, they just validate. Based on my entire experience, this perception does the injustice to tester’s work. If you want to test on a bulk data or speed-up your testing process, you will require some hands-on skills. For me, being hands-on in SQL has helped me a lot in most of the projects.
Coming back to the project we are discussing in this article, after the first report submission, I realised on important thing. For every report, some set of specified tasks needs to be performed repetitively. Just calculations or rules change from report to report. So, currently, I am working on developing a generalised framework for testing various reports. I will complete it soon and hopefully, write another article stating more details about it. Stay tuned…
Want to connect with me? Head here: http://bit.ly/2iTzSdh
Read my other article: ETL Testing Automation: An Effective & Efficient Approach
To know more about Ellicium’s Analytics work, please visit: www.ellicium.com