Financial and management reporting systems testing

28.04.17 Mikko Hakala

Testing is a must-to-do activity in any IT system and digital product development. Testing disciplines have evolved over the years and are widely practiced in different functions and IT systems. In addition to applying the general testing methodologies, one must also understand the special requirements of the functional area and system that is tested. This blog makes a deep-dive to financial and management reporting systems testing. It also describes how to save time and effort without compromising the quality.

Big corporations typically have three core platforms for financial and management reporting: ERP, management reporting cube and financial reporting cube. ERP – such as SAP – processes most of the financial transactions of the legal entities. Data is transferred from ERP to reporting cubes via ETL, that aggregates and converts the transactions to reporting dimensions. Each data transfer is carefully reconciled and documented for audits and SOX controls. Several processes will take place after the data load: common cost allocations, currency conversions, journal postings and management adjustments.

Management and financial reporting platforms are typically separated, because management reporting should accommodate the changes in reporting dimensions like business units, products, customers and markets immediately as well as to the past periods. However, the published financial reports are rarely restated and shall not be affected by the changes in reporting dimensions.

Testing the management and financial reporting cubes has the same testing phases as any testing: Unit Test, Integration Test, User Acceptance Test, Regression test and Migrated Data Validation. However, there are couple of aspects that require special focus and methodology.

Only unit testing should use imaginary, yet simple and representing data to test new functionality and special cases like min-max values, negative-positive values and all fields & paths of the logic.

Integration Test (INT) requires end-to-end test environment, where all the IT build changes are brought together with production data for the first time. Integration test focuses on technical data transfer from ERP to reporting cubes. Transferred data is reconciled by different reporting dimensions. Production test data from already published quarter should be used. This will guarantee that all combinations of data are covered and that there is a reference point for reconciliation. In reconciliation, the data in cube is compared with the source data by all possible reporting dimensions. Integration test is done by Key Users and development team because testing is highly technical, iterative and time consuming.

User Acceptance Test (UAT) focuses on validating the data from content perspective. Financial accountants and analysts are validating the data in cube by checking the feasibility of figures, comparing the cube against already published data and comparing the results on any calculation with the manual (excel based) calculation with the same data. User acceptance test should be based on already reported data that is familiar to the testers.

Test management needs to be rigid. Test plans, test scripts, test evidence and defects need to be documented for internal and external audit. Big corporations use test management systems like HP/ALM/Quality Center to document test scripts and evidence. Tool provides also access rights control that is required when storing test results with real reporting data. For example, new features in reporting system can change history data. This happens when management reporting structure needs to be changed. History data migration requires reloading of data with new ETL mappings. Migrated data is compared with certain non-changed levels of the reporting hierarchy to make sure that all data got migrated and adds up to right levels.

Management and financial reporting cubes have high expectations on data quality and correctness of the reported figures. That’s why the development projects need to use much time and effort for testing and raise the importance of testing as a core function of development project.

Testing effort and time can however be optimized in several ways. If development is done in scrum method, the accountants can be involved already in unit testing to validate user stories of the sprint. UAT gets faster by using the same data set and validated outcome of INT as the basis for UAT because data loads do not need to be repeated and integration test failures have already been fixed. Actual testing time can be shortened by good preparation. A good practice is to check the readiness to start INT/UAT in entry check meeting, where business stakeholders, development and testers together check that there are no open issues in functional design, processes, roles & responsibilities and test preparation: data, environment, scripts and test resourcing. Test completeness is checked with the same group in INT/UAT exit meeting to ensure that all test scrips have been run and issues resolved.

Quality assurance is an investment. Depending on functional area and type of application, up to 40-50% of the calendar time can go to testing. The main question is how to optimize the time and money spent on testing. Poor quality affects the business and fixing errors afterwards is costly. Therefore, quality assurance activities must start early on in a project.

Financial and management reporting testing requires involving specialists such as accountants and key users, who also have operative daily work. By being systematic, as described in this blog, one can achieve the best results, while at the same time minimizing the burden on the organization.