PL/SQL Unit Testing with DBUnit

My colleague Slavomír Jeleň and I are currently working on a logistics management application for an international food retailer. It’s a data-oriented application that performs pre-calculation steps on billions of rows with PL/SQL stored procedures. In order to ensure the correctness of these calculations, we devised a solution for unit testing the stored procedures in Oracle based on DBUnit.

The logistics management application enables the international food retailer to administer the flow of goods between stores and warehouses. The number of users is relatively low at around 200, but the volume of data is massive with many tables containing hundreds of millions of rows and some containing over one billion. An operating day begins with an import of the latest data from the customer’s central data warehouse. From 7am on, users analyze tables of results and trends derived from this imported historical data, make decisions based on them and the application then recalculates future trends based on these decisions.

In order to reduce the volume of data to be analyzed in real-time when recalculating future trends, the historical data is pre-calculated into a smaller set of aggregated values. For example, sales data describing how many pieces of various items are sold in which store on which day is imported every morning, but one part of the system only needs data per item and per region for a month and so it makes sense to perform this aggregation at the beginning of the day and cache the results in a separate database table. These pre-calculations take the form of PL/SQL stored procedures which are run directly after the import completes every morning.

Overview of how the historical data is pre-calculated into a smaller set of aggregated values.

We had some issue with quality assurance, which generally fell into three groups:

  • Unusual circumstances would create data that would not be properly aggregated,
  • The ability to ensure that changes to aggregation rules have been implemented properly, and
  • Ensuring that changes made do not adversely affect other aggregation rules.

The pre-calculation is split into several stored procedures but together comprise over 2000 lines of PL/SQL. Ensuring that all this logic performs as it should demands special attention. What was required was a system which would be able to test the stored procedures to make sure they function as specified – unit testing for PL/SQL stored procedures.

Proposed solution

The goal we had in mind was a system which could insert the bare minimum amount of data required for one specific test, run the pre-calculation and then compare the actual result with the expected result. The overall pre-calculation is very complicated and so we split the tests up so that there would be one test for each rule of the aggregation. Using the example mentioned earlier, one test might be to specify sales data on the first day of a month and the last day of the previous month for one given item in a given store and to ensure that only the data in the current month is calculated.

Another test for this aggregation might be to have sales data in a month for two items and to ensure that the data is aggregated separately for each item. Importantly though in each case we specify only the absolute minimum amount of data required for the scenario, for example there might be just one country, one store, one item and perhaps just two rows in the sales table, not forgetting auxiliary data for foreign key integrity.


After a few prototypes, we settled on using DBUnit which is able to do exactly what we want. DBUnit is a testing framework for databases which can run stored procedures, fetch, manipulate and compare data. It is a simple and usable interface to the database on top of JDBC including the ability to compare datasets derived from the database against datasets specified in file based structures such as XML and CSV. This means that queries, views and full tables can be compared.

Initial data and the expected output state of the tables containing the aggregated values are specified separately in two simple XML files. These are then referenced in a test method in a Java class along with the command to run the pre-calculation stored procedure.

Relevant parts of our Testing solution and how they play together.

An Example

Using the example mentioned earlier, sales data would be prepared like this:

item_id date store_id quantity
1 31/05/12 1 1
1 01/06/12 1 2
1 30/06/12 1 4
1 01/07/12 1 8

In addition to this, auxiliary data would have to be prepared for entities such as the store, the store’s country, the country’s language, etc. Most of this auxiliary data is common and so can be easily duplicated when making new tests, making small changes where necessary.

In this example we are interested in making sure that sales for June are correctly aggregated and that values for May and July are excluded from the June aggregation and so expected values in the table containing the monthly aggregates would look like this:

item_id month store_id quantity
1 01/05/12 1 1
1 01/06/12 1 6
1 01/07/12 1 8

Once the two XML files have been created, the next step is to create the Java test method:

public class PrecalcMonthlyDbTestCase extends ProjectDBTestCase {
protected IDataSet getDataSet() throws Exception {
return loadXmlDataSet

public void test() throws Exception {
“{call p_recalc_monthly_sales(1, DATE ‘2012-07-02’)}”);

//…more tests

The class PrecalcMonthlyDbTestCase covers just one test. It extends from ProjectDbTestCase which is our implementation of DBUnit’s base class for test cases, setting common configuration options and providing additional functionality such as loading the source data from a specified location on the classpath before the test and cleaning it afterwards. The test itself is a simple affair; it runs the pre-calculation for a given date and then compares the expected data defined in an XML file with actual data in the corresponding database tables holding the new data aggregated by the pre-calculation.

Test-driven Development

We soon discovered that we were able to use this approach to ensure changes to the aggregation are implemented correctly and that all conceivable corner-cases are correctly handled. A Test-Driven-Development approach was accommodated whereby the first step in implementation was to create a series of tests based on the requirement. Initially these tests must fail as they specify new behavior. Then the implementation begins and afterwards is verified when the tests are run successfully. Similarly, if a bug is found in the pre-calculation, the first step is to reproduce the situation with a failing test case.


Whilst the testing of PL/SQL stored procedures was the initial focus of the work, we are now planning to broaden the scope and use DBUnit to test the large SQL statements in the project. Many of the SQL statements make extensive use of common table expressions and many exceed 200 lines, making them very susceptible to the same vulnerabilities as the pre-calculation stored procedures.

The end result is several suites of tests covering all of the various pre-calculation procedures which are run every night as part of the nightly build and also before every release. We don’t have any numbers to quantify the increase in accuracy and stability of the results, but are now 100% sure that once a problem surfaces, it will not resurface again.