For this project we were faced with a large, complex data source which contained a combination of static data, formula calculated fields, blanks, and special characters which were breaking other calculations throughout the database, making it impossible to extract the data we needed.
On top of this the data source had two key issues:
- The data source would be sent manually from a separate team each week meaning a one off clean would be insufficient, we needed a solution that would programmatically ‘clean’ the data each time the process was run.
- The format of the data we received each week was likely to change, so we had to build a dynamic solution that would detect if the data contained the key required fields, and extract those from the data before any business logic was run.
This presented us with significant challenges. We adopted the following 5 step approach to this project:
We first presented ‘illustration only’ MI (metrics and dashboards) to senior management to agree a presentation format.
Once the presentation format was agreed, we knew which data points and fields we needed to extract from the database.
We created a robust, dynamic data structure that confirmed the existence of the required data from the database.
Once existence was confirmed, we extracted the data in the specific format that our MI required.
Only then was business logic run against the database and our MI was produced.
We transformed a process that previously took an entire day into one that took less than 10 minutes to run, whilst increasing data quality, accuracy, and process fidelity.
- Process heavily reliant on user?
- Simple BAU process?
- Scalable BAU process?
- Embedded data validation?
- Embedded reconciliation checks?
- Business logic applied programmatically?
- Consistent data output?
- Previous Process
- Redesigned EDI Process