Transformation

The importance of data quality is often underestimated in IT projects. This doesn’t only become apparent during initial migration, but can require attention at a structural level as well.

Consider, for instance, mapping the data between two separate systems, or receiving invalid data from legacy systems with limited validation capability. Failing to improve data quality ‘up front’ means invalid data may be carried throughout the entire process, potentially leading to invoicing errors and other problems.

With this in mind, we have developed the Abillity® Transformation module: a data pipeline where incoming data is monitored for quality (and improved if necessary), and where raw data can be ‘enriched’ in readiness for subsequent stages in the order-to-cash process.

Why use FIQAS for transformation?

Guaranteed quality

Data quality is optimised as early as possible in the process.

Rapid workflow

Admins can track exceptions on a graphic display and take action to resolve them (en masse) quickly and easily.

Process optimisation

Exceptions being clearly visible means lessons can be learned in order to optimise the process on a continuous basis.

Some of our data transformation clients

KPN

FIQAS processes tens of millions of transactions on a daily basis for KPN Wholesale using Abillity®. These transactions arise from the use of the fixed telephone network, both by their own customers and those of other telecom providers.

More about KPN

PostNL

Now that PostNL Transport has also started using Abillity®, they are one step closer to their ambition of becoming the biggest provider of logistics solutions and postal services in, to and from the Benelux.

More about PostNL

Transformation using Abillity®

The Transformation module can be applied at the front end of the order-to-cash chain, capable of parsing any type of data and converting it for downstream processing. This module is sold separately as one of the optional add-ons to the Abillity® platform. The module itself boasts a generic design and represents a framework to which we can easily add business logic and validation processes specific to your transformation process.

Dataverwerking

The transformation module can be fed from APIs or by supplying files. Processing incoming data properly requires the following steps:

  • Validation
    The data supplied is checked for formatting (structure), mandatory fields and specific entries, determined by business logic where appropriate. If the data received does not meet specified requirements, an exception is raised.
  • Transformation
    The data is transformed to make it suitable for generic processing. We have settled on a structure which only contains fields relevant to the intended purpose.
  • Enriching
    If required, raw data can be enriched. Consider for instance mapping received IDs to master data within Abillity®, so descriptions or tags can be added. If the desired enrichment proves not feasible (perhaps because mapping is not possible), an exception is raised for the data received.
  • Combining
    Several different types of raw data can be combined to create transactions for processing.
  • Splitting
    This is the opposite of combining; sometimes, a transaction received needs to be split, perhaps because it needs to be invoiced to several parties.
  • Grouping
    Particularly with high-volume processing, it may be desirable or necessary to group data by predetermined fields, so that subsequent processes operate with aggregate data, significantly reducing the volume of data in those processes.

Exception handling

A clear web-based environment complete with search functionality provides users with details of exceptions ready for exception handling.

The Abillity® transformation module offers users various options for processing incorrect or incomplete data efficiently where necessary.

  • Deleting and resubmitting data
    If a source system has the option to edit data and resubmit it once corrected, this would be the preferred option. It is then easy for users to delete any incorrect data within the transformation module.
  • Editing and reprocessing data
    If the source system does not offer the ability to resubmit data, or this is not fast enough, it is possible to edit the data en masse using the transformation module. A full audit trail is maintained of who edited what and when.
  • Editing master data
    When data is enriched based on master data, some master data may be missing or incorrect. It is easy to edit master data in Abillity® to resolve this problem. The data can then still be processed successfully.

Automated pipeline

Data pipelines can be configured in Abillity® to run automatically. Handling exceptions will usually require human input, but the ‘happy path’ can in theory run 24/7 with no human intervention. This means input can be processed continuously and (provided the quality is right) output can flow continuously to the next stage in the order-to-cash process.

Abillity® pipelines can scale horizontally and can process hundreds of millions of transactions every day.

Improved data quality

With the Ability® transformation module we are breaking the IT adage of ‘garbage in = garbage out’. We prevent ‘garbage’ ending up in the next process stage and only allow usable data to pass through (whether recycled or not 😊).
The output from the transformation process consists of validated data – enriched using business rules and perhaps even already aggregated – which serves as input for the next stage in the process: rating.

More information?

Keen to find out if our platform is right for your organisation? Would you like to find out more about our transformation module? Any specific questions? Then get in touch. We’d love to talk!

Willem Lemmers

Senior Consultant

+31 297 382323