A large retail customer was facing issues when trying to analyze their spend data in their SAP system because the data volume was very high (5TB SAP HANA). They wanted to track their spending, predict future spending, and prevent fraudulent activities. They were unable to achieve their goals due to system limitations as well as performance and cost issues. Kochasoft proposed to build a Spend Analytics Solution in Google GCP Big Query Datawarehouse.

Here is a high-level architecture of the solution:

We have used our proprietary Kochasoft SAP Certified GCP Integration Tool to integrate SAP and Big Query to build a procurement Datawarehouse.

The Kochasoft GCP Integration Tool enabled us to source data from SAP Tables, Extractors, CDS Views, Info sets, and BW Cubes/ADSO’s. It has robust support for delta management including initialize, initialize without data transfer, full load, repetition of failed delta, and pseudo delta. Additionally, it supports live streaming of data and Change Data Capture (CDC) for SAP tables.

A special feature of the tool includes the grouping of tables (import multiple tables to Big Query at once) which simplifies replication of multiple tables.

The process is started by first staging data into BQ and then flattening/ de-normalizing the data into the final reporting tables. We used Google Data Studio Tool for Reports/Dashboards on top of Big Query, and BQML and AutoML were used to predict future spending and fraud detection.  This approach significantly reduced implementation time and cost, and ultimately reduced TCO for our Customer.