One of the challenges many businesses face when analyzing large data sets in their SAP system is to work in a small subset of data due to high data volume. Restrictions in SAP environments can lead to limited business data predictions and the inability to use machine learning for business use cases.

It is crucial to have the right tools in place in order to use data effectively in the least amount of time. Tools such as Google BigQuery can help, but how? Let’s take a look at our Telecom Customer case study to show you.

Our Telecom Customer approached us because they required a solution for their Spend Analytics. To look at their requirement from a high-level point of view, they wanted to build a modern data platform that would work with large amounts of procurement data, while also having the ability to customize comprehensive reports on evolutionary analytical tools. With a new data platform, our Telecom customer will have the ability to optimize data structures and extend the data platform into machine learning use cases.

Additionally, our Telecom customer was looking to solve core data challenges in their SAP by adopting a cloud-based DWH. Before our customer decided to work with our team, they evaluated SAP BW, but it wasn’t viable due to data size, cost, flexibility, and agility. Our proposal to our Telecom customer was designed to go from ‘current state’ to resolve data issues that stem from the inability to execute queries on huge spend data.

Let’s dive into our solution…

Above, is a generic architecture diagram that details the different tools we can use in each stage of the transition process to a cloud-based DWH. We used Cloud storage and BigQuery for Data Store and DataPrep and BigQuery for data preparation and processing.

In this project, our Telecom customer required historical data to be transferred from SAP in time-sliced batches using flat files. These files were loaded into the Cloud storage and then loaded into BigQuery staging tables. All of the data was flattened during the extractions to avoid unnecessary ‘joins’ in BigQuery. The delta data from the staging data sets merged into golden BigQuery data sets, which will be used for reporting and machine learning.

We used BigQuery Machine Learning (BQML) for Spend prediction. Fraud detection was initiated using models through TensorFlow. For reporting, we used Google Data Studio and Looker.

This project was a success for our Telecom Customer because they now have the ability to view their Spend Analytics in greater detail, which takes away stress and time from their Procurement Team. We showed our Telecom Customer the benefit of using Google BigQuery for their data. Between in-depth customer interviews and analysis, to showing our customers a visual roadmap of the entire process, they knew that their requirements would be resolved.

If you’re interested in learning more about the benefits of using Google BigQuery Data warehouse for your SAP systems, we invite you to watch our on demand webinar. Click here to access the Integrating SAP Analytics on BigQuery on demand webinar!