Archive your SAP Data to Google’s Cloud Storage

Integrating SAP ArchiveLink with Google Cloud Storage for Seamless Data Archival

Satish Inamdar
Google Cloud - Community
9 min readJun 23, 2024

--

Sample Solution — Archival to Googel Cloud Storage Architecture

Introduction

Recently, an innovative sample solution for archiving SAP data to Cloud Storage on Google Cloud was made available on GitHub. This solution provides a powerful and efficient way for organizations to securely store and manage their SAP data in the cloud. With this sample solution, organizations can leverage the scalability, reliability, and cost-effectiveness of Google Cloud to optimize their data archiving processes.

The solution is designed to be easily accessible and customizable. It can be directly pulled from GitHub, allowing organizations to quickly integrate it into their existing SAP environments. Alternatively, the transport request (TR) can be downloaded from the release section of GitHub and then imported into SAP Systems, providing greater flexibility for organizations with specific requirements.

This solution is built on top of ABAP SDK for Google Cloud and it leverages all the underlying framework included in the SDK ( like: Authentication, HTTP Request & Response handling etc).

Purpose of this blog

The primary purpose of this blog is to provide you with a good understanding of the archival sample solution. Through this blog, we will guide you through a step-by-step example of an archival process. This example will encompass different stages and activities involved in the archival process.

Pre-requisites

Create a bucket in Cloud Storage in your Google Cloud Project with a unique name and Storage Class as Nearline as this optimally suited for data archival scenarios. However, you can evaluate different regions and storage classes offered by Google Cloud and choose the one most suitable for your business purposes.

Bucket on Google Cloud

To learn more about storage classes and bucket regions you can follow the links here: https://cloud.google.com/storage/docs/storage-classes and https://cloud.google.com/storage/docs/locations respectively.

Configuration

As highlighted in the above section, this solution is built on top of ABAP SDK for Google Cloud therefore for the purpose of this blog we will be assuming that you are conversant with ABAP SDK and your SAP system already has the latest version of the SDK (V1.7) installed.

In case, you’re new to ABAP SDK for Google Cloud, don’t worry. I’ll share some links to blog posts at the end of this blog that will help you get up to speed.

Without further ado, lets begin!

Client Key Settings:

This setting is related to ABAP SDK for Google Cloud. This setting involves maintaining configuration data in the client key table /GOOG/CLIENT_KEY. You can access this table maintenance via transaction SM30, alternatively you can go to transaction SPRO and follow the path: ABAP SDK for Google Cloud > Configurations > Client Key Settings

An example configuration that will also be used for the purposes of this demo is shown below in the screenshot:

Client Key Configuration — Table: /GOOG/CLIENT_KEY

The service account configured sap-data-archival-sa@xxxxxxx.iam.gserviceaccount.com should have the IAM roles Storage Object Creator (roles/storage.objectCreator)and Storage Admin (roles/storage.admin) assigned to it. Refer the following link for details regarding various permission contained in the aforementioned roles: https://cloud.google.com/storage/docs/access-control/iam-roles

Archival Sample Solution Configuration Settings:

This setting is related to the sample archival solution. This involves maintaining the configuration data in the table ZGOOG_CONT_REPO. You can access this table maintenance via transaction SM30. An example configuration is shown below:

Content Repositor Solution Configuration — Table: ZGOOG_CONT_REPO

Let’s understand the meaning and content to be stored in the table fields:

Content Repository: CS is the content repository name

Google Cloud Key Name: ARCHIVE_TO_GCS is client key name using which connection is established with Google Cloud and is a primary key in table /GOOG/CLIENT_KEY

Bucket Name: Google Cloud Storage Bucket Name

Steps to Archive

We will archive SAP table data to Google Cloud Storage. Now that all prerequisites are satisfied, it is time to begin. For the purpose of this blog, let’s focus on the archive object BC_SBAL. This object is used for archiving old or expired application logs.

Navigate to transaction: SARA

For the input field Archiving object provide the value BC_SBAL. Click on the button Customizing in the application menu bar to maintain technical settings

SARA Transaction — Initial Screen

Select Technical Settings under Archiving Object-Specific Customizing sub menu

Technical Settings in the Customizing Options

Here you can maintain various parameters, however for the purposes of this demo we will proceed with just maintaining the content repository information as shown in the screenshot below. Save the configuration, this will prompt for saving the customizing data in a transport request. Choose an appropriate transport request in the pop-up window and continue.

Maintain the Content Repository Name in the Archiving Object Technical Customizing Configuration

From the initial screen of SARA transaction choose the button Write

Step 1: Write Action

Within the new screen for Write action provide a variant name and maintain the required parameters for the variant. This will be used to fetch the data that will be selected for archival. For the demo, we will consider the data from last year (2023) as shown in the below screenshot:

Once you save the variant, you will be back on the previous screen. From here click on the button Start Date and choose a schedule for executing the write job in the background. For this demo, we will choose an immediate execution.

Maintain Start Date related setting for Data Archiving Step: Write

Provide output device information for Spool Parameters and save. An example is shown below:

Once the required settings are maintained, you should see green traffic light icons against each of the buttons. You can now proceed and click on execute button.

Required settings maintained for Archival Step: Write

On successful execution, you will see a success message on the left bottom of your screen

Success message — Archiving job scheduled

You can click on the Job button to view the scheduled background jobs and take a look at the job output as well

Job logs for Archival Step: Write
Job Output dispalyed on Spool by the background job

The next action to be performed is Delete. This will delete the data from the tables constituted in the archival object BC_SBAL and write the archival file consisting of deleted data to the content repository CS.

Click on the Archival Selection button in the next screen. You should be able to see a new entry which is based on the previous step (Write). Select the entry as shown in the screenshot below and save.

As for Start Date and Spool Parameters buttons, perform similar action as previously done during the Write action

Once all the settings are maintained for each of the buttons you will see a green traffic light icon against each of them. You can go ahead and click on execute.

All required steps performed for Archival Step: Delete

On successful execution you will a success message on left bottom of your screen as shown below

You can again click on the Job button the application menu bar and take a look at the jobs and the spool that is generated

Jobs scheduled for Archival Step: Delete
Spool Output for the background job to delete the data

From the initial screen of SARA transaction click on the Management button

You can expand the tree to find the required Archival session and view the details of the Archival file

You can also change the layout to view additional details as shown below:

Document ID for the Archiving File

Within the Google Cloud Storage, a folder is created with the name Doc ID, and it serves as the repository for the related files for a particular archival file/session. One of these files is the primary data file, accompanied by a corresponding metadata file that contains metadata information.

A folder created within the bucket with name same as Doc ID
Actual data file for the archival object stored in Cloud Storage

A Note on Pricing

Cloud storage is a cost-effective way to store your data. Here are some simplistic figures for a quick view:

Storage — $20 / TB / Month

Write — $1 / 100K inserts

Read — $0.4 / Million reads

You can learn more about pricing of Cloud Storage here : https://cloud.google.com/storage/pricing

In fact, it’s comparable to the price of a Starbucks coffee in major cities like New York or San Francisco. A Starbucks coffee in New York or San Francisco typically costs around $3.50 ( I am not even considering the options GRANDE and VENTI). For that price, you can get more than 100GB of cloud storage for a month.

And here’s the best part: You only pay for the storage you use. So, if you don’t need a lot of storage, you won’t have to pay for it. This flexibility makes cloud storage a cost-effective option for businesses and individuals alike.

Conclusion

This sample solution offers numerous benefits for organizations looking to archive their SAP data to Cloud Storage on Google Cloud. It streamlines the archiving process, reducing the time and effort required to manage and maintain SAP data. Additionally, it enhances data security by leveraging the robust encryption features of Google Cloud, ensuring the protection of sensitive SAP data. By leveraging Google Cloud’s scalable infrastructure, organizations can confidently store and manage their SAP data at any scale.

Furthermore, the sample solution is cost-effective, offering a compelling alternative to traditional on-premises archiving solutions. Organizations can optimize their IT budgets by leveraging the pay-as-you-go pricing model of Google Cloud, only paying for the storage resources they consume.

Other useful and related reading material

As promised earlier, here I will be adding some reading material and blog links for ABAP SDK for Google Cloud:

Bookmark What’s new with the ABAP SDK for Google Cloud for the latest announcements and follow installation and configuration instructions.

Check out these blog posts to get started with ABAP SDK for Google Cloud

  • This blog, explains how you can evaluate ABAP SDK for Google Cloud using ABAP Platform Trial 1909 on Google Cloud Platform.
  • Read this blog post to get a sneak peek on how a business process such as Sales Order entry in SAP can be automated using ABAP SDK for Google Cloud.
  • This blog is an excellent start to understand how BigQuery ML which is a powerful machine learning service that lets you build and deploy models using SQL queries. you can now be accessed with ABAP SDK for Google Cloud.
  • This blog provides details of Translation API Advanced (V3) and Translating with Glossaries using ABAP SDK for Google Cloud
  • This blog provides details of setting up JWT Based Authentication for ABAP SDK for Google Cloud

You can also join the Google Cloud Community

Checkout the youtube Channel!!

Subscribe to the below youtube channel where you would find, a quick 5 minutes overview covering the design principles and capabilities of ABAP SDK, reference architectures and art of the possible SAP solutions based on Google’s AI services, Google Workspace APIs and Google Maps Platform APIs…. along many more insightful references.

So, this bring us to the conclusion, thank you all for your patience reading!

--

--