Beyond Read and Write: The ABAP Way to Cloud Storage

Satish Inamdar
Google Cloud - Community
9 min readDec 12, 2023

ABAP SDK for Google Cloud V1.5 supports a wide group of Google APIs which ranges from Enterprise APIs, Workspace APIs to Maps Platform APIs. This SDK provides a simple and direct way to communicate with Google APIs natively in ABAP language.

In this blog, we will do an initial exploration of the Google Cloud’s Storage JSON API. Before we begin with our exploration, let’s also quickly understand what Google Cloud Storage is.

Google Cloud Storage is a highly scalable, reliable, and secure cloud-based object storage service. It offers a broad range of features, including:

High scalability: Google Cloud Storage can store and manage exabytes of data.

Reliable: Google Cloud Storage is designed to be highly reliable, with a 99.99% uptime SLA.

Secure: Google Cloud Storage uses a variety of security features to protect your data, including encryption at rest and in transit.

Google Cloud Storage is a great option for storing and managing large amounts of data. It is used by a wide range of businesses, enterprises and applications.

Some of the benefits of using Google Cloud Storage:

Cost-effectiveness: Google Cloud Storage is a cost-effective way to store and manage large amounts of data.

Simplicity: Google Cloud Storage is easy to use, with a simple and intuitive API.

Flexibility: Google Cloud Storage can be used with a variety of applications and platforms.

The Google Cloud Storage JSON API comes with a range of capabilities including like:

Create, list, and delete buckets

Upload, download, and delete objects

Set and get object metadata

Manage object access permissions

Encrypt objects at rest and in transit

Google Cloud Storage JSON API is a powerful tool that can be used to manage and access your data in Google Cloud Storage. It is easy to use and can be integrated with a variety of applications and platforms.

In ABAP SDK for Google Cloud, the Storage JSON API is available as an ABAP Class or as we like to call it an API Client Stub, The Class Name is : /GOOG/CL_STORAGE_V1. This class serves as a single window of interaction from ABAP to Google Cloud Storage and it contains all the required methods and data types to interact with the JSON Storage API.

Configuration

Before we begin coding with ABAP SDK and consuming the Storage JSON API, there are certain pre-requisite configurations that need to be maintained. The below section explains the same:

Step: Configure a Client Key required for connectivity.

  • Login to SAP where ABAP SDK for Google Cloud V1.5 is installed, goto transaction SPRO > ABAP SDK for Google Cloud > Basic Settings > Configure Client Key and add the following new entry. (Please replace the service account and project id in the below entry as per your environment).
  • Ensure that the service account is assigned the role “Storage Admin” and “Storage Object Creator” role

Google Cloud Key Name: DEMO_STORAGE

Google Cloud Service Account Name: abap-sdk-dev@gcp-project.iam.gserviceaccount.com

Google Cloud Scope: https://www.googleapis.com/auth/cloud-platform

Google Cloud Project Identifier: gcp-project

Authorization Class: /GOOG/CL_AUTH_GOOGLE

Leave the other fields blank

Let’s get started with Coding:

Scenario 1: Create a bucket on Cloud Storage

Step 1: Go to transaction code SE38 and create a report program with name ZDEMO_CREATE_BUCKET

Step 2: Paste the below code and activate the program

REPORT zdemo_create_bucket.

DATA ls_input TYPE /goog/cl_storage_v1=>ty_001.
DATA ls_output TYPE /goog/cl_storage_v1=>ty_001.
DATA lv_project_id TYPE string.
DATA lv_json_response TYPE string.
DATA lv_ret_code TYPE i.
DATA lv_err_text TYPE string.
DATA ls_err_resp TYPE /goog/err_resp.
DATA lv_msg TYPE string.
DATA lo_gcs TYPE REF TO /goog/cl_storage_v1.
DATA lo_exception TYPE REF TO /goog/cx_sdk.

TRY.

lo_gcs = NEW #( iv_key_name = 'DEMO_STORAGE' ).
lv_project_id = lo_gcs->gv_project_id.

"Bucket Name is globally unique & permanent"
ls_input-name = 'newtest_bucket_abapsdk_gcloud01'.

lo_gcs->insert_buckets( EXPORTING iv_q_project = lv_project_id
is_input = ls_input
IMPORTING es_raw = lv_json_response
es_output = ls_output
ev_ret_code = lv_ret_code
ev_err_text = lv_err_text
es_err_resp = ls_err_resp ).

IF lo_gcs->is_success( lv_ret_code ) = abap_true.
cl_demo_output=>new(
)->begin_section( 'Result:'
)->write_text( 'Bucket was created:'
)->next_section( 'JSON Response:'
)->write_json( lv_json_response
)->display( ).

ELSE.
lv_msg = lv_ret_code && ':' && lv_err_text.
cl_demo_output=>new(
)->begin_section( 'Result:'
)->write_text( 'Bucket creation failed:'
)->next_section( 'Error:'
)->write_json( lv_msg
)->display( ).
ENDIF.

CATCH /goog/cx_sdk INTO lo_exception.
lv_msg = lo_exception->get_text( ).
MESSAGE lv_msg TYPE 'S' DISPLAY LIKE 'E'.
ENDTRY.

lo_storage->close( ).

On successful execution, you will see the output as shown in the below screenshot:

Success Response

You can also verify by logging into Google Cloud Platform that the bucket is created in Cloud Storage

Bucket details shown in Google Cloud Console
Bucket Created

Scenario 2: List the available buckets

Step 1: Go to transaction code SE38 and create a report program with name ZDEMO_LIST_BUCKET

Step 2: Paste the below code and activate the program

REPORT zdemo_list_bucket.

DATA ls_output TYPE /goog/cl_storage_v1=>ty_004.
DATA lv_project_id TYPE string.
DATA lv_maxresults TYPE string.
DATA lv_pagetoken TYPE string.
DATA lv_prefix TYPE string.
DATA lv_ret_code TYPE i.
DATA lv_err_text TYPE string.
DATA ls_err_resp TYPE /goog/err_resp.
DATA lv_msg TYPE string.
DATA lo_exception TYPE REF TO /goog/cx_sdk.
DATA lo_storage TYPE REF TO /goog/cl_storage_v1.
DATA lt_buckets TYPE TABLE OF string.

FIELD-SYMBOLS <ls_item> TYPE /goog/cl_storage_v1=>ty_001.

TRY.

lo_storage = NEW #( iv_key_name = 'DEMO_STORAGE' ).
lv_project_id = lo_storage->gv_project_id.

lv_project_id = lo_storage->gv_project_id.

"Read maximum 25 buckets
lv_maxresults = 25.
CONDENSE lv_maxresults.
lv_prefix = 'abap-sdk-'.

CLEAR sy-index.

WHILE sy-index = 1 OR lv_pagetoken IS NOT INITIAL.


lo_storage->list_buckets( EXPORTING iv_q_project = lv_project_id
iv_q_maxresults = lv_maxresults
iv_q_pagetoken = lv_pagetoken
iv_q_prefix = lv_prefix
IMPORTING es_output = ls_output
ev_ret_code = lv_ret_code
ev_err_text = lv_err_text
es_err_resp = ls_err_resp ).

IF lo_storage->is_success( lv_ret_code ) = abap_true.
lt_buckets = VALUE #( BASE lt_buckets
FOR items IN ls_output-items
( items-name ) ).
ENDIF.

ENDWHILE.

IF lo_storage->is_success( lv_ret_code ) = abap_true.
cl_demo_output=>new(
)->begin_section( 'List of Buckets'
)->write_data( lt_buckets
)->display( ).
ELSE.
lv_msg = lv_ret_code && ':' && lv_err_text.
cl_demo_output=>new(
)->begin_section( 'Error:'
)->write_text( lv_msg
)->display( ).
ENDIF.

CATCH /goog/cx_sdk INTO lo_exception.
lv_msg = lo_exception->get_text( ).
MESSAGE lv_msg TYPE 'S' DISPLAY LIKE 'E'.
ENDTRY.

lo_storage->close( ).

On successful execution, you will see the output as shown in the below screenshot:

Program Output: List of buckets
List of buckets displayed as output

Scenario 3: Upload a file to a bucket

In this scenario we are loading a markdown file that is stored in the application server. Therefore replace the file path and name accordingly.

Step 1: Go to transaction code SE38 and create a report program with name ZDEMO_UPLOAD_FILE

Step 2: Paste the below code and activate the program

REPORT zdemo_upload_file.

DATA lv_file_length TYPE i.
DATA lv_ret_code TYPE i.
DATA lv_err_text TYPE string.
DATA lv_msg TYPE string.
DATA lv_dset TYPE string.
DATA lv_data TYPE string.
DATA ls_data TYPE xstring.
DATA ls_output TYPE /goog/cl_storage_v1=>ty_013.
DATA ls_err_resp TYPE /goog/err_resp.
DATA lo_exception TYPE REF TO /goog/cx_sdk.
DATA lo_storage TYPE REF TO /goog/cl_storage_v1.

" Read file data from the application server
DATA(dset) = '/tmp/googcl_addrvaldn_v1.md'.
OPEN DATASET dset FOR INPUT IN TEXT MODE ENCODING DEFAULT.
IF sy-subrc <> 0.
MESSAGE 'Cannot opening/reading dataset' TYPE 'E'.
ENDIF.
DO.
READ DATASET dset INTO lv_dset.
IF sy-subrc <> 0.
EXIT.
ENDIF.

CONCATENATE lv_data lv_dset INTO lv_data SEPARATED BY cl_abap_char_utilities=>newline.
CLEAR lv_dset.

ENDDO.

CLOSE DATASET dset.

CALL FUNCTION 'SCMS_STRING_TO_XSTRING'
EXPORTING text = lv_data
IMPORTING buffer = ls_data
EXCEPTIONS failed = 1
OTHERS = 2.
IF sy-subrc <> 0.
MESSAGE 'Conversion from string to xstring failed' TYPE 'E'.
ENDIF.

TRY.

lo_storage = NEW #( iv_key_name = 'DEMO_STORAGE' ).

lo_storage->insert_objects( EXPORTING iv_q_name = 'example_file.md'
iv_p_bucket = 'newtest_bucket_abapsdk_gcloud01'
is_data = ls_data
iv_content_type = 'text/markdown'
IMPORTING es_output = ls_output
ev_ret_code = lv_ret_code
ev_err_text = lv_err_text
es_err_resp = ls_err_resp ).

IF lo_storage->is_success( lv_ret_code ) = abap_true.
cl_demo_output=>new(
)->begin_section( 'Result:'
)->write_text( 'Object was uploaded successfully'
)->write_text( 'Object Self Link:'
)->write_text( ls_output-self_link
)->display( ).
ELSE.
cl_demo_output=>new(
)->begin_section( 'Error:'
)->write_text( lv_msg
)->display( ).
ENDIF.
CATCH /goog/cx_sdk INTO lo_exception.
lv_msg = lo_exception->get_text( ).
MESSAGE lv_msg TYPE 'S' DISPLAY LIKE 'E'.
RETURN.
ENDTRY.

lo_storage->close( ).

On successful execution you should be able to see the below output:

Program Output

We can verify the upload by logging into Google Cloud Platform:

File Uploded to Cloud Storage

A portion of file content is also displayed in the below screenshot which can be directly viewed from the Cloud Storage:

Portion of uploaded file data

Scenario 4: Download a file from a bucket

In this scenario, a text file stored in Cloud storage bucket is downloaded to SAP’s application server. Please note that we are using the ‘/tmp/’ directory. Please replace this directory path accordingly as required.

Step 1: Go to transaction code SE38 and create a report program with name ZDEMO_DOWNLOAD_FILE

Step 2: Paste the below code and activate the program

REPORT zdemo_download_file.

DATA ls_data TYPE xstring.
DATA ls_output TYPE /goog/cl_storage_v1=>ty_013.
DATA lv_ret_code TYPE i.
DATA lv_err_text TYPE string.
DATA lv_data_str TYPE string.
DATA lv_length TYPE i.
DATA ls_err_resp TYPE /goog/err_resp.
DATA lv_msg TYPE string.
DATA lt_bin_tab TYPE TABLE OF char1024.
DATA lo_exception TYPE REF TO /goog/cx_sdk.
DATA lo_storage TYPE REF TO /goog/cl_storage_v1.

TRY.
lo_storage = NEW #( iv_key_name = 'DEMO_STORAGE' ).

" Set the Common Query Parameter 'alt' to 'media' to download the object data
" If the parameter is not set only object metadata is downloaded
lo_storage->add_common_qparam( iv_name = 'alt'
iv_value = 'media' ).

lo_storage->get_objects( EXPORTING iv_p_bucket = 'newtest_bucket_abapsdk_gcloud01'
iv_p_object = 'sample_file.txt'
IMPORTING es_output = ls_output
ev_ret_code = lv_ret_code
ev_err_text = lv_err_text
es_err_resp = ls_err_resp
es_raw = ls_data ).

IF lo_storage->is_success( lv_ret_code ) = abap_true.
CALL FUNCTION 'SCMS_XSTRING_TO_BINARY'
EXPORTING buffer = ls_data
IMPORTING output_length = lv_length
TABLES binary_tab = lt_bin_tab.

CALL FUNCTION 'SCMS_BINARY_TO_STRING'
EXPORTING input_length = lv_length
IMPORTING text_buffer = lv_data_str
TABLES binary_tab = lt_bin_tab
EXCEPTIONS failed = 1
OTHERS = 2.
IF sy-subrc <> 0.
MESSAGE 'Error convering Binary to String' TYPE 'E'.
ENDIF.

DATA(dset) = '/tmp/sample_file.txt'.
OPEN DATASET dset FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
IF sy-subrc <> 0.
MESSAGE 'Error opening the file' TYPE 'E'.
ENDIF.
TRANSFER lv_data_str TO dset.
IF sy-subrc <> 0.
MESSAGE 'Error writing file' TYPE 'E'.
ENDIF.
CLOSE DATASET dset.
cl_demo_output=>new(
)->begin_section( 'Result:'
)->write_text( 'Object was downloaded successfully'
)->display( ).
ELSE.
lv_msg = lv_ret_code && ':' && lv_err_text.
cl_demo_output=>new(
)->begin_section( 'Error:'
)->write_text( lv_msg
)->display( ).
ENDIF.

CATCH /goog/cx_sdk INTO lo_exception.
lv_msg = lo_exception->get_text( ).
MESSAGE lv_msg TYPE 'S' DISPLAY LIKE 'E'.
RETURN.
ENDTRY.

lo_storage->close( ).

On successful execution you should be able to see the below program output:

Program Output

File data as viewed in the transaction AL11

Downloaded File data

What’s Next ?

In this blog post, we have highlighted some basic and useful scenarios for interacting with Cloud Storage using ABAP SDK for Google Cloud. These scenarios are designed to get you started with using the ABAP SDK for Google Cloud. In future blog posts, we will present you with newer examples that cover increasingly complex scenarios and explore further capabilities of Cloud Storage JSON API.

We hope that you find these blog posts helpful in learning how to use the ABAP SDK for Google Cloud to interact with Cloud Storage.

Ready to start using ABAP SDK for Google Cloud?

Bookmark What’s new with the ABAP SDK for Google Cloud for the latest announcements and follow installation and configuration instructions.

Join the community today!

The ABAP SDK for Google Cloud Community is now open! This is a place for you to ask questions, share knowledge, and collaborate with other ABAP developers who are using Google Cloud.

We encourage you to get involved in the community and help us make the ABAP SDK for Google Cloud even better. We have a lot of exciting things planned for the future, and we want you to be a part of it.

--

--