Migrating Sybase Tables to Google BigQuery using Cloud DataFusion

Smithson Sylvester Dmello
Google Cloud - Community
4 min readDec 1, 2022

Sybase, which is now known by the name SAP Adaptive Server Enterprise(ASE), was considered as one of the popular and forward thinking relational database management systems. It was mainly appreciated owing to its transaction processing speed, besides other features like high availability & data security.

Post acquisition of Sybase by SAP, it looked that the popularity doomed, as SAP appeared giving more push to its favorite HANA database, leaving the Sybase users in a fix.

Looking at the information which is being published by SAP, it appears that Sybase is deviated to be a key component of SAP’s future product roadmap.

By migrating Sybase Table to GCP BigQuery with the Cloud Data Fusion, enterprises can easily gain the benefits that come with BigQuery, such as serverless architecture and scalability. Seamless data insights can then be provided using the popular Business Intelligence tools such as Looker, Tableau and Data Studio.

Context:

Today, we will have a look into how we can migrate a Sybase Table to GCP BigQuery with the Cloud DataFusion data integration service provided by Google Cloud.

Prerequisite:

We consider here that we already have a Sybase database server installed on to a GCP Compute Engine linux Instance.

Go to the Sybase Install directory and check the server name

Set the environment by running the following command :

. ./SYBASE.sh

Now, switch to install directory. In my case /opt/sap/ASE-16_0/install is the RUN files path and run the following command in Linux

startserver -f RUN_TEST

Note:(In this case ,during installation I had give the server names as TEST for main server & TEST_BS for backup server).

Similarly, also start the Backup server by the following command

startserver -f RUN_TEST_BS

After this, run the below 2 commands to connect to the Sybase server

unset LANG

isql64 -U sa -STEST

This will now ask us for the password we have setup during the Sybase installation. Enter the password to connect to the server.

I have an already created Employee table, which we can view as below

Employee Table description
Sample Employee database table records

Create a corresponding BigQuery Table Schema, with proper Datatype mappings in the BigQuery Schema definition.

BigQuery Table Schema creation

As a next step to create a Data Fusion pipeline, we will add a SYBASE JDBC driver to Data Fusion(assuming that we already have a GCP DataFusion instance created).

Sybase Driver upload

Database plugin: To establish connectivity , we have made use of Database as a source plugin.

Database plugin properties: We would now be required to give the connection setting properties in the database plugin, which requires the JDBC connection string, Sybase server credentials & the Import query for the table we want to import through this pipeline.

Sybase database plugin properties

BigQuery Plugin: As our destination for Sybase table migration is BigQuery warehouse, we will select the BigQuery plugin from the Sink menu bar.

BigQuery plugin

BigQuery Plugin Properties: In the BigQuery plugin properties, we need to specify the Project ID, the dataset in which we want to migrate the table.

BigQuery plugin properties

Final Pipeline from Sybase to DataFusion, can be see below

Pipeline after successful run

In this article, we had a brief walk-through on how a low-code pipeline can be set up to migrate a Sybase database table to serverless and cost-effective GCP data warehouse BigQuery.

--

--