Import millions of Data into Autonomous Database(ADB) using DBMS_CLOUD

Biman Dey Sarkar
3 min readNov 29, 2022

--

To import huge amount of data into Database, first need to create an Auth Token so ADB Can read files in Object Storage. You can choose either option by creating API Keys or Auth Token. In case of Auth Tokens don’t forget to keep the password handy to create connections.

Sample Auth Token created

Now log into your Admin database user & make sure below grant permission available for the user. You can use either SQL Developer off the cloud console or SQL developer client.

grant dwrole, oml_developer, create table, create view to <user>;
grant read, write on directory data_pump_dir to <user>;
grant execute on dbms_cloud to <user>;
alter user <user> quota unlimited on data;

You can execute either of the SQL(with required details) to establish credential with Object Storage

BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => 'api_token',
username => '<cloud userid>',
password => '<generated auth token password>'
);
END;
/
BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL (
credential_name => 'api_token',
user_ocid => 'user_ocid',
tenancy_ocid => 'tenancy_ocid',
private_key => 'private_key',
fingerprint => 'fingerprint'
);
END;
/
Sample create connection used using SQL Developer

If you want to check, the connection established correctly or not you can execute DBMS_CLOUS.LIST_OBJECTS to check all available data in the required bucket.

DBMS_CLOUS.LIST_OBJECTS details

Now lets create a file employee_$Date.csv with 100,000 dummy data and store it in the Object Storage. We will try to load entire load into database.

Now you just need to execute below DBMS_CLOUD. CREATE_EXTERNAL_TABLE, which will reads data from this file and load into a stage table with in a second.

BEGIN
DBMS_CLOUD.CREATE_EXTERNAL_TABLE(
table_name =>'Temp_Employee_Stg',
credential_name =>'Auth_Token',
file_uri_list =>'<object storage file URI>',
format => json_object('delimiter' value ',', 'removequotes' value 'true','ignoremissingcolumns' value 'true','blankasnull' value 'true','skipheaders' value '1'),
column_list => 'USERID VARCHAR2(200),FIRSTNAME VARCHAR2(200),LASTNAME VARCHAR2(200),EMAILID VARCHAR2(200),AGE NUMBER');
END;

Below block executed in miliseconds to load entire data into stage table

Took 0.096Sec to insert all the records into stage table
All data loaded successfully

Sometimes you will have limitation in OIC to process large amount of data into ADB. In those scenarios, you can create a custom package in OIC with different features of DBMS_CLOUD and execute the same from an OIC integration. PFB sample OIC integration :

  • Integration will perform List files from Object storage.
  • Then get the filename to load into ATP DB.
  • After that OIC need to pass the filename in the procedure to load data into a stage table.

I made a different blog to create a procedure to merge new records with existing records.

Sample Integration to load bulk data from Object Storage to ADB

References:

https://docs.oracle.com/en/cloud/paas/autonomous-database/adbsa/dbms-cloud-subprograms.html

https://cloud-code-innovate.github.io/dbms_cloud/

Thanks for going through the detailed implementation.

--

--

Biman Dey Sarkar

Around 15 years of experience in Oracle Integration. I have worked on cloud migration projects with several clients from different regions.