Procedure to Load updated records into Employee table

Biman Dey Sarkar
2 min readNov 30, 2022

--

In my another Blog you might have seen how to create credentials and create external table. Now lets create the dummy employee table which is having existing emp records.

Script to create sample Employee table :

CREATE TABLE EMPLOYEE_TBL 
(USERID VARCHAR2(100),
FIRSTNAME VARCHAR2(200),
LASTNAME VARCHAR2(200),
EMAILID VARCHAR2(300),
AGE NUMBER(3));

Script to create a package where the integration need to pass the filename which stored in object storage. CREATE_EMP_STG procedure will create the stage table, which will be called from PROCESS_EMP_DB procedure. This procedure to process all delta and updated records in Database.

You need to create the Credential(as mentioned as ‘Auth_Token’) in the database in advance. I have separate blog to demonstrate how to create the same.

CREATE OR REPLACE PACKAGE LOAD_DATA_INTO_EMP as
c_object_storage_bucket VARCHAR2(200) := 'https://objectstorage.ap-hyderabad-1.oraclecloud.com/n/############/b/bucket-############/o/';
c_integration_id NUMBER;
c_credential_name VARCHAR2(200) := 'Auth_Token';

table_not_exists exception;
PRAGMA EXCEPTION_INIT(table_not_exists, - 942);

PROCEDURE CREATE_EMP_STG( p_bucket_uri IN VARCHAR2,
p_filename IN VARCHAR2,
p_credential_name IN VARCHAR2);

PROCEDURE PROCESS_EMP_DB( p_filename IN VARCHAR2);

END LOAD_DATA_INTO_EMP;

Now lets create the package body, which will perform all below operations:

  • CREATE_EMP_STG procedure will first check if the stage table exists or not. If yes, it will drop the table.
  • Then CREATE_EMP_STG procedure will create external table, which will be loaded with fresh data from object storage
  • PROCESS_EMP_DB is the main procedure which will call CREATE_EMP_STG to create the stage table
  • Then PROCESS_EMP_DB will keep checking comparing the new records in Stage table & available data in employee table and merge as per requirement.

Please find below code snippet to create the message body.

CREATE OR REPLACE PACKAGE BODY LOAD_DATA_INTO_EMP IS

PROCEDURE CREATE_EMP_STG (p_bucket_uri IN VARCHAR2 ,p_filename IN VARCHAR2,p_credential_name IN VARCHAR2) IS

l_stage_table_name VARCHAR2(100) := 'TEMP_EMP_STAGE_TBL';
table_not_exist exception;
PRAGMA EXCEPTION_INIT( table_not_exist, -942);

--drop external table if exists

BEGIN
BEGIN
EXECUTE IMMEDIATE ('drop table '|| l_stage_table_name );
EXCEPTION
WHEN table_not_exist THEN
NULL;
WHEN OTHERS THEN
RAISE;
END;

--create the external table

DBMS_CLOUD.create_external_table( table_name => l_stage_table_name,
credential_name => p_credential_name,
file_uri_list => p_bucket_uri || p_filename,
column_list => 'USERID VARCHAR2(200),FIRSTNAME VARCHAR2(200),LASTNAME VARCHAR2(200),EMAILID VARCHAR2(200),AGE NUMBER',
format => JSON_OBJECT( 'delimiter' VALUE ',','type' VALUE 'csv','skipheaders' VALUE '1','ignoremissingcolumns' value 'true'));

END CREATE_EMP_STG;


PROCEDURE PROCESS_EMP_DB (p_filename IN VARCHAR2) IS

l_status VARCHAR2(10);
l_error_message VARCHAR2(1000);
l_stage_table_name VARCHAR2(100) := 'TEMP_EMP_STAGE_TBL';

BEGIN

--call CREATE_EMP_STG to create external table

CREATE_EMP_STG(p_bucket_uri => c_object_storage_bucket, p_filename => p_filename, p_credential_name => c_credential_name);

--load data into main table

EXECUTE IMMEDIATE 'MERGE INTO "ADMIN"."EMPLOYEE_TBL" E
USING
(SELECT DISTINCT * FROM TEMP_EMP_STAGE_TBL) S
ON
(S."USERID" = E."USERID")
WHEN MATCHED THEN
UPDATE SET
E."FIRSTNAME"=S."FIRSTNAME",
E."LASTNAME"=S."LASTNAME",
E."EMAILID"=S."EMAILID",
E."AGE"=S."AGE"
WHEN NOT MATCHED THEN
INSERT (E."USERID", E."FIRSTNAME", E."LASTNAME", E."EMAILID", E."AGE")
VALUES (S."USERID", S."FIRSTNAME", S."LASTNAME", S."EMAILID", S."AGE")';
COMMIT;

END PROCESS_EMP_DB;


END LOAD_DATA_INTO_EMP;

Hope this is very helpful to load millions of records directly in DB in very less time. One of the biggest table we had around 300Million records. We split those records in multiple file and migrated with in 3hours.

--

--

Biman Dey Sarkar

Around 15 years of experience in Oracle Integration. I have worked on cloud migration projects with several clients from different regions.