Storing OIC JSON input payload in ATP for Audit
Introduction:
When an integration is activated in OIC GEN3, three levels of tracking are available — Production, Audit, and Debug. But it is not advisable to use the Audit and Debug tracing levels in a Production environment as they could create system slowdowns and pose a security risk due to the logging of important payload information. However, for troubleshooting this data can be essential in operations. In this situation, this is advisable to keep this data in a database for audit.
Main Content:
As per best practice we crate a common integration for Auditing framework to store the input payload along with instance id, integration name, tracking variables, and instance dateTime etc. This information in the database can be useful for troubleshooting even with tracking set to Production and no logs available in the Audit Trail. However, OIC internally processes all payloads as XML by default. Hence even if we receive a JSON payload when to we try to store it in DB it will be converted in XML.
For example, we have used below function to store entire input payload as a string in OIC mapper. Though the input payload was JSON, you can see that the database has it stored as XML.
oraext:get-content-as-string (/nssrcmpr:execute/ns19:request-wrapper )
However, if a client want to replicate a scenario from the beginning, this XML won’t be useful to them. So we have to store the As-Is JSON coming in input, which is visible in dequeued Wire Messages:
Step-by-Step Guides/Tutorials:
We must use Stage File Activity to write the JSON payload, read the complete payload as an opaque element, then store it in the database in order to meet our requirements. Lets start building the integration now.
Taking into account that our integration uses a Rest Adapter in trigger that gets a request for the employee JSON sample payload. Then we will be using same sample JSON payload(saved as a file) in Stage write activity. We have selected the “Write File” operation and provided one dummy FileName/Directory where the file will be written in a stage location.
Also while specifying schema select Sample JSON document and click continue, then select the sample JSON payload you have used in Trigger connection & Finish it. Which means we want to write entire employee JSON input payload As-Is into a stage location.
Once the Stage Write configuration is complete, you need to copy each node from the JSON input payload in the OIC mapper. Finally OIC mapper will looks like below :
Now we have to read the file as a binary content for which we will be using opaque.xsd. First configure a Stage Activity, where option to select will be Read Entire File. And you can either use the file reference of writeFile Response, or manually type the FileName/Directory mentioned in the Write Stage File activity.
But now instead of JSON, we will be using XML Schema(XSD) Document to read the file. And select the a XSD with opaque schema. With this, the payload written in the previous step will be read base64 binary format into the opaque Element. Sample snapshot provided with sample opaque.xsd.
<?xml version = '1.0' encoding = 'UTF-8'?>
<schema targetNamespace="http://xmlns.oracle.com/pcbpel/adapter/opaque/"
xmlns="http://www.w3.org/2001/XMLSchema">
<element name="opaqueElement" type="base64Binary"/>
</schema>
After that you need to configure DB adapter to insert payload into it. You can select either of the below options to load data into DB:
- Perform an Operation On Table
- Run a SQL statement
- Invoke a Store Procedure
I personally prefer to write a store procedure to perform such actions, as will have more control over error handling. However, the developer has complete control over the strategy they choose to employ. Once DB adapter has been configure you need to map required details. As part of my POC, I just copy three parameters to populate in DB.
- Instance ID
- Integration Name
- Payload
The payload will be transformed to base64 format as we read it using opaque schema. Hence we will be using oraext:decodeBase64 function to decode base64 format and store into Database.
Execution:
Now we are ready to test the integration. The overall integration will appear as follows. Please assign tracking variables, save & Activate it.
We will be executing the integration with a valid employee JSON Payload, and will expect the same JSON will be stored into Database.
Now you can see exact same payload has been stored in DB, which can be used by operation team to replicate any error scenarios.
In Conclusion:
In any type of integration, auditing is crucial. But we need to make sure this will not impact performance as well. For Async integrations, you can use the same approach as I outline in this blog post. However, in the case of Sync integrations, customer will be awaiting for an response from the OIC endpoint. In those scenarios, I will advise implementing any type of auditing in a common child integration that will be called asynchronously from the main integration. This will help process actual flow in parallel with Auditing.
I hope that anyone who are attempting to store JSON payload as an input would find this blog useful.
➡️ Please follow me on LinkedIn
I am an accomplished professional with 15 years of experience in Oracle Integration. Throughout my career, I have been deeply involved in architecting and implementing integration solutions for various organizations, enabling them to seamlessly connect and optimize their business requirements & Cloud Migration.