JSON Logging in Mule 4: Getting the Most Out of Your Logs

Edgar Moran
Another Integration Blog
10 min readSep 6, 2023

--

JSON, or JavaScript Object Notation, is a lightweight data interchange format that is easy to read and write for humans and machines. JSON logging involves logging application data in JSON format instead of traditional text-based formats like plain text or CSV.

Mule 4 provides out-of-the-box logging capabilities that can be configured to log in JSON format. This is achieved by configuring the Mule runtime’s log4j2.xml file to use a JSON layout. The JSON layout converts the log messages into JSON format, making them easier to parse and analyze.

About the SDK

Use the Mule SDK for Java or XML to extend the Mule 4 Runtime by creating new modules that you can install in Mule apps. Examples of modules include connectors, such as HTTP, or modules with custom functionality, such as the Validations Module.

The SDK provides an API that:

  • Facilitates faster development of Mule modules than its predecessor, DevKit
  • Decouples modules from the Mule Runtime internals
  • Assures consistency across Mule components and other modules, such as cloud connectors

You can use either the advanced, feature-rich Java Mule SDK or the simpler XML SDK, which provides only outbound operations and doesn’t support recursive calls. More information is available in the official MuleSoft documentation about SDK.

Installing JSON logger

JSON logger is a MuleSoft extension that can be found in this GitHub repository. The extension has multiple features that enables your logging experience and allows you to get the most of all logs. Let’s talk about some of them:

  • Content field data parsing: To parse data in content fields as part of the output JSON structure (instead of the current “stringify” behavior)
  • DataWeave functions: To accommodate desired log formatting
  • External destinations: To send logs to external systems
  • Data masking: To obfuscate sensitive content
  • Scope Logger: For in-scope calculation of elapsed times

To get the JSON logger installed and available on Anypoint Studio, follow these steps:

  • Pull the JSON Logger code from the GitHub repository from your local environment.
  • Open the folder in a text editor like VSCode and locate the folder json-logger
  • Locate the pom.xml file and let’s replace the groupId tag with your organization Id (Access Management > Organization > Select the name of your organization) for this sample 73d212e2–80dd-4f41–9347–8dbb793c3b24 would be the value
  • From our command line (terminal) let’s execute the deploy-to-exchange.sh command. This will deploy the connector into our Exchange instance
./deploy-to-exchange.sh <YOUR-ORG-ID>

We are assuming at this point your settings.xml file for maven configuration is set up for the nexus repository which will allow you to connect to your instance, otherwise you’ll see a 401 unauthorized message in your terminal.

Another quick way to get it deployed is by creating a settings.xml file in the json-logger folder and specifying a new server for Exchange2 inside the servers tag. We can get the token by installing the unofficial Anypoint Chrome Extension and copying the token that appears on it.

Let’s copy and paste the value in the password tag.

One last step is to open the deploy-to-exchange.sh file and add in line 27 the -s json-logger/settings.xml attribute so the command will pick our recently created file.

Finally let’s run our command one more time and we should be able to deploy our extension.

You should be able to see the connector into Exchange:

Using JSON logger in a project, and how to get the most with Splunk

MuleSoft and Splunk are two powerful tools that can be used together to enhance your organization’s data integration and analysis capabilities. MuleSoft is an integration platform that allows you to connect different systems, applications, and data sources to exchange data seamlessly. Splunk, on the other hand, is a data analysis platform that helps you collect, index, and analyze machine-generated data.

By integrating MuleSoft and Splunk, you can achieve better visibility into your data and gain insights that were previously difficult to obtain. Here are some of the benefits of using MuleSoft and Splunk together:

  • Real-time data integration: MuleSoft allows you to connect to various data sources and systems in real-time. With Splunk, you can collect and analyze this data in real-time, allowing you to monitor and react to events as they occur.
  • Data analysis: Splunk has powerful data analysis capabilities, allowing you to identify patterns, trends, and anomalies in your data. By integrating MuleSoft with Splunk, you can analyze data from multiple sources to get a complete view of your organization’s data.
  • Streamline troubleshooting: MuleSoft and Splunk integration can help streamline the troubleshooting process by providing more visibility into your systems and applications. You can use Splunk to monitor logs and metrics from MuleSoft to quickly identify and resolve issues.
  • Better decision making: By analyzing data from multiple sources, you can make more informed decisions. With MuleSoft and Splunk, you can get a complete view of your organization’s data and use this information to make data-driven decisions

If we think about API-led connectivity, we can find a really powerful aggregated value keeping track of our logs for transactions we care about, in this case meaning, being able to track a single request across multiple applications in the different layers using one single common attribute like (X-Request-ID).

For this demo, let’s create a simple systems API that sends email notifications:

  • notification-system-api

First step is to add the extension in our project. We can do that by selecting the exchange button in the palette and searching for JSON Logger, then we can add it. Some dependencies are pulled and the connector operations will appear in the palette.

Let’s add the connector next, and we can put the required information. Here we can see we can specify values like the application name, version, environment, and default options to either disable fields or mask values.

Every system api will contain a flow variable called requestId which will hold the value from the header X-Request-ID. This will allow the transaction Id to persist on the implemented flows.

The logger can be set at the beginning and end of the flow, so you can also track how long it’s taking from start to end.

Now, before continuing and showing how the logs are getting inserted into Splunk, we need to mention what dependencies your application will need.

  • You need to have a subscription for Splunk, either cloud or on-premise. For this demo, we created a Splunk cloud trial account.
  • A Splunk token to use on our log4j2 file for authorization.
  • We need to include a repository and a couple dependencies in our pom.xml file.

Repository:

<repository>
<id>splunk-artifactory</id>
<name>Splunk Releases</name>
<url>https://splunk.jfrog.io/splunk/ext-releases-local</url>
</repository>

Dependencies:

<dependency>
<groupId>com.splunk.logging</groupId>
<artifactId>splunk-library-javalogging</artifactId>
<version>1.11.4</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.10.0</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>2.10.0</version>
</dependency>
  • Last step is to set up our log4j2.xml file (located under src>main>resources folder).
<?xml version="1.0" encoding="UTF-8"?>
<Configuration monitorInterval="60" packages="com.splunk.logging,org.apache.logging.log4j">
<Appenders>
<RollingFile name="file" fileName="${sys:mule.home}/logs/${project.artifactId}.log" filePattern="${sys:mule.home}/logs/archive/${project.artifactId}/${project.artifactId}-%d{yyyy-MM-dd}.log.gz">
<PatternLayout pattern="%-5p %d [%t] %c: %m%n" />
<Policies>
<TimeBasedTriggeringPolicy />
</Policies>
</RollingFile>
<SplunkHttp name="Splunk" url="https://prd-p-1lsw8.splunkcloud.com:8088" token="c8e9a29c-d4d5–47a8–832f-cac0d2bf7911" index="mulesoft" disableCertificateValidation="true">
<PatternLayout pattern="%-5p %d [%t] [event: %X{correlationId}] %c: %m%n" />
</SplunkHttp>
</Appenders>
<Loggers>
<AsyncLogger name="com.mulesoft.mule.transport.jdbc" level="WARN" />
<AsyncLogger name="org.apache.cxf" level="WARN" />
<AsyncLogger name="org.apache" level="WARN" />
<AsyncLogger name="org.springframework.beans.factory" level="WARN" />
<AsyncLogger name="org.mule" level="INFO" />
<AsyncLogger name="com.mulesoft" level="INFO" />
<AsyncLogger name="org.jetel" level="WARN" />
<AsyncLogger name="Tracking" level="WARN" />
<AsyncRoot level="INFO">
<AppenderRef ref="file" />
<AppenderRef ref="Splunk" />
</AsyncRoot>
</Loggers>
</Configuration>
© FreeFormatter.com - FREEFORMATTER is a d/b/a of 10174785 Canada Inc. - Copyright Notice - Privacy Statement - Terms of Use - Contact

One thing to mention is, in the example log4j2 file we see hardcoding of the values. These values can be passed during deployment using a CI pipeline. Once we deploy our application, it will start sending the logs into Splunk, and we should be able to start creating some reports.

How to customize JSON logging in Mule 4

Like its predecessor in Mule 3, the whole rationale behind JSON logger is to be a customizable component through metadata changes in the provided JSON schemas without having to really know much about SDK itself.

To customize the JSON output data structure, we pretty much follow the same concepts as described here. However, using the annotations described below, a big change introduced on this version is that for global expressions, we no longer need to define the field in both the loggerConfig.json and loggerProcessor.json. Instead, everything defined at the config level that we want to be printed in the output logs needs to be part of the nested object globalSettings inside loggerConfig.json.

If you define expressions inside the global config, make sure that the result of these expressions are fairly static throughout the lifecycle of the Mule app at runtime (e.g. appName, appVersion, environment). If you define something dynamic like “correlationId” (which, in theory, changes per request) then SDK will create a new instance of the global config for every new value which will end up in memory leaks.

Supported configurations

In order to tell the JSON logger how each field should be treated, we need to use the “sdk” object. Inside this object we can use the following attributes:

  • default → As the name implies, it allows defining a default value. It also implicitly makes the field optional, so it doesn’t require the user to input a value.

Code:

"priority": {
"type": "string",
"javaType": "org.mule.extension.jsonlogger.api.pojos.Priority",
"enum": [
"DEBUG",
"TRACE",
"INFO",
"WARN",
"ERROR"
],
"sdk": {
"default": "INFO",
"summary": "Logger priority"
}
}

Preview:

  • required + example → Unless specified, all fields are considered “mandatory.” You can also explicitly mark a field as required == true. When having a required field, it’s very helpful to provide an example text that points developers to the required data.

Code:

"message": {
"type": "string",
"sdk": {
"example": "Add a log entry",
"required": true,
"summary": "Message to be logged"
}
}

Preview:

  • displayName → Specifies the name to be displayed in the Studio/flow designer UI

Preview:

  • summary → Provides information about how to populate/use a specific field

Code:

"message": {
"type": "string",
"sdk": {
"example": "Add a log entry",
"required": true,
"summary": "Message to be logged"
}
}

Preview:

  • isContent → For scenarios when we need to log more complex data (e.g. based on expressions, payloads, attributes, etc.), we have the ability to define the isContent attribute to indicate SDK that the input will be a full fledged DataWeave expression
  • Finally, JSON logger will internally attempt to “stringify” the results of the expression and log it as part of the JSON object

Code:

"content": {
"type": "string",
"javaType": "org.mule.runtime.extension.api.runtime.parameter.ParameterResolver<org.mule.runtime.api.metadata.TypedValue<Object>>",
"sdk": {
"default": "#[output application/json - -n{n tpayload: payload,n tattributes: attributesn}]",
"summary": "Remove if no content should be logged",
"isContent": true
}
}

Preview:

  • isPrimaryContent → This option only exists for scenarios where you need more than one content field, as SDK needs to be told which field should be the primary one
  • expressionSupport (NOT_SUPPORTED / SUPPORTED / REQUIRED) → This field controls how the UI is generated for a specific field or object, e.g. if we want fields to be shown explicitly in the UI, we need to set the value to NOT_SUPPORTED.

Code:

"globalSettings": {
"type": "object",
"properties": {
"applicationName": {
"type": "string",
"sdk": {
"default": "${json.logger.application.name}"
}
},
(…),
"sdk": {
"parameterGroup": "Global Settings",
"expressionSupport": "NOT_SUPPORTED",
"placement": {
"order": 1
}
}
}

Preview:

  • placement → In SDK, there are two ways to organize where things are displayed
  • order → Indicates in which order fields will be displayed

Code:

"globalSettings": {
"type": "object",
"properties": {
"applicationName": {
"type": "string",
"sdk": {
"default": "${json.logger.application.name}"
}
},
(…),
"sdk": {
"parameterGroup": "Global Settings",
"expressionSupport": "NOT_SUPPORTED",
"placement": {
"order": 1
}
}
}

Preview:

  • tab → Allows to display fields in different tabs
"correlationId": {
"type": "string",
"sdk": {
"default": "#[correlationId]",
"placement": {
"tab": "Advanced"
}
}
}

Preview:

  • parameterGroup → Allows to visually group a set of related fields

Code:

"globalSettings": {
"type": "object",
"properties": {
"applicationName": {
"type": "string",
"sdk": {
"default": "${json.logger.application.name}"
}
},
(…),
"sdk": {
"parameterGroup": "Global Settings",
"expressionSupport": "NOT_SUPPORTED",
"placement": {
"order": 1
}
}
}

Preview:

Hope this helps you in your logging journey!

--

--

Edgar Moran
Another Integration Blog

@Mulesoft Ambassador | Software Engineer @Cisco Meraki | Ex-Twitter | Sr. Force.com developer | innovating in technology, love coding, and photography !