How did we implement a custom logging mechanism using Elasticsearch?

Our way of custom audit mechanism

Fatih YILMAZ
Trendyol Tech
5 min readDec 26, 2022

--

elastic.co

As Homepage & Recommendation Team, we lately have been working on an audit logging mechanism due to us having a back-office portal and wanted to keep records of the people that are making the changes.

We wanted to share our story and tell you how we did it.

For this, we implemented a custom logging mechanism using Elasticsearch. Actually, we already have a centralized logging mechanism using Fluentbit&Elasticsearch with Kibana, but we wanted our custom logging mechanism to only store our audit logs and nothing else.

If you are looking for how to create centralized logging using Elasticsearch, you can refer to this story by Oğuzhan Demir.

Initial thoughts

First, we’ve thought about just creating an ElasticSearchClient with a write method that will help us to write a custom logging message such as:

AUDIT: User fatih.yilmaz has changed this widget with Id: 123, data: {...}

and use it in code like this:

private WidgetDTO dataManipulativeMethod(WidgetRequest request) {

// business logic

if (request.isWidgetChangeRequest()) {
jestClient.execute(new Update.Builder(changesThatFatihMadeObject)
.index(auditIndex)
.id(request.getId())
.build());
}

// business logic

}

Of course, we could’ve written a simple wrapper for this, but eventually, we would’ve made it so that the business code would be tightly-coupled with infrastructure code, which we didn’t want to have.

So we looked for a more convenient alternative and continued on that.

Implementation

We were using Graylog before we switched to centralized Elasticsearch Logging for Trendyol, because of that we were familiar with logback appenders, that’s why the thought of using an appender came to our minds. Then we started to research libraries to achieve this behavior.

We found this library from internetitem, you can also check for the available configs from here.

Here is our config:

<include resource="org/springframework/boot/logging/logback/defaults.xml"/>

<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<filter class="com.trendyol.loremipsumapi.configuration
.logger.AuditExcludedLogFilter"/> <!-- important -->
</appender>
<appender name="ELASTIC" class="com.internetitem.logback
.elasticsearch.ElasticsearchAppender">
<url>http://lorem-ipsum-dolor-elastic.trendyol.com/_bulk</url>
<index>widget-audit-%date{yyyy-MM-dd}</index>
<errorLoggerName>widget-audit-error-logger</errorLoggerName>
<properties>
<property>
<name>level</name>
<value>%level</value>
</property>
<property>
<name>logger-name</name>
<value>%logger</value>
</property>
<property>
<name>user-email</name>
<value>%X{useremail}</value>
</property>
</properties>
<headers>
<header>
<name>Content-Type</name>
<value>application/json</value>
</header>
</headers>
<filter class="com.trendyol.loremipsumapi.configuration
.logger.AuditLogFilter"/> <!-- important -->
</appender>
<root level="INFO">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="ELASTIC"/>
</root>

To put user email data into our appender, we got our user data like this:

public class HttpRequestInterceptor implements AsyncHandlerInterceptor {

private static final String USER_EMAIL = "useremail";

@Override
public boolean preHandle(HttpServletRequest request, HttpServletResponse response, Object handler) {
User currentUser = request.getUser();
String currentUserEmail = currentUser.getEmail();
MDC.put(X_USER_EMAIL, currentUserEmail);

return true;
}
}

Our log appender is getting the data of user email from its ThreadLocal,
for this behavior, we needed to put the data into MDC.

For more detail about ThreadLocal and MDC, you can check this article.

Separation of concerns
We’ve marked the AuditExcludedLogFilter and AuditLogFilter in our appender, this was because:

We wanted to continue writing our application logs as before but with only a minor change: By discarding our Audit Logs from our central logging system and putting them only into our custom Elastic.

public class AuditExcludedLogFilter extends 
AbstractMatcherFilter<ILoggingEvent> {

@Override
public FilterReply decide(ILoggingEvent event) {
if (!isStarted() || StringUtils.isBlank(event.getMessage())) {
return FilterReply.NEUTRAL;
}

return event.getMessage().startsWith(AUDIT_LOG_PREFIX) ?
FilterReply.DENY : FilterReply.ACCEPT;
}
}
public class AuditLogFilter extends
AbstractMatcherFilter<ILoggingEvent> {

@Override
public FilterReply decide(ILoggingEvent event) {
if (!isStarted() || StringUtils.isBlank(event.getMessage()) ||
!event.getMessage().startsWith(AUDIT_LOG_PREFIX)) {

return FilterReply.DENY;
}

return FilterReply.ACCEPT;
}
}

To be able to customize our filtering, we needed to extend AbstractMatcherFilter as seen above.

Elasticsearch configurations
We wanted to set a retention policy for 60 days, to achieve this behavior, we set the current date in our appender as a suffix.

To do it manually:

PUT _ilm/policy/widget-audit-retention-policy {
"policy": {
"phases": {
"hot": {
"min_age": "0ms",
"actions": {
"set_priority": {
"priority": 100
}
}
},
"delete": {
"min_age": "60d",
"actions": {
"delete": {
"delete_searchable_snapshot": true
}
}
}
}
}
}

We needed to create an index template mainly because of these two reasons:

  1. Our indexes are logically pieces of a whole. The suffix contains date (for search purposes), but overall they are parts of “widget-audit”.
  2. Retention policies can only be applied to index templates.

Index template’s preview:

{
"template": {
"settings": {
"index": {
"lifecycle": {
"name": "widget-audit-retention-policy",
"rollover_alias": "widget-audit"
}
}
},
"aliases": {
"widget-audit": {}
},
"mappings": {}
}
}

You can create an index pattern to be able to search your logs as you like.

Due to us using our alias in our index template, our index pattern is just a simple definition.

We would’ve needed to put a wildcard at the end of our index pattern if we didn’t connect our alias to the index template.

Search in action

Left-hand side:

The reason me and Emre Tanriverdi wrote this story is to lend those, who want to do a similar thing in their projects, a helping hand.

We hope it was helpful. :)

Thank you for reading! ❤️

Thanks to all our colleagues in the Homepage & Recommendation Team. 🤟

--

--