Collect server log files with Elasticsearch

Here is showed how to obtain request/response logs from servers and fill them to elastic.

Logstash

Input data format:

2018-08-30 12:33:11.767 INFO  [trnId:11,corrId:insomnia-1535625191712] c.c.m.p.p.l.r.ConsoleMessageLogger - INBOUND REQUEST: {"header":{"METHOD":"POST","URI":"/my/payments?bank=0800","REQUEST_HEADERS":{"User-Agent":"insomnia/6.0.2","Content-Type":"application/json","Accept":"*/*","Content-Length":"335"}},"payload":{"amount": {"instructedAmount": {"currency": "CZK","value": 1}},"debtorAccount": {"identification": {"iban": "CZ0000000000001"}},"creditorAccount": {"identification": {"iban": "CZ0000000000002"}},"requestedExecutionDate" : "2018-08-30"}}
2018-08-30 12:33:11.939 INFO  [trnId:11,corrId:insomnia-1535625191712] c.c.m.p.p.l.r.ConsoleMessageLogger - OUTBOUND RESPONSE: {"header":{"STATUS":400,"RESPONSE_HEADERS":{"Date":"Thu, 30 Aug 2018 10:33:11 GMT","Content-Type":"application/json;charset=UTF-8"}},"payload":{"errors":[{"error":"FIELD_MISSING","scope":"paymentTypeInformation","dev-message":"Vyplnte prosím tento povinný údaj."}],"cz-transactionId":"100000803109966"}}

logstash configuration:

input {
file {
path => "C://logs/*.log"
start_position => "beginning"
}
}

filter {
grok {
match => { "message" =>
"^%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level}\s+\[trnId:%{NUMBER:transactionId},corrId:%{DATA:correlationId}] %{JAVAFILE:class} - (?<operDirection>(INBOUND|OUTBOUND)) (?<operType>(REQUEST|RESPONSE)): %{GREEDYDATA:request}$"
}
remove_field => "message"
}

if "_grokparsefailure" in [tags] {
drop { }
}

json{
source => "request"
target => "parsedJson"
}

if "_jsonparsefailure" in [tags] {
mutate {
add_field => {
"jsonFailed" => "%{[request]}"
}
}
}

if [operType] == "REQUEST" and [parsedJson][header] {
mutate {
add_field => {
"method" => "%{[parsedJson][header][METHOD]}"
"uri" => "%{[parsedJson][header][URI]}"
"headers" => "%{[parsedJson][header][REQUEST_HEADERS]}"
}
}
}
if [operType] == "REQUEST" and [parsedJson][payload] {
mutate {
add_field => {
"payload" => "%{[parsedJson][payload]}"
}
}
}
if [operType] == "RESPONSE" and [parsedJson][header] {
mutate {
add_field => {
"status" => "%{[parsedJson][header][STATUS]}"
}
}
}
if [operType] == "RESPONSE" and [parsedJson][header][RESPONSE_HEADERS] {
mutate {
add_field => {
"headers" => "%{[parsedJson][header][RESPONSE_HEADERS]}"
}
}
}
if [operType] == "RESPONSE" and [parsedJson][payload] {
mutate {
add_field => {
"payload" => "%{[parsedJson][payload]}"
}
}
}

mutate {
remove_field => ["parsedJson","request","class","host"]
}

date {
match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss.SSS" ]
timezone => "Europe/Prague"
remove_field => ["timestamp"]
}
}

output {
elasticsearch {
hosts => "localhost:9200"
index => "my_api"
document_type => "mytype"
}
stdout { codec => rubydebug }
}

Create logstashStart.bat:

logstash -f logstashConfig.conf

Downloading log files from server

  1. Install rsync, add it to git folder

rsync is able to download only file differences

2. Create bash script

#!/bin/sh
rsync -avP --inplace ${USERNAME}@server.cz:/log/log-s1.log ./logs

inplace is required for maintaining inode which prevents duplication of records in elastic

3. Create process in Kotlin

fun download() {
val process = ProcessBuilder("bash", "C:\\logDownloader.sh", "username").start()

val reader = BufferedReader(InputStreamReader(process.inputStream))
val builder = StringBuilder()
var line: String? = reader.readLine()

while ((line) != null) {
line = reader.readLine()
builder.append(line)
builder.append(System.getProperty("line.separator"))
}

val result = builder.toString()
println(result)
}

4. Run in scheduler

@Scheduled(cron = "0 */5 7-19 * * MON-FRI")
fun reportCurrentTime() {
downloaderService.download()
}

Create run.bat:

cd C:\elastic\elasticsearch-5.4.1\bin\
start elasticsearch.bat
cd C:\elastic\logstash-5.4.1\bin\
start logstashStart.bat
cd C:\elastic\kibana-5.4.1-windows-x86\bin\
start kibana.bat
cd C:\logdownload\
start mvn spring-boot:run
pause

Configure index my_api in Kibana with time field @timestamp.

Result for request in Kibana: