Laravel Log Management using Filebeat + ELK (Elastic Search, Logstash and Kibana)

Mehraien Arash
5 min readFeb 22, 2024

Log management is one of the most important parts of your application especially when your application grows, in the mature application with heavy load it’s very crucial to track the event that occur in the system to avoid from a disaster situation and system collapse and solution is log management involves storing, rotation, filter, visualization and analysis

One of the popular and robust approach of log management is ELK (Elastic Search, Logstash and Kibana), In the following I would like to try integrate ELK with Laravel application so let’s get started.

How it works [Image from https://logz.io/blog/filebeat-vs-logstash/]

Step 1: Create a Laravel project and configuration

composer create-project laravel/laravel elk-log-management
cd elk-log-management

Now you have to make some changes in the application

First you should create a custom log formatter to modify the log data to a proper format for storing within Elasticsearch. So create a class inside app/Services/CustomFormatter.php . Write this in the file

<?php
namespace App\Services;
use Monolog\Formatter\NormalizerFormatter;
use Monolog\LogRecord;
class CustomFormatter extends NormalizerFormatter
{
public function format(LogRecord $record)
{
$result = parent::format($record);
$result['app_name'] = env('APP_NAME');
$result['@timestamp'] = $this->normalize($record->datetime);//Needed for Kibana
/**
* You can add any other property that you need
*/
return $this->toJson($result) . "\n";
}
}

Now, open config/logging.php and change your desired channel

  • I use ‘daily’ channel most of the time.
use App\Services\CustomFormatter;
.
.
.
'daily' => [
'driver' => 'daily',
'path' => storage_path('logs/laravel.log'),
'level' => env('LOG_LEVEL', 'debug'),
'days' => 14,
'replace_placeholders' => true,
'formatter'=>CustomFormatter::class,//The only change needed here
],

Next, change .env file

APP_NAME=ELK-log-manager
LOG_CHANNEL=daily

Step 2: Setup ELK infrastructures and configuration

Now, you must configure and run ELK infrastructure

  • I used docker to configure ELK infrastructure

So create elk directory in the root of your Laravel application and inside the elk directory create docker-compose.yml file and write this in the file

version: "3.8"
name: laravel-log-manager

networks:
elk-network:
driver: bridge

volumes:
elastic-data-vl:

services:
elasticsearch:
image: elasticsearch:8.11.1
container_name: elasticsearch
restart: always
volumes:
- elastic-data-vl:/usr/share/elasticsearch/data/
environment:
ES_JAVA_OPTS: "-Xmx256m -Xms256m"
bootstrap.memory_lock: true
discovery.type: single-node
xpack.license.self_generated.type: basic
xpack.security.enabled: false
ports:
- "9200:9200"
- "9300:9300"
ulimits:
memlock:
soft: -1
hard: -1
networks:
- elk-network

Logstash:
image: logstash:8.11.1
container_name: logstash
restart: always
volumes:
- ./logstash/:/logstash_dir
command: logstash -f /logstash_dir/logstash.conf
depends_on:
- elasticsearch
ports:
- "5044:5044"
- "9600:9600"
environment:
LS_JAVA_OPTS: "-Xmx256m -Xms256m"
networks:
- elk-network

Kibana:
image: kibana:8.11.1
container_name: kibana
restart: always
ports:
- "5601:5601"
environment:
- ELASTICSEARCH_URL=http://elasticsearch:9200
depends_on:
- elasticsearch
networks:
- elk-network

filebeat:
image: elastic/filebeat:8.11.1
container_name: filebeat
user: root
platform: linux/amd64
volumes:
- ./filebeat/filebeat.yml:/usr/share/filebeat/filebeat.yml
- ../storage/logs:/var/log/ELK-log-manager #Be careful about this line if you put this file in another path this line maybe need change (in my case I put 'docker-compose.yml' in directory named 'elk' which placed in the root of Laravel app so to reach the path of log files I use '../storage/logs')
environment:
- monitoring.enabled= true
depends_on:
- Logstash
- elasticsearch
command: ["--strict.perms=false"]
ulimits:
memlock:
soft: -1
hard: -1
stdin_open: true
tty: true
deploy:
mode: global
logging:
driver: 'json-file'
options:
max-size: '12m'
max-file: "100"

networks:
- elk-network

Now, you must configure filebeat and logstash to work with each other

Inside elk directory create filebeat directore and inside filebeat create filebeat.yml and write this in the file

filebeat.inputs:
- type: log
enable: true
paths:
- /var/log/ELK-log-manager/*.log #Attention: path '/var/log/ELK-log-manager' must be exact the path that you defined in filebeat volume in docker-compose.yml


output.logstash:
hosts: ["logstash:5044"]

logging.json: true
logging.metrics.enable: false

logging:
files:
rotationeverybytes: 12582912

Inside elk directory create logstash directore and inside logstash create logstash.conf and write this in the file

input {
beats {
port => 5044
}
}


filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level} %{GREEDYDATA:message}" }
}
}

output {
elasticsearch {
hosts => ["http://elasticsearch:9200"]
index => "laravel-log-%{+YYYY.MM.dd}"
}
}

For the end and some Log or throw some exception in any part of Laravel application that you want

Additional: You could customize and generalize Exceptions it definitely helps you to track the events more effectively (for more example and configuration visit my GitHub repository for this article [GitHub Repo] )

Now everything is fine. You can run elk infrastructure by

cd elk  
docker compose up -d
Docker containers
  • After start of all containers maybe it takes 1–2 minutes to ELK integration to be ready to use so be patient

Now you could start your Laravel application

php artisan serve

Send some requests to your Laravel app to trigger Log or throw exception that you have added to your Laravel app

Step 2: Discover logs in Kibana

Open http://localhost:5601/app/kibana#/management/kibana/index_patterns you must reach to this page

Kibana Index Management

Now you should create index pattern based on defined index in logstash.cong

Put laravel-* (if you changed the index name in logstash.cong you must use that name) to index patter textbox and select @timestap from dropdown list and save

Create Index Pattern

After that choose “Discover” from the menu and you should find the log in this page

Discover

Now you could filter or search them.

I hope you enjoy this article. Do you have any questions? Please add it in the comments.

And for more example and configuration visit my GitHub repository for this article [GitHub Repo]

--

--