
How to Set Up Fluent Bit, Elasticsearch, and Kibana for Log Management
In this blog post, I’ll walk you through the steps to set up Fluent Bit, Elasticsearch, and Kibana for centralized log management. This setup is particularly useful for collecting, parsing, and visualizing logs from applications like Couchbase. By the end of this guide, you’ll have a fully functional log management system.
1. Update Your System and Install Fluent Bit
First, ensure your system is up to date and install Fluent Bit, a lightweight log processor and forwarder.
sudo dnf update -y
sudo dnf install -y https://packages.fluentbit.io/centos/8/x86_64/fluent-bit-2.2.2-1.x86_64.rpm
sudo systemctl enable --now fluent-bit
systemctl status fluent-bit
This will install Fluent Bit and start the service. Verify that the service is running with systemctl status fluent-bit
.
2. Configure Fluent Bit
Next, configure Fluent Bit to collect logs from Couchbase and forward them to Elasticsearch.
Edit the Fluent Bit configuration file:
sudo vi /etc/fluent-bit/fluent-bit.conf
[SERVICE]
Log_Level info
Parsers_File /etc/fluent-bit/parsers.conf
[INPUT]
Name tail
Path /opt/couchbase/var/lib/couchbase/logs/*.log
Tag couchbase_logs
Parser couchbase_log_parser
Refresh_Interval 5
[FILTER]
Name modify
Match couchbase_logs
Rename log message
[OUTPUT]
Name es
Match couchbase_logs
Host 10.0.0.140
Port 9200
Index couchbase-logs
Time_Key timestamp
Time_Key_Format %Y-%m-%dT%H:%M:%S.%L%z
Suppress_Type_Name On
[OUTPUT]
Name stdout
Match *
This configuration:
- Collects logs from Couchbase log files.
- Parses the logs using a custom parser.
- Forwards the logs to Elasticsearch and also outputs them to the console for debugging.
3. Create a Custom Parser for Couchbase Logs
Fluent Bit needs a parser to extract structured data from Couchbase logs. Create a parser configuration file:
vi /etc/fluent-bit/parsers.conf
[PARSER]
Name couchbase_log_parser
Format regex
Regex ^(?<timestamp>\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}\.\d{3}\+\d{2}:\d{2})\s+\[(?<log_level>\w+)\]\s+(?<message>.*)$
Time_Key timestamp
Time_Format %Y-%m-%dT%H:%M:%S.%L%z
4. Install and Configure Elasticsearch
Now, let’s install Elasticsearch to store and index the logs.
Add the Elasticsearch repository and install it:
sudo rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch
cat <<EOF | sudo tee /etc/yum.repos.d/elasticsearch.repo
[elasticsearch]
name=Elasticsearch repository for 8.x packages
baseurl=https://artifacts.elastic.co/packages/8.x/yum
gpgcheck=1
enabled=1
autorefresh=1
type=rpm-md
EOF
sudo dnf install -y elasticsearch
sudo systemctl enable elasticsearch
Edit the Elasticsearch configuration file:
sudo vi /etc/elasticsearch/elasticsearch.yml
network.host: 0.0.0.0
xpack.security.enabled: false
sudo systemctl start elasticsearch
sudo systemctl status elasticsearch
5. (Optional) Install Kibana for Log Visualization
If you want to visualize your logs, install Kibana:
sudo dnf install -y kibana
sudo systemctl enable --now kibana
Kibana will be available at http://<your-server-ip>:5601
. You can create dashboards and visualizations to explore your logs.
Testing
curl -X DELETE "10.0.0.140:9200/couchbase-logs"
curl -X PUT "10.0.0.140:9200/couchbase-logs" -H 'Content-Type: application/json' -d'
{
"mappings": {
"properties": {
"timestamp": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"log_level": {
"type": "keyword"
},
"message": {
"type": "text"
}
}
}
}'
curl -X GET "10.0.0.140:9200/couchbase-logs/_search?pretty"
Result >>
{
"took" : 0,
"timed_out" : false,
"_shards" : {
"total" : 1,
"successful" : 1,
"skipped" : 0,
"failed" : 0
},
"hits" : {
"total" : {
"value" : 2091,
"relation" : "eq"
},
"max_score" : 1.0,
"hits" : [
{
"_index" : "couchbase-logs",
"_id" : "WEq7sJUBUh7VXPpY9IHx",
"_score" : 1.0,
"_source" : {
"timestamp" : "2025-03-19T23:27:41.%L+0000.851Z",
"message" : "[error_logger:error,2025-03-20T02:27:40.850+03:00,babysitter_of_ns_1@cb.local:ns_child_ports_sup<0.134.0>:ale_error_logger_handler:do_log:101]"
}
},
{
"_index" : "couchbase-logs",
"_id" : "WUq7sJUBUh7VXPpY9IHx",
"_score" : 1.0,
"_source" : {
"timestamp" : "2025-03-19T23:27:41.%L+0000.851Z",
"message" : "=========================SUPERVISOR REPORT========================="
}
},
{
"_index" : "couchbase-logs",
"_id" : "Wkq7sJUBUh7VXPpY9IHx",
"_score" : 1.0,
"_source" : {
"timestamp" : "2025-03-19T23:27:41.%L+0000.851Z",
"message" : " supervisor: {local,ns_child_ports_sup}"
}
},
{
"_index" : "couchbase-logs",
"_id" : "W0q7sJUBUh7VXPpY9IHx",
"_score" : 1.0,
"_source" : {
"timestamp" : "2025-03-19T23:27:41.%L+0000.851Z",
"message" : " errorContext: child_terminated"
}
},
{
"_index" : "couchbase-logs",
"_id" : "XEq7sJUBUh7VXPpY9IHx",
"_score" : 1.0,
"_source" : {
"timestamp" : "2025-03-19T23:27:41.%L+0000.851Z",
"message" : " reason: normal"
}
},
{
"_index" : "couchbase-logs",
"_id" : "XUq7sJUBUh7VXPpY9IHx",
"_score" : 1.0,
"_source" : {
"timestamp" : "2025-03-19T23:27:41.%L+0000.851Z",
"message" : " offender: [{pid,<0.19798.2>},"
}
},
{
"_index" : "couchbase-logs",
"_id" : "Xkq7sJUBUh7VXPpY9IHx",
"_score" : 1.0,
"_source" : {
"timestamp" : "2025-03-19T23:27:41.%L+0000.851Z",
"message" : " {id,{index,\"/opt/couchbase/bin/indexer\","
}
},
{
"_index" : "couchbase-logs",
"_id" : "X0q7sJUBUh7VXPpY9IHx",
"_score" : 1.0,
"_source" : {
"timestamp" : "2025-03-19T23:27:41.%L+0000.851Z",
"message" : " [\"-adminPort=9100\",\"-scanPort=9101\",\"-httpPort=9102\","
}
},
{
"_index" : "couchbase-logs",
"_id" : "YEq7sJUBUh7VXPpY9IHx",
"_score" : 1.0,
"_source" : {
"timestamp" : "2025-03-19T23:27:41.%L+0000.851Z",
"message" : " \"-streamInitPort=9103\",\"-streamCatchupPort=9104\","
}
},
{
"_index" : "couchbase-logs",
"_id" : "YUq7sJUBUh7VXPpY9IHx",
"_score" : 1.0,
"_source" : {
"timestamp" : "2025-03-19T23:27:41.%L+0000.851Z",
"message" : " \"-streamMaintPort=9105\",\"--httpsPort=19102\","
}
}
]
}
6. Verify the Setup
- Check Fluent Bit logs to ensure logs are being collected and forwarded.
- Use Elasticsearch’s REST API to query the
couchbase-logs
index. - If Kibana is installed, create an index pattern and start exploring your logs.