Unified Logging Solution using ELK Stack

Background

There is a need for a unified logging solution: Applications typically run across multiple nodes which may not have SSH access to developers or management to view the logs. It is also not possible to SSH to each node to check which node can show the logs of a particular claim we are interested in.

There is also a need to maintain and check build logs, integration and performance test logs. The post below explains how the ELK stack can be used to setup and configure ELK stack for log monitoring.

What is ELK?

Elasticsearch, Logstash, Kibana

Any kind of and any volume of data that flows into your system can be searched in real time using elasticsearch. A distributed, HA elasticsearch cluster can be setup to enable horizontal scalability. Data can be stored using multiple indices which makes querying easier. The full text search capability is offered using Lucene.

Logstash is a tool to manage events and logs. It collects logs, parses them and stores them.

Kibana is a front end where you can see and interact with your data.

Logstash Forwarder (Lumberjack)

This is a tool which collects logs locally on a node and these logs can be forwarded to Logstash.

Setting up ELK Stack (on ubuntu)

The setup will cover 4 components installation:

Elasticsearch

Logstash

Kibana

Logstash Forwarder

The setup can be done on AWS machine, Instance Type: t2.medium (vCPU=2, Memory=4GB)

Install Java

Elasticsearch uses Apache Lucene, which is written in Java. Hence we need java installed on the machine.

sudo add-apt-repository -y ppa:webupd8team/java

sudo apt-get update

sudo apt-get -y install oracle-java7-installer

Install Elasticsearch

wget -O - http://packages.elasticsearch.org/GPG-KEY-elasticsearch | sudo apt-key add -

echo 'deb http://packages.elasticsearch.org/elasticsearch/1.1/debian stable main' | sudo tee /etc/apt/sources.list.d/elasticsearch.list

sudo apt-get update

sudo apt-get -y install elasticsearch=1.1.1

Update the config file:

sudo vi /etc/elasticsearch/elasticsearch.yml

Some edits are needed here.

script.disable_dynamic: true & network.host: localhost

Start elasticsearch:

sudo service elasticsearch restart

Install Kibana

I have setup Logstash 1.4.2 which recommends Kibana 3.0.1

cd ~; wget https://download.elasticsearch.org/kibana/kibana/kibana-3.0.1.tar.gz

tar xvf kibana-3.0.1.tar.gz

Edit the configuration file:

sudo vi ~/kibana-3.0.1/config.js

In the configuration, changed the port from default 9200 to 80:

elasticsearch: "http://"+window.location.hostname+":80"

Install Nginx to serve kibana

sudo mkdir -p /var/www/kibana3

sudo cp -R ~/kibana-3.0.1/* /var/www/kibana3/

sudo apt-get install nginx

User <== Port 80 ==> Kibana/Nginx <== Port 9200 ==> Elasticsearch

For this port routing, some configuration changes need to be done. Kibana’s sample Nginx configuration can be used.

cd ~; wget https://gist.githubusercontent.com/thisismitch/2205786838a6a5d61f55/raw/f91e06198a7c455925f6e3099e3ea7c186d0b263/nginx.conf

Edit configuration:

vi nginx.conf

Make changes as below:

server_name FQDN;

root /var/www/kibana3;

sudo mv nginx.conf /etc/nginx/sites-available/default

Secure using htpasswd

sudo apt-get install apache2-utils

sudo htpasswd -c /etc/nginx/conf.d/kibana.myhost.org.htpasswd user

Sanity Test

Start nginx:

sudo service nginx restart

Check if you can access kibana using the link: http://logstash_server_public_ip/

Install Logstash

echo 'deb http://packages.elasticsearch.org/logstash/1.4/debian stable main' | sudo tee /etc/apt/sources.list.d/logstash.list

sudo apt-get update

sudo apt-get install logstash=1.4.2-1-2c0f5a1

Generate SSL certificates

sudo mkdir -p /etc/pki/tls/certs
sudo mkdir /etc/pki/tls/private

cd /etc/pki/tls;

cat > logstash.cnf << BLOCK1



[req]


distinguished_name = req_distinguished_name
 
x509_extensions = v3_req
 
prompt = no
 
[req_distinguished_name]

C = TG
 
ST = Togo
 
L =  Lome
 
O = Private company
 
CN = *
 
[v3_req]
 
subjectKeyIdentifier = hash
 
authorityKeyIdentifier = keyid,issuer
 
basicConstraints = CA:TRUE
 
subjectAltName = @alt_names
 
[alt_names]
 
DNS.1 = *
 
DNS.2 = *.*
 
DNS.3 = *.*.*
 
DNS.4 = *.*.*.*
 
DNS.5 = *.*.*.*.*
 
DNS.6 = *.*.*.*.*.*
 
DNS.7 = *.*.*.*.*.*.*
 
IP.1 = <IP-address-of-logstash-server>
 
BLOCK1

Put the IP address of the logstash server machine.

Generate keys:

 sudo openssl req -x509 -batch -nodes -newkey rsa:2048 -keyout private/logstash-forwarder.key -out certs/logstash-forwarder.crt -config logstash.cnf -days 1825

Configure Logstash

There is lots that can be done to configure logstash. A basic part has been shown below:

 
cat /etc/logstash/conf.d/logstash-default.conf

input {
  lumberjack {
   port => 5000
    type => "logs
    ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
    ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
  }
}
filter {
    grok {
      type => "myapplog"
      pattern => "%{GREEDYDATA:logline}"
    }
}
output {
  elasticsearch { host => localhost }
  stdout { codec => rubydebug }
}

Start Logstash

sudo /opt/logstash/bin/logstash --config /etc/logstash/conf.d/logstash-default.conf &

You will see logstash server started successfully when you see something like below:

Using milestone 1 input plugin ‘lumberjack’. This plugin should work, but would benefit from use by folks like you. Please let us know if you find bugs or have suggestions on how to improve this plugin.  For more information on plugin milestones, see http://logstash.net/docs/1.4.2-modified/plugin-milestones {:level=>:warn}

Setup Logstash Forwarder

Logstash forwarder, formerly known as Lumberjack is set up on the machine which logs we want to capture.

Copy the SSL certificates from the logstash server to this node on which your application would run. (Consider you have copied to /tmp/logstash-forwarder.crt)

sudo mkdir -p /etc/pki/tls/certs
sudo cp /tmp/logstash-forwarder.crt /etc/pki/tls/certs/

Install Logstash-forwarder

echo 'deb http://packages.elasticsearch.org/logstashforwarder/debian stable main' | sudo tee /etc/apt/sources.list.d/logstashforwarder.list
sudo apt-get update
sudo apt-get --yes --force-yes install logstash-forwarder
cd /etc/init.d/
sudo wget https://raw.github.com/elasticsearch/logstash-forwarder/master/logstash-forwarder.init -O logstash-forwarder
sudo chmod +x logstash-forwarder
sudo update-rc.d logstash-forwarder defaults

Configure Logstash forwarder

There is a lot that can be configured and forwarded. A basic example for the configuration is as below:

cat > /etc/logstash-forwarder.conf << BLOCK
{
"network": {
    "servers": [ "$LOGSTASH_IP" ],
    "timeout": 15,
    "ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt"
  },
  "files": [
    {
      "paths": [
        "$PWD/myapplog.log"
       ],
      "fields": { "type": "myapplog" }
    }
   ]
}
BLOCK

where LOGSTASH_IP=logstash-server:port. (In this case port=5000) and PWD=”path of the log file”

Start logstash forwarder

sudo ./logstash-forwarder -config  /etc/logstash-forwarder.conf &

Done

The ELK stack along with logstash forwarder has been setup. When your application runs and writes to the log file, these would be sent by logstash forwarder to the logstash server. Logstash server parses them, indexes them and saves them. When you open kibana and view the logs, you can filter out data as needed.

Advertisements