Adding SLF4J Logs to Akka

You might want to consider adding SLF4J plugin to your default Akka logs. This can help you standardize the logs and would help you in the better analysis of your logs.

Akka provides help on how you can plugin SLF4J into your akka application: 

I implemented the suggested approach and am listing down the simplified steps for quick reference.

Add dependency

Add akka-slf4j plugin and logback-classic dependency to your build.

Eg: Add the following to your pom.xml


If you are using SBT, then your build.sbt would look like:

libraryDependencies += “com.typesafe.akka” % “akka-slf4j” % “2.3.9”,
libraryDependencies += “ch.qos.logback” % “logback-classic” % “1.1.2”

Enable SLF4JLogger for Akka

Update your application.conf (src/main/resources) with the following:

akka {
event-handlers = [“akka.event.slf4j.Slf4jEventHandler”]
loglevel = “INFO”

Add logback.xml

Add logback.xml to your classpath. That is, create a logback.xml at src/main/resources. The following is an example of logback.xml. You can find more info about how you can configure your logs with the various patterns at:

<?xml version=”1.0″ encoding=”UTF-8″?>
<appender name=”CONSOLE” class=”ch.qos.logback.core.ConsoleAppender”>
<pattern>%d %X{akkaTimestamp} %-4r %-5level [%thread] %logger{0} %class{0} – %-5msg%n</pattern>
<appender name=”FILE” class=”ch.qos.logback.core.FileAppender”>
<pattern>%d %X{akkaTimestamp} %-4r %-5level [%thread] %logger{0} %class{0} – %-5msg%n</pattern>
<logger name=”akka” level=”DEBUG” />
<root level=”INFO”>
<appender-ref ref=”CONSOLE”/>
<appender-ref ref=”FILE”/>

Add Logs

Now you can add logs to your code, add the following to your scala code:

import akka.event.Logging

val log = Logging(context.system, this.getClass.getName)“Info message”)

log.warning(“Warning message”)

log.debug(“Debug message”)

log.error(“Error message”)

Unified Logging Solution using ELK Stack


There is a need for a unified logging solution: Applications typically run across multiple nodes which may not have SSH access to developers or management to view the logs. It is also not possible to SSH to each node to check which node can show the logs of a particular claim we are interested in.

There is also a need to maintain and check build logs, integration and performance test logs. The post below explains how the ELK stack can be used to setup and configure ELK stack for log monitoring.

What is ELK?

Elasticsearch, Logstash, Kibana

Any kind of and any volume of data that flows into your system can be searched in real time using elasticsearch. A distributed, HA elasticsearch cluster can be setup to enable horizontal scalability. Data can be stored using multiple indices which makes querying easier. The full text search capability is offered using Lucene.

Logstash is a tool to manage events and logs. It collects logs, parses them and stores them.

Kibana is a front end where you can see and interact with your data.

Logstash Forwarder (Lumberjack)

This is a tool which collects logs locally on a node and these logs can be forwarded to Logstash.

Setting up ELK Stack (on ubuntu)

The setup will cover 4 components installation:




Logstash Forwarder

The setup can be done on AWS machine, Instance Type: t2.medium (vCPU=2, Memory=4GB)

Install Java

Elasticsearch uses Apache Lucene, which is written in Java. Hence we need java installed on the machine.

sudo add-apt-repository -y ppa:webupd8team/java

sudo apt-get update

sudo apt-get -y install oracle-java7-installer

Install Elasticsearch

wget -O - | sudo apt-key add -

echo 'deb stable main' | sudo tee /etc/apt/sources.list.d/elasticsearch.list

sudo apt-get update

sudo apt-get -y install elasticsearch=1.1.1

Update the config file:

sudo vi /etc/elasticsearch/elasticsearch.yml

Some edits are needed here.

script.disable_dynamic: true & localhost

Start elasticsearch:

sudo service elasticsearch restart

Install Kibana

I have setup Logstash 1.4.2 which recommends Kibana 3.0.1

cd ~; wget

tar xvf kibana-3.0.1.tar.gz

Edit the configuration file:

sudo vi ~/kibana-3.0.1/config.js

In the configuration, changed the port from default 9200 to 80:

elasticsearch: "http://"+window.location.hostname+":80"

Install Nginx to serve kibana

sudo mkdir -p /var/www/kibana3

sudo cp -R ~/kibana-3.0.1/* /var/www/kibana3/

sudo apt-get install nginx

User <== Port 80 ==> Kibana/Nginx <== Port 9200 ==> Elasticsearch

For this port routing, some configuration changes need to be done. Kibana’s sample Nginx configuration can be used.

cd ~; wget

Edit configuration:

vi nginx.conf

Make changes as below:

server_name FQDN;

root /var/www/kibana3;

sudo mv nginx.conf /etc/nginx/sites-available/default

Secure using htpasswd

sudo apt-get install apache2-utils

sudo htpasswd -c /etc/nginx/conf.d/ user

Sanity Test

Start nginx:

sudo service nginx restart

Check if you can access kibana using the link: http://logstash_server_public_ip/

Install Logstash

echo 'deb stable main' | sudo tee /etc/apt/sources.list.d/logstash.list

sudo apt-get update

sudo apt-get install logstash=1.4.2-1-2c0f5a1

Generate SSL certificates

sudo mkdir -p /etc/pki/tls/certs
sudo mkdir /etc/pki/tls/private

cd /etc/pki/tls;

cat > logstash.cnf << BLOCK1


distinguished_name = req_distinguished_name
x509_extensions = v3_req
prompt = no

C = TG
ST = Togo
L =  Lome
O = Private company
CN = *
subjectKeyIdentifier = hash
authorityKeyIdentifier = keyid,issuer
basicConstraints = CA:TRUE
subjectAltName = @alt_names
DNS.1 = *
DNS.2 = *.*
DNS.3 = *.*.*
DNS.4 = *.*.*.*
DNS.5 = *.*.*.*.*
DNS.6 = *.*.*.*.*.*
DNS.7 = *.*.*.*.*.*.*
IP.1 = <IP-address-of-logstash-server>

Put the IP address of the logstash server machine.

Generate keys:

 sudo openssl req -x509 -batch -nodes -newkey rsa:2048 -keyout private/logstash-forwarder.key -out certs/logstash-forwarder.crt -config logstash.cnf -days 1825

Configure Logstash

There is lots that can be done to configure logstash. A basic part has been shown below:

cat /etc/logstash/conf.d/logstash-default.conf

input {
  lumberjack {
   port => 5000
    type => "logs
    ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
    ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
filter {
    grok {
      type => "myapplog"
      pattern => "%{GREEDYDATA:logline}"
output {
  elasticsearch { host => localhost }
  stdout { codec => rubydebug }

Start Logstash

sudo /opt/logstash/bin/logstash --config /etc/logstash/conf.d/logstash-default.conf &

You will see logstash server started successfully when you see something like below:

Using milestone 1 input plugin ‘lumberjack’. This plugin should work, but would benefit from use by folks like you. Please let us know if you find bugs or have suggestions on how to improve this plugin.  For more information on plugin milestones, see {:level=>:warn}

Setup Logstash Forwarder

Logstash forwarder, formerly known as Lumberjack is set up on the machine which logs we want to capture.

Copy the SSL certificates from the logstash server to this node on which your application would run. (Consider you have copied to /tmp/logstash-forwarder.crt)

sudo mkdir -p /etc/pki/tls/certs
sudo cp /tmp/logstash-forwarder.crt /etc/pki/tls/certs/

Install Logstash-forwarder

echo 'deb stable main' | sudo tee /etc/apt/sources.list.d/logstashforwarder.list
sudo apt-get update
sudo apt-get --yes --force-yes install logstash-forwarder
cd /etc/init.d/
sudo wget -O logstash-forwarder
sudo chmod +x logstash-forwarder
sudo update-rc.d logstash-forwarder defaults

Configure Logstash forwarder

There is a lot that can be configured and forwarded. A basic example for the configuration is as below:

cat > /etc/logstash-forwarder.conf << BLOCK
"network": {
    "servers": [ "$LOGSTASH_IP" ],
    "timeout": 15,
    "ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt"
  "files": [
      "paths": [
      "fields": { "type": "myapplog" }

where LOGSTASH_IP=logstash-server:port. (In this case port=5000) and PWD=”path of the log file”

Start logstash forwarder

sudo ./logstash-forwarder -config  /etc/logstash-forwarder.conf &


The ELK stack along with logstash forwarder has been setup. When your application runs and writes to the log file, these would be sent by logstash forwarder to the logstash server. Logstash server parses them, indexes them and saves them. When you open kibana and view the logs, you can filter out data as needed.