Table of Contents

ELK Stack

Elasticsearch LogStash Kibana

A nice article https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html

Java app logs to ELK

Here a good tutorial: https://balamaci.ro/java-app-monitoring-with-elk-logstash/

Java Side

SLF4j as a logging fascade. Logback is used as logging engine.

build.gradle

Gradle config, for the logback

dependencies {


    // Logging Fascade. From now on - the log interface may be used in code
    // https://mvnrepository.com/artifact/org.slf4j/slf4j-api
    compile group: 'org.slf4j', name: 'slf4j-api', version: '1.7.25'

    // Logging engine. From now on the logs are really processed
    // and stored in the default location.
    //required for logging LoggingEvents
    compile 'ch.qos.logback:logback-core:1.1.3'

    //required for logging LoggingEvents
    compile 'ch.qos.logback:logback-classic:1.1.3'

    // the LogStash encoder, used in STASH-appender, in logstash.xml
    compile 'net.logstash.logback:logstash-logback-encoder:4.6'
    
    ...
}
logback.xml

Configure the appender to log to logstash.

<configuration>
    <appender name="STASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
        <destination>localhost:5044</destination>

        <encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
            <providers>
                <mdc/> <!-- MDC variables on the Thread will be written as JSON fields-->
                <context/> <!--Outputs entries from logback's context -->
                <version/> <!-- Logstash json format version, the @version field in the output-->
                <logLevel/>
                <loggerName/>

                <pattern>
                    <pattern> <!-- we can add some custom fields to be sent with all the log entries. make filtering easier in Logstash. -->
                        {
                        "appName": "elk-testdata",<!--or searching with Kibana-->
                        "appVersion": "1.0"
                        }
                    </pattern>
                </pattern>

                <threadName/>
                <message/>

                <logstashMarkers/> <!-- Useful so we can add extra information for specific log lines as Markers-->
                <arguments/> <!--or through StructuredArguments-->

                <stackTrace/>
            </providers>
        </encoder>
    </appender>

    <root level="info">
        <appender-ref ref="STASH"/>
    </root>
</configuration>

ELK side

Configure Logstash to receive the data from the tcp appender, via TCP.

Modify the input file: /etc/logstash/conf.d/02-beats-input.conf

https://github.com/logstash/logstash-logback-encoder#tcp-appenders

input {
   tcp{     
    port => 5044
    codec => json_lines
  }
}

Achtung: LogStash creates its own index!!! No need to mess around with the manual creation.

Docker

Running ELK in docker for the demo.

sudo docker run -v /home/vagrant/vagrant-home/02-beats-input.conf:/etc/logstash/conf.d/02-beats-input.conf -p 5601:5601 -p 9200:9200 -p 5044:5044 -d --name elk sebp/elk