[Spring] Enable logging to ELK with Log4J2 and Sprint cloud sleuth

To enable ELK logging, mostly it will use logback as logging library. However, when migrate existing application, eliminate dependency change is one of the factor to ensure application stability.
In this demo, we will use Log4J2, a common Java logging library to write log to ELK.

Prerequisites

  1. Project migrated to gradle;

  2. Using Log4J2 as logging library;

Steps

  1. Add Spring Cloud Sleuch dependency to project.
    In build.gradle, add dependency as below.

    dependencies {
        compile 'org.springframework.cloud:spring-cloud-starter-sleuth:2.2.6.RELEASE'
    }
  2. Deploy project to target server.

  3. Add Socket appender to write log via Socket call.
    In log4j2.xml, add socket appender and enable it.

    <Configuration status="WARN" monitorInterval="30">
        <Properties>
            <Property name="LOG_PATTERN">%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level --- [%t] %notEmpty{ [%X{x-apigw-requestId}]} -  %logger{36} - %msg%n</Property>
            <Property name="LOG_FILE_NAME">user-management</Property>
            <Property name="LOG_PATH">c:\log\user-management</Property>
        </Properties>
        <Appenders>
            <Socket name="socket" host="10.3.11.114" port="19722" protocol="TCP">
                <JsonLayout properties="true" compact="true" eventEol="true" />
                <PatternLayout pattern="${LOG_PATTERN}" />
            </Socket>
            <Async name="async">
                <AppenderRef ref="socket" />
            </Async>
        </Appenders>
        <Loggers>
            <root level="info">
                <AppenderRef ref="async" />
            </root>
        </Loggers>
    </Configuration>
  4. Create Logstash pipeline.
    In logstash server, create config file in /etc/logstash/conf.d/ and add content below.
    Command

    ## Sample service name: User management service.
    sudo vi /etc/logstash/conf.d/user-management-service-logstash.conf

    Content

    ## This sample use port 19721 to collect log from socket call in JSON format.
    input {
      tcp {
          port => 19721
          codec => json
      }
      stdin {}
    }
    
    output {
      stdout {
        codec => rubydebug
      }
      elasticsearch {
        action => "index"
        hosts => [ "10.3.11.114:9200" ]
        index => "user-management-service-%{+YYYY.MM.dd}"
      }
    }
  5. Add index pattern in Kibana.
    In ELK main menu, click Stack Management > Index Patterns > Create index pattern.

  6. Configure new index pattern
    In Index pattern name, input user-management-service* and un-select include system and hidden indices. After click Net step, select @timestamp in Time field and click Create index pattern.


  7. Add index pattern to log.
    In ELK main screen, click Logs > Settings
    In Log indices, append index pattern created in previous step.

  8. Test.
    In ELK main screen, click Log and select log in filter, if result shown, means integration success.

About C.H. Ling 262 Articles
a .net / Java developer from Hong Kong and currently located in United Kingdom. Thanks for Google because it solve many technical problems so I build this blog as return. Besides coding and trying advance technology, hiking and traveling is other favorite to me, so I will write down something what I see and what I feel during it. Happy reading!!!

Be the first to comment

Leave a Reply

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.