Using Asynchronous Logging in Log4J2 in Spark Scala Application

Problem: I cannot observe the asynchronous capabilities of Log4J2 after initializing SparkContext in local Spark mode.

Log4j2 dependencies in SBT:

  "com.lmax" % "disruptor" % "3.3.5",
  "org.apache.logging.log4j" % "log4j-api" % "2.8.2",
  "org.apache.logging.log4j" % "log4j-core" % "2.8.2",
  "org.apache.logging.log4j" %% "log4j-api-scala" % "2.8.2"

Log4j2 configuration file:

<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="debug">
<Appenders>
    <Console name="Console-Appender" target="SYSTEM_OUT">
        <PatternLayout>
            <pattern>
                [%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n
            </pattern>>
        </PatternLayout>
    </Console>
    <File name="File-Appender" fileName="logs/xmlfilelog.log" >
        <PatternLayout>
            <pattern>
                [%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n
            </pattern>
        </PatternLayout>
    </File>
</Appenders>
<Loggers>
    <Logger  name="guru.springframework.blog.log4j2async" level="debug">
        <AppenderRef ref="File-Appender"/>he preceding c
    </Logger>
    <Root level="debug">
        <AppenderRef ref="Console-Appender"/>
    </Root>
</Loggers>

I set the following system property in IntelliJ

-DLog4jContextSelector=org.apache.logging.log4j.core.async.AsyncLoggerContextSelector

To test for asynchronous behavior, I ran the following code snippet before and after initializing SparkContext:

val start = System.nanoTime()

for(i <- 1 to 1000) {
  logger.error("Hello")
}

val end = System.nanoTime()
val timeMS = (end - start)/1000000
println(s"Parsed ${iterations} reports in ${timeMS} ms ${timeMS/1000} sec")

Successful result: I was able to see the following debugging line confirming that AsyncContext is enabled 2017-04-25 14:55:40,541 main DEBUG LoggerContext[name=AsyncContext@6d9c638, org.apache.logging.log4j.core.async.AsyncLoggerContext@758f4f03] started OK.. In addition, my print line operator "Parsed ..." appears somewhere in the middle of the log output, which indicates async behavior.

, SparkContext, . "Parsed..." . , 2017-04-25 14:55:40,541 main DEBUG LoggerContext[name=AsyncContext@6d9c638, org.apache.logging.log4j.core.async.AsyncLoggerContext@758f4f03] started OK.. .

api . package org.apache.logging.log4j.scala, Apache Log4j Scala 2.11 wrapper for Log4j API, version 2.8.2. Log4J2, "og4j 2 Scala API API Log4j 2". , , , Log4J2.

: , LOG4J2 LOG4J, ?

+4
1

, , Log4j2.

-DLog4jContextSelector=org.apache.logging.log4j.core.async.AsyncLoggerContextSelector

, : , AsyncLoggerContext, , Async Loggers Disruptor.

+1

Source: https://habr.com/ru/post/1675658/


All Articles