I use spark for my application, but I get unnecessary logs. How to disable logs in spark java application

Below are the logs that I get in my console.

.spark.executor.Executor       : Finished task 185.0 in stage 189.0 (TID 4477). 11508 bytes result sent to driver
2017-05-06 10:00:18.767  INFO 3336 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Starting task 188.0 in stage 189.0 (TID 4480, localhost, executor driver, partition 188, ANY, 6317 bytes)
2017-05-06 10:00:18.769  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.769  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.769  INFO 3336 --- [result-getter-1] o.apache.spark.scheduler.TaskSetManager  : Finished task 185.0 in stage 189.0 (TID 4477) in 75 ms on localhost (executor driver) (185/200)
2017-05-06 10:00:18.769  INFO 3336 --- [launch worker-5] org.apache.spark.executor.Executor       : Running task 188.0 in stage 189.0 (TID 4480)
2017-05-06 10:00:18.770  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.770  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.771  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 2 non-empty blocks out of 201 blocks
2017-05-06 10:00:18.771  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 3 non-empty blocks out of 401 blocks
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.774  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 1 ms
2017-05-06 10:00:18.775  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 2 non-empty blocks out of 201 blocks
2017-05-06 10:00:18.775  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.777  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 3 non-empty blocks out of 401 blocks
2017-05-06 10:00:18.777  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.786  INFO 3336 --- [launch worker-6] org.apache.spark.executor.Executor       : Finished task 182.0 in stage 189.0 (TID 4474). 11508 bytes result sent to driver
2017-05-06 10:00:18.786  INFO 3336 --- [er-event-loop-1] o.apache.spark.scheduler.TaskSetManager  : Starting task 189.0 in stage 189.0 (TID 4481, localhost, executor driver, partition 189, ANY, 6317 bytes)
2017-05-06 10:00:18.787  INFO 3336 --- [result-getter-2] o.apache.spark.scheduler.TaskSetManager  : Finished task 182.0 in stage 189.0 (TID 4474) in 132 ms on localhost (executor driver) (186/200)
2017-05-06 10:00:18.787  INFO 3336 --- [launch worker-6] org.apache.spark.executor.Executor       : Running task 189.0 in stage 189.0 (TID 4481)
2017-05-06 10:00:18.790  INFO 3336 --- [launch worker-5] org.apache.spark.executor.Executor       : Finished task 188.0 in stage 189.0 (TID 4480). 11356 bytes result sent to driver
2017-05-06 10:00:18.790  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.790  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.791  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.791  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.792  INFO 3336 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Starting task 190.0 in stage 189.0 (TID 4482, localhost, executor driver, partition 190, ANY, 6317 bytes)
2017-05-06 10:00:18.792  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 2 non-empty blocks out of 201 blocks
2017-05-06 10:00:18.792  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.794  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 2 non-empty blocks out of 401 blocks
2017-05-06 10:00:18.794  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.796  INFO 3336 --- [launch worker-4] org.apache.spark.executor.Executor       : Finished task 187.0 in stage 189.0 (TID 4479). 11356 bytes result sent to driver
2017-05-06 10:00:18.798  INFO 3336 --- [er-event-loop-0] o.apache.spark.scheduler.TaskSetManager  : Starting task 191.0 in stage 189.0 (TID 4483, localhost, executor driver, partition 191, ANY, 6317 bytes)
2017-05-06 10:00:18.798  INFO 3336 --- [launch worker-5] org.apache.spark.executor.Executor       : Running task 190.0 in stage 189.0 (TID 4482)
2017-05-06 10:00:18.798  INFO 3336 --- [result-getter-3] o.apache.spark.scheduler.TaskSetManager  : Finished task 188.0 in stage 189.0 (TID 4480) in 31 ms on localhost (executor driver) (187/200)
2017-05-06 10:00:18.798  INFO 3336 --- [result-getter-3] o.apache.spark.scheduler.TaskSetManager  : Finished task 187.0 in stage 189.0 (TID 4479) in 35 ms on localhost (executor driver) (188/200)
2017-05-06 10:00:18.800  INFO 3336 --- [launch worker-4] org.apache.spark.executor.Executor       : Running task 191.0 in stage 189.0 (TID 4483)
2017-05-06 10:00:18.801  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.801  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.802  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.802  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.803  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.803  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.803  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 201 blocks
2017-05-06 10:00:18.803  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.804  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.804  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.804  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 401 blocks

Below is my POM file.

<dependencies>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-data-rest</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web-services</artifactId>
    </dependency>
     <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-databind</artifactId>
        </dependency> 
    <dependency>
        <groupId>info.debatty</groupId>
        <artifactId>java-string-similarity</artifactId>
        <version>RELEASE</version>
    </dependency>



    <dependency>
        <groupId>com.univocity</groupId>
        <artifactId>univocity-parsers</artifactId>
        <version>2.3.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.11</artifactId>
        <version>2.1.0</version>
    </dependency>
    <dependency>
        <groupId>org.codehaus.janino</groupId>
        <artifactId>commons-compiler</artifactId>
        <version>2.6.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.1.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.1.0</version>
    </dependency>
        <dependency>
        <groupId>com.oracle</groupId>
        <artifactId>ojdbc6</artifactId>
        <version>11.2.0.3</version>
    </dependency>


    <dependency>
       <groupId>org.apache.spark</groupId>
       <artifactId>spark-network-common_2.10</artifactId>
       <version>1.4.0</version>
      </dependency>
      <dependency>
       <groupId>org.codehaus.janino</groupId>
       <artifactId>commons-compiler</artifactId>
       <version>2.7.5</version>
      </dependency>


    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-test</artifactId>
        <scope>test</scope>
    </dependency>
</dependencies>
+4
source share
3 answers

I think you can change the log level as

sparkContext.setLogLevel("WARN")

You can choose the log level among

ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN

If the log is printed in a spark shell, you can change the log level from the configuration file located in conf/log4j.properties(change name from conf/log4j.properties.template) then change the log level as you want

log4j.rootCategory=INFO, console

and open the shell again, you will see less output.

+2
source

spark.history.fs.cleaner.enabled true spark-default-conf. hdfs 7 , spark.history.fs.cleaner.maxAge

0

I use the code below in scala: Logger.getLogger ("org"). SetLevel (Level.OFF) Logger.getLogger ("Acca"). SetLevel (Level.WARN)

you can try something like above in java.

0
source

Source: https://habr.com/ru/post/1676529/


All Articles