Even simpler you just cd SPARK_HOME/conf , then mv log4j.properties.template log4j.properties , then open log4j.properties and change all INFO to ERROR . Here SPARK_HOME is the root directory of your spark installation.
Some may use hdfs as their internal Spark repository and find the registration messages that hdfs actually generate. To change this, go to the HADOOP_HOME/etc/hadoop/log4j.properties . Just change hadoop.root.logger=INFO,console to hadoop.root.logger=ERROR,console . Once again, HADOOP_HOME is the root of your hadoop installation for me, it was /usr/local/hadoop .
quine source share