覆盖Spark log4j的配置[英] Override Spark log4j configurations

本文是小编为大家收集整理的关于覆盖Spark log4j的配置的处理/解决方法,可以参考本文帮助大家快速定位并解决问题,中文翻译不准确的可切换到English标签页查看源文。

问题描述

我在纱线群集上运行火花,并配置了log4j.properties,因此默认情况下所有日志都转到日志文件.但是,对于某些Spark作业,我希望日志可以进入控制台,而无需更改Log4J文件和实际作业的代码.实现这一目标的最佳方法是什么?谢谢,全部.

推荐答案

根据文档:upload a custom log4j.properties using spark-submit, by adding it to the --files list of files to be uploaded with the application.

我刚刚在纱线群集上尝试了log4j.properties文件,它可以正常工作.

spark-submit --class com.foo.Bar \
  --master yarn-cluster \
  --files path_to_my_log4j.properties \
  my.jar

其他推荐答案

我知道至少有4种解决此问题的解决方案.

  1. 您可以在火花机中修改log4j.properties

  2. 在Spark上运行作业时,最好将Log4J文件作为配置文件提交到Spark示例

    bin/spark-submit -class com.viaplay.log4jtest.log4jtest -conf" spark.driver.extrajavaoptions = -dlog4j.configuration = file = file:/users/feng/feng/sparklog4j/sparklog4j/sparklog4j/sparklog4j/sparklog4jtest/teast/targ/target/target/logget/logget/log4j2.protectiesectiesectiesectiesectiesefie -master local [*]/users/feng/sparklog4j/sparklog4jtest/target/sparklog4jtest-1.0-jar-with-dipperencies.jar

  3. 尝试将log4j导入您的逻辑代码.

    导入org.apache.log4j.logger; 导入org.apache.log4j.level;

    将这些记录仪放到您的SparkContext()函数 logger.getLogger(" org").setLevel(level.info); logger.getLogger(" akka").setLevel(level.info);

  4. 火花使用Spark.SQL.SparkSession

    导入org.apache.spark.sql.sparksession; spark = sparksession.builder.getorcreate() Spark.SparkContext.SetLoglevel('错误')

本文地址:https://www.itbaoku.cn/post/1575110.html

问题描述

I'm running Spark on a Yarn cluster and having log4j.properties configured such that all logs by default go to a log file. However, for some spark jobs I want the logs to go to console without changing the log4j file and the code of the actual job. What is the best way to achieve this? Thanks, all.

推荐答案

Per the documentation: upload a custom log4j.properties using spark-submit, by adding it to the --files list of files to be uploaded with the application.

I just tried with a log4j.properties file on a Yarn cluster and it works just fine.

spark-submit --class com.foo.Bar \
  --master yarn-cluster \
  --files path_to_my_log4j.properties \
  my.jar

其他推荐答案

I know there have at least 4 solutions for solving this problem.

  1. You could modify your log4j.properties in your Spark machines

  2. When you running the job on spark you better to attach the log4j file as configuration file submit to spark example

    bin/spark-submit --class com.viaplay.log4jtest.log4jtest --conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=file:/Users/feng/SparkLog4j/SparkLog4jTest/target/log4j2.properties" --master local[*] /Users/feng/SparkLog4j/SparkLog4jTest/target/SparkLog4jTest-1.0-jar-with-dependencies.jar

  3. Try to import log4j to your logic code.

    import org.apache.log4j.Logger; import org.apache.log4j.Level;

    put those logger to your SparkContext() function Logger.getLogger("org").setLevel(Level.INFO); Logger.getLogger("akka").setLevel(Level.INFO);

  4. Spark use spark.sql.SparkSession

    import org.apache.spark.sql.SparkSession; spark = SparkSession.builder.getOrCreate() spark.sparkContext.setLogLevel('ERROR')