如何改变spark中的日志级别?[英] How to change log level in spark?

本文是小编为大家收集整理的关于如何改变spark中的日志级别?的处理/解决方法,可以参考本文帮助大家快速定位并解决问题,中文翻译不准确的可切换到English标签页查看源文。

问题描述

我尝试了所有这些方法,没有任何可行的方法:

在log4j文件中 -

log4j.logger.org=OFF

log4j.rootCategory=ERROR, console
log4j.rootCategory=OFF, console

代码:

#option 1
Logger.getLogger("org.apache.spark").setLevel(Level.OFF)

#option 2
sparkContext.setLogLevel("OFF")

#option 3
val rootLogger: Logger = Logger.getRootLogger()
rootLogger.setLevel(Level.OFF)

是的,还可以通过在火花上下文对象之后放置它.
我想念什么? 还是有另一种设置日志级别的方法?

推荐答案

您可以从一开始就找到这些日志,这意味着我们需要通过logback而不是log4j设置日志配置.

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/linzi/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/linzi/.m2/repository/org/slf4j/slf4j-log4j12/1.7.26/slf4j-log4j12-1.7.26.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]

添加为logback.xml设置如下:

<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
    <layout class="ch.qos.logback.classic.PatternLayout">
        <Pattern>
            %d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n
        </Pattern>
    </layout>
</appender>

<logger name="com.mkyong" level="debug" additivity="false">
    <appender-ref ref="CONSOLE"/>
</logger>

<root level="error">
    <appender-ref ref="CONSOLE"/>
</root>

其他推荐答案

您应该能够以这样的操作来完成:

spark = SparkSession.builder.getOrCreate();
spark.sparkContext().setLogLevel("OFF");

https://spark.apache.org/docs/2.3.0/api/java/java/org/opache/spark/spark/spark/sparkcontext.html#settml#setl.setl.setl.setl.setlwellvel-java.lang.lang.lang.string-string-

您可以分享代码的其余部分以及您运行的何处吗?

其他推荐答案

如果您在SparkSession object创建

之前将日志级别更改为OFF
import org.apache.log4j.{Level, Logger}
Logger.getLogger("org").setLevel(Level.OFF)

val spark = SparkSession.builder().appName("test").master("local[*]").getOrCreate()

本文地址:https://www.itbaoku.cn/post/1574851.html

问题描述

I tried all this methods and nothing works :

In log4j file -

log4j.logger.org=OFF

log4j.rootCategory=ERROR, console
log4j.rootCategory=OFF, console

In code :

#option 1
Logger.getLogger("org.apache.spark").setLevel(Level.OFF)

#option 2
sparkContext.setLogLevel("OFF")

#option 3
val rootLogger: Logger = Logger.getRootLogger()
rootLogger.setLevel(Level.OFF)

And yes also tried by putting it after spark context object also before.Nothing seems working.
What am I missing ? Or Is there another way to set the log levels ?

推荐答案

You could find these logs from the start, which means we need to set log config via logback instead of log4j.

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/linzi/.m2/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/linzi/.m2/repository/org/slf4j/slf4j-log4j12/1.7.26/slf4j-log4j12-1.7.26.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]

Add as logback.xml setting as below:

<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
    <layout class="ch.qos.logback.classic.PatternLayout">
        <Pattern>
            %d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n
        </Pattern>
    </layout>
</appender>

<logger name="com.mkyong" level="debug" additivity="false">
    <appender-ref ref="CONSOLE"/>
</logger>

<root level="error">
    <appender-ref ref="CONSOLE"/>
</root>

其他推荐答案

You should be able to do it with something like this:

spark = SparkSession.builder.getOrCreate();
spark.sparkContext().setLogLevel("OFF");

https://spark.apache.org/docs/2.3.0/api/java/org/apache/spark/SparkContext.html#setLogLevel-java.lang.String-

Can you share the rest of the code and where you're running it?

其他推荐答案

This should change your log level to OFF if you declare it before SparkSession object creation

import org.apache.log4j.{Level, Logger}
Logger.getLogger("org").setLevel(Level.OFF)

val spark = SparkSession.builder().appName("test").master("local[*]").getOrCreate()