问题描述
如果我运行以下Main.scala类:
object main extends App { import org.apache.spark.sql.SparkSession val spark = SparkSession .builder() .config("spark.master", "local") .config("spark.network.timeout", "10000s") .config("spark.executor.heartbeatInterval", "5000s") .getOrCreate() println("Hello World") //stop spark spark.stop() }
*编辑:这是我的log4j.properties文件,位于main\resources下:
log4j.appender.A1=org.apache.log4j.ConsoleAppender log4j.appender.A1.Threshold=INFO
log4j将将所有INFO日志记录到控制台,但INFO日志将被归类为[error],就像:
[info] Running controller.main compile:last run [error] Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties [error] 19/03/14 22:35:08 INFO SparkContext: Running Spark version 2.3.0 [error] 19/03/14 22:35:09 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable [error] 19/03/14 22:35:09 INFO SparkContext: Submitted application: 289d7336-196c-4b3a-8bf9-6e247b7a6883 [error] 19/03/14 22:35:09 INFO SecurityManager: Changing view acls to: elisquared [error] 19/03/14 22:35:09 INFO SecurityManager: Changing modify acls to: elisquared [error] 19/03/14 22:35:09 INFO SecurityManager: Changing view acls groups to: [error] 19/03/14 22:35:09 INFO SecurityManager: Changing modify acls groups to: [error] 19/03/14 22:35:09 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(elisquared); groups with view permissions: Set(); users with modify permissions: Set(elisquared); groups with modify permissions: Set() [error] 19/03/14 22:35:09 INFO Utils: Successfully started service 'sparkDriver' on port 51903. [error] 19/03/14 22:35:09 INFO SparkEnv: Registering MapOutputTracker [error] 19/03/14 22:35:09 INFO SparkEnv: Registering BlockManagerMaster [error] 19/03/14 22:35:09 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information [error] 19/03/14 22:35:09 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up [error] 19/03/14 22:35:09 INFO DiskBlockManager: Created local directory at /private/var/folders/tj/qnhq1s7x6573bbxlkps7p3wc0000gn/T/blockmgr-e2e86467-a8bc-4cec-8ee9-fa55ef8ce99b [error] 19/03/14 22:35:09 INFO MemoryStore: MemoryStore started with capacity 912.3 MB [error] 19/03/14 22:35:09 INFO SparkEnv: Registering OutputCommitCoordinator [error] 19/03/14 22:35:09 INFO Utils: Successfully started service 'SparkUI' on port 4040. [error] 19/03/14 22:35:10 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.1.158:4040 [error] 19/03/14 22:35:10 INFO Executor: Starting executor ID driver on host localhost [error] 19/03/14 22:35:10 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 51904. [error] 19/03/14 22:35:10 INFO NettyBlockTransferService: Server created on 192.168.1.158:51904 [error] 19/03/14 22:35:10 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy [error] 19/03/14 22:35:10 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.1.158, 51904, None) [error] 19/03/14 22:35:10 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.158:51904 with 912.3 MB RAM, BlockManagerId(driver, 192.168.1.158, 51904, None) [error] 19/03/14 22:35:10 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.1.158, 51904, None) [error] 19/03/14 22:35:10 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.1.158, 51904, None) [info] Hello World [error] 19/03/14 22:35:10 INFO SparkUI: Stopped Spark web UI at http://192.168.1.158:4040 [error] 19/03/14 22:35:10 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! [error] 19/03/14 22:35:10 INFO MemoryStore: MemoryStore cleared [error] 19/03/14 22:35:10 INFO BlockManager: BlockManager stopped [error] 19/03/14 22:35:10 INFO BlockManagerMaster: BlockManagerMaster stopped [error] 19/03/14 22:35:10 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! [error] 19/03/14 22:35:10 INFO SparkContext: Successfully stopped SparkContext [error] 19/03/14 22:35:10 INFO ShutdownHookManager: Shutdown hook called [error] 19/03/14 22:35:10 INFO ShutdownHookManager: Deleting directory /private/var/folders/tj/qnhq1s7x6573bbxlkps7p3wc0000gn/T/spark-bd7cacce-4431-442e-8942-62eec678717a [success] Total time: 11 s, completed Mar 14, 2019 10:35:10 PM
这些错误似乎并不能阻止应用程序运行,但它们很烦人.所以我有两个问题:
-
是什么导致这些错误?
-
我应该抑制它们,如果是这样,如何?
推荐答案
caould您请您以下更新您的log4j propety.
# Root logger option log4j.rootLogger=INFO, stdout # Direct log messages to stdout log4j.appender.stdout=org.apache.log4j.ConsoleAppender log4j.appender.stdout.Target=System.out log4j.appender.stdout.layout=org.apache.log4j.PatternLayout log4j.appender.stdout.layout.ConversionPattern=[%d] %-5p %c{3}:%L - %m%n //set according to your logs.
请让我知道它是否不起作用.谢谢!
希望它有帮助!
问题描述
If I run the following Main.scala class:
object main extends App { import org.apache.spark.sql.SparkSession val spark = SparkSession .builder() .config("spark.master", "local") .config("spark.network.timeout", "10000s") .config("spark.executor.heartbeatInterval", "5000s") .getOrCreate() println("Hello World") //stop spark spark.stop() }
*Edit: here is my log4j.properties file which is located under main\resources:
log4j.appender.A1=org.apache.log4j.ConsoleAppender log4j.appender.A1.Threshold=INFO
The log4j will log all INFO logs to the console but the INFO logs will be classified as [error], like so:
[info] Running controller.main compile:last run [error] Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties [error] 19/03/14 22:35:08 INFO SparkContext: Running Spark version 2.3.0 [error] 19/03/14 22:35:09 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable [error] 19/03/14 22:35:09 INFO SparkContext: Submitted application: 289d7336-196c-4b3a-8bf9-6e247b7a6883 [error] 19/03/14 22:35:09 INFO SecurityManager: Changing view acls to: elisquared [error] 19/03/14 22:35:09 INFO SecurityManager: Changing modify acls to: elisquared [error] 19/03/14 22:35:09 INFO SecurityManager: Changing view acls groups to: [error] 19/03/14 22:35:09 INFO SecurityManager: Changing modify acls groups to: [error] 19/03/14 22:35:09 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(elisquared); groups with view permissions: Set(); users with modify permissions: Set(elisquared); groups with modify permissions: Set() [error] 19/03/14 22:35:09 INFO Utils: Successfully started service 'sparkDriver' on port 51903. [error] 19/03/14 22:35:09 INFO SparkEnv: Registering MapOutputTracker [error] 19/03/14 22:35:09 INFO SparkEnv: Registering BlockManagerMaster [error] 19/03/14 22:35:09 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information [error] 19/03/14 22:35:09 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up [error] 19/03/14 22:35:09 INFO DiskBlockManager: Created local directory at /private/var/folders/tj/qnhq1s7x6573bbxlkps7p3wc0000gn/T/blockmgr-e2e86467-a8bc-4cec-8ee9-fa55ef8ce99b [error] 19/03/14 22:35:09 INFO MemoryStore: MemoryStore started with capacity 912.3 MB [error] 19/03/14 22:35:09 INFO SparkEnv: Registering OutputCommitCoordinator [error] 19/03/14 22:35:09 INFO Utils: Successfully started service 'SparkUI' on port 4040. [error] 19/03/14 22:35:10 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.1.158:4040 [error] 19/03/14 22:35:10 INFO Executor: Starting executor ID driver on host localhost [error] 19/03/14 22:35:10 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 51904. [error] 19/03/14 22:35:10 INFO NettyBlockTransferService: Server created on 192.168.1.158:51904 [error] 19/03/14 22:35:10 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy [error] 19/03/14 22:35:10 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.1.158, 51904, None) [error] 19/03/14 22:35:10 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.158:51904 with 912.3 MB RAM, BlockManagerId(driver, 192.168.1.158, 51904, None) [error] 19/03/14 22:35:10 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.1.158, 51904, None) [error] 19/03/14 22:35:10 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.1.158, 51904, None) [info] Hello World [error] 19/03/14 22:35:10 INFO SparkUI: Stopped Spark web UI at http://192.168.1.158:4040 [error] 19/03/14 22:35:10 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! [error] 19/03/14 22:35:10 INFO MemoryStore: MemoryStore cleared [error] 19/03/14 22:35:10 INFO BlockManager: BlockManager stopped [error] 19/03/14 22:35:10 INFO BlockManagerMaster: BlockManagerMaster stopped [error] 19/03/14 22:35:10 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! [error] 19/03/14 22:35:10 INFO SparkContext: Successfully stopped SparkContext [error] 19/03/14 22:35:10 INFO ShutdownHookManager: Shutdown hook called [error] 19/03/14 22:35:10 INFO ShutdownHookManager: Deleting directory /private/var/folders/tj/qnhq1s7x6573bbxlkps7p3wc0000gn/T/spark-bd7cacce-4431-442e-8942-62eec678717a [success] Total time: 11 s, completed Mar 14, 2019 10:35:10 PM
These errors do not appear to prevent the application from functioning, but they are annoying. So I have two questions:
What is causing these errors?
Should I just suppress them, if so how?
推荐答案
Caould you please update your log4j propety as below.
# Root logger option log4j.rootLogger=INFO, stdout # Direct log messages to stdout log4j.appender.stdout=org.apache.log4j.ConsoleAppender log4j.appender.stdout.Target=System.out log4j.appender.stdout.layout=org.apache.log4j.PatternLayout log4j.appender.stdout.layout.ConversionPattern=[%d] %-5p %c{3}:%L - %m%n //set according to your logs.
Please let me know if it didn't work. Thanks!
Hope it helps!