gpt4 book ai didi

logging - 如何抑制Spark中的拼花日志消息?

转载 作者:行者123 更新时间:2023-12-03 14:16:57 25 4
gpt4 key购买 nike

如何阻止此类消息进入我的Spark-shell控制台。

5 May, 2015 5:14:30 PM INFO: parquet.hadoop.InternalParquetRecordReader: at row 0. reading next block
5 May, 2015 5:14:30 PM INFO: parquet.hadoop.InternalParquetRecordReader: RecordReader initialized will read a total of 89213 records.
5 May, 2015 5:14:30 PM INFO: parquet.hadoop.InternalParquetRecordReader: block read in memory in 2 ms. row count = 120141
5 May, 2015 5:14:30 PM INFO: parquet.hadoop.InternalParquetRecordReader: at row 0. reading next block
5 May, 2015 5:14:30 PM INFO: parquet.hadoop.InternalParquetRecordReader: block read in memory in 2 ms. row count = 89213
5 May, 2015 5:14:30 PM WARNING: parquet.hadoop.ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutp
[Stage 12:=================================================> (184 + 4) / 200]


谢谢

最佳答案

SPARK-8118 issue comment中的解决方案似乎有效:


您可以通过创建具有以下内容的属性文件来禁用chatty输出:


org.apache.parquet.handlers=java.util.logging.ConsoleHandler
java.util.logging.ConsoleHandler.level=SEVERE



然后在应用程序运行时将文件的路径传递给Spark
已提交。假设文件位于/tmp/parquet.logging.properties中
(当然,这需要在所有工作节点上都可用):


spark-submit \
--conf spark.driver.extraJavaOptions="-Djava.util.logging.config.file=/tmp/parquet.logging.properties" \`
--conf spark.executor.extraJavaOptions="-Djava.util.logging.config.file=/tmp/parquet.logging.properties" \
...


积分转到 Justin Bailey

关于logging - 如何抑制Spark中的拼花日志消息?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/30052889/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com