gpt4 book ai didi

java - Apache Flink - 无法使用 Log4j 创建每小时/每日日志文件

转载 作者:行者123 更新时间:2023-12-01 18:26:32 25 4
gpt4 key购买 nike

我无法使用 log4j 创建每日和每小时日志文件(特别是任务执行器日志)

这是我的 log4j.properties

# This affects logging for both user code and Flink
log4j.rootLogger=INFO, file

# Uncomment this if you want to _only_ change Flink's logging
#log4j.logger.org.apache.flink=INFO

# The following lines keep the log level of common libraries/connectors on
# log level INFO. The root logger does not override this. You have to manually
# change the log levels here.
log4j.logger.akka=INFO
log4j.logger.org.apache.kafka=INFO
log4j.logger.org.apache.hadoop=INFO
log4j.logger.org.apache.zookeeper=INFO

# Log all infos in the given file
log4j.appender.file=org.apache.log4j.DailyRollingFileAppender
log4j.appender.file.File=${log.file}
log4j.appender.file.DatePattern='.'yyyy-MM-dd-HH
log4j.appender.file.append=false
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %-60c %x - %m%n

${log.file} 指向:

2020-02-13 17:40:51,105 INFO  org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -     -Dlog.file=/.../flink-1.9.1/log/flink-tarantula-taskexecutor-0-...log
  • 同时还有一个logback.xml配置(flink使用哪一个?)

  • logback.xml 文件:

<configuration>
<appender name="file" class="ch.qos.logback.core.FileAppender">
<file>${log.file}</file>
<append>false</append>
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{60} %X{sourceThread} - %msg%n</pattern>
</encoder>
</appender>

<!-- This affects logging for both user code and Flink -->
<root level="INFO">
<appender-ref ref="file"/>
</root>

<!-- Uncomment this if you want to only change Flink's logging -->
<!--<logger name="org.apache.flink" level="INFO">-->
<!--<appender-ref ref="file"/>-->
<!--</logger>-->

<!-- The following lines keep the log level of common libraries/connectors on
log level INFO. The root logger does not override this. You have to manually
change the log levels here. -->
<logger name="akka" level="INFO">
<appender-ref ref="file"/>
</logger>
<logger name="org.apache.kafka" level="INFO">
<appender-ref ref="file"/>
</logger>
<logger name="org.apache.hadoop" level="INFO">
<appender-ref ref="file"/>
</logger>
<logger name="org.apache.zookeeper" level="INFO">
<appender-ref ref="file"/>
</logger>

<!-- Suppress the irrelevant (wrong) warnings from the Netty channel handler -->
<logger name="org.apache.flink.shaded.akka.org.jboss.netty.channel.DefaultChannelPipeline" level="ERROR">
<appender-ref ref="file"/>
</logger>
</configuration>
  • 这里是flink/lib文件夹 apache-log4j-extras-1.2.17.jar flink-dist_2.11-1.9.1.jar flink-table_2.11-1.9.1.jar flink-table-blink_2.11-1.9.1.jar log4j-1.2.17.jar slf4j-log4j12-1.7.15.jar

最终解决方案

经过所有尝试,我找到了解决方案,我无法创建每日日志文件,因为此语句阻止创建每日文件。

log4j.logger.org.apache.flink.shaded.akka.org.jboss.netty.channel.DefaultChannelPipeline=ERROR, filelog4j.rootLogger=INFO, file

每天中午和午夜创建日志的示例。

log4j.category.org.apache.flink.shaded.akka.org.jboss.netty.channel.DefaultChannelPipeline=ERROR, nettyFileAppender
log4j.rootLogger=INFO, file

# Uncomment this if you want to _only_ change Flink's logging
#log4j.logger.org.apache.flink=INFO

# The following lines keep the log level of common libraries/connectors on
# log level INFO. The root logger does not override this. You have to manually
# change the log levels here.
log4j.logger.akka=INFO
log4j.logger.org.apache.kafka=INFO
log4j.logger.org.apache.hadoop=INFO
log4j.logger.org.apache.zookeeper=INFO

# Log all infos in the given file
log4j.appender.file=org.apache.log4j.DailyRollingFileAppender
log4j.appender.file.file=${log.file}
log4j.appender.file.append=false
log4j.appender.file.DatePattern='.'yyyy-MM-dd-a
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %-60c %x - %m%n

# Suppress the irrelevant (wrong) warnings from the Netty channel handler
log4j.appender.nettyFileAppender=org.apache.log4j.FileAppender
log4j.appender.nettyFileAppender.file=/path/to/nettyLog/nettyChannelIrrelevant.log
log4j.appender.nettyFileAppender.append=false
log4j.appender.nettyFileAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.nettyFileAppender.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %-60c %x - %m%n

最佳答案

您没有指定如何提供 log4j.properties 文件。但一个常见的问题是,有人在其 jar 中包含此文件,但它会被忽略,因为 Flink 将使用位于集群上的 conf/log4j.properties 文件。

假设您的 jar 不包含除 slf4j-api jar 中的类以外的任何内容,那么 Flink 将在 中获取 slf4j-log4j12.jar >flink/lib (因为它位于类路径上),因此使用 log4j (不是 logback),因此 logback.xml 配置文件将被忽略。

关于java - Apache Flink - 无法使用 Log4j 创建每小时/每日日志文件,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/60221379/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com