gpt4 book ai didi

logging - Pentaho Kettle - 写入日志文件时出错

转载 作者:行者123 更新时间:2023-12-02 15:26:43 53 4
gpt4 key购买 nike

我们有一个 Pentaho 作业,它在本地环境中运行良好,但在部署它并使用 Kettle 运行该作业后,我们在写入日志文件时遇到错误。该错误发生在具有“针对每个输入行执行?”设置的作业中。检查过。以下是日志记录设置的配置方式,路径和名称是之前设置的变量。在这一步之前它能够很好地记录到文件。

作业日志配置

enter image description here

这是我在以调试日志级别运行 Kettle 时遇到的错误。在失败的工作中,我们还会写入日志,我不知道这是否是一个不好的做法。还有其他人遇到过这个问题并知道解决方法吗?

ProcessFiles - Log folder [file:////<ServerPath>/QA/PentahoLogs] exists.
ProcessFiles - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : Unable to open file appender for file [${LOGFOLDER}${LOGFILENAME}_20161005.txt] : org.pentaho.di.core.exception.KettleException:
ProcessFiles - There was an error while trying to open file 'file:////<ServerPath>/QA/PentahoLogs/PartImportLog_20161005.txt' for writing
ProcessFiles - Could not write to "file:////<ServerPath>/QA/PentahoLogs/PartImportLog_20161005.txt" because it is currently in use.
ProcessFiles - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : org.pentaho.di.core.exception.KettleException:
ProcessFiles - There was an error while trying to open file 'file:////<ServerPath>/QA/PentahoLogs/PartImportLog_20161005.txt' for writing
ProcessFiles - Could not write to "file:////<ServerPath>/QA/PentahoLogs/PartImportLog_20161005.txt" because it is currently in use.
ProcessFiles -
ProcessFiles - at org.pentaho.di.core.logging.LogChannelFileWriter.<init>(LogChannelFileWriter.java:78)
ProcessFiles - at org.pentaho.di.core.logging.LogChannelFileWriter.<init>(LogChannelFileWriter.java:96)
ProcessFiles - at org.pentaho.di.job.entries.job.JobEntryJob.execute(JobEntryJob.java:552)
ProcessFiles - at org.pentaho.di.job.Job.execute(Job.java:723)
ProcessFiles - at org.pentaho.di.job.Job.execute(Job.java:864)
ProcessFiles - at org.pentaho.di.job.Job.execute(Job.java:864)
ProcessFiles - at org.pentaho.di.job.Job.execute(Job.java:864)
ProcessFiles - at org.pentaho.di.job.Job.execute(Job.java:545)
ProcessFiles - at org.pentaho.di.job.Job.run(Job.java:435)
ProcessFiles - Caused by: org.apache.commons.vfs2.FileSystemException: Could not write to "file:////<ServerPath>/QA/PentahoLogs/PartImportLog_20161005.txt" because it is currently in use.
ProcessFiles - at org.apache.commons.vfs2.provider.DefaultFileContent.getOutputStream(DefaultFileContent.java:475)
ProcessFiles - at org.pentaho.di.core.vfs.KettleVFS.getOutputStream(KettleVFS.java:289)
ProcessFiles - at org.pentaho.di.core.logging.LogChannelFileWriter.<init>(LogChannelFileWriter.java:76)
ProcessFiles - ... 8 more

最佳答案

确保日志路径/文件未被同一存储库用户或其他用户中的其他作业使用。

关于logging - Pentaho Kettle - 写入日志文件时出错,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39877014/

53 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com