logging - Pentaho Kettle - 写入日志文件时出错

标签 logging pentaho kettle

我们有一个 Pentaho 作业,它在本地环境中运行良好,但在部署它并使用 Kettle 运行该作业后,我们在写入日志文件时遇到错误。该错误发生在具有“针对每个输入行执行?”设置的作业中。检查过。以下是日志记录设置的配置方式,路径和名称是之前设置的变量。在这一步之前它能够很好地记录到文件。

作业日志配置

enter image description here

这是我在以调试日志级别运行 Kettle 时遇到的错误。在失败的工作中,我们还会写入日志,我不知道这是否是一个不好的做法。还有其他人遇到过这个问题并知道解决方法吗?

ProcessFiles - Log folder [file:////<ServerPath>/QA/PentahoLogs] exists.
ProcessFiles - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : Unable to open file appender for file [${LOGFOLDER}${LOGFILENAME}_20161005.txt] : org.pentaho.di.core.exception.KettleException:
ProcessFiles - There was an error while trying to open file 'file:////<ServerPath>/QA/PentahoLogs/PartImportLog_20161005.txt' for writing
ProcessFiles - Could not write to "file:////<ServerPath>/QA/PentahoLogs/PartImportLog_20161005.txt" because it is currently in use.
ProcessFiles - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : org.pentaho.di.core.exception.KettleException:
ProcessFiles - There was an error while trying to open file 'file:////<ServerPath>/QA/PentahoLogs/PartImportLog_20161005.txt' for writing
ProcessFiles - Could not write to "file:////<ServerPath>/QA/PentahoLogs/PartImportLog_20161005.txt" because it is currently in use.
ProcessFiles -
ProcessFiles -    at org.pentaho.di.core.logging.LogChannelFileWriter.<init>(LogChannelFileWriter.java:78)
ProcessFiles -    at org.pentaho.di.core.logging.LogChannelFileWriter.<init>(LogChannelFileWriter.java:96)
ProcessFiles -    at org.pentaho.di.job.entries.job.JobEntryJob.execute(JobEntryJob.java:552)
ProcessFiles -    at org.pentaho.di.job.Job.execute(Job.java:723)
ProcessFiles -    at org.pentaho.di.job.Job.execute(Job.java:864)
ProcessFiles -    at org.pentaho.di.job.Job.execute(Job.java:864)
ProcessFiles -    at org.pentaho.di.job.Job.execute(Job.java:864)
ProcessFiles -    at org.pentaho.di.job.Job.execute(Job.java:545)
ProcessFiles -    at org.pentaho.di.job.Job.run(Job.java:435)
ProcessFiles - Caused by: org.apache.commons.vfs2.FileSystemException: Could not write to "file:////<ServerPath>/QA/PentahoLogs/PartImportLog_20161005.txt" because it is currently in use.
ProcessFiles -    at org.apache.commons.vfs2.provider.DefaultFileContent.getOutputStream(DefaultFileContent.java:475)
ProcessFiles -    at org.pentaho.di.core.vfs.KettleVFS.getOutputStream(KettleVFS.java:289)
ProcessFiles -    at org.pentaho.di.core.logging.LogChannelFileWriter.<init>(LogChannelFileWriter.java:76)
ProcessFiles -    ... 8 more

最佳答案

确保日志路径/文件未被同一存储库用户或其他用户中的其他作业使用。

关于logging - Pentaho Kettle - 写入日志文件时出错,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39877014/

相关文章:

python - 递归日志记录使 Python 3 中的解释器崩溃

mysql - Pentaho 数据集成 - MySQL 数据库与 SSH 连接

java - 如何在窗口上安装 Pentaho 作为服务?

java - 在 Java 上运行 PDI Kettle - Mongodb 步骤缺少插件

java - java中同一文件中同一行以及不同行中的多个日志记录

sql - 如何知道Fluent NHibernate生成的查询

python - 如何忽略python登录过程中产生的异常?

java - 在许多 pentaho prpt 报告之间共享外观和感觉

etl - Pentaho Kettle 的架构在哪里?

web-services - 如何使用 Pentaho PDI(Kettle 版本 4.2.1)访问谷歌分析数据 API