gpt4 book ai didi

java - 处理器多次接收具有相同有效负载的多条消息

转载 作者:行者123 更新时间:2023-12-02 12:11:20 25 4
gpt4 key购买 nike

我正在使用“spring-cloud-dataflow”启动一个新项目,开发一堆 jar 来满足我的需要。

其中之一是用于解压来自文件源的文件的处理器,该应用程序使用集成 zip 的自定义版本,具有处理 tar 和gunzip 文件压缩的​​功能。

所以我的问题如下:当我的源发送一 strip 有文件引用的消息时,处理器多次接收这些消息,有效负载相同,但 ID 不同。

Here the log file of both component

如您所见,文件仅在消息上生成:

2017-10-02 12:38:28.013  INFO 17615 --- [ask-scheduler-3] o.s.i.file.FileReadingMessageSource      : Created message: [GenericMessage [payload=/tmp/patent/CNINO_im_201733_batch108.tgz, headers={id=0b99b840-e3b3-f742-44ec-707aeea638c8, timestamp=1506940708013}]]

当生产者有 3 条消息传入时:

2017-10-02 12:38:28.077  INFO 17591 --- [           -L-1] o.s.i.codec.kryo.CompositeKryoRegistrar  : registering [40, java.io.File] with serializer org.springframework.integration.codec.kryo.FileSerializer
2017-10-02 12:38:28.080 INFO 17591 --- [ -L-1] .w.c.s.a.c.p.AbstractCompressTransformer : Message 'GenericMessage [payload=/tmp/patent/CNINO_im_201733_batch108.tgz, headers={kafka_offset=1, id=1
a4d4b9c-86fe-d3a8-d800-8013e8ae7027, kafka_receivedPartitionId=0, contentType=application/x-java-object;type=java.io.File, kafka_receivedTopic=untar.file, timestamp=1506940708079}]' unpacking started...
2017-10-02 12:38:28.080 INFO 17591 --- [ -L-1] .w.c.s.a.c.p.AbstractCompressTransformer : Check message's payload type to decompress
2017-10-02 12:38:29.106 INFO 17591 --- [ -L-1] .w.c.s.a.c.p.AbstractCompressTransformer : Message 'GenericMessage [payload=/tmp/patent/CNINO_im_201733_batch108.tgz, headers={kafka_offset=1, id=c
d611ca4-4cd9-0624-0871-dcf93a9a0051, kafka_receivedPartitionId=0, contentType=application/x-java-object;type=java.io.File, kafka_receivedTopic=untar.file, timestamp=1506940709106}]' unpacking started...
2017-10-02 12:38:29.107 INFO 17591 --- [ -L-1] .w.c.s.a.c.p.AbstractCompressTransformer : Check message's payload type to decompress
2017-10-02 12:38:31.108 INFO 17591 --- [ -L-1] .w.c.s.a.c.p.AbstractCompressTransformer : Message 'GenericMessage [payload=/tmp/patent/CNINO_im_201733_batch108.tgz, headers={kafka_offset=1, id=97171a2e-29ac-2111-b838-3da7220f5e3c, kafka_receivedPartitionId=0, contentType=application/x-java-object;type=java.io.File, kafka_receivedTopic=untar.file, timestamp=1506940711108}]' unpacking started...
2017-10-02 12:38:31.108 INFO 17591 --- [ -L-1] .w.c.s.a.c.p.AbstractCompressTransformer : Check message's payload type to decompress
2017-10-02 12:38:31.116 ERROR 17591 --- [ -L-1] o.s.integration.handler.LoggingHandler : org.springframework.integration.transformer.MessageTransformationException: failed to transform message; nested exception is org.springframework.messaging.MessageHandlingException: Failed to apply Zip transformation.; nested exception is java.io.FileNotFoundException: /tmp/patent/CNINO_im_201733_batch108.tgz (File o directory non esistente), failedMessage=GenericMessage [payload=/tmp/patent/CNINO_im_201733_batch108.tgz, headers={kafka_offset=1, id=97171a2e-29ac-2111-b838-3da7220f5e3c, kafka_receivedPartitionId=0, contentType=application/x-java-object;type=java.io.File, kafka_receivedTopic=untar.file, timestamp=1506940711108}], failedMessage=GenericMessage [payload=/tmp/patent/CNINO_im_201733_batch108.tgz, headers={kafka_offset=1, id=97171a2e-29ac-2111-b838-3da7220f5e3c, kafka_receivedPartitionId=0, contentType=application/x-java-object;type=java.io.File, kafka_receivedTopic=untar.file, timestamp=1506940711108}]
at org.springframework.integration.transformer.AbstractTransformer.transform(AbstractTransformer.java:44)

我找不到这个问题的任何解决方案,有人遇到同样的问题并找到解决方法吗?或者有什么配置我错过了?

编辑:

我使用的是 SDFS 版本 1.2.2.RELEASE 的本地版本,因此 IO 文件操作在同一文件系统上工作,并且我对 SCS 使用版本 Ditmars.BUILD-SNAPSHOT。

不幸的是,如果我禁用文件删除操作应用程序,该应用程序仍会多次处理消息。这里有一些代码片段,我喜欢这是我的项目 repo :

这是我的处理器类:

@EnableBinding(Processor.class)
@EnableConfigurationProperties(UnTarProperties.class)
public class UnTarProcessor {

@Autowired
private UnTarProperties properties;

@Autowired
private Processor processor;

@Bean
public UncompressedResultSplitter splitter() {
return new UncompressedResultSplitter();
}

@Bean
public UnTarGzTransformer transformer() {
UnTarGzTransformer unTarGzTransformer = new UnTarGzTransformer(properties.isUseGzCompression());
unTarGzTransformer.setExpectSingleResult(properties.isSingleResult());
unTarGzTransformer.setWorkDirectory(new File(properties.getWorkDirectory()));
unTarGzTransformer.setDeleteFiles(properties.isDeleteFile());

return unTarGzTransformer;
}

@Bean
public IntegrationFlow process() {

return IntegrationFlows.from(processor.input())
.transform(transformer())
.split(splitter())
.channel(processor.output())
.get();
}
}

这是解压文件的核心方法:

  @Override
protected Object doCompressTransform(final Message<?> message) throws Exception {
logger.info(String.format("Message '%s' unpacking started...", message));

try (InputStream checkMessage = checkMessage(message);
InputStream inputStream = (gzCompression ? new BufferedInputStream(new GZIPInputStream(checkMessage)) : new BufferedInputStream(checkMessage))) {

final Object payload = message.getPayload();
final Object unzippedData;

try (TarArchiveInputStream tarIn = new TarArchiveInputStream(inputStream)){
TarArchiveEntry entry = null;

final SortedMap<String, Object> uncompressedData = new TreeMap<String, Object>();

while ((entry = (TarArchiveEntry) tarIn.getNextEntry()) != null) {

final String zipEntryName = entry.getName();
final Date zipEntryTime = entry.getLastModifiedDate();
final long zipEntryCompressedSize = entry.getSize();

final String type = entry.isDirectory() ? "directory" : "file";

final File tempDir = new File(workDirectory, message.getHeaders().getId().toString());
tempDir.mkdirs(); // NOSONAR false positive

final File destinationFile = new File(tempDir, zipEntryName);

if (entry.isDirectory()) {
destinationFile.mkdirs(); // NOSONAR false positive
}
else {
unpackEntries(tarIn, entry, tempDir);
uncompressedData.put(zipEntryName, destinationFile);
}
}

if (uncompressedData.isEmpty()) {
unzippedData = null;
}
else {
if (this.expectSingleResult) {
if (uncompressedData.size() == 1) {
unzippedData = uncompressedData.values().iterator().next();
}
else {
throw new MessagingException(message, String.format("The UnZip operation extracted %s "
+ "result objects but expectSingleResult was 'true'.", uncompressedData.size()));
}
}
else {
unzippedData = uncompressedData;
}

}

logger.info("Payload unpacking completed...");
}
finally {
if (payload instanceof File && this.deleteFiles) {
final File filePayload = (File) payload;
if (!filePayload.delete() && logger.isWarnEnabled()) {
if (logger.isWarnEnabled()) {
logger.warn("failed to delete File '" + filePayload + "'");
}
}
}
}
return unzippedData;
}
catch (Exception e) {
throw new MessageHandlingException(message, "Failed to apply Zip transformation.", e);
}
}

checkmessage() 方法抛出异常

    protected InputStream checkMessage(Message<?> message) throws FileNotFoundException {
logger.info("Check message's payload type to decompress");

InputStream inputStream;
Object payload = message.getPayload();

if (payload instanceof File) {
final File filePayload = (File) payload;

if (filePayload.isDirectory()) {
throw new UnsupportedOperationException(String.format("Cannot unzip a directory: '%s'",
filePayload.getAbsolutePath()));
}

inputStream = new FileInputStream(filePayload);
}
else if (payload instanceof InputStream) {
inputStream = (InputStream) payload;
}
else if (payload instanceof byte[]) {
inputStream = new ByteArrayInputStream((byte[]) payload);
}
else {
throw new IllegalArgumentException(String.format("Unsupported payload type '%s'. " +
"The only supported payload types are java.io.File, byte[] and java.io.InputStream",
payload.getClass().getSimpleName()));
}

return inputStream;
}

我真的很感谢任何帮助。非常感谢

最佳答案

我们需要更多信息。 SCDF 和 SCS 应用程序的版本。至少关于如何部署应用程序的 DSL。

刚刚检查了您的日志,您是否意识到您的消费者由于 FileNotFoundException 而无法使用消息?您不会多次收到相同的消息,SCS 只是在失败之前尝试重新传送它。检查完整日志以及无法打开指定位置的文件的原因

关于java - 处理器多次接收具有相同有效负载的多条消息,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/46524126/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com