gpt4 book ai didi

java - 将 RAMDirectory 上传到 AzureCloud 会创建 EOF 异常

转载 作者:行者123 更新时间:2023-12-01 05:04:11 26 4
gpt4 key购买 nike

我目前正在尝试使用 AzureBlobStorage 与 Lucene 配合使用。因此,我创建了一个新目录,为了避免太多延迟,我使用 RAMDirectory 作为缓存(这可能不是最好的解决方案,但它似乎很容易做到,我愿意接受建议)。不管怎样,一切似乎都工作得很好,除了当我将 .nrm 文件写入云时,当我将它们上传到 blob 时总是会引发 EOFExceptions。

我将快速解释该目录的工作原理,因为它将有助于理解:我创建了一个名为 BlobOutputStream 的新 IndexOutput,它几乎封装了 RAMOutputStream但是,当它关闭时,它会将所有内容上传到 azureBlobStorage。这是如何完成的:

String fname = name;
output.flush();
long length = output.length();
output.close();
System.out.println("Size of the upload: " + length);
InputStream bStream = directory.openCachedInputAsStream(fname);
System.out.println("Uploading cache version of: " + fname);
blob.upload(bStream, length);
System.out.println("PUT finished for: " + fname);

blob 是一个 ClubBlockBloboutput 是一个 RAMOutputStreamdirectory.openCacheInputAsStreamIndexInput 上打开一个新的 InputStream

因此,除了 .nrm 文件在上传时总是引发 EOFException 之外,大多数情况下一切正常。虽然我检查过,当索引中只有一个文档并且包含“NRM-1 和该文档的范数”时,它们的长度为 5 个字节。

当我在上传调用中指定了流的大小时,我真的不明白为什么 Azure 会尝试上传比文件中存在的内容更多的内容。

如果我不清楚的话,我很抱歉,解释起来非常困难。如果您需要更多代码,请告诉我,我将在 github 或其他地方提供所有内容。

感谢您的回答

编辑

所以也许我的inputStream的代码可能会显示问题:

public class StreamInput extends InputStream {
public IndexInput input;

public StreamInput(IndexInput openInput) {
input = openInput;
}

@Override
public int read() throws IOException {
System.out.println("Attempt to read byte: "+ input.getFilePointer());
int b = input.readByte();
System.out.println(b);
return b;
}
}

这是我得到的痕迹:


Size of the upload: 5
Uploading cache version of: _0.nrm
Attempt to read byte: 0
78
Attempt to read byte: 1
82
Attempt to read byte: 2
77
Attempt to read byte: 3
-1
Attempt to read byte: 4
114
Attempt to read byte: 5
Attempt to read byte: 1029
java.io.EOFException: read past EOF: RAMInputStream(name=_0.nrm)
at org.apache.lucene.store.RAMInputStream.switchCurrentBuffer(RAMInputStream.java:100)
at org.apache.lucene.store.RAMInputStream.readByte(RAMInputStream.java:73)
at org.lahab.clucene.core.StreamInput.read(StreamInput.java:18)
at java.io.InputStream.read(InputStream.java:151)
at com.microsoft.windowsazure.services.core.storage.utils.Utility.writeToOutputStream(Utility.java:1024)
at com.microsoft.windowsazure.services.blob.client.BlobOutputStream.write(BlobOutputStream.java:560)
at com.microsoft.windowsazure.services.blob.client.CloudBlockBlob.upload(CloudBlockBlob.java:455)
at com.microsoft.windowsazure.services.blob.client.CloudBlockBlob.upload(CloudBlockBlob.java:374)
at org.lahab.clucene.core.BlobOutputStream.close(BlobOutputStream.java:92)
at org.apache.lucene.util.IOUtils.close(IOUtils.java:141)
at org.apache.lucene.index.NormsWriter.flush(NormsWriter.java:172)
at org.apache.lucene.index.DocInverter.flush(DocInverter.java:71)
at org.apache.lucene.index.DocFieldProcessor.flush(DocFieldProcessor.java:60)
at org.apache.lucene.index.DocumentsWriter.flush(DocumentsWriter.java:581)
at org.apache.lucene.index.IndexWriter.doFlush(IndexWriter.java:3587)
at org.apache.lucene.index.IndexWriter.prepareCommit(IndexWriter.java:3376)
at org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:3485)
at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:3467)
at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:3451)
at org.lahab.clucene.server.IndexerNode.addDocuments(IndexerNode.java:139)

看起来上传确实太过分了......

最佳答案

所以问题是我的输入流,而且我无法读取文档并转换字节;)。我的读取功能应该是:

System.out.println("file:" + input.getFilePointer() + "/" + input.length());
if (input.getFilePointer() >= input.length()) {
return -1;
}
System.out.println("Attempt to read byte: "+ input.getFilePointer());
int b = (int) input.readByte() & 0xff;
System.out.println(b);
return b;

javadoc 中关于 inputStream.read() 的说明:

Reads the next byte of data from the input stream. The value byte is returned as an int in the range 0 to 255. If no byte is available because the end of the stream has been reached, the value -1 is returned. This method blocks until input data is available, the end of the stream is detected, or an exception is thrown.

然后& 0xff是为了屏蔽符号位

关于java - 将 RAMDirectory 上传到 AzureCloud 会创建 EOF 异常,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/12955397/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com