gpt4 book ai didi

java.lang.OutOfMemoryError : Direct buffer memory when invoking Files. 读取所有字节

转载 作者:行者123 更新时间:2023-12-05 03:10:26 38 4
gpt4 key购买 nike

我有以下代码,旨在读取目录并将其压缩到 tar.gz 存档中。当我将代码部署到服务器上并使用一批文件对其进行测试时,它适用于前几个测试批处理,但在第 4 或第 5 批处理之后,它开始始终如一地给我 java.lang.OutOfMemoryError: Direct buffer memory 即使文件批大小保持不变,堆空间看起来很好。这是代码:

public static void compressDirectory(String archiveDirectoryToCompress) throws IOException {
Path archiveToCompress = Files.createFile(Paths.get(archiveDirectoryToCompress + ".tar.gz"));

try (GzipCompressorOutputStream gzipCompressorOutputStream = new GzipCompressorOutputStream(
Files.newOutputStream(archiveToCompress));
TarArchiveOutputStream tarArchiveOutputStream = new TarArchiveOutputStream(gzipCompressorOutputStream)) {
Path directory = Paths.get(archiveDirectoryToCompress);
Files.walk(directory)
.filter(path -> !Files.isDirectory(path))
.forEach(path -> {
String
stringPath =
path.toAbsolutePath().toString().replace(directory.toAbsolutePath().toString(), "")
.replace(path.getFileName().toString(), "");
TarArchiveEntry tarEntry = new TarArchiveEntry(stringPath + "/" + path.getFileName().toString());
try {
byte[] bytes = Files.readAllBytes(path); //It throws the error at this point.
tarEntry.setSize(bytes.length);
tarArchiveOutputStream.putArchiveEntry(tarEntry);
tarArchiveOutputStream.write(bytes);
tarArchiveOutputStream.closeArchiveEntry();
} catch (Exception e) {
LOGGER.error("There was an error while compressing the files", e);
}
});
}

这里是异常(exception):

Caused by: java.lang.OutOfMemoryError: Direct buffer memory
at java.nio.Bits.reserveMemory(Bits.java:658)
at java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:123)
at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:311)
at sun.nio.ch.Util.getTemporaryDirectBuffer(Util.java:174)
at sun.nio.ch.IOUtil.read(IOUtil.java:195)
at sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:158)
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:65)
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:109)
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103)
at java.nio.file.Files.read(Files.java:3105)
at java.nio.file.Files.readAllBytes(Files.java:3158)
at com.ubs.gfs.etd.reporting.otc.trsloader.service.file.GmiEodFileArchiverService.lambda$compressDirectory$4(GmiEodFileArchiverService.java:124)
at com.ubs.gfs.etd.reporting.otc.trsloader.service.file.GmiEodFileArchiverService$$Lambda$19/183444013.accept(Unknown Source)
at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:512)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:502)
at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
at com.ubs.gfs.etd.reporting.otc.trsloader.service.file.GmiEodFileArchiverService.compressDirectory(GmiEodFileArchiverService.java:117)
at com.ubs.gfs.etd.reporting.otc.trsloader.service.file.GmiEodFileArchiverService.archiveFiles(GmiEodFileArchiverService.java:66)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.springframework.expression.spel.support.ReflectiveMethodExecutor.execute(ReflectiveMethodExecutor.java:113)
at org.springframework.expression.spel.ast.MethodReference.getValueInternal(MethodReference.java:102)
at org.springframework.expression.spel.ast.MethodReference.access$000(MethodReference.java:49)
at org.springframework.expression.spel.ast.MethodReference$MethodValueRef.getValue(MethodReference.java:347)
at org.springframework.expression.spel.ast.CompoundExpression.getValueInternal(CompoundExpression.java:88)
at org.springframework.expression.spel.ast.SpelNodeImpl.getTypedValue(SpelNodeImpl.java:131)
at org.springframework.expression.spel.standard.SpelExpression.getValue(SpelExpression.java:330)
at org.springframework.integration.util.AbstractExpressionEvaluator.evaluateExpression(AbstractExpressionEvaluator.java:166)
at org.springframework.integration.util.MessagingMethodInvokerHelper.processInternal(MessagingMethodInvokerHelper.java:317)
... 93 more

我认为存在缓冲区内存泄漏,因为它在前 4 个测试批处理中完美运行,但之后始终出现 java.lang.OutOfMemoryError: Direct buffer memory error 但我不知道如何修复它。我在这里看到了一个使用 Cleaner 方法的潜在解决方案:http://www.java67.com/2014/01/how-to-fix-javalangoufofmemoryerror-direct-byte-buffer-java.html

但我不知道这是否适用于这种情况。

------------------------编辑-------------------- --

我找到了另一种方法来使用 IOUtils 和缓冲输入流对文件进行压缩并解决了问题,更新了代码:

  public static void compressDirectory(String archiveDirectoryToCompress) throws IOException {
Path archiveToCompress = Files.createFile(Paths.get(archiveDirectoryToCompress + ".tar.gz"));

try (GzipCompressorOutputStream gzipCompressorOutputStream = new GzipCompressorOutputStream(
Files.newOutputStream(archiveToCompress));
TarArchiveOutputStream tarArchiveOutputStream = new TarArchiveOutputStream(gzipCompressorOutputStream)) {
Path directory = Paths.get(archiveDirectoryToCompress);
Files.walk(directory)
.filter(path -> !Files.isDirectory(path))
.forEach(path -> {
TarArchiveEntry tarEntry = new TarArchiveEntry(path.toFile(),path.getFileName().toString());
try (BufferedInputStream bufferedInputStream = new BufferedInputStream(new FileInputStream(path.toString()))) {
tarArchiveOutputStream.putArchiveEntry(tarEntry);
IOUtils.copy(bufferedInputStream, tarArchiveOutputStream);
tarArchiveOutputStream.closeArchiveEntry();
} catch (Exception e) {
LOGGER.error("There was an error while compressing the files", e);
}
});
}

最佳答案

当将文件加载到内存中时,java 使用称为直接内存池的不同非堆池分配一系列 DirectByteBuffer。这些缓冲区还附加了一个 Deallocator 类,负责在不再需要文件时释放内存。默认情况下,这些 Deallocators 在垃圾收集期间运行。

我怀疑正在发生的事情(这是我之前实际看到的事情)是您的程序没有充分利用堆,并且垃圾收集的运行频率不足以释放那些 DirectByteBuffers。因此,您可以尝试以下两种方法之一:使用 -XX:MaxDirectMemorySize 增加直接内存池的大小,或者通过调用 System.gc() 定期强制进行垃圾回收。

关于java.lang.OutOfMemoryError : Direct buffer memory when invoking Files. 读取所有字节,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39983588/

38 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com