gpt4 book ai didi

java - Files.walkFileTree 按字典顺序排列

转载 作者:太空宇宙 更新时间:2023-11-04 09:58:00 45 4
gpt4 key购买 nike

我有一个单元测试,尝试使用本地文件系统模拟读取 S3 存储桶。为此,我利用 Files.walkFileTree 将某些记录添加到列表中。

这是正在遍历的文件夹,稍后我将从 .gz 文件中提取数据。

$ ls -l /var/folders/8g/f_n563nx5yv9mdpnznnxv8gj1xs_mm/T/s3FilesReaderTest1892987110875929052/prefix/2016-01-01/ | cut -d' ' -f8-

41 Dec 19 18:38 topic-00000-000000000000.gz
144 Dec 19 18:38 topic-00000-000000000000.index.json
48 Dec 19 18:38 topic-00001-000000000000.gz
144 Dec 19 18:38 topic-00001-000000000000.index.json

这是模拟方法

final AmazonS3 client = mock(AmazonS3Client.class);
when(client.listObjects(any(ListObjectsRequest.class))).thenAnswer(new Answer<ObjectListing>() {

private String key(File file) {
return file.getAbsolutePath().substring(dir.toAbsolutePath().toString().length() + 1);
}

@Override
public ObjectListing answer(InvocationOnMock invocationOnMock) throws Throwable {
final ListObjectsRequest req = (ListObjectsRequest) invocationOnMock.getArguments()[0];
final String bucket = req.getBucketName();
final String marker = req.getMarker();
final String prefix = req.getPrefix();
logger.debug("prefix = {}; marker = {}", prefix, marker);

final List<File> files = new ArrayList<>();
Path toWalk = dir;
if (prefix != null) {
toWalk = Paths.get(dir.toAbsolutePath().toString(), prefix).toAbsolutePath();
}
logger.debug("walking\t{}", toWalk);
Files.walkFileTree(toWalk, new SimpleFileVisitor<Path>() {
@Override
public FileVisitResult preVisitDirectory(Path toCheck, BasicFileAttributes attrs) throws IOException {
if (toCheck.startsWith(dir)) {
logger.debug("visiting\t{}", toCheck);
return FileVisitResult.CONTINUE;
}
logger.debug("skipping\t{}", toCheck);
return FileVisitResult.SKIP_SUBTREE;
}

@Override
public FileVisitResult visitFile(Path path, BasicFileAttributes attrs) throws IOException {
File f = path.toFile();
String key = key(f);
if (marker == null || key.compareTo(marker) > 0) {
logger.debug("adding\t{}", f);
files.add(f);
}
return FileVisitResult.CONTINUE;
}
});

ObjectListing listing = new ObjectListing();
List<S3ObjectSummary> summaries = new ArrayList<>();
Integer maxKeys = req.getMaxKeys();
for (int i = 0; i < maxKeys && i < files.size(); i++) {
String key = key(files.get(i));

S3ObjectSummary summary = new S3ObjectSummary();
summary.setKey(key);
logger.debug("adding summary for {}", key);
summaries.add(summary);

listing.setNextMarker(key);
}

listing.setMaxKeys(maxKeys);
listing.getObjectSummaries().addAll(summaries);
listing.setTruncated(files.size() > maxKeys);

return listing;
}
});

以及日志输出

2018-12-19 18:38:13.469 [main] DEBUG c.s.k.connect.s3.S3FilesReaderTest - prefix = prefix; marker = prefix/2016-01-01
2018-12-19 18:38:13.470 [main] DEBUG c.s.k.connect.s3.S3FilesReaderTest - walking /var/folders/8g/f_n563nx5yv9mdpnznnxv8gj1xs_mm/T/s3FilesReaderTest1892987110875929052/prefix
2018-12-19 18:38:13.475 [main] DEBUG c.s.k.connect.s3.S3FilesReaderTest - visiting /var/folders/8g/f_n563nx5yv9mdpnznnxv8gj1xs_mm/T/s3FilesReaderTest1892987110875929052/prefix
2018-12-19 18:38:13.476 [main] DEBUG c.s.k.connect.s3.S3FilesReaderTest - visiting /var/folders/8g/f_n563nx5yv9mdpnznnxv8gj1xs_mm/T/s3FilesReaderTest1892987110875929052/prefix/2016-01-01
2018-12-19 18:38:13.477 [main] DEBUG c.s.k.connect.s3.S3FilesReaderTest - adding /var/folders/8g/f_n563nx5yv9mdpnznnxv8gj1xs_mm/T/s3FilesReaderTest1892987110875929052/prefix/2016-01-01/topic-00000-000000000000.index.json
2018-12-19 18:38:13.477 [main] DEBUG c.s.k.connect.s3.S3FilesReaderTest - adding /var/folders/8g/f_n563nx5yv9mdpnznnxv8gj1xs_mm/T/s3FilesReaderTest1892987110875929052/prefix/2016-01-01/topic-00001-000000000000.index.json
2018-12-19 18:38:13.477 [main] DEBUG c.s.k.connect.s3.S3FilesReaderTest - adding /var/folders/8g/f_n563nx5yv9mdpnznnxv8gj1xs_mm/T/s3FilesReaderTest1892987110875929052/prefix/2016-01-01/topic-00001-000000000000.gz
2018-12-19 18:38:13.477 [main] DEBUG c.s.k.connect.s3.S3FilesReaderTest - adding /var/folders/8g/f_n563nx5yv9mdpnznnxv8gj1xs_mm/T/s3FilesReaderTest1892987110875929052/prefix/2016-01-01/topic-00000-000000000000.gz
2018-12-19 18:38:13.479 [main] DEBUG c.s.k.connect.s3.S3FilesReaderTest - adding summary for prefix/2016-01-01/topic-00000-000000000000.index.json
2018-12-19 18:38:13.479 [main] DEBUG c.s.k.connect.s3.S3FilesReaderTest - adding summary for prefix/2016-01-01/topic-00001-000000000000.index.json
2018-12-19 18:38:13.479 [main] DEBUG c.s.k.connect.s3.S3FilesReaderTest - adding summary for prefix/2016-01-01/topic-00001-000000000000.gz
2018-12-19 18:38:13.479 [main] DEBUG c.s.k.connect.s3.S3FilesReaderTest - adding summary for prefix/2016-01-01/topic-00000-000000000000.gz
2018-12-19 18:38:13.481 [main] DEBUG c.s.k.c.s3.source.S3FilesReader - aws ls bucket/prefix after:prefix/2016-01-01 = [prefix/2016-01-01/topic-00000-000000000000.index.json, prefix/2016-01-01/topic-00001-000000000000.index.json, prefix/2016-01-01/topic-00001-000000000000.gz, prefix/2016-01-01/topic-00000-000000000000.gz]
2018-12-19 18:38:13.481 [main] DEBUG c.s.k.c.s3.source.S3FilesReader - Skipping non-data chunk prefix/2016-01-01/topic-00000-000000000000.index.json
2018-12-19 18:38:13.481 [main] DEBUG c.s.k.c.s3.source.S3FilesReader - Skipping non-data chunk prefix/2016-01-01/topic-00001-000000000000.index.json
2018-12-19 18:38:13.484 [main] DEBUG c.s.k.c.s3.source.S3FilesReader - Adding chunk-key prefix/2016-01-01/topic-00001-000000000000.gz
2018-12-19 18:38:13.484 [main] DEBUG c.s.k.c.s3.source.S3FilesReader - Adding chunk-key prefix/2016-01-01/topic-00000-000000000000.gz
2018-12-19 18:38:13.485 [main] DEBUG c.s.k.c.s3.source.S3FilesReader - Next Chunks: [prefix/2016-01-01/topic-00001-000000000000.gz, prefix/2016-01-01/topic-00000-000000000000.gz]
2018-12-19 18:38:13.485 [main] DEBUG c.s.k.c.s3.source.S3FilesReader - Now reading from prefix/2016-01-01/topic-00001-000000000000.gz
2018-12-19 18:38:13.513 [main] DEBUG c.s.k.c.s3.source.S3FilesReader - Now reading from prefix/2016-01-01/topic-00000-000000000000.gz

这些文件都被正确读取(key0 为 1 个值,key1 为 2 个值),但我的单元测试期望它们按升序读取。
所有以 prefix/2016-01-01/topic-00000 开头的文件都应在 prefix/2016-01-01/topic-00001 之前读取,特别是添加摘要

java.lang.AssertionError: 
Expected :[key0-0=value0-0, key1-0=value1-0, key1-1=value1-1]
Actual :[key1-0=value1-0, key1-1=value1-1, key0-0=value0-0]

除了插入排序集合而不是常规列表之外,还有哪些其他选项可以满足该条件,以便按照对单个文件夹进行常规 ls 操作给出的顺序读取文件?

最佳答案

一种选择是使用 Stream:

try (Stream<Path> tree = Files.walk(toWalk)) {
tree.filter(p -> !Files.isDirectory(p) && p.startsWith(dir)).sorted()
.forEachOrdered(path -> {
File f = path.toFile();
String key = key(f);
// etc.
});
}

关于java - Files.walkFileTree 按字典顺序排列,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/53861136/

45 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com