gpt4 book ai didi

c# - 在 C# 中快速读取 100+ GB 文件行的最快方法

转载 作者:太空宇宙 更新时间:2023-11-03 20:52:50 24 4
gpt4 key购买 nike

<分区>

从事一个将加载 100+ GB 文本文件的项目,其中一个过程是计算指定文件中的行数。我必须按照以下方式进行操作才能避免内存不足异常。有没有更快的方法或者什么是多任务最有效的方法。 (我知道你可以做一些事情,比如在 4 个线程上运行它,并将组合输出除以 4。不知道最有效的方法)

uint loadCount2 = 0;
foreach (var line in File.ReadLines(currentPath))
{
loadCount2++;
}

计划在我固定位置后在具有 4 个双核 CPU 和 40 GB RAM 的服务器上运行该程序。目前,它运行在一个临时的小型 4 核 8GB RAM 服务器上。 (不知道线程在多个 CPU 上的表现如何。)


我测试了很多你的建议。

            Stopwatch sw2 = Stopwatch.StartNew();
{
using (FileStream fs = File.Open(json, FileMode.Open))
CountLinesMaybe(fs);
}



TimeSpan t = TimeSpan.FromMilliseconds(sw2.ElapsedMilliseconds);
string answer = string.Format("{0:D2}h:{1:D2}m:{2:D2}s:{3:D3}ms", t.Hours, t.Minutes, t.Seconds, t.Milliseconds);
Console.WriteLine(answer);
sw2.Restart();
loadCount2 = 0;


Parallel.ForEach(File.ReadLines(json), (line) =>
{
loadCount2++;
});


t = TimeSpan.FromMilliseconds(sw2.ElapsedMilliseconds);
answer = string.Format("{0:D2}h:{1:D2}m:{2:D2}s:{3:D3}ms", t.Hours, t.Minutes, t.Seconds, t.Milliseconds);
Console.WriteLine(answer);
sw2.Restart();
loadCount2 = 0;

foreach (var line in File.ReadLines(json))
{
loadCount2++;
}

t = TimeSpan.FromMilliseconds(sw2.ElapsedMilliseconds);
answer = string.Format("{0:D2}h:{1:D2}m:{2:D2}s:{3:D3}ms", t.Hours, t.Minutes, t.Seconds, t.Milliseconds);
Console.WriteLine(answer);
sw2.Restart();
loadCount2 = 0;

int query = (int)Convert.ToByte('\n');
using (var stream = File.OpenRead(json))
{
int current;
do
{
current = stream.ReadByte();
if (current == query)
{
loadCount2++;
continue;
}
} while (current != -1);
}

t = TimeSpan.FromMilliseconds(sw2.ElapsedMilliseconds);
answer = string.Format("{0:D2}h:{1:D2}m:{2:D2}s:{3:D3}ms", t.Hours, t.Minutes, t.Seconds, t.Milliseconds);
Console.WriteLine(answer);
Console.ReadKey();

    private const char CR = '\r';
private const char LF = '\n';
private const char NULL = (char)0;

public static long CountLinesMaybe(Stream stream)
{
//Ensure.NotNull(stream, nameof(stream));

var lineCount = 0L;

var byteBuffer = new byte[1024 * 1024];
const int BytesAtTheTime = 4;
var detectedEOL = NULL;
var currentChar = NULL;

int bytesRead;
while ((bytesRead = stream.Read(byteBuffer, 0, byteBuffer.Length)) > 0)
{
var i = 0;
for (; i <= bytesRead - BytesAtTheTime; i += BytesAtTheTime)
{
currentChar = (char)byteBuffer[i];

if (detectedEOL != NULL)
{
if (currentChar == detectedEOL) { lineCount++; }

currentChar = (char)byteBuffer[i + 1];
if (currentChar == detectedEOL) { lineCount++; }

currentChar = (char)byteBuffer[i + 2];
if (currentChar == detectedEOL) { lineCount++; }

currentChar = (char)byteBuffer[i + 3];
if (currentChar == detectedEOL) { lineCount++; }
}
else
{
if (currentChar == LF || currentChar == CR)
{
detectedEOL = currentChar;
lineCount++;
}
i -= BytesAtTheTime - 1;
}
}

for (; i < bytesRead; i++)
{
currentChar = (char)byteBuffer[i];

if (detectedEOL != NULL)
{
if (currentChar == detectedEOL) { lineCount++; }
}
else
{
if (currentChar == LF || currentChar == CR)
{
detectedEOL = currentChar;
lineCount++;
}
}
}
}

if (currentChar != LF && currentChar != CR && currentChar != NULL)
{
lineCount++;
}
return lineCount;
}

enter image description here

结果显示进步很大,但我希望能达到 20 分钟。我想让他们在我更强大的服务器上看到拥有更多 CPU 的效果。

第二次运行返回:23分钟,25分钟,22分钟,29分钟

意味着这些方法并没有真正产生任何区别。(无法截屏,因为我删除了暂停,程序通过清屏继续运行)

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com