gpt4 book ai didi

node.js - 循环异步写文件,如何管理流

转载 作者:太空宇宙 更新时间:2023-11-03 22:20:53 25 4
gpt4 key购买 nike

我正在尝试通过 Node Canvas 循环写入 100 多个 png 文件。仅生成 40 个文件,然后该过程完成。

我尝试通过 createPNGStream() 创建 png 流,并将结果通过管道传输到 fs.createWriteStream() 创建的写入流。

写入函数:

function writeFile(row, col) {
// canvas code ...
const stream = canvas.createPNGStream();
let path = __dirname + '/imgs/heatmapRow' + row + "Col" + col + '.png';
const out = fs.createWriteStream(path);
stream.pipe(out);
out.on('finish', () => console.log('The PNG file was created.'))
}

调用函数:

function generateImages(){
var numRows = 20;
var numCols = 5;
for(let row = 0; row < numRows; ++row) {
for (let col = 0; col < numCols; ++col) {
writeFile(row, col);
}
}
}

循环运行并完成,最后我一次得到一堆以下行:

The PNG file was created. The PNG file was created. The PNG file was created. The PNG file was created.

我认为在每个循环上都会创建一个 writeStream 并且是异步的。该进程正在终止,因为我只能打开这么多流。

如何异步写入所有文件以加快处理时间? (我不喜欢同步写入文件)我需要将每个 writeFile 添加到队列中吗?如何确定我打开的流的数量限制以及如何管理它们?

最佳答案

您必须使用 Promise 来进行异步调用。我给你一个解决方案:

  function writeFile(row, col) {
// canvas code ...
const stream = canvas.createPNGStream();
let path = __dirname + "/imgs/heatmapRow" + row + "Col" + col + ".png";

return new Promise(resolve => {
const out = fs.createWriteStream(path);
stream.pipe(out);
out.on("finish", () => {
console.log("The PNG file was created.");
resolve();
});
});
}

function generateImages() {
var numRows = 20;
var numCols = 5;
var imagesToGenerate = [];
for (let row = 0; row < numRows; ++row) {
for (let col = 0; col < numCols; ++col) {
imagesToGenerate.push(writeFile(row, col));
}
}

Promise.all(imagesToGenerate).then(() => {
console.log("All images generated");
});
}

看看Promise.all docs如果您不清楚它是如何工作的

关于node.js - 循环异步写文件,如何管理流,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/55398887/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com