gpt4 book ai didi

node.js - fs.readFile 在 Docker 容器中的行为不同

转载 作者:行者123 更新时间:2023-12-02 20:23:08 24 4
gpt4 key购买 nike

以下代码在 OSX 上的 dev 中运行时从未失败,但在位于 Docker 容器中的生产环境中每次都失败:

数据同步 Controller .js

let jsonFile = await FileSystemService.ParseCsvToJson(fileName);
if (!jsonFile.success)
return res.json({ success: false });

let parsedJson = await FileSystemService.ParseJsonFile({ file: jsonFile.fileName });
if (!parsedJson.success)
return res.json({ success: false });

文件系统服务.js
static async ParseJsonFile(params)
{
return new Promise((resolve, reject) =>
{
try
{
fs.readFile(jsonFilePath, 'utf-8', (err, data) =>
{
if (err)
{
console.log('fs.readFile() error: ', err);
resolve({ success: false });
}

var file = [];

try
{
// This fails every time in the Docker container
// Error is [Unexpected end of input]
// At this point I've seend `data` evaluate to '',
// <buffer >, undefined and partial JSON data
file = JSON.parse(data);
}
catch(e)
{
console.log('ERROR parsing JSON file: ', e);
return resolve({ success: false });
}

// Do Stuff
return resolve({ success: true });
});
}
catch(exception)
{
console.error(exception);
resolve({ success: false });
}
});
}

似乎在 Docker 容器中,JSON 文件在读取时尚未完成写入(例如,如果我将整个 JSON.parse() block 放入 fs.readFile() 并阻止它运行 5 秒,则 timeout 有效),但我不明白这是怎么可能的,也不知道为什么在 Docker 容器中而不是在我的本地机器上会出现这种情况。一如既往,非常感谢任何想法。

更新

根据要求,这里是 ParseCsvToJson 的实现实际将 JSON 文件写入磁盘的方法。请注意,在 Docker 容器的开发和生产中,写入的 JSON 文件虽然很大(大约 4,400 条记录),但看起来还不错。
var Converter = require("csvtojson").Converter;
var filePath = `${config.root}/server/uploadDir`;

static async ParseCsvToJson(fileName)
{
return new Promise((resolve, reject) =>
{
try
{
let fileNameWithoutExtension = fileName.replace('.csv', '');
const jsonFilePath = `${filePath}/${fileNameWithoutExtension}.json`;
const csvFilePath = `${filePath}/${fileName}`;

// The parameter false will turn off final result construction.
// It can avoid huge memory consumption while parsing.
// The trade off is final result will not be populated to end_parsed event.
var csvConverter = new Converter({ constructResult: false, toArrayString: true });
var readStream = fs.createReadStream(csvFilePath);
var writeStream = fs.createWriteStream(jsonFilePath);
readStream.pipe(csvConverter).pipe(writeStream);

resolve({ success: true, fileName: `${fileNameWithoutExtension}.json` });
}
catch(exception)
{
console.error(exception);
resolve({ success: false });
}
});
}

最佳答案

关于转换器的实例化:

var csvConverter = new Converter({ constructResult: false, toArrayString: true });
文档说:

The parameter false will turn off final result construction.It can avoid huge memory consumption while parsing.The trade off is final result will not be populated to end_parsed event.


由于我不太了解的原因,在 OSX 上,这就足够了:
var readStream = fs.createReadStream('path/to/csvfile.csv', { encoding: 'utf-8' });
var writeStream = fs.createWriteStream('path/to/jsonfile.json', { encoding: 'utf-8' });
readStream.pipe(csvConverter).pipe(writeStream);
resolve({ success: true, fileName: '.json' });
然而,在 Linux 系统上,文件并不总是完全写入或根本写入。因此,还需要以下内容:
csvConverter.on('end_parsed', () =>
{
// The docs are unclear whether there is a possible error object
// passed along here in the case of failure, but clearly mention
// that the constructed result will be unavailable, so just resolve
// the Promise
resolve({ success: true, fileName: 'jsonfile.json' });
});
这可能是一个边缘案例,但希望它会在 future 对其他人有所帮助。

关于node.js - fs.readFile 在 Docker 容器中的行为不同,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36313822/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com