gpt4 book ai didi

javascript - Nodejs频繁写文件

转载 作者:行者123 更新时间:2023-12-03 06:47:57 26 4
gpt4 key购买 nike

我有一个监听器来监听内容的变化,一旦内容修改,它就会发出处理函数:

$('#editor').on('onchange', () => changeHandler('...','...'));

function changeHandler(filePath, content){
var ws = fs.createWriteStream(filePath, 'utf8');
ws.write(content);
}

我的问题是'onchange'发生得太频繁,所以'write file'处理得太频繁,期间可能会丢失数据。有人可以给点建议吗?

更新现在我已经根据下面的答案更改了代码:

this.buffer = null; //used to cache

// once content changed, maybe too often
changeHandler() {
if (this.editor.curOp && this.editor.curOp.command.name) {
var id = $('.nav-items li.active .lk-hosts').attr('data-hosts-id');
var content = this.editor.getValue();
// cache data, not immediately write to file
this.buffer = {id: id, content: content};
}
}

setInterval(()=> {
// means there's data in cache
if (this.buffer !== null) {
let id = this.buffer.id;
let content = this.buffer.content;
// reset cache to null
this.buffer = null;
// write file
this.writeContent(id, content, (err)=> {
})
}
}, 800);

谢谢大家的回答!

最佳答案

为什么不简单地构建一个缓冲区来收集写入的文本,然后仅在写入达到一定数量时才写入文件:

$('#editor').on('onchange', () => changeHandler('...','...'));

var writeBuffer = ''; // can also make this an array
var writeBufferSize = 0;
var filePath = 'path_to_file';
var ws = fs.createWriteStream(filePath, 'utf8');

function changeHandler(content){
if (writeBufferSize == SOME_THRESHOLD) {
ws.write(writeBuffer);
writeBuffer = '';
writeBufferSize = 0;
} else {
writeBuffer += content + '\n';
writeBufferSize++;
}
}

如果您选择的写入缓冲区阈值太大,您可能希望将写入委托(delegate)给某个工作线程以并行完成,在这种情况下,您可以创建另一个临时写入缓冲区来填充原始写入缓冲区写好,然后将两者交换。

关于javascript - Nodejs频繁写文件,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37638523/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com