gpt4 book ai didi

sql-server - 如何从 SQL Server 传输 200 万行而不导致 Node 崩溃?

转载 作者:太空宇宙 更新时间:2023-11-04 00:12:57 27 4
gpt4 key购买 nike

我正在使用 Node 将 200 万行从 SQL Server 复制到另一个数据库,所以我当然使用“流”选项,如下所示:

    const sql = require('mssql')
...
const request = new sql.Request()
request.stream = true
request.query('select * from verylargetable')
request.on('row', row => {
promise = write_to_other_database(row);
})

我的问题是我对每一行执行异步操作(插入另一个数据库),这需要时间。

读取速度比写入速度快,因此“on row”事件不断出现,内存最终会被待处理的 Promise 填满,并最终导致 Node.js 崩溃。这令人沮丧——“流媒体”的全部意义就是避免这种情况,不是吗?

如何解决这个问题?

最佳答案

间歇性地流式传输数百万行而不崩溃 pause your request .

sql.connect(config, err => {

if (err) console.log(err);

const request = new sql.Request();
request.stream = true; // You can set streaming differently for each request
request.query('select * from dbo.YourAmazingTable'); // or
request.execute(procedure)

request.on('recordset', columns => {
// Emitted once for each recordset in a query
//console.log(columns);
});

let rowsToProcess = [];

request.on('row', row => {
// Emitted for each row in a recordset
rowsToProcess.push(row);
if (rowsToProcess.length >= 3) {
request.pause();
processRows();
}
console.log(row);
});

request.on('error', err => {
// May be emitted multiple times
console.log(err);
});

request.on('done', result => {
// Always emitted as the last one
processRows();
//console.log(result);

});

const processRows = () => {
// process rows
rowsToProcess = [];
request.resume();
}

关于sql-server - 如何从 SQL Server 传输 200 万行而不导致 Node 崩溃?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/48645466/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com