gpt4 book ai didi

从大表中选择数据时,MySQL nodejs 崩溃

转载 作者:行者123 更新时间:2023-11-29 05:56:04 24 4
gpt4 key购买 nike

我正在尝试将数据从一个数据库转换到另一个数据库,但在尝试从大表中获取数据、将其保存到对象并插入到另一个数据库时遇到了一些问题。这是我的代码:

let sql;
let resultsToFetch = true;
while (resultsToFetch) {
sql = `SELECT X FROM Y LIMIT ${index}, 1000`;
DB1.query(sql, (err, result) => {
if (err) {
resultsToFetch = false;
throw err;
} else if (result.length == 0) {
resultsToFetch = false;
} else {
result.forEach(res => {
const obj = {
id: res.id,
name: res.name
};
sql = "INSERT INTO X SET ?";
DB2.query(sql, obj, (err, result) => {
if (err) throw err;
});
});
}
});
index += 1000;
}

我正在尝试使用 LIMIT,所以我没有立即选择所有 600 万个条目,但我仍然遇到 Javascript 堆内存不足错误。我想我误解了与 Node.js 相关的内容,但我不太确定它是什么。这是错误:

<--- Last few GCs --->

[11256:000002A5D2CBB600] 22031 ms: Mark-sweep 1418.5 (1482.0) -> 1418.5 (1451.5) MB, 918.3 / 0.0 ms last resort GC in old space requested
[11256:000002A5D2CBB600] 22947 ms: Mark-sweep 1418.5 (1451.5) -> 1418.5 (1451.5) MB, 915.2 / 0.0 ms last resort GC in old space requested


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 000000B356525529 <JSObject>
1: /* anonymous */ [\index.js:~1] [pc=00000042DA416732](this=000000C326B04AD1 <Object map = 0000027D35B023B9>,exports=000000C326B04AD1 <Object map = 0000027D35B023B9>,require=000000C326B04A89 <JSFunction require (sfi = 00000229888651E9)>,module=000000C326B04A39 <Module map = 0000027D35B44F69>,__filename=000002298886B769 <String[52]\

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
1: node::DecodeWrite
2: node_module_register
3: v8::internal::FatalProcessOutOfMemory
4: v8::internal::FatalProcessOutOfMemory
5: v8::internal::Factory::NewUninitializedFixedArray
6: v8::internal::WasmDebugInfo::SetupForTesting
7: v8::internal::interpreter::BytecodeArrayRandomIterator::UpdateOffsetFromIndex
8: 00000042DA2843C1

编辑:@Grégory NEUT

let query = DB1.query("SELECT * FROM X");
let index = 0;
query
.on("error", function(err) {
// Handle error, an 'end' event will be emitted after this as well
})
.on("fields", function(fields) {
// the field packets for the rows to follow
})
.on("result", function(row) {
// Pausing the connnection is useful if your processing involves I/O
DB1.pause();
const obj = {
id: row.id,
};
console.log(obj);
const sql = `INSERT INTO X SET ?`;
DB2.query(sql, obj, (err, result) => {
if (err) {
throw err;
}
DB1.resume();
});
console.log(index);
index++;
})
.on("end", function() {
// all rows have been received
});

最佳答案

我不知道 mysql 驱动程序在 node.js 中是如何完成的,但可能它会加载所有内容然后限制数据。或者也许 1000 个条目太多了。


无论如何,解决方案是使用 streams

var query = connection.query('SELECT * FROM posts');

query
.on('error', function(err) {
// Handle error, an 'end' event will be emitted after this as well
})
.on('fields', function(fields) {
// the field packets for the rows to follow
})
.on('result', function(row) {
// Pausing the connnection is useful if your processing involves I/O
connection.pause();

processRow(row, function() {
connection.resume();
});
})
.on('end', function() {
// all rows have been received
});

因此它一次只会将处理过的数据加载到内存中。使用它,您将确保无论拥有多少数据,都不会遇到分配失败。

关于从大表中选择数据时,MySQL nodejs 崩溃,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/49254124/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com