gpt4 book ai didi

php - 处理MySQL查询结果时如何限制PHP内存使用?

转载 作者:可可西里 更新时间:2023-11-01 07:38:12 24 4
gpt4 key购买 nike

所以我有一个 PHP 页面,允许用户为可能是一大堆记录的内容下载 CSV。问题是 MySQL 查询返回的结果越多,它使用的内存就越多。这并不奇怪,但确实会带来问题。

我尝试使用 mysql_unbuffered_query() 但这没有任何区别,所以我需要一些其他方法来释放我认为是先前处理的行所使用的内存。有没有标准的方法来做到这一点?

这是一个注释日志,说明了我在说什么:

// Method first called
2009-10-07 17:44:33 -04:00 --- info: used 3555064 bytes of memory

// Right before the query is executed
2009-10-07 17:44:33 -04:00 --- info: used 3556224 bytes of memory

// Immediately after query execution
2009-10-07 17:44:34 -04:00 --- info: used 3557336 bytes of memory

// Now we're processing the result set
2009-10-07 17:44:34 -04:00 --- info: Downloaded 1000 rows and used 3695664 bytes of memory
2009-10-07 17:44:35 -04:00 --- info: Downloaded 2000 rows and used 3870696 bytes of memory
2009-10-07 17:44:36 -04:00 --- info: Downloaded 3000 rows and used 4055784 bytes of memory
2009-10-07 17:44:37 -04:00 --- info: Downloaded 4000 rows and used 4251232 bytes of memory
2009-10-07 17:44:38 -04:00 --- info: Downloaded 5000 rows and used 4436544 bytes of memory
2009-10-07 17:44:39 -04:00 --- info: Downloaded 6000 rows and used 4621776 bytes of memory
2009-10-07 17:44:39 -04:00 --- info: Downloaded 7000 rows and used 4817192 bytes of memory
2009-10-07 17:44:40 -04:00 --- info: Downloaded 8000 rows and used 5012568 bytes of memory
2009-10-07 17:44:41 -04:00 --- info: Downloaded 9000 rows and used 5197872 bytes of memory
2009-10-07 17:44:42 -04:00 --- info: Downloaded 10000 rows and used 5393344 bytes of memory
2009-10-07 17:44:43 -04:00 --- info: Downloaded 11000 rows and used 5588736 bytes of memory
2009-10-07 17:44:43 -04:00 --- info: Downloaded 12000 rows and used 5753560 bytes of memory
2009-10-07 17:44:44 -04:00 --- info: Downloaded 13000 rows and used 5918304 bytes of memory
2009-10-07 17:44:45 -04:00 --- info: Downloaded 14000 rows and used 6103488 bytes of memory
2009-10-07 17:44:46 -04:00 --- info: Downloaded 15000 rows and used 6268256 bytes of memory
2009-10-07 17:44:46 -04:00 --- info: Downloaded 16000 rows and used 6443152 bytes of memory
2009-10-07 17:44:47 -04:00 --- info: used 6597552 bytes of memory

// This is after unsetting the variable. Didn't make a difference because garbage
// collection had not run
2009-10-07 17:44:47 -04:00 --- info: used 6598152 bytes of memory

我希望有某种标准技术可以处理像这样(甚至更大)的大型结果集,但我的研究没有发现任何结果。

想法?

根据要求,这里有一些代码:

    $results = mysql_query($query);

Kohana::log('info', "used " . memory_get_usage() . " bytes of memory");

$first = TRUE;
$row_count = 0;

while ($row = mysql_fetch_assoc($results)) {
$row_count++;
$new_row = $row;

if (array_key_exists('user_id', $new_row)) {
unset($new_row['user_id']);
}

if ($first) {
$columns = array_keys($new_row);
$columns = array_map(array('columns', "title"), $columns);
echo implode(",", array_map(array('Reports_Controller', "_quotify"), $columns));
echo "\n";
$first = FALSE;
}

if (($row_count % 1000) == 0) {
Kohana::log('info', "Downloaded $row_count rows and used " . memory_get_usage() . " bytes of memory");
}

echo implode(",", array_map(array('Reports_Controller', "_quotify"), $new_row));
echo "\n";
}

最佳答案

一些进一步的分析表明问题是某处的内存泄漏。我将代码简化为最简单的形式,内存使用量不会随着每次迭代而增加。我怀疑是 Kohana(我正在使用的框架)。

关于php - 处理MySQL查询结果时如何限制PHP内存使用?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/1535527/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com