gpt4 book ai didi

php - 有没有更好的方法来处理一个 300,000 行的文本文件数据并将其插入 MySQL?

转载 作者:行者123 更新时间:2023-12-01 00:01:28 26 4
gpt4 key购买 nike

我现在正在做的是读取文本文件的内容并将其存储在一个变量中。阅读全部内容后,我为 block 数据运行一个循环,并在其中调用一个函数,该函数将读取 block 数据的每一行并将每一行传递给另一个处理每一列数据并将其插入的函数按批处理数据库。批处理是整个 block 。

对于大小超过 500KB 的每个文件,代码处理时间过长。我的问题是文本文件中没有我可以使用的唯一标识符,因此我可以应用“LOAD DATA INFILE”,这让我处于这种按 block 处理文本文件的情况。

700K 几乎花了一整天的时间来处理,但这仍然取决于机器规范。该代码在 CentOS 中运行。在处理完第一个文本文件后,下一个 800KB++ 大小的文本文件花了将近一周的时间来处理。这些与其他大小超过 800KB 的文本文件一起处理,尤其是 1MB 大小的文件需要将近或超过一周的时间来处理。

谁能告诉我我做错了什么以及我需要哪些选项才能让我的代码高效运行。


/*
====================================================================================
RECORDS FETCH
====================================================================================

Needs path and filename with extension.
The page do an iteration of records in the file by line.
After by line, it iterates again per delimiter ","..
It concatenates the part of the records for bulk insert process.
PID address is incremental, every three PID correspond to one Chamber
and the reading in each Chamber is CO2 for first PID address, RH for the
second PID address, TEMP for the third PID address.


====================================================================================
*/
$path = "";
$filename = "";
error_reporting(0);
include_once ("connect.php");
$p_results = mysql_query("SELECT PATH, FILENAME FROM tbl_path");
if(mysql_num_rows($p_results) > 0 ){
while ( $rows = mysql_fetch_assoc($p_results) )
{
$path = $rows['PATH'];
$filename = $rows['FILENAME'];
}
}
else
{
mysql_query("INSERT INTO tbl_log (LOG_DATE, DETAILS) VALUES ( NOW(), 'There is no path and filename to be accessed. Please provide.' )");
}
$path = str_replace('\\','/',$path);
//holds the path..NOTE: Change backslash (\) to forward slash (/)
//$path = "E:/";
//holds the filename.. NOTE: Include the file extension
//$filename = "sample2.txt"; //"chamber_1_con.txt";
if ($path <> "" && $filename <> "")
is_connected($path, $filename);

echo ('<script language="javascript">window.location = "chambers_monitoring.php" </script>');

//function for DB writing in table data
function InsertData($rec, &$errorDataCnt, &$sql, $y, $i, $x, &$dCnt)
{

$dDate = (!isset($rec[0]) ? 0 : (trim($rec[0]) == "" ? 0 : trim($rec[0])));
$dTime = (!isset($rec[1]) ? 0 : (trim($rec[1]) == "" ? 0 : trim($rec[1])));
$address = (!isset($rec[2]) ? 0 : (trim($rec[2]) == "" ? 0 : trim($rec[2])));
$co2SV = (!isset($rec[3]) ? 0 : (trim($rec[3]) == "" ? 0 : trim($rec[3])));
$co2PV = (!isset($rec[4]) ? 0 : (trim($rec[4]) == "" ? 0 : trim($rec[4])));
$tempSV = (!isset($rec[5]) ? 0 : (trim($rec[5]) == "" ? 0 : trim($rec[5])));
$tempPV = (!isset($rec[6]) ? 0 : (trim($rec[6]) == "" ? 0 : trim($rec[6])));
$rhSV = (!isset($rec[7]) ? 0 : (trim($rec[7]) == "" ? 0 : trim($rec[7])));
$rhPV = (!isset($rec[8]) ? 0 : (trim($rec[8]) == "" ? 0 : trim($rec[8])));


/* include('connect.php'); */
set_time_limit(36000);
ini_set('max_execution_time','43200');
$e_results = mysql_query("SELECT ID FROM tbl_reading WHERE (READING_DATE = '".date("Y-m-d",strtotime($dDate))."' AND READING_TIME = '".date("H:i:s",strtotime($dTime))."') AND READING_ADDRESS = $address LIMIT 1");
if(mysql_num_rows($e_results) <= 0 ){
if (!($dDate == 0 || $dTime == 0 || $address == 0) ) {
if ($y == 0){
$sql = "INSERT INTO tbl_reading (READING_DATE, READING_TIME, READING_ADDRESS, CO2_SET_VALUE, CO2_PROCESS_VALUE, TEMP_SET_VALUE, TEMP_PROCESS_VALUE, RH_SET_VALUE, RH_PROCESS_VALUE) VALUES ('".date("Y/m/d",strtotime($dDate))."','".date("H:i:s",strtotime($dTime))."', ". mysql_real_escape_string($address).",". mysql_real_escape_string($co2SV).",". mysql_real_escape_string($co2PV).",". mysql_real_escape_string($tempSV).",". mysql_real_escape_string($tempPV).",". mysql_real_escape_string($rhSV).",". mysql_real_escape_string($rhPV).")";
}
else {
$sql .= ", ('".date("Y/m/d",strtotime($dDate))."','".date("H:i:s",strtotime($dTime))."', ". mysql_real_escape_string($address).",". mysql_real_escape_string($co2SV).",". mysql_real_escape_string($co2PV).",". mysql_real_escape_string($tempSV).",". mysql_real_escape_string($tempPV).",". mysql_real_escape_string($rhSV).",". mysql_real_escape_string($rhPV).")";

}
}
}

if(($x + 1) == $i){
//echo ($x + 1)." = ".$i."<br>";
if (substr($sql, 0, 1) == ",")
$sql = "INSERT INTO tbl_reading (READING_DATE, READING_TIME, READING_ADDRESS, CO2_SET_VALUE, CO2_PROCESS_VALUE, TEMP_SET_VALUE, TEMP_PROCESS_VALUE, RH_SET_VALUE, RH_PROCESS_VALUE) VALUES".substr($sql, 1);
//echo $sql."<br>";
set_time_limit(36000);
try {

$result = mysql_query($sql) ;
$dCnt = mysql_affected_rows();
if( $dCnt == 0)
{
$errorDataCnt = $errorDataCnt + 1;
}
}
catch (Exception $e)
{
mysql_query("INSERT INTO tbl_log (LOG_DATE, DETAILS) VALUES ( NOW(), '".$e->getMessage()."' )");
}
//mysql_free_result($result);
}

unset($dDate);
unset($dTime);
unset($address);
unset($co2SV);
unset($co2PV);
unset($tempSV);
unset($tempPV);
unset($rhSV);
unset($rhPV);

}

//function for looping into the records per line
function loop($data)
{
$errorDataCnt = 0; $sql = ""; $exist = 0;
$i = count( $data); $x = 0; $y = 0; $tmpAdd = ""; $cnt = 0; $t = 0; $dCnt = 0;

ini_set('max_execution_time','43200');
while($x < $i)
{
$rec = explode(",", $data[$x]);
InsertData($rec, $errorDataCnt, $sql, $y, $i, $x, $dCnt);
$x++;
$y++;
unset($rec);
}

$errFetch = ($i - $dCnt);
if($errorDataCnt > 0)
mysql_query("INSERT INTO tbl_log (LOG_DATE, DETAILS) VALUES ( NOW(), 'Error inserting $errFetch records. Check if there is a NULL or empty value or if it is the correct data type.' )");
if($dCnt > 0)
mysql_query("INSERT INTO tbl_log (LOG_DATE, DETAILS) VALUES ( NOW(), 'Saved $dCnt of $i records into the database. Total $exist records already existing in the database.' )");


}

// functions in looping records and passing into $contents variable
function DataLoop($file)
{
ini_set("auto_detect_line_endings", true);
set_time_limit(36000);
ini_set('max_execution_time','43200');
$contents = ''; $j = 0;
if ($handle = fopen($file,"rb")){
while (!feof($handle)) {
$rdata = fgets($handle, 3359232);//filesize($file));
//$rdata = fread($handle, filesize($file));
if(trim($rdata) != "" || $rdata === FALSE){
if (feof($handle)) break;
else {
$contents .= $rdata;
$j = $j + 1; }}
}
fclose($handle);
$data = explode("\n", $contents);
unset($contents);
unset($rdata);
}
/* echo count($contents)." ".count($data); */
/* $query = "SELECT MAX(`ID`) AS `max` FROM `tbl_reading`";
$result = mysql_query($query) or die(mysql_error());
$row = mysql_fetch_assoc($result);
$max = $row['max']; */
/* $res = mysql_fetch_assoc(mysql_query("SELECT COUNT(*) as total FROM tbl_reading")) or die(mysql_error());
echo "<script>alert('".$res['total']."')</script>"; */
$p = 0;
ini_set('memory_limit','512M');
if($j != 0)
{
foreach(array_chunk($data, ceil(count($data)/200)) as $rec_data){
loop($rec_data);
$p++;
}
}

}
//function to test if filename exists
function IsExist($file)
{
if ($con = fopen($file, "r"))// file_exists($file))
{
fclose($con);
DataLoop($file);
}
else
mysql_query("INSERT INTO tbl_log (LOG_DATE, DETAILS) VALUES ( NOW(), '$filename is not existing in $path. Check if the filename or the path is correct.' )");

}

//function to test connection to where the file is.
function is_connected($path, $filename)
{
//check to see if the local machine is connected to the network
$errno = ""; $errstr = "";
if (substr(trim($path), -1) == '/')
$file = $path.$filename;
else
$file = $path."/".$filename;

IsExist($file);

}

最佳答案

从您的代码来看,您的“唯一标识符”(至少对于此插入而言)似乎是复合 (READING_DATE, READING_TIME, READING_ADDRESS)

如果您在数据库中定义这样一个UNIQUE 键,那么LOAD DATA使用 IGNORE 关键字应该完全符合您的要求:

ALTER TABLE tbl_reading
ADD UNIQUE KEY (READING_DATE, READING_TIME, READING_ADDRESS)
;

LOAD DATA INFILE '/path/to/csv'
IGNORE
INTO TABLE tbl_reading
FIELDS
TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY ''
LINES
TERMINATED BY '\r\n'
(@rec_0, @rec_1, @rec_2, @rec_3, @rec_4, @rec_5, @rec_6, @rec_7, @rec_8)
SET
READING_DATE = DATE_FORMAT(STR_TO_DATE(TRIM(@rec_0), '???'), '%Y/%m/%d'),
READING_TIME = DATE_FORMAT(STR_TO_DATE(TRIM(@rec_1), '???'), '%H:%i:%s'),
READING_ADDRESS = TRIM(@rec_2),
CO2_SET_VALUE = TRIM(@rec_3),
CO2_PROCESS_VALUE = TRIM(@rec_4),
TEMP_SET_VALUE = TRIM(@rec_5),
TEMP_PROCESS_VALUE = TRIM(@rec_6),
RH_SET_VALUE = TRIM(@rec_7),
RH_PROCESS_VALUE = TRIM(@rec_8)
;

(其中 '???' 替换为表示 CSV 中日期和时间格式的字符串)。

请注意,您确实应该将 READING_DATEREADING_TIME 一起存储在单个 DATETIMETIMESTAMP 列中:

ALTER TABLE tbl_reading
ADD COLUMN READING_DATETIME DATETIME AFTER READING_TIME,
ADD UNIQUE KEY (READING_DATETIME, READING_ADDRESS)
;

UPDATE tbl_reading SET READING_DATETIME = STR_TO_DATE(
CONCAT(READING_DATE, ' ', READING_TIME),
'%Y/%m/%d %H:%i:%s'
);

ALTER TABLE tbl_reading
DROP COLUMN READING_DATE,
DROP COLUMN READING_TIME
;

在这种情况下,LOAD DATA 命令的 SET 子句将改为包括:

READING_DATETIME = STR_TO_DATE(CONCAT(TRIM(@rec_0), ' ', TRIM(@rec_1)), '???')

关于php - 有没有更好的方法来处理一个 300,000 行的文本文件数据并将其插入 MySQL?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/14453922/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com