gpt4 book ai didi

Java - 将记录插入数据库时​​,executebatch 不起作用

转载 作者:太空宇宙 更新时间:2023-11-03 10:33:39 25 4
gpt4 key购买 nike

在从文件中提取单词并将它们存储在哈希集中后,我试图将记录插入到我在 MYSQL 中的表中。

我尝试在获得 500 条记录后使用 executeBatch() 插入我的数据库,但是当执行完成时,我检查了我的表,根本没有插入任何记录。

注意:当我使用 ExecuteUpdate() 时,记录将显示在我的表中。但不是 ExecuteBatch() 因为我想批量插入,而不是一个一个地插入。我可以知道我做错了什么吗?

代码:

public void readDataBase(String path,String word) throws Exception {
try {

// Result set get the result of the SQL query

int i=0;
// This will load the MySQL driver, each DB has its own driver
Class.forName("com.mysql.jdbc.Driver");
// Setup the connection with the DB
connect = DriverManager
.getConnection("jdbc:mysql://126.32.3.20/fulltext_ltat?"
+ "user=root&password=root");

// Statements allow to issue SQL queries to the database
// statement = connect.createStatement();
System.out.print("Connected");
// Result set get the result of the SQL query

preparedStatement = connect
.prepareStatement("insert IGNORE into fulltext_ltat.indextable values (default,?, ?) ");

preparedStatement.setString( 1, path);
preparedStatement.setString(2, word);
preparedStatement.addBatch();
i++;
// preparedStatement.executeUpdate();

if(i%500==0){
preparedStatement.executeBatch();

}




preparedStatement.close();



// writeResultSet(resultSet);
} catch (Exception e) {
throw e;
} finally {
close();
}

}

这是我调用该方法的循环(words 只是一个包含要插入到表中的单词的数组):

 for(int i = 1 ; i <= words.length - 1 ; i++ ) {

connection.readDataBase(path, words[i].toString());

}

我的主要方法:

public static void main(String[] args) throws Exception {

StopWatch stopwatch = new StopWatch();
stopwatch.start();



File folder = new File("D:\\PDF1");
File[] listOfFiles = folder.listFiles();

for (File file : listOfFiles) {
if (file.isFile()) {
HashSet<String> uniqueWords = new HashSet<>();
String path = "D:\\PDF1\\" + file.getName();
try (PDDocument document = PDDocument.load(new File(path))) {

if (!document.isEncrypted()) {

PDFTextStripper tStripper = new PDFTextStripper();
String pdfFileInText = tStripper.getText(document);
String lines[] = pdfFileInText.split("\\r?\\n");
for (String line : lines) {
String[] words = line.split(" ");

for (String word : words) {
uniqueWords.add(word)
;

}

}
// System.out.println(uniqueWords);

}
} catch (IOException e) {
System.err.println("Exception while trying to read pdf document - " + e);
}
Object[] words = uniqueWords.toArray();



MysqlAccessIndex connection = new MysqlAccessIndex();

for(int i = 1 ; i <= words.length - 1 ; i++ ) {

connection.readDataBase(path, words[i].toString());

}

System.out.println("Completed");

}
}

最佳答案

您执行批量更新的模式已关闭。您应该只打开连接并准备语句一次。然后,迭代多次,绑定(bind)参数,并将该语句添加到批处理中。

// define a collection of paths and words somewhere
List<String> paths = new ArrayList<>();
List<String> words = new ArrayList<>();

try {
// presumably you only want to insert so many records
int LIMIT = 10000;
Class.forName("com.mysql.jdbc.Driver");
connect = DriverManager
.getConnection("jdbc:mysql://126.32.3.20/fulltext_ltat?"
+ "user=root&password=root");

String sql = "INSERT IGNORE INTO fulltext_ltat.indextable VALUES (default, ?, ?);";
preparedStatement = connect.prepareStatement(sql);

for (int i=0; i < LIMIT; ++i) {
preparedStatement.setString(1, paths.get(i));
preparedStatement.setString(2, word.get(i));
preparedStatement.addBatch();
if (i % 500 == 0) {
preparedStatement.executeBatch();
}
}

// execute remaining batches
preparedStatement.executeBatch();
}
catch (SQLException e) {
e.printStackTrace();
}
finally {
try {
preparedStatement.close();
connect.close();
}
catch (SQLException e) {
e.printStackTrace();
}
}

我在这里所做的一个关键更改是添加了何时应该停止插入的逻辑。目前,您的代码看起来有一个无限循环,这意味着它将永远运行。这可能不是您打算做的。

关于Java - 将记录插入数据库时​​,executebatch 不起作用,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/53001439/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com