gpt4 book ai didi

java - 一次性数据转换的JDBC批量更新

转载 作者:行者123 更新时间:2023-11-30 10:59:08 24 4
gpt4 key购买 nike

我遇到一个情况,需要对近50万条记录进行数据转换。下面是过程

  1. From Java we will call the DB and get a set of records (Older data)
  2. These records will be passed to a webservice as input
  3. The webservice response will be updated into the DB (correct data)

我对此做了一些研究,发现 SQL Batch 是一个不错的选择。

我打算做的是

  1. Add 10,000 records to a batch (preparedstmnt. addBatch ();) via for loop
  2. Then do a commit ( con.commit();)
  3. preparedstmnt.clearBatch()
  4. And then go back to step one until entire records are completed

尊敬的专家,您对这种方法有何看法。如果您有更好的想法,请也建议我这样做。此外,如果您在从 java 进行批量更新时有一些建议要记住,请也让我知道

还有 executeBatch() 最后会调用 clearBatch() 吗?或者我们是否需要在开始下一批之前在每批结束时显式调用 clearBatch?

最佳答案

基本上,如果您使用 Spring Batch,它看起来像这样。正如我所说,其中一个优点是:“这很简单”。易于实现和 Spring 驱动,这意味着如果它也是 Spring 驱动的,它将很好地适合您当前的体系结构。这是大约。你的案例会是什么样子:

@Configuration
@EnableBatchProcessing
public class DBBatchProcess {

@Bean
public DataSource dataSource() {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName("com.mysql.jdbc.Driver");
dataSource.setUrl("jdbc:mysql://localhost:3306/myDB");
dataSource.setUsername("username");
dataSource.setPassword("password");
return dataSource;
}

@Bean
public ItemReader<InputEntity> reader(DataSource dataSource) {
// Use some database reader. Like JdbcCursorItemReader, JdbcPagingItemReader etc.
return new MyDatabaseReader();
}

@Bean
public ItemProcessor<InputEntity, OutputEntity> processor() {
// Do your conversion. Convert values from incoming entity to outcoming entity.
return new MyProcessor();
}

@Bean
public ItemWriter<OutputEntity> writer() {
// Receive outcoming entity from processor and write it to database. You can use JdbcBatchItemWriter for instance.
return new MyDatabaseWriter();
}

// Create a step. Provide reader, processor and writer. Determine chunk size.
@Bean
public Step step(StepBuilderFactory stepBuilderFactory, ItemReader<InputEntity> reader, ItemWriter<OutputEntity> writer,
ItemProcessor<InputEntity, OutputEntity> processor) {

return stepBuilderFactory.get("step")
.<InputEntity, OutputEntity>chunk(10000)
.reader(reader)
.processor(processor)
.writer(writer)
.build();
}

// Finally, create a job
@Bean
public Job job(JobBuilderFactory jobBuilderFactory, Step step) {
return jobBuilderFactory.get("job")
.start(step)
.build();
}

}

如果失败,您可以决定要做什么。您可以将监听器附加到步骤或作业等。

关于java - 一次性数据转换的JDBC批量更新,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32004608/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com