gpt4 book ai didi

java - Spring批处理上传CSV文件并相应地插入数据库

转载 作者:搜寻专家 更新时间:2023-11-01 02:57:57 31 4
gpt4 key购买 nike

我的项目有这样的要求,用户上传一个 CSV 文件,该文件必须被推送到 mysql 数据库。我知道我们可以使用 Spring 批处理来处理大量记录。但是我找不到任何教程/示例代码来满足我的这个要求。我遇到的所有教程都只是硬编码其中的 CSV 文件名,如下所示:

https://spring.io/guides/gs/batch-processing/

我需要使用用户上传的文件并进行相应的处理。如有任何帮助,我们将不胜感激..

如果不使用Spring batch,有没有其他方法可以将上传的CSV数据插入到mysql中?

最佳答案

请将此作为主要引用:http://walkingtechie.blogspot.co.uk/2017/03/spring-batch-csv-file-to-mysql.html这解释了如何使用 Batch 将 CSV 文件导入 MySQL 数据库。

但是,正如您所说,所有示例都假定了一个硬编码文件,这不是您想要的。

在下面的代码中,重要的部分(与我提供的链接中的示例不同)是采用多部分文件并将其保存在临时文件夹中的 Controller 。然后将文件名作为参数传递给 Job:

JobExecution jobExecution = jobLauncher.run(importUserJob, new JobParametersBuilder()
.addString("fullPathFileName", fileToImport.getAbsolutePath())
.toJobParameters());

最后,importReader 使用参数 fullPathFileName 加载用户上传的文件:

      @Bean
public FlatFileItemReader<Person> importReader(@Value("#{jobParameters[fullPathFileName]}") String pathToFile) {
FlatFileItemReader<Person> reader = new FlatFileItemReader<>();
reader.setResource(new FileSystemResource(pathToFile));

这里有完整的代码(未经测试,但它包含大部分组件)给你一个想法:

@Configuration
@EnableBatchProcessing
public class BatchConfig{

@Bean
public ResourcelessTransactionManager batchTransactionManager(){
ResourcelessTransactionManager transactionManager = new ResourcelessTransactionManager();
return transactionManager;
}

@Bean
protected JobRepository jobRepository(ResourcelessTransactionManager batchTransactionManager) throws Exception{
MapJobRepositoryFactoryBean jobRepository = new MapJobRepositoryFactoryBean();
jobRepository.setTransactionManager(batchTransactionManager);
return (JobRepository)jobRepository.getObject();
}

@Bean
public JobLauncher jobLauncher(JobRepository jobRepository) throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
return jobLauncher;
}

}

@Configuration
public class ImportJobConfig {

@Bean
public FlatFileItemReader<Person> importReader(@Value("#{jobParameters[fullPathFileName]}") String pathToFile) {
FlatFileItemReader<Person> reader = new FlatFileItemReader<>();
reader.setResource(new FileSystemResource(pathToFile));
reader.setLineMapper(new DefaultLineMapper<Person>() {{
setLineTokenizer(new DelimitedLineTokenizer() {{
setNames(new String[]{"firstName", "lastName"});
}});
setFieldSetMapper(new BeanWrapperFieldSetMapper<Person>() {{
setTargetType(Person.class);
}});
}});
return reader;
}

@Bean
public PersonItemProcessor processor() {
return new PersonItemProcessor();
}

@Bean
public JdbcBatchItemWriter<Person> writer() {
JdbcBatchItemWriter<Person> writer = new JdbcBatchItemWriter<>();
writer.setItemSqlParameterSourceProvider(
new BeanPropertyItemSqlParameterSourceProvider<Person>());
writer.setSql("INSERT INTO people (first_name, last_name) VALUES (:firstName, :lastName)");
writer.setDataSource(dataSource);
return writer;
}
// end::readerwriterprocessor[]

// tag::jobstep[]
@Bean
public Job importUserJob(JobCompletionNotificationListener listener) {
return jobBuilderFactory.get("importUserJob").incrementer(new RunIdIncrementer())
.listener(listener).flow(step1()).end().build();
}

@Bean
public Step step1(@Qualifier("importReader") ItemReader<Person> importReader) {
return stepBuilderFactory.get("step1").<Person, Person>chunk(10).reader(importReader)
.processor(processor()).writer(writer()).build();
}

}

@RestController
public class MyImportController {

@Autowired private JobLauncher jobLauncher;
@Autowired private Job importUserJob;

@RequestMapping(value="/import/file", method=RequestMethod.POST)
public String create(@RequestParam("file") MultipartFile multipartFile) throws IOException{

//Save multipartFile file in a temporary physical folder
String path = new ClassPathResource("tmpuploads/").getURL().getPath();//it's assumed you have a folder called tmpuploads in the resources folder
File fileToImport = new File(path + multipartFile.getOriginalFilename());
OutputStream outputStream = new FileOutputStream(fileToImport);
IOUtils.copy(multipartFile.getInputStream(), outputStream);
outputStream.flush();
outputStream.close();

//Launch the Batch Job
JobExecution jobExecution = jobLauncher.run(importUserJob, new JobParametersBuilder()
.addString("fullPathFileName", fileToImport.getAbsolutePath())
.toJobParameters());

return "OK";
}

}

关于java - Spring批处理上传CSV文件并相应地插入数据库,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/47442909/

31 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com