- android - 多次调用 OnPrimaryClipChangedListener
- android - 无法更新 RecyclerView 中的 TextView 字段
- android.database.CursorIndexOutOfBoundsException : Index 0 requested, 光标大小为 0
- android - 使用 AppCompat 时,我们是否需要明确指定其 UI 组件(Spinner、EditText)颜色
我正在开发Spring Batch Partitioning示例
,在这个示例中我尝试创建一个“Partitioner”
作业,它有10个线程
,每个线程都会根据提供的'id'
范围从数据库读取记录。我引用了 http://www.mkyong.com/spring-batch/spring-batch-partitioning-example/
当我第一次运行代码时,它工作正常,但下次当我再次运行代码时,它会显示以下错误,我希望每个运行文件都应该重新创建,那么我需要为此做什么?我需要做哪些额外配置?
org.springframework.batch.item.ItemStreamException: File already exists: [C:\Users\userpc\Documents\workspace-sts-3.6.4.RELEASE\SpringBatch-Partitioner-Example\csv\outputs\users.processed21-30.csv]
at org.springframework.batch.item.util.FileUtils.setUpOutputFile(FileUtils.java:61)
at org.springframework.batch.item.file.FlatFileItemWriter$OutputState.initializeBufferedWriter(FlatFileItemWriter.java:559)
at org.springframework.batch.item.file.FlatFileItemWriter$OutputState.access$000(FlatFileItemWriter.java:399)
at org.springframework.batch.item.file.FlatFileItemWriter.doOpen(FlatFileItemWriter.java:333)
at org.springframework.batch.item.file.FlatFileItemWriter.open(FlatFileItemWriter.java:323)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157)
at org.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:133)
at org.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:121)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207)
at com.sun.proxy.$Proxy5.open(Unknown Source)
at org.springframework.batch.item.support.CompositeItemStream.open(CompositeItemStream.java:96)
at org.springframework.batch.core.step.tasklet.TaskletStep.open(TaskletStep.java:310)
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:197)
at org.springframework.batch.core.partition.support.TaskExecutorPartitionHandler$1.call(TaskExecutorPartitionHandler.java:139)
at org.springframework.batch.core.partition.support.TaskExecutorPartitionHandler$1.call(TaskExecutorPartitionHandler.java:136)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.lang.Thread.run(Thread.java:745)
Apr 06, 2016 12:07:29 AM org.springframework.batch.core.step.AbstractStep execute
SEVERE: Encountered an error executing step slave in job partitionJob
用户.java
public class User implements Serializable{
private static final long serialVersionUID = 1L;
private int id;
private String username;
private String password;
private int age;
// setters and getters
}
UserRowMapper.java
public class UserRowMapper implements RowMapper<User> {
@Override
public User mapRow(ResultSet rs, int rowNum) throws SQLException {
User user = new User();
user.setId(rs.getInt("id"));
user.setUsername(rs.getString("username"));
user.setPassword(rs.getString("password"));
user.setAge(rs.getInt("age"));
return user;
}
}
RangePartitioner.java
public class RangePartitioner implements Partitioner {
@Override
public Map<String, ExecutionContext> partition(int gridSize) {
Map<String, ExecutionContext> result = new HashMap<String, ExecutionContext>();
int range = 10;
int fromId = 1;
int toId = range;
for (int i = 1; i <= gridSize; i++) {
ExecutionContext value = new ExecutionContext();
System.out.println("\nStarting : Thread" + i);
System.out.println("fromId : " + fromId);
System.out.println("toId : " + toId);
value.putInt("fromId", fromId);
value.putInt("toId", toId);
// give each thread a name
value.putString("name", "Thread" + i);
result.put("partition" + i, value);
fromId = toId + 1;
toId += range;
}
return result;
}
}
UserProcessor.java
@Component("itemProcessor")
@Scope(value = "step")
public class UserProcessor implements ItemProcessor<User, User> {
@Value("#{stepExecutionContext[name]}")
private String threadName;
@Override
public User process(User item) throws Exception {
System.out.println(threadName + " processing : " + item.getId() + " : " + item.getUsername());
return item;
}
public String getThreadName() {
return threadName;
}
public void setThreadName(String threadName) {
this.threadName = threadName;
}
}
数据库.xml
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:jdbc="http://www.springframework.org/schema/jdbc"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.2.xsd
http://www.springframework.org/schema/jdbc
http://www.springframework.org/schema/jdbc/spring-jdbc-3.2.xsd">
<!-- connect to database -->
<bean id="dataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value="com.mysql.jdbc.Driver" />
<property name="url" value="jdbc:mysql://localhost:3306/toga" />
<property name="username" value="root" />
<property name="password" value="root" />
</bean>
<bean id="transactionManager"
class="org.springframework.batch.support.transaction.ResourcelessTransactionManager" />
<!-- create job-meta tables automatically
<jdbc:initialize-database data-source="dataSource">
<jdbc:script location="org/springframework/batch/core/schema-drop-mysql.sql" />
<jdbc:script location="org/springframework/batch/core/schema-mysql.sql" />
</jdbc:initialize-database>
-->
</beans>
上下文.xml
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.2.xsd">
<!-- stored job-meta in memory -->
<bean id="jobRepository"
class="org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean">
<property name="transactionManager" ref="transactionManager" />
</bean>
<bean id="transactionManager"
class="org.springframework.batch.support.transaction.ResourcelessTransactionManager" />
<bean id="jobLauncher"
class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<property name="jobRepository" ref="jobRepository" />
</bean>
</beans>
作业分区器.xml
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:batch="http://www.springframework.org/schema/batch" xmlns:util="http://www.springframework.org/schema/util"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="
http://www.springframework.org/schema/batch http://www.springframework.org/schema/batch/spring-batch.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util.xsd">
<!-- spring batch core settings -->
<import resource="../config/context.xml" />
<!-- database settings -->
<import resource="../config/database.xml" />
<!-- ============= partitioner job ========== -->
<job id="partitionJob" xmlns="http://www.springframework.org/schema/batch">
<!-- master step, 10 threads (grid-size) -->
<step id="masterStep">
<partition step="slave" partitioner="rangePartitioner">
<handler grid-size="10" task-executor="taskExecutor" />
</partition>
</step>
</job>
<!-- ======= Jobs to run ===== -->
<step id="slave" xmlns="http://www.springframework.org/schema/batch">
<tasklet>
<chunk reader="pagingItemReader" writer="flatFileItemWriter"
processor="itemProcessor" commit-interval="1" />
</tasklet>
</step>
<bean id="rangePartitioner" class="com.mkyong.partition.RangePartitioner" />
<bean id="taskExecutor" class="org.springframework.core.task.SimpleAsyncTaskExecutor" />
<bean id="itemProcessor" class="com.mkyong.processor.UserProcessor" scope="step">
<property name="threadName" value="#{stepExecutionContext[name]}" />
</bean>
<!-- ========== Paging Item Reader -->
<bean id="pagingItemReader" class="org.springframework.batch.item.database.JdbcPagingItemReader" scope="step">
<property name="dataSource" ref="dataSource" />
<property name="queryProvider">
<bean class="org.springframework.batch.item.database.support.SqlPagingQueryProviderFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="selectClause" value="select id, username,password, age" />
<property name="fromClause" value="from user" />
<property name="whereClause" value="where id >= :fromId and id <= :toId" />
<property name="sortKey" value="id" />
</bean>
</property>
<!-- Inject via the ExecutionContext in rangePartitioner -->
<property name="parameterValues">
<map>
<entry key="fromId" value="#{stepExecutionContext[fromId]}" />
<entry key="toId" value="#{stepExecutionContext[toId]}" />
</map>
</property>
<property name="pageSize" value="10" />
<property name="rowMapper">
<bean class="com.mkyong.UserRowMapper" />
</property>
</bean>
<!-- ================= csv file writer ============== -->
<bean id="flatFileItemWriter" class="org.springframework.batch.item.file.FlatFileItemWriter" scope="step" >
<property name="resource"
value="file:csv/outputs/users.processed#{stepExecutionContext[fromId]}-#{stepExecutionContext[toId]}.csv" />
<property name="appendAllowed" value="false" />
<property name="lineAggregator">
<bean class="org.springframework.batch.item.file.transform.DelimitedLineAggregator">
<property name="delimiter" value="," />
<property name="fieldExtractor">
<bean class="org.springframework.batch.item.file.transform.BeanWrapperFieldExtractor">
<property name="names" value="id, username, password, age" />
</bean>
</property>
</bean>
</property>
</bean>
<!-- ========= Mongo Item Reader ========-->
<bean id="mongoItemReader" class="org.springframework.batch.item.data.MongoItemReader" scope="step">
<property name="template" ref="mongoTemplate" />
<property name="targetType" value="com.mkyong.User" />
<property name="query"
value="{
'id':{$gt:#{stepExecutionContext[fromId]}, $lte:#{stepExecutionContext[toId]}
} }" />
<property name="sort">
<util:map id="sort">
<entry key="id" value="" />
</util:map>
</property>
</bean>
</beans>
请帮我解决这个问题。
应用程序.java
public class App {
public static void main(String[] args) {
App obj = new App();
obj.run();
}
private void run() {
final String[] springConfig = { "spring/batch/jobs/job-partitioner.xml" };
ApplicationContext context = new ClassPathXmlApplicationContext(springConfig);
JobLauncher jobLauncher = (JobLauncher) context.getBean("jobLauncher");
Job job = (Job) context.getBean("partitionJob");
try {
System.out.println("-----------------------------------------------");
JobExecution execution = jobLauncher.run(job, new JobParameters());
System.out.println("Exit Status : " + execution.getStatus());
System.out.println("Exit Status : " + execution.getAllFailureExceptions());
System.out.println("-----------------------------------------------");
} catch (Exception e) {
e.printStackTrace();
}
System.out.println("Done");
}
}
最佳答案
去掉appendAllowed配置即可生效(append默认为false,参见source of flatfileitemwriter)
<bean id="flatFileItemWriter" class="org.springframework.batch.item.file.FlatFileItemWriter" scope="step" >
<property name="resource"
value="file:csv/outputs/users.processed#{stepExecutionContext[fromId]}-#{stepExecutionContext[toId]}.csv" />
<!-- ******* remove this line ***** --><property name="appendAllowed" value="false" />
<property name="lineAggregator">
<bean class="org.springframework.batch.item.file.transform.DelimitedLineAggregator">
<property name="delimiter" value="," />
<property name="fieldExtractor">
<bean class="org.springframework.batch.item.file.transform.BeanWrapperFieldExtractor">
<property name="names" value="id, username, password, age" />
</bean>
</property>
</bean>
</property>
</bean>
至于为什么这样可以解决问题,原因在于 FlatFileItemWriter 内部
/**
* Flag to indicate that the target file should be appended if it already
* exists. If this flag is set then the flag
* {@link #setShouldDeleteIfExists(boolean) shouldDeleteIfExists} is
* automatically set to false, so that flag should not be set explicitly.
* Defaults value is false.
*
* @param append the flag value to set
*/
public void setAppendAllowed(boolean append) {
this.append = append;
this.shouldDeleteIfExists = false;
}
独立于附加值,shouldDeleteIfExists将设置为 false,这会导致您的问题
看起来像一个错误,它是在 BATCH-2495 创建的
关于java - org.springframework.batch.item.ItemStreamException : File already exists: [SpringBatch-Partitioner-Example\csv\outputs\users.processed21-30.csv],我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36434643/
我正在从数据库中读取数据,它有很多行,是的,但是“我认为读者在该步骤中释放了内存”。它需要超过 1Gb 的内存。怎样才能改善呢?使用另一种阅读器? ...
我实际上正在使用 Spring Batch,并且我一直在问自己一些关于泛型方法的问题。考虑以下代码: public SimpleStepBuilder chunk(int chunkSize) {
我得到了以下自定义 FieldSetMapper import org.springframework.batch.item.file.mapping.FieldSetMapper; import o
我们需要在已经使用 NoSQL 产品(MarkLogic)的应用程序中使用 spring batch。我知道作为先决条件,spring-batch 使用一些关系表来进行基础设施管理和监控,如批处理状态
我正在学习 spring Batch 抛出一个简单的例子,我从数据库读取数据并将其写入 xml 文件。作业运行正常,但我看不到放入 ItemReadListener 中的消息。我是否遗漏了配置中的某些
我喜欢 SpringBatch 的功能,但是我的大部分开发工作都是在 Java EE 领域进行的。当前的 Java EE 7 规范是否有任何部分涉及批处理? 最佳答案 基于 Spring Batch
这是一个非常简单的设置。我有一个可能包含 500 万条记录的文件,我需要读取这些记录并进行一些处理,然后将其发送到数据库。处理和写作的机制并不重要。我需要能够将路径和文件名 [/opt/etc/app
我有一个 Spring Batch 作业,它从 S3 存储桶读取一堆文件,处理它们,然后将其发送到数据库,在多线程配置中执行此操作。 application.properties 文件包含以下内容:
我使用下面的 bean 定义来配置读取器以从 Spring Batch 项目中的数据库表中读取一些数据。它在 SQL 中使用命名参数。我将 java.util.List 作为参数传递。但是,当它尝试运
这是一个非常简单的设置。我有一个可能包含 500 万条记录的文件,我需要读取这些记录并进行一些处理,然后将其发送到数据库。处理和写作的机制并不重要。我需要能够将路径和文件名 [/opt/etc/app
springframework.dao.EmptyResultDataAccessException 在尝试从 sql server 数据库中选择数据时,这是我编写的代码。谁能建议如何使用查询界面从数
我正在开发Spring Batch Partitioning示例,在这个示例中我尝试创建一个“Partitioner”作业,它有10个线程,每个线程都会根据提供的'id'范围从数据库读取记录。我引用了
我是一名优秀的程序员,十分优秀!