Using FlatFileItemReader with TaskExecutor (thread safety)

There are many examples that are used FlatFileItemReaderwith TaskExecutor. I provide samples below (both with XML and Java Config):

I used it myself with the XML configuration for large CSVs (GB size), writing to the database with the finished one JpaItemWriter. It seems that there is no problem even without setting save-state = false or accepting any special processing.

Now FlatFileItemReaderdocumented as unsafe .

My assumption was that it JpaItemWriter“covered” the problem, preserving many sets, i.e. collections without duplicates, if the tags hashCode()and equals()cover key business Entity. However, even this method is not enough to prevent duplication due to safe reading and processing without streaming.

Could you clarify: is it correct / correct / safe to use the ready-made FlatFileItemReaderwithin the Tasklet to which the TaskExecutor is assigned? Regardless of the writer. If not, how can we theoretically explain the absence of errors in use JpaItemWriter?

PS: Use the examples of links that I gave above FlatFileItemReaderwith TaskExecutor without mentioning all possible problems with ensuring thread safety ...

+4
1

TL; DR FlatFileItemReader TaskExecutor , Writer . (, , , .. ).

: JIRA, , saveState false (.. ), FlatFileItemReader TaskExecutor .


, , Spring TaskExecutor.

Spring ItemWriter ItemReader. Javadocs, , , . Javadocs , , -

:

: // FlatFileItemReader Tasklet, TaskExecutor? . , JPAItemWriter?

" " . Writer . JpaItemWriter , Java FlatFileItemReader, . , JpaItemWriter , . , , . ( Spring Batch docs)

P.S: , , FlatFileItemReader TaskExecutor .

, , CoherenceBatchWriter.java 6. mapBatch, Map. , API Coherence, , NamedCache .

, , , Writer , . .


+3

Source: https://habr.com/ru/post/1669913/


All Articles