Multiple Spring Performing batch jobs that simultaneously cause deadlocks in Spring batch metadata tables

We have several Spring batch jobs, each of which runs in its own java instance using CommandLineJobRunner. All jobs run simultaneously, only read / write flat files and update the same Spring batch metadata hosted on SQL Server. The only database is the Spring Metadata Database.

When you run multiple jobs at the same time, we get SQL lock exceptions. More detailed stack tracing can be found below. From a database perspective, we see that deadlock victims did one of the following: Paste BATCH_JOB_SEQ into the default values ​​or Delete from BATCH_JOB_SEQ, where ID <some_number.

We use the default MapJobRegistry, as well as the default job store or JobRepositoryFactoryBean. For the data source used to interact with the Spring database, we tried both DriverManagerDataSource and DBCP2 BasicDataSource using the standard SQL Server SQL Server SQL Server. I can download more specific configuration files, but in my testing, while I use SQL Server and the standard Spring configuration, problems arise.

In my research, I think the problem is with how the default incrementer class, org.springframework.jdbc.support.incrementer.SqlServerMaxValueIncrementer, increases job instance and step identifiers in combination with creating SQL Server database tables. The code in SqlServerMaxValueIncrementer is synchronized, so if we execute all the tasks in one Java instance, this will not be a problem.

If we implement Spring Batch metadata in a DB2 database, we have no problem. The SQL Server implementation uses actual tables, and the DB2 implementation uses sequence objects.

Has anyone encountered this problem? Did I miss something? It seems that whenever we have such a problem, it is as simple as setting the xxx installation to yyy. If not, does anyone know why Spring Batch does not implement sequence objects in a SQL Server implementation?

Stack trace:

[org.springframework.batch.core.launch.support.CommandLineJobRunner] - <Job Terminated in error: Could not increment identity; nested exception is com.microsoft.sqlserver.jdbc.SQLServerException: Transaction (Process ID 74) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.> org.springframework.dao.DataAccessResourceFailureException: Could not increment identity; nested exception is com.microsoft.sqlserver.jdbc.SQLServerException: Transaction (Process ID 74) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction. at org.springframework.jdbc.support.incrementer.SqlServerMaxValueIncrementer.getNextKey(SqlServerMaxValueIncrementer.java:124) at org.springframework.jdbc.support.incrementer.AbstractDataFieldMaxValueIncrementer.nextLongValue(AbstractDataFieldMaxValueIncrementer.java:1 28) at org.springframework.batch.core.repository.dao.JdbcJobInstanceDao.createJobInstance(JdbcJobInstanceDao.java:108) at org.springframework.batch.core.repository.support.SimpleJobRepository.createJobExecution(SimpleJobRepository.java:135) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) 

Configuration:

 <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:batch="http://www.springframework.org/schema/batch" xmlns:context="http://www.springframework.org/schema/context" xmlns:jdbc="http://www.springframework.org/schema/jdbc" xsi:schemaLocation=" http://www.springframework.org/schema/batch http://www.springframework.org/schema/batch/spring-batch.xsd http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd"> <bean id="transactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager" lazy-init="true"> <property name="dataSource" ref="batchPoolingDataSource" /> </bean> <bean id="jobRegistry" class="org.springframework.batch.core.configuration.support.MapJobRegistry" /> <bean id="jobRegistryBeanPostProcessor" class="org.springframework.batch.core.configuration.support.JobRegistryBeanPostProcessor"> <property name="jobRegistry" ref="jobRegistry" /> </bean> <bean id="jobRepository" class="org.springframework.batch.core.repository.support.JobRepositoryFactoryBean"> <property name="databaseType" value="SQLSERVER" /> <property name="dataSource" ref="batchPoolingDataSource" /> <property name="transactionManager" ref="transactionManager" /> </bean> <bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher"> <property name="jobRepository" ref="jobRepository" /> </bean> <bean id="jobExplorer" class="org.springframework.batch.core.explore.support.JobExplorerFactoryBean"> <property name="dataSource" ref="batchPoolingDataSource" /> </bean> <bean id="jobOperator" class="org.springframework.batch.core.launch.support.SimpleJobOperator"> <property name="jobExplorer" ref="jobExplorer" /> <property name="jobLauncher" ref="jobLauncher" /> <property name="jobRegistry" ref="jobRegistry" /> <property name="jobRepository" ref="jobRepository" /> </bean> <bean class="org.springframework.batch.core.scope.StepScope"> <property name="proxyTargetClass" value="true" /> </bean> <bean id="batchPoolingDataSource" class="org.apache.commons.dbcp2.BasicDataSource" destroy-method="close"> <property name="driverClassName" value="com.microsoft.sqlserver.jdbc.SQLServerDriver" /> <property name="url" value="jdbc:sqlserver://server info" /> <property name="username" value="${batch.jdbc.user}" /> <property name="password" value="${batch.jdbc.password}" /> <property name="initialSize" value="5" /> <property name="maxTotal" value="15" /> <property name="maxWaitMillis" value="5000" /> </bean> <bean id="batchDataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource" > <property name="driverClassName" value="org.springframework.jdbc.datasource.DriverManagerDataSource" /> <property name="url" value="jdbc:sqlserver://server info" /> <property name="username" value="${batch.jdbc.user}" /> <property name="password" value="${batch.jdbc.password}" /> </bean> 

+7
source share
1 answer

After further study and partial progress along the way of working on DAO versions that return JobRepository and work with SQL Server IDENTITY instead of sequences, I came across a way to solve this issue without much more than a little configuration.

An easy way to solve this problem is to set the databaseType and isolationLevelForCreate JobRepository . Here are the settings I use with SQL Server 2008:

 <bean id="jobRepository" class="org.springframework.batch.core.repository.support.JobRepositoryFactoryBean"> <property name="dataSource" ref="dataSource" /> <property name="transactionManager" ref="transactionManager" /> <property name="databaseType" value="SQLSERVER" /> <property name="isolationLevelForCreate" value="ISOLATION_REPEATABLE_READ" /> </bean> 

I tested this with 30 tasks (the same task with different parameters), which is launched by the Quartz work group of tasks, and so far I have not seen any problems.

I also saved the retry code (see the comment on the question) when you run tasks to catch any possible deadlocks and let it try again. This may be a moot point, but I cannot risk running jobs.

I think that mentioning these options in the Spring Batch documentation about starting multiple jobs at the moment when using SQL Server as a data source would be very useful for others. Again, I think not many people are stuck in SQL Server.

+10
source

Source: https://habr.com/ru/post/977133/


All Articles