Take this basic assembly pipeline (with gradle tasks):
- Compile / execute unit tests (gradle clean build)
- Integration tests (gradle integrationTest)
- Acceptance tests (gradle acceptTest)
- Deployment (gradle myCustomDeployTask)
According to Jez Humble's Continuous Delivery book, you should only create your binaries once. So, in the above theoretical pipeline, in step 1, we clean, compile and create a WAR, in step 2 we run integration tests (using the compiled code from step 1), in step 3 we run acceptance tests (using the compiled code from step 1), and in step 4 we deploy the WAR (which was built in step 1). So far so good.
I am trying to implement this pipeline in Jenkins. Since each task has its own workspace, steps 2, 3, and 4 complete the code recompilation and the construction of the WAR, which violates the Continuous Delivery mantra only to create your binary files once.
To combat this, I used the Clone Workspace SCM Jenkins plugin, which closes the workspace from the first assembly and will be the source of the workspace for builds 2, 3, and 4. However, gradle will still recompile the code at each step, because it apparently uses the absolute path to the files to determine if the task should be executed . Since the plugin moves the files to a new workspace, the absolute path has changed, which makes gradle believe that he needs to start from the very beginning rather than perform incremental builds.
We could now exchange workspaces in Jenkins, but we also frowned at the possibility of two jobs running against a common workspace.
So, how to implement this pipeline using Jenkins and gradle, adhering to the best continuous delivery methods, Jenkins and Gradle?