How to split declarative pipeline across many projects

I have many projects in different repositories that use the same basic CI workflow, which I can easily express as a declarative pipeline:

pipeline { agent any options { buildDiscarder(logRotator(numToKeepStr: '20')) } stages { stage('CI') { steps { echo 'Do CI' } } stage('QA') { steps { echo 'Do QA' } } } post { always { junit allowEmptyResults: true, testResults: '**/target/surefire-reports/TEST-*.xml' // etc... } failure { echo 'Failure mail' // etc } } } 

I would like to use the same declarative pipeline in all my projects and be able to change the definition of Pipeline in only one place and automatically change the changes in all projects.

Essentially what I would like to do in a Jenkinsfile project:

 loadPipelineFromScm 'repository', 'pipeline.groovy' 

I can already do this with shared libraries, but then I can no longer use the Declarative Pipeline functions.

Is there a way to share the Declarative pipeline in many repositories?

+5
source share
3 answers

While the views remain intact using the sentence from noober01, the declarative pipeline will not function properly. For instance. when the clauses are ignored, as it is expected that the element of the pipeline will be top, which means that it is analyzed as a script script.

See the following question rejected by the Jenkins team: loading an external declarative pipeline problem

+1
source

I dealt with the same problem for my own work. The best solution I could find was that in every project / repo in my organization there was a common Jenkinsfile:

 node { checkout([$class: 'GitSCM', branches: [[name: env.DELIVERY_PIPELINE_BRANCH]], userRemoteConfigs: [[credentialsId: env.DELIVERY_PIPELINE_CREDENTIALS, url: env.DELIVERY_PIPELINE_URL]]]) stash includes: '*.groovy', name: 'assets', useDefaultExcludes: false load './Jenkinsfile.groovy' } 

I used environment variables in case something should change, maybe it can be even more dynamic than my current current one (anyway, it’s still in development).

Then stash is used to store the remaining groovy scripts used later and deactivate them in the declarative pipeline.

Finally, it loads the declarative pipeline. It doesn’t mix with representations, basically everything behaves normally.

So this is not exactly what you were looking for, and I'd rather be able to just pull it out of SCM. But hey, it works well enough for me at the moment.

0
source

I can use a shared library to define a declarative pipeline that is configured through a YAML file.

In my repo/project I define a Jenkinsfile to call the Shared Library:

 @Library('my-shared-library')_ pipelineDefault(); // cannot be named 'pipeline' 

and Jenkinsfile.yaml to configure build options:

 project_name: my_project debug: true # you get the idea 

Then in my vars/pipelineDefault.groovy very simple shared library might look like this:

 def call() { Map pipelineConfig = readYaml(file: "${WORKSPACE}/Jenkinsfile.yaml }") node { stage('Build'){ println "Building: ${pipelineConfig.project_name}" } } } 

Of course, this is a very simplified example, but the dynamic configuration does the job.

0
source

Source: https://habr.com/ru/post/1264522/


All Articles