Cmdpirx Corner

Jenkins Pipeline Basics

| blog
    ___                          _ __              _              
   ( /          /  o            ( /  )o           //o             
    / _  _ _   /< ,  _ _   (     /--',   ,_   _  //,  _ _   _  (  
  _/_(/_/ / /_/ |_(_/ / /_/_)_  /    (__/|_)_(/_(/_(_/ / /_(/_/_)_
 //                                     /|                        
(/                                     (/                         

Recently, I've had to learn Jenkins, so I've decided to dump my notes here for everyone to see.

Basics

All valid Declarative Pipelines must be enclosed within a pipeline block:

pipeline {  
}
  • The top-level of the Pipeline must be a block, specifically: pipeline { }
  • No semicolons as statement separators. Each statement has to be on its own line.
  • Blocks must only consist of Sections, Directives, Steps, or assignment statements.
  • A property reference statement is treated as a no-argument method invocation. So, for example, input is treated as input().

Sections

Agent

Required: Yes
Allowed: In top-level pipeline and each stage block

The agent section specifies where the entire Pipeline, or a specific stage, will execute in the Jenkins environment depending on where the agent section is placed.

Differences between top-level agents and stage-level agents

There are some nuances when adding an agent to the top level or a stage level when the options directive is applied.

In agents declared at the top level of a Pipeline, an agent is allocated and then the timeout option is applied. The time to allocate the agent is not included in the limit set by the timeout option.

pipeline {
    agent any
    options {
        // Timeout counter starts AFTER agent is allocated
        timeout(time: 1, unit: 'SECONDS')
    }
    stages {
        stage('Example') {
            steps {
                echo 'Hello World'
            }
        }
    }
}

Stage Agents

In agents declared within a stage, the options are invoked before allocating the agent and before checking any when conditions. In this case, when using timeout, it is applied before the agent is allocated. The time to allocate the agent is included in the limit set by the timeout option.

pipeline {
    agent none
    stages {
        stage('Example') {
            agent any
            options {
                // Timeout counter starts BEFORE agent is allocated
                timeout(time: 1, unit: 'SECONDS')
            }
            steps {
                echo 'Hello World'
            }
        }
    }
}

Agent Parameters

any

Execute the Pipeline, or stage, on any available agent. For example: agent any

none

When applied at the top-level of the pipeline block, no global agent will be allocated for the entire Pipeline run and each stage section will need to contain its own agent section. For example: agent none

label

A string. The label or label condition on which to run the Pipeline or individual stage. Execute the Pipeline, or stage, on an agent available in the Jenkins environment with the provided label. For example: agent { label 'my-defined-label' }

Label conditions can also be used. For example: agent { label 'my-label1 && my-label2' } or agent { label 'my-label1 || my-label2' }

args

A string. Runtime arguments to pass to docker run.

pipeline {
    agent { docker 'maven:3.9.3-eclipse-temurin-17' }
    stages {
        stage('Example Build') {
            steps {
                sh 'mvn -B clean verify'
            }
        }
    }
}

customWorkspace

A string. Run the Pipeline or individual stage this agent is applied to within this custom workspace, rather than the default. It can be either a relative path, in which case the custom workspace will be under the workspace root on the node, or an absolute path.

node

agent { node { label 'labelName' } } behaves the same as agent { label 'labelName' }, but node allows for additional options (such as customWorkspace).

docker

Execute the Pipeline, or stage, with the given container which will be dynamically provisioned on a node pre-configured to accept Docker-based Pipelines, or on a node matching the optionally defined label parameter. docker also optionally accepts an args parameter which may contain arguments to pass directly to a docker run invocation, and an alwaysPull option, which will force a docker pull even if the image name is already present. For example: agent { docker 'maven:3.9.3-eclipse-temurin-17' }

agent {
    docker {
        image 'maven:3.9.3-eclipse-temurin-17'
        label 'my-defined-label'
        args  '-v /tmp:/tmp'
    }
}

docker also optionally accepts a registryUrl and registryCredentialsId parameters which will help to specify the Docker Registry to use and its credentials. The parameter registryCredentialsId could be used alone for private repositories within the docker hub.

agent {
    docker {
        image 'myregistry.com/node'
        label 'my-defined-label'
        registryUrl 'https://myregistry.com/'
        registryCredentialsId 'myPredefinedCredentialsInJenkins' 
    }
} 

dockerfile

Execute the Pipeline, or stage, with a container built from a Dockerfile contained in the source repository. In order to use this option, the Jenkinsfile must be loaded from either a Multibranch Pipeline or a Pipeline from SCM. Conventionally this is the Dockerfile in the root of the source repository: agent { dockerfile true }. If building a Dockerfile in another directory, use the dir option: agent { dockerfile { dir 'someSubDir' } }. If your Dockerfile has another name, you can specify the file name with the filename option. You can pass additional arguments to the docker build command with the additionalBuildArgs option, like agent { dockerfile { additionalBuildArgs '--build-arg foo=bar' } }

// Equivalent to "docker build -f Dockerfile.build --build-arg version=1.0.2 ./build/"
agent {
    dockerfile {
        filename 'Dockerfile.build'
        dir 'build'
        label 'my-defined-label'
        additionalBuildArgs  '--build-arg version=1.0.2'
        args '-v /tmp:/tmp'
    }
}

Post

Required: No
Allowed: In top-level pipeline and each stage block

The post section defines one or more additional steps that are run upon the completion of a Pipeline's or stage's run (depending on the location of the post section within the Pipeline). post can support any of the following post-condition blocks: always, changed, fixed, regression, aborted, failure, success, unstable, unsuccessful, and cleanup. These condition blocks allow the execution of steps inside each condition depending on the completion status of the Pipeline or stage. The condition blocks are executed in the order shown below.

  • always - Run the steps in the post section regardless of the completion status of the Pipeline's or stage's run.
  • changed - Only run the steps in post if the current Pipeline's run has a different completion status from its previous run.
  • fixed - Only run the steps in post if the current Pipeline's run is successful and the previous run failed or was unstable.
  • regression - Only run the steps in post if the current Pipeline's or status is failure, unstable, or aborted and the previous run was successful.
  • aborted - Only run the steps in post if the current Pipeline's run has an "aborted" status, usually due to the Pipeline being manually aborted. This is typically denoted by gray in the web UI.
  • failure - Only run the steps in post if the current Pipeline's or stage's run has a "failed" status, typically denoted by red in the web UI.
  • success - Only run the steps in post if the current Pipeline's or stage's run has a "success" status, typically denoted by blue or green in the web UI.
  • unstable - Only run the steps in post if the current Pipeline's run has an "unstable" status, usually caused by test failures, code violations, etc. This is typically denoted by yellow in the web UI.
  • unsuccessful - Only run the steps in post if the current Pipeline's or stage's run has not a "success" status. This is typically denoted in the web UI depending on the status previously mentioned (for stages this may fire if the build itself is unstable).
  • cleanup - Run the steps in this post condition after every other post condition has been evaluated, regardless of the Pipeline or stage's status.

Stages

Containing a sequence of one or more stage directives, the stages section is where the bulk of the "work" described by a Pipeline will be located. At a minimum, it is recommended that stages contain at least one stage directive for each discrete part of the continuous delivery process, such as Build, Test, and Deploy.

pipeline {
    agent any
    stages {
        stage('Example') {
            steps {
                echo 'Hello World'
            }
        }
    }
}

Steps

The steps section defines a series of one or more steps to be executed in a given stage directive.

Directives

environment

The environment directive specifies a sequence of key-value pairs which will be defined as environment variables for all steps, or stage-specific steps, depending on where the environment directive is located within the Pipeline.

This directive supports a special helper method credentials() which can be used to access pre-defined Credentials by their identifier in the Jenkins environment.

Supported Credentials Types

  • Secret Text - The environment variable specified will be set to the Secret Text content.
  • Secret File - The environment variable specified will be set to the location of the File file that is temporarily created.
  • Username and password - The environment variable specified will be set to username:password and two additional environment variables will be automatically defined: MYVARNAME_USR and MYVARNAME_PSW respectively.
  • SSH with Private Key - The environment variable specified will be set to the location of the SSH key file that is temporarily created and two additional environment variables will be automatically defined: MYVARNAME_USR and MYVARNAME_PSW (holding the passphrase).
pipeline {
    agent any
    environment {
        CC = 'clang'
    }
    stages {
        stage('Example') {
            environment {
                AN_ACCESS_KEY = credentials('my-predefined-secret-text')
            }
            steps {
                sh 'printenv'
            }
        }
    }
}
pipeline {
    agent any
    stages {
        stage('Example Username/Password') {
            environment {
                SERVICE_CREDS = credentials('my-predefined-username-password')
            }
            steps {
                sh 'echo "Service user is $SERVICE_CREDS_USR"'
                sh 'echo "Service password is $SERVICE_CREDS_PSW"'
                sh 'curl -u $SERVICE_CREDS https://myservice.example.com'
            }
        }
        stage('Example SSH Username with private key') {
            environment {
                SSH_CREDS = credentials('my-predefined-ssh-creds')
            }
            steps {
                sh 'echo "SSH private key is located at $SSH_CREDS"'
                sh 'echo "SSH user is $SSH_CREDS_USR"'
                sh 'echo "SSH passphrase is $SSH_CREDS_PSW"'
            }
        }
    }
}

options

The options directive allows configuring Pipeline-specific options from within the Pipeline itself. Pipeline provides a number of these options, such as buildDiscarder, but they may also be provided by plugins, such as timestamps.

Available Options

  • buildDiscarder - Persist artifacts and console output for the specific number of recent Pipeline runs. For example: options { buildDiscarder(logRotator(numToKeepStr: '1')) }
  • checkoutToSubdirectory - Perform the automatic source control checkout in a subdirectory of the workspace. For example: options { checkoutToSubdirectory('foo') }
  • disableConcurrentBuilds - Disallow concurrent executions of the Pipeline. Can be useful for preventing simultaneous accesses to shared resources, etc. For example: options { disableConcurrentBuilds() } to queue a build when there's already an executing build of the Pipeline, or options { disableConcurrentBuilds(abortPrevious: true) } to abort the running one and start the new build.
  • disableResume - Do not allow the pipeline to resume if the controller restarts. For example: options { disableResume() }
  • newContainerPerStage - Used with docker or dockerfile top-level agent. When specified, each stage will run in a new container deployed on the same node, rather than all stages running in the same container deployment.
  • overrideIndexTriggers - Allows overriding default treatment of branch indexing triggers. If branch indexing triggers are disabled at the multibranch or organization label, options { overrideIndexTriggers(true) } will enable them for this job only. Otherwise, options { overrideIndexTriggers(false) } will disable branch indexing triggers for this job only.
  • preserveStashes - Preserve stashes from completed builds, for use with stage restarting. For example: options { preserveStashes() } to preserve the stashes from the most recent completed build, or options { preserveStashes(buildCount: 5) } to preserve the stashes from the five most recent completed builds.
  • quietPeriod - Set the quiet period, in seconds, for the Pipeline, overriding the global default. For example: options { quietPeriod(30) }
  • retry - On failure, retry the entire Pipeline the specified number of times. For example: options { retry(3) }
  • skipDefaultCheckout - Skip checking out code from source control by default in the agent directive. For example: options { skipDefaultCheckout() }
  • skipStagesAfterUnstable - Skip stages once the build status has gone to UNSTABLE. For example: options { skipStagesAfterUnstable() }
  • timeout - Set a timeout period for the Pipeline run, after which Jenkins should abort the Pipeline. For example: options { timeout(time: 1, unit: 'HOURS') }
  • timestamps - Prepend all console output generated by the Pipeline run with the time at which the line was emitted. For example: options { timestamps() }
  • parallelsAlwaysFailFast - Set failfast true for all subsequent parallel stages in the pipeline. For example: options { parallelsAlwaysFailFast() }
  • disableRestartFromStage - Completely disable option "Restart From Stage" visible in classic Jenkins UI and Blue Ocean as well. For example: options { disableRestartFromStage() }. This option cannot be used inside of the stage.

Stage Options

The options directive for a stage is similar to the options directive at the root of the Pipeline. However, the stage-level options can only contain steps like retry, timeout, or timestamps, or Declarative options that are relevant to a stage, like skipDefaultCheckout.

Inside a stage, the steps in the options directive are invoked before entering the agent or checking any when conditions.

  • skipDefaultCheckout - Skip checking out code from source control by default in the agent directive. For example: options { skipDefaultCheckout() }
  • timeout - Set a timeout period for this stage, after which Jenkins should abort the stage. For example: options { timeout(time: 1, unit: 'HOURS') }
  • retry - On failure, retry this stage the specified number of times. For example: options { retry(3) }
  • timestamps - Prepend all console output generated during this stage with the time at which the line was emitted. For example: options { timestamps() }

parameters

The parameters directive provides a list of parameters that a user should provide when triggering the Pipeline. The values for these user-specified parameters are made available to Pipeline steps via the params object, refer to the Parameters, Declarative Pipeline for its specific usage.

Each parameter has a Name and Value, depending on the parameter type. This information is exported as environment variables when the build starts, allowing subsequent parts of the build configuration to access those values. For example, use the ${PARAMETER_NAME} syntax with POSIX shells like bash and ksh, the ${Env:PARAMETER_NAME} syntax with PowerShell, or the %PARAMETER_NAME% syntax with Windows cmd.exe.

  • string - A parameter of a string type, for example: parameters { string(name: 'DEPLOY_ENV', defaultValue: 'staging', description: '') }.
  • text - A text parameter, which can contain multiple lines, for example: parameters { text(name: 'DEPLOY_TEXT', defaultValue: 'One\\nTwo\\nThree\\n', description: '') }.
  • booleanParam - A boolean parameter, for example: parameters { booleanParam(name: 'DEBUG_BUILD', defaultValue: true, description: '') }.
  • choice - A choice parameter, for example: parameters { choice(name: 'CHOICES', choices: ['one', 'two', 'three'], description: '') }. The first value is the default.
  • password - A password parameter, for example: parameters { password(name: 'PASSWORD', defaultValue: 'SECRET', description: 'A secret password') }.
pipeline {
    agent any
    parameters {
        string(name: 'PERSON', defaultValue: 'Mr Jenkins', description: 'Who should I say hello to?')

        text(name: 'BIOGRAPHY', defaultValue: '', description: 'Enter some information about the person')

        booleanParam(name: 'TOGGLE', defaultValue: true, description: 'Toggle this value')

        choice(name: 'CHOICE', choices: ['One', 'Two', 'Three'], description: 'Pick something')

        password(name: 'PASSWORD', defaultValue: 'SECRET', description: 'Enter a password')
    }
    stages {
        stage('Example') {
            steps {
                echo "Hello ${params.PERSON}"

                echo "Biography: ${params.BIOGRAPHY}"

                echo "Toggle: ${params.TOGGLE}"

                echo "Choice: ${params.CHOICE}"

                echo "Password: ${params.PASSWORD}"
            }
        }
    }
}

triggers

The triggers directive defines the automated ways in which the Pipeline should be re-triggered. For Pipelines which are integrated with a source such as GitHub or BitBucket, triggers may not be necessary as webhooks-based integration will likely already be present. The triggers currently available are cron, pollSCM and upstream.

  • cron - Accepts a cron-style string to define a regular interval at which the Pipeline should be re-triggered, for example: triggers { cron('H */4 * * 1-5') }.
  • pollSCM - Accepts a cron-style string to define a regular interval at which Jenkins should check for new source changes. If new changes exist, the Pipeline will be re-triggered. For example: triggers { pollSCM('H */4 * * 1-5') }
  • upstream - Accepts a comma-separated string of jobs and a threshold. When any job in the string finishes with the minimum threshold, the Pipeline will be re-triggered. For example: triggers { upstream(upstreamProjects: 'job1,job2', threshold: hudson.model.Result.SUCCESS) }

stage

The stage directive goes in the stages section and should contain a steps section, an optional agent section, or other stage-specific directives. Practically speaking, all of the real work done by a Pipeline will be wrapped in one or more stage directives.

// Declarative //
pipeline {
    agent any
    stages {
        stage('Example') {
            steps {
                echo 'Hello World'
            }
        }
    }
}

input

The input directive on a stage allows you to prompt for input, using the input step. The stage will pause after any options have been applied, and before entering the agent block for that stage or evaluating the when condition of the stage. If the input is approved, the stage will then continue. Any parameters provided as part of the input submission will be available in the environment for the rest of the stage.

  • message - Required. This will be presented to the user when they go to submit the input.
  • id - An optional identifier for this input. The default value is based on the stage name.
  • ok - Optional text for the "ok" button on the input form.
  • submitter - An optional comma-separated list of users or external group names who are allowed to submit this input. Defaults to allowing any user.
  • submitterParameter - An optional name of an environment variable to set with the submitter name, if present.
  • parameters - An optional list of parameters to prompt the submitter to provide. Refer to parameters for more information.
pipeline {
    agent any
    stages {
        stage('Example') {
            input {
                message "Should we continue?"
                ok "Yes, we should."
                submitter "alice,bob"
                parameters {
                    string(name: 'PERSON', defaultValue: 'Mr Jenkins', description: 'Who should I say hello to?')
                }
            }
            steps {
                echo "Hello, ${PERSON}, nice to meet you."
            }
        }
    }
}

when

The when directive allows the Pipeline to determine whether the stage should be executed depending on the given condition. The when directive must contain at least one condition. If the when directive contains more than one condition, all the child conditions must return true for the stage to execute. This is the same as if the child conditions were nested in an allOf condition (refer to the examples below). If an anyOf condition is used, note that the condition skips remaining tests as soon as the first "true" condition is found.

More complex conditional structures can be built using the nesting conditions: not, allOf, or anyOf. Nesting conditions may be nested to any arbitrary depth.

Built-in Conditions

  • branch - Execute the stage when the branch being built matches the branch pattern (ANT style path glob) given, for example: when { branch 'master' }. Note that this only works on a multibranch Pipeline.
  • The optional parameter comparator may be added after an attribute to specify how any patterns are evaluated for a match:

    • EQUALS for a simple string comparison
    • GLOB (the default) for an ANT style path glob (same as for example changeset)
    • REGEXP for regular expression matching
    • For example: when { branch pattern: "release-\\d+", comparator: "REGEXP"}
  • buildingTag - Execute the stage when the build is building a tag. For example: when { buildingTag() }

  • changelog - Execute the stage if the build's SCM changelog contains a given regular expression pattern, for example: when { changelog '.*^\\[DEPENDENCY\\] .+$' }.

  • changeset - Execute the stage if the build's SCM changeset contains one or more files matching the given pattern. Example: when { changeset "**/*.js" }

  • The optional parameter comparator may be added after an attribute to specify how any patterns are evaluated for a match:

    • EQUALS for a simple string comparison
    • GLOB (the default) for an ANT style path glob case insensitive (this can be turned off with the caseSensitive parameter).
    • REGEXP for regular expression matching
    • For example: when { changeset pattern: ".TEST\\.java", comparator: "REGEXP" } or when { changeset pattern: "*/*TEST.java", caseSensitive: true }
  • changeRequest - Executes the stage if the current build is for a "change request" (a.k.a. Pull Request on GitHub and Bitbucket, Merge Request on GitLab, Change in Gerrit, etc.). When no parameters are passed the stage runs on every change request, for example: when { changeRequest() }.

  • By adding a filter attribute with parameter to the change request, the stage can be made to run only on matching change requests. Possible attributes are id, target, branch, fork, url, title, author, authorDisplayName, and authorEmail. Each of these corresponds to a CHANGE_* environment variable, for example: when { changeRequest target: 'master' }.
  • The optional parameter comparator may be added after an attribute to specify how any patterns are evaluated for a match:

    • EQUALS for a simple string comparison (the default)
    • GLOB for an ANT style path glob (same as for example changeset)
    • REGEXP for regular expression matching
    • Example: when { changeRequest authorEmail: "[\\w_-.]+@example.com", comparator: 'REGEXP' }
  • environment - Execute the stage when the specified environment variable is set to the given value, for example: when { environment name: 'DEPLOY_TO', value: 'production' }.

  • equals - Execute the stage when the expected value is equal to the actual value, for example: when { equals expected: 2, actual: currentBuild.number }.

  • expression - Execute the stage when the specified Groovy expression evaluates to true, for example: when { expression { return params.DEBUG_BUILD } }.

  • tag - Execute the stage if the TAG_NAME variable matches the given pattern. For example: when { tag "release-*" } If an empty pattern is provided the stage will execute if the TAG_NAME variable exists (same as buildingTag()).

  • The optional parameter comparator may be added after an attribute to specify how any patterns are evaluated for a match:

    • EQUALS for a simple string comparison,
    • GLOB (the default) for an ANT style path glob (same as for example changeset), or
    • REGEXP for regular expression matching.
    • For example: when { tag pattern: "release-\\d+", comparator: "REGEXP"}
  • not - Execute the stage when the nested condition is false. Must contain one condition. For example: when { not { branch 'master' } }

  • allOf - Execute the stage when all of the nested conditions are true. Must contain at least one condition. For example: when { allOf { branch 'master'; environment name: 'DEPLOY_TO', value: 'production' } }

  • anyOf - Execute the stage when at least one of the nested conditions is true. Must contain at least one condition. For example: when { anyOf { branch 'master'; branch 'staging' } }

  • triggeredBy - Execute the stage when the current build has been triggered by the param given. For example:

  • when { triggeredBy 'SCMTrigger' }
  • when { triggeredBy 'TimerTrigger' }
  • when { triggeredBy 'BuildUpstreamCause' }
  • when { triggeredBy cause: "UserIdCause", detail: "vlinde" }

Evaluating when before entering agent in a stage

By default, the when condition for a stage will be evaluated after entering the agent for that stage, if one is defined. However, this can be changed by specifying the beforeAgent option within the when block. If beforeAgent is set to true, the when condition will be evaluated first, and the agent will only be entered if the when condition evaluates to true.

Evaluating when before the input directive

By default, the when condition for a stage will not be evaluated before the input, if one is defined. However, this can be changed by specifying the beforeInput option within the when block. If beforeInput is set to true, the when condition will be evaluated first, and the input will only be entered if the when condition evaluates to true.

beforeInput true takes precedence over beforeAgent true.

Evaluating when before the options directive

By default, the when condition for a stage will be evaluated after entering the options for that stage, if any are defined. However, this can be changed by specifying the beforeOptions option within the when block. If beforeOptions is set to true, the when condition will be evaluated first, and the options will only be entered if the when condition evaluates to true.

beforeOptions true takes precedence over beforeInput true and beforeAgent true.

Example 15. Single Condition, Declarative Pipeline

pipeline {
    agent any
    stages {
        stage('Example Build') {
            steps {
                echo 'Hello World'
            }
        }
        stage('Example Deploy') {
            when {
                branch 'production'
            }
            steps {
                echo 'Deploying'
            }
        }
    }
}

Example 16. Multiple Condition, Declarative Pipeline

pipeline {
    agent any
    stages {
        stage('Example Build') {
            steps {
                echo 'Hello World'
            }
        }
        stage('Example Deploy') {
            when {
                branch 'production'
                environment name: 'DEPLOY_TO', value: 'production'
            }
            steps {
                echo 'Deploying'
            }
        }
    }
}

Example 17. Nested condition (same behavior as previous example)

pipeline {
    agent any
    stages {
        stage('Example Build') {
            steps {
                echo 'Hello World'
            }
        }
        stage('Example Deploy') {
            when {
                allOf {
                    branch 'production'
                    environment name: 'DEPLOY_TO', value: 'production'
                }
            }
            steps {
                echo 'Deploying'
            }
        }
    }
}

Example 18. Multiple condition and nested condition

pipeline {
    agent any
    stages {
        stage('Example Build') {
            steps {
                echo 'Hello World'
            }
        }
        stage('Example Deploy') {
            when {
                branch 'production'
                anyOf {
                    environment name: 'DEPLOY_TO', value: 'production'
                    environment name: 'DEPLOY_TO', value: 'staging'
                }
            }
            steps {
                echo 'Deploying'
            }
        }
    }
}

Example 19. Expression condition and nested condition

pipeline {
    agent any
    stages {
        stage('Example Build') {
            steps {
                echo 'Hello World'
            }
        }
        stage('Example Deploy') {
            when {
                expression { BRANCH_NAME ==~ /(production|staging)/ }
                anyOf {
                    environment name: 'DEPLOY_TO', value: 'production'
                    environment name: 'DEPLOY_TO', value: 'staging'
                }
            }
            steps {
                echo 'Deploying'
            }
        }
    }
}

Example 20. beforeAgent

pipeline {
    agent none
    stages {
        stage('Example Build') {
            steps {
                echo 'Hello World'
            }
        }
        stage('Example Deploy') {
            agent {
                label "some-label"
            }
            when {
                beforeAgent true
                branch 'production'
            }
            steps {
                echo 'Deploying'
            }
        }
    }
}

Example 21. beforeInput

pipeline {
    agent none
    stages {
        stage('Example Build') {
            steps {
                echo 'Hello World'
            }
        }
        stage('Example Deploy') {
            when {
                beforeInput true
                branch 'production'
            }
            input {
                message "Deploy to production?"
                id "simple-input"
            }
            steps {
                echo 'Deploying'
            }
        }
    }
}

Example 22. beforeOptions

pipeline {
    agent none
    stages {
        stage('Example Build') {
            steps {
                echo 'Hello World'
            }
        }
        stage('Example Deploy') {
            when {
                beforeOptions true
                branch 'testing'
            }
            options {
                lock label: 'testing-deploy-envs', quantity: 1, variable: 'deployEnv'
            }
            steps {
                echo "Deploying to ${deployEnv}"
            }
        }
    }
}

Example 23. triggeredBy

pipeline {
    agent none
    stages {
        stage('Example Build') {
            steps {
                echo 'Hello World'
            }
        }
        stage('Example Deploy') {
            when {
                triggeredBy "TimerTrigger"
            }
            steps {
                echo 'Deploying'
            }
        }
    }
}

Sequential Stages

Stages in Declarative Pipeline may have a stages section containing a list of nested stages to be run in sequential order.

pipeline {
    agent none
    stages {
        stage('Non-Sequential Stage') {
            agent {
                label 'for-non-sequential'
            }
            steps {
                echo "On Non-Sequential Stage"
            }
        }
        stage('Sequential') {
            agent {
                label 'for-sequential'
            }
            environment {
                FOR_SEQUENTIAL = "some-value"
            }
            stages {
                stage('In Sequential 1') {
                    steps {
                        echo "In Sequential 1"
                    }
                }
                stage('In Sequential 2') {
                    steps {
                        echo "In Sequential 2"
                    }
                }
                stage('Parallel In Sequential') {
                    parallel {
                        stage('In Parallel 1') {
                            steps {
                                echo "In Parallel 1"
                            }
                        }
                        stage('In Parallel 2') {
                            steps {
                                echo "In Parallel 2"
                            }
                        }
                    }
                }
            }
        }
    }
}

Parallel

In addition, you can force your parallel stages to all be aborted when any one of them fails by adding failFast true to the stage containing the parallel block. Another option for adding fail-fast behavior is adding an option to the pipeline definition: parallelsAlwaysFailFast().

pipeline {
    agent any
    stages {
        stage('Non-Parallel Stage') {
            steps {
                echo 'This stage will be executed first.'
            }
        }
        stage('Parallel Stage') {
            when {
                branch 'master'
            }
            failFast true
            parallel {
                stage('Branch A') {
                    agent {
                        label "for-branch-a"
                    }
                    steps {
                        echo "On Branch A"
                    }
                }
                stage('Branch B') {
                    agent {
                        label "for-branch-b"
                    }
                    steps {
                        echo "On Branch B"
                    }
                }
                stage('Branch C') {
                    agent {
                        label "for-branch-c"
                    }
                    stages {
                        stage('Nested 1') {
                            steps {
                                echo "In stage Nested 1 within Branch C"
                            }
                        }
                        stage('Nested 2') {
                            steps {
                                echo "In stage Nested 2 within Branch C"
                            }
                        }
                    }
                }
            }
        }
    }
}

Matrix

Stages in Declarative Pipeline may have a matrix section defining a multi-dimensional matrix of name-value combinations to be run in parallel. We'll refer to these combinations as "cells" in a matrix. Each cell in a matrix can include one or more stages to be run sequentially using the configuration for that cell.

In addition, you can force your matrix cells to all be aborted when any one of them fails by adding failFast true to the stage containing the matrix. Another option for adding fail-fast behavior is adding an option to the pipeline definition: parallelsAlwaysFailFast().

The matrix section must include an axes section and a stages section. The axes section defines the values for each axis in the matrix. The stages section defines a list of stages to run sequentially in each cell. A matrix may have an excludes section to remove invalid cells from the matrix. Many of the directives available on stage, including agent, tools, when, etc., can also be added to matrix to control the behavior of each cell.

Axes

The axes section specifies one or more axis directives. Each axis consists of a name and a list of values. All the values from each axis are combined with the others to produce the cells.

matrix {
    axes {
        axis {
            name 'PLATFORM'
            values 'linux', 'mac', 'windows'
        }
        axis {
            name 'BROWSER'
            values 'chrome', 'edge', 'firefox', 'safari'
        }
        axis {
            name 'ARCHITECTURE'
            values '32-bit', '64-bit'
        }
    }
    // ...
}

Example

pipeline {
    parameters {
        choice(name: 'PLATFORM_FILTER', choices: ['all', 'linux', 'windows', 'mac'], description: 'Run on specific platform')
    }
    agent none
    stages {
        stage('BuildAndTest') {
            matrix {
                agent {
                    label "${PLATFORM}-agent"
                }
                when { anyOf {
                    expression { params.PLATFORM_FILTER == 'all' }
                    expression { params.PLATFORM_FILTER == env.PLATFORM }
                } }
                axes {
                    axis {
                        name 'PLATFORM'
                        values 'linux', 'windows', 'mac'
                    }
                    axis {
                        name 'BROWSER'
                        values 'firefox', 'chrome', 'safari', 'edge'
                    }
                }
                excludes {
                    exclude {
                        axis {
                            name 'PLATFORM'
                            values 'linux'
                        }
                        axis {
                            name 'BROWSER'
                            values 'safari'
                        }
                    }
                    exclude {
                        axis {
                            name 'PLATFORM'
                            notValues 'windows'
                        }
                        axis {
                            name 'BROWSER'
                            values 'edge'
                        }
                    }
                }
                stages {
                    stage('Build') {
                        steps {
                            echo "Do Build for ${PLATFORM} - ${BROWSER}"
                        }
                    }
                    stage('Test') {
                        steps {
                            echo "Do Test for ${PLATFORM} - ${BROWSER}"
                        }
                    }
                }
            }
        }
    }
}