Azure DevOps yml pipelines


Steps

Conditional execution

The following example shows how you can make the execution of a step conditional, e. g. by a parameter defined in the parameters section of the steps:

steps:
- parameters:
    paramX: 'false'
...
- pwsh: |
    ...
  condition: eq(${{ parameters.paramX}}, 'true')
...

Workflow

Making sure that a task is executed even if its previous task fails

There are two approaches to achieve that:

  • Using continueOnError
  • Using allway()

Using continueOnError

In this approach, we set continueOnError: true for the previous task. It might be also required to make sure that the subsequent task does not have enabled: true (seems to interfere with continueOnError). The following example shows a simple implementation of this approach:

...
- task: Bash@3
	name: StepX
	displayName: "Step X"
	…
	continueOnError: true

- task: PublishPipelineArtifact@1
	displayName: 'Step Y'
	…
	enabled: true
	timeoutInMinutes: 1
...

Using always()

This approach is the more robust and better approach if your goal is that the second task is executed regardless of whether the previous tasks have failed or not. The following example shows how the always() option can be used:

... 
- task: PublishPipelineArtifact@1
  condition: always()
  displayName: 'Task Z'
  inputs:
	path: "$(Build.SourcesDirectory)\\myfolder"
	artifact: 'myartifact'
  timeoutInMinutes: 1
...

Artifacts

Publishing/Downloading artifacts

Publishing multiple folders with different paths into the same artifact

Sometimes you have a situation where you need to publish folders/files from multiple sources and maybe at multiple steps into the same artifact. One possible approach to handle such a situation would be the usage of the artifact staging directory.

Let’s say you have two folders from two different paths (path1/folder1, path2/folder2) and the folders are ready at different steps, but you want to have them published into the same artifact arfitact-x. The following code shows how you could use the artifact staging directory to achieve this:

...

- task: CopyFiles@2
  inputs:
    sourceFolder: 'path1/folder1'
    contents: '**/*'
    targetFolder: '$(Build.ArtifactStagingDirectory)/folder1'

...

- task: CopyFiles@2
  inputs:
    sourceFolder: 'path2/folder2'
    contents: '**/*'
    targetFolder: '$(Build.ArtifactStagingDirectory)/folder2'

...

- task: PublishPipelineArtifact@1
  displayName: 'Publishing to Artifact-x'
  inputs:
    path: '$(Build.ArtifactStagingDirectory)'
    artifact: 'arfitact-x'

...

Passing data between pipelines

There are different ways to pass data from one pipeline to another one. Here we go through some possible approaches to achieve that goal.

Case: Pipelines run on the same agent

If pipelines run on the same agent, a simple way to pass data from one to another would be to use files stored on the system where the agent is running.

Let’s assume we want to pass the build ID of the first pipeline to the second one. One way to achieve that would be to store our data, in this case, the value of the build ID of the first pipeline, in a file that can be accessed by the second pipeline. The following example shows how we can store the build Id of the first pipeline in a file under the agent workspace directory.

Code in the first pipeline:

...
  - task: Bash@3
    displayName: "Output build Id of the first pipeline"
    inputs:
      targetType: inline
      script: |
        mkdir -p $(Pipeline.Workspace)/variables
        echo "Build.BuildId: $(Build.BuildId)"
        echo $(Build.BuildId) > $(Pipeline.Workspace)/variables/firstPipelineBuildId
...

We can then use the same path to read that file with the variable value in the second pipeline.

Case: Pipelines run on different agents

If the pipelines are running on different agents, we can’t use the agent file system to pass data between them. In this case, we could achieve our goal by pushing the data as artifacts in the first pipeline and downloading the artifacts in the second pipeline.

Publishing artifacts in one pipeline and downloading them in another

to be completed

PowerShell scripts in DevOps pipelines

Mixing a command and a variable in the same string

A variable can be included together with a command in the same string as shown below:

$pactsPath = "$(System.DefaultWorkingDirectory)/pacts"
Write-Host "Pacts path: $pactsPath"
Write-Host $($(Get-ChildItem -Path $pactsPath))

Predefined DevOps pipeline variables

Here is a list of some of the predefined DevOps pipeline variables and some example values:

VariableCommentsExample value
Agent.BuildDirectoryNot available in the PowerShell task in the release pipeline/home/vsts/work/1 (Linux Agent)

Build or deployment job number
Pipeline.WorkspaceNot available in the PowerShell task in the release pipeline/home/vsts/work/1 (Linux Agent)
Build.Repository.LocalPathNot available in the PowerShell task in the release pipeline/home/vsts/work/1/s (Linux Agent)

Name of the stage in the pipeline
System.DefaultWorkingDirectoryAvailable in the PowerShell task in the release pipeline/agent/_work/r2/a (Linux Agent)
C:\agent\_work\r1\a (Windows Agent)