How to use a custom object in an Azure DevOps YAML pipeline to pass multiple values to a template file

Recently, a customer asked me how to pass several custom objects to an Azure DevOps YAML pipeline and index into each object.

They wanted to index into 3 object arrays being passed to the template file, similar to below.

- template: builds.yml
  parameters:
    imageNames:
      - app1Api
      - app2api
    pathToSolutionFiles: 
      - App1API/App1API.sln
      - App2API/App2Api.sln
    pathToDockerfiles:
      - App1API/App1API/Dockerfile
      - App2API/App2API/Dockerfile
parameters:
- name: imageNames
  type: object
- name: pathToSolutionFiles
  type: object
- name: pathToDockerfiles
  type: object

steps:
- ${{ each imageName in parameters.imageNames }}:
  - task: DotNetCoreCLI@2
    displayName: 'dotnet build'
    inputs:
      projects: "$(pathToSolutionFiles)"
  - task: Docker@1
    displayName: 'Build Image $(imageName)'
    inputs:
      dockerFile: $(pathToDockerfiles)
 ...

Unfortunately, ADO YAML syntax doesn’t let you index into 3 objects at the same time the way you might in C# (pseudocode).

for (int i = 0; i < parameters.imageNames.length; i++) {
  var imageName = parameters.imageNames[i];
  var pathToSolutionFile = parameters.pathToSolutionFiles[i];
  var pathToDockerfile = parameters.pathToDockerfiles[i];
  ...
}

Therefore, you need to change how you are passing in the array of objects.

You need to combine the object arrays together.

You can either choose a sequential or parallel approach to generating the correct YAML.

Sequential

In the sequential version of this method, you will use the each command to have the ADO YAML generated for you. Each set of commands will be copy-pasted after the previous set of commands (one for each entry in the array).

template-build.yml

parameters:
  - name: builds
    type: object
  
steps:
- ${{ each build in parameters.builds }} :
  - task: DotNetCoreCLI@2
    displayName: dotnet build
    inputs:
      projects: '${{ build.pathToSolutionFile }}'
  - task: Docker@1
    displayName: Build Image ${{ build.imageName }}
    inputs:
      dockerFile: '${{ build.pathToDockerfile }}'
...

You write the YAML assuming what the 3 properties of each array object will be.

In the calling script, you pass in a complex object. This object will be an array of objects. Each entry in the object array will contain 3 properties (imageName, pathToSolutionFile, pathToDockerfile).

azure-pipeline.yml

- job: jobSequential
  steps:
    - template: template-build.yml
      parameters:
        builds:
          - imageName: app1Api
            pathToSolutionFile: App1API/App1API.sln
            pathToDockerfile: App1API/App1API/Dockerfile
          - imageName: app2Api
            pathToSolutionFile: App2API/App2API.sln
            pathToDockerfile: App2API/App2API/Dockerfile

Parallel

In this option, instead of writing the each loop yourself, you can use the matrix build to have the ADO YAML generated that will execute in parallel instead of sequentially.

template-build.yml

parameters:
- name: imageName
  type: string
- name: pathToSolutionFile
  type: string
- name: pathToDockerfile
  type: string
  
steps:
- task: DotNetCoreCLI@2
  displayName: dotnet build
  inputs:
    projects: '${{ parameters.pathToSolutionFile }}'
- task: Docker@1
  displayName: Build Image ${{ parameters.imageName }}
  inputs:
    dockerFile: '${{ parameters.pathToDockerfile }}'

In this option, the template-build.yml file has the specific steps for 1 job.

azure-pipeline.yml

- job: jobMatrix
    strategy:
      matrix:
        buildApp1:
          imageName: app1Api
          pathToSolutionFile: App1API/App1API.sln
          pathToDockerfile: App1API/App1API/Dockerfile
        buildApp2:
          imageName: app2Api
          pathToSolutionFile: App2API/App2API.sln
          pathToDockerfile: App2API/App2API/Dockerfile
    steps:
      - template: template-build.yml
        parameters:
          imageName: $(imageName)
          pathToSolutionFile: $(pathToSolutionFile)
          pathToDockerfile: $(pathToDockerfile)

The calling script will use the matrix keyword to generate duplicate jobs, one for each object in the array. Each entry in the array will contain the 3 properties we need.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *