ADO Pipelines
ADO Pipelines is a build automation server that manages and controls software delivery processes throughout the entire lifecycle. It orchestrates tasks including build, test, package, deployment, and more. (docs)
- Well intregated tool within the ADO suite and Azure ecosystem
- Builds, tests, and deploys software continuously and monitors the execution and status of jobs
- Trigger execution automatically on code commits
- Runs tasks on Azure's hosted agents, custom virtual machines, or ephemeral containers
YAML: Pipeline as Code¶
Although ADO originally developed pipelines with a GUI editing. In 2018, ADO released YAML pipelines to provide pipeline-as-code, while keeping legacy formats with Classic / Release pipelines. Using YAML is the recommended approach as it supports the modern DevOps way of working, keeping the pipeline file within a Git repository.
pool:
vmImage: 'ubuntu-latest'
steps:
- task: DotNetCoreCLI@2
inputs:
command: 'build'
- task: DotNetCoreCLI@2
inputs:
command: 'test'
- task: Fortify@2
- task: UniversalPackages@0
inputs:
command: 'publish'
publishDirectory: '$(Build.ArtifactStagingDirectory)'
Benefits of pipeline as code¶
- Versioned changes to the pipeline
- Supports everything-as-code software delivery
- Easy rollbacks to previous states
- Auditability and traceability with all changes
- Supports ticket-based engineering, tying code changes to work items
Organization¶
Pipelines are organized by stages, jobs, and tasks. Working from the inside-out, a task
(docs) is the code block containing the execution logic, like build or test.
A
task
is the building block for defining automation in a pipeline. A task is simply a packaged script or procedure that has been abstracted with a set of inputs.
ADO Pipelines provides these built-in tasks for the most common types. When no pre-defined task exists, developers can write their own custom script
.
- task: DotNetCoreCLI@2
inputs:
command: 'build'
Tasks are run on agents within a job
(docs). Jobs are not a logical grouping, but rather a grouping based on agent configuration.
- If various tasks require different software installation or configuration, additional jobs may be required.
- As a job is run on an agent, variables and artifacts persist only within a job and not between jobs. However, these can be passed from one job to another with some logic.
- Multiple jobs can be run in parallel.
- Jobs may be be undefined and left with an implicit default.
Finally, a stage
(docs) can be used for organizing jobs. For most cases within an application CI pipeline, it would be unnecessary to apply stages. However, they can be helpful in non-conventional application pipelines or in deploy pipelines to differentiate deployments to various environments.
Pipeline Editor¶
To modify the pipeline, ADO has a feature-rich, easy-to-use pipeline editor that brings source control and pipeline tasks together. It is recommended to use this editor to easily add and modify built-in pipeline tasks. Like all code changes, these should be committed to a branch, tied to a work item, and merged through a pull request.
Templates: DRY, Standards, Innersourcing¶
As pipelines are adopted for more and more projects in an organization, common patterns are likely to emerge. Often it is useful to share parts of pipeline code between various projects to reduce redundancies, keep code simple, and maintain standards across the enterprise.
For this reason, code can be leveraged from templates which can be defined in external source control repositories and called by passing in a set of parameters.
- template: terraform-templates/plan.yml
parameters:
environments: dev
Templates can be very versatile and provide DRY techniques within your own project across all your pipeline files in addition to organization-level templates.