What is Azure DevOps?
Built by Microsoft, it’s a set of tools commonly used in IT Projects: Wiki, Backlogs, Version Control System, Pipelines & Testing tools. This post will focus on the features used in Azure Pipelines.
Get to know terminology
YAML
Is the name of the language used to define what happens in your pipeline. It stands for “Yet Another Markup Language” and should simply be seen as a set of instructions that can easily be read by humans but more importantly understood by machines to perform a series of automated tasks. It’s often used in configuration files and is written with indentations. This is one of the reasons why I like it so much as it actually forces developers to write beautiful looking code!
Here’s an example of what it looks like:
# This YAML automates the drawbridge at Hyrule Castle to avoid manual effort by the guards
# Authored by Adam Walker, Hyrule Consulting
pool:
vmImage: 'ubuntu-latest'
steps:
- script: |
echo "**** Lowering Drawbridge ****"
./gate.sh 'open'
displayName: 'Initiate Gate Opening Sequence'
- script: |
echo "**** Waiting for passers by to enter the castle ****"
sleep 5m
displayName: 'Wait for entry'
- script: |
echo "**** Raising Drawbridge ****"
./gate.sh 'close'
displayName: 'Initiate Gate Closing Sequence'
Pipeline Builds
As YAML is just a set of instructions, nothing actually works unless you have a machine reading the instructions and performing the steps. This is where “Builds” come in. When you trigger a Build, you are essentially using a machine to perform the instructions defined in the YAML. You can declare variables in the Build definition that can be read and used in the YAML steps. Builds can be triggered in a number of ways:
Manual by clicking a button
Set a time and frequency
CI Trigger (when someone changes the source code).
YAML Schema
There are many keywords that can be used in the YAML which will be understood by Azure Pipelines when you run a Build. These keywords are designed to help you perform tasks and save time – we can’t go into all of them (as there are so many!) but check them out in the Microsoft Documentation. Here’s a few I almost always use in every pipeline I build: pool
Put simply, what machine are you going to use to run this. You could use a Microsoft hosted VM image that will be spun up in seconds for you, or you could just use your own hosted machine (e.g. setup via Azure)
# I'm using a simple linux-based setup...
# ...but I could also use the latest windows installation i.e. 'windows-latest'
pool:
vmImage: 'ubuntu-latest'
checkout
If you are doing something with the source code e.g. deploying to Salesforce, you will need to ‘checkout’ the code
# Use Checkout in your steps
# By default, it checks out everything
steps:
- checkout: self
Let’s build the pipeline
Some things you need to do first before we actually build the pipeline.
Set up a connected app. This pipeline uses the simple JWT auth, so create a connected app and upload a certificate.
Make sure your Azure Repo has all of the metadata and most importantly the package.xml in the root folder (src)
Extract the private key from the certificate you got in step (1), save it as a .key file. Now upload it into the Azure Pipelines secure library (Pipelines –> Library –> Secure Files). This is used in the JWT Authentication.
Now you have everything you need to build the pipeline, let’s create a starter pipeline from scratch. Head over to Pipelines –> Builds –> New –> Azure Repo –> Select Repository –> Starter Pipeline
Now paste the below pipeline into the YAML editor. Remember to add the variables as per the instructions by clicking the variables button.
# Deploys Metadata from a branch to a target org
# Variables required in Build Definition:
# USERNAME : e.g. link_of_hylia@hyrulecastle.co.uk
# CONSUMER_ID : This is the connected app client Id. Used in a JWT Auth
# INSTANCE_URL : e.g. https://test.salesforce.com
# Authored by Adam Walker, Hyrule Consulting
# I only want to trigger manually, so I set it to none.
trigger: none
# Running this from a linux-based machine, MS Hosted. Simples!
pool:
name: 'Ubuntu-Latest'
# Here are my steps, note that I first need to checkout
steps:
- checkout: self
# I used DX CLI to deploy, so the machine needs to have it installed
- script: npm install sfdx-cli --global
displayName: 'Installing SFDX'
# The Authentication is via JWT. So you need a the private key of the certificate used.
- task: DownloadSecureFile@1
name: key
inputs:
secureFile: 'MyCertificate.key'
retryCount: '5'
# Login to the target org
- script: sfdx force:auth:jwt:grant -u $(USERNAME) -f $(key.secureFilePath) -i $(CONSUMER_ID) -r $(INSTANCE_URL)
displayName: 'Authenticating'
# Submit the deployment to the target
# the src folder contains all of the metadata
# make sure you have a package.xml in there too though!
- script: sfdx force:mdapi:deploy -d src -l RunLocalTests -u $(USERNAME)
displayName: 'Submit Deployment'
# Monitors the deployment through the DX CLI
# If you don't specify a job Id, it just uses the latest deployment. Neat!
- script: sfdx force:mdapi:deploy:report --wait=-1 -u $(USERNAME)
displayName: 'Monitor Deployment'
# If the build gets canceled for any reason, I want to cancel the actual deployment too!
# By adding a condition here means this step always run when the build is canceled
- script: sfdx force:mdapi:deploy:cancel -u $(USERNAME)
condition: canceled()
displayName: 'Aborting Deployment'
That’s it! You’ve just built a deployment pipeline using Salesforce DX and Azure DevOps. In our example we just trigger the build manually however you could introduce a ‘CI Trigger’ we talked about earlier which would allow the automatic deployment based on a commit on a specific branch (e.g. master).
Comments