Azure pipeline variables and parameters#
Variable#
Variable scope#
When we set variables from a script, the new variable is only available from the next step, not the step where the variable is defined.
variables:
sauce: orange
steps:
# Create a variable
- bash: |
echo "##vso[task.setvariable variable=sauce]crushed tomatoes" # remember to use double quotes
echo inside the same step, sauce: $(sauce)
# Use the variable
# "$(sauce)" is replaced by the contents of the `sauce` variable by Azure Pipelines
# before handing the body of the script to the shell.
- bash: |
echo from the next step, sauce: $(sauce)
The result will be:
Json Variable#
Parameter can have object type like dict in Python, but not the case for variable. The workaround is to assign a raw json string to variable, and using tools like jq to handle it during runtime. The json string variable must follow some special format, the double quotes must be escaped, and the whole string must be enclosed by the single quote.
Parameter#
String parameter#
For string parameter with an empty string ""
as default value, in bash script task, we can use if [[ -n $VAR_NAME ]]; then
to handle it.
-n
in Linux returns true (0) if exists, and not empty.
parameters:
- name: paramName
type: string
default: ""
steps:
- scripts: |
if [[ -n $PARAM_NAME ]]; then
echo PARAM_NAME is set with a value: $PARAM_NAME
fi
displayName: check paramName
failOnStderr: true
env:
PARAM_NAME: ${{ parameters.paramName }}
Boolean parameter#
- In pipeline YAML syntax, we compare the value by YAML's Boolean type
true
orfalse
- In bash script, we should compare it with string format of
True
orFalse
Object parameter#
Parameter has a type of object
which can take any YAML structure. If it's related to a array/list
type, we can use ${{ each element in paramters.elements}}
to loop through it, but if it's related to a mapping/dict
type, it will not be easy as Microsoft hasn't provided any official docs (and this one) on how to use complex parameter with the pipeline native syntax, and my tests with different approaches failed too. Hopefully, for mapping/dict
object type of parameter, we can workaround it by doing some transformation in a script task with convertToJson like: echo '${{ convertToJson(parameters.elements) }}'
Warning
Must use single quotes
around the convetToJson
expression. If we use double quotes
, the output will remove the double quotes from the json data.
Loop through parameters#
We can loop through parameters with:
steps:
- ${{ each parameter in parameters }}:
- script: echo ${{ parameter.Key }}
- script: echo ${{ parameter.Value }}
The above example provided by the official doc loops through the parameters script by script. In the pipeline, we will see as many tasks as the number of parameters which looks a bit heavy, hereunder how to iterate all the parameters in a single script.
# suppose the blow pipeline is defined in a template which takes the parameter with name `parameters`, so we can reuse it in any other pipelines.
parameters:
- name: parameters
displayName: parameters
type: object
steps:
- script: |
parameters_in_json=$(echo '${{ convertToJson(parameters.parameters) }}' | jq -c)
echo "##vso[task.logissue type=warning]parameters: $parameters_in_json"
displayName: echo parameters
The above example uses only one script to iterate all the parameters and pipe it to jq, as long as jq can handle the parameters, we can handle everything. Here, we use jq -c
to convert all the parameters into a single line json, which will be better displayed by ##vso[task.logissue type=warning]
, as it takes only one line.