Blog#
Databricks job/task context
Suppose we're running following job/task in a Azure Databricks workspace:
jobId: "1111"
jobRunId: "2222"
taskRunId: "3333"
jobName: "ths job name"
taskName: "first-task"
databricksWorkspaceUrl: https://adb-4444444444.123.azuredatabricks.net/
Run below command in a Databricks job (task precisely):
Azure pipeline conditions
Azure pipeline has two kinds of conditions:
- With keyword
condition
- With jinja like format
${{if elseif else}}
In both syntax, we have use parameters and variables, but there's a big difference between them which makes DevOps frustrated.
Azure Pipeline Checkout Multiple Repositories
This post will talk about some Azure pipeline predefined variables' values in a multiple repositories checkout situation. The official doc is here.
Manage Azure Databricks Service Principal
Most of Databricks management can be done from the GUI or CLI, but for Azure Service Principal, we can only manage it by the SCIM API. There's an open PR for adding support of SCIM API in Databricks CLI, but the lastest update is back to the beginning of 2021.
This post is to add some tips that not covered by the official API docs.
Azure pipeline checkout repository from another project
Context
This post can be an extend to my previous post on variables and templates reuse
In fact, in addition to the variables and templates, I also need to reuse some non native Azure pipeline yaml files, for example some Python scripts defined in the shared template. If we use the same technic shown by the previous blog, the pipeline will throw error saying that it cannot find the Python script. This is because we need to checkout the remote repository at first before using the native pipeline yaml files.
Azure pipeline reuse variables in template from another repository
Context
In my project, I have several Azure pipelines that share some same variables, instead of declaring them in each pipeline, I would like to refactor it by using some central places to store the shared variables.