Skip to content

Blog#

Databricks job/task context

Suppose we're running following job/task in a Azure Databricks workspace:

jobId: "1111"
jobRunId: "2222"
taskRunId: "3333"
jobName: "ths job name"
taskName: "first-task"
databricksWorkspaceUrl: https://adb-4444444444.123.azuredatabricks.net/

Run below command in a Databricks job (task precisely):

dbutils.notebook.entry_point.getDbutils().notebook().getContext().toJson()

Azure pipeline checkout repository from another project

Context

This post can be an extend to my previous post on variables and templates reuse

In fact, in addition to the variables and templates, I also need to reuse some non native Azure pipeline yaml files, for example some Python scripts defined in the shared template. If we use the same technic shown by the previous blog, the pipeline will throw error saying that it cannot find the Python script. This is because we need to checkout the remote repository at first before using the native pipeline yaml files.