Skip to content

Latest commit

 

History

History
18 lines (12 loc) · 1.25 KB

File metadata and controls

18 lines (12 loc) · 1.25 KB

Update Workflows and Clusters ♻️

Databricks

Introduction

This notebook is used to update clusters and workflows in the current workspace. It works by fetching the current cluster / workflow configs, performing some parsing and finally updating the same in current workspace.

Use Cases

Areas where such a notebook may be helpful:

  1. Cluster management: The notebook could be used to automate the process of updating clusters, such as changing the cluster size, node type, or Spark version. This could be useful for organizations that need to scale their clusters up or down dynamically, or that need to keep their clusters up to date with the latest Spark releases.
  2. Workflow management: The notebook could be used to automate the process of updating workflows, such as adding or removing tasks, changing the order of tasks, or updating the parameters of tasks. This could be useful for organizations that need to make changes to their workflows on a regular basis, or that need to deploy new workflows to production quickly and reliably.

See more details in the notebook (ipynb)