Updating Cluster Libraries for Databricks

You may need to update the libraries used by a Databricks cluster in the following scenarios:

  • A vulnerability is reported for a specific library version that is currently being used by the cluster.

  • If you are using a Databricks cluster for running a job that needs a specific library version. (As far as possible, Lazsa recommends that you use a job cluster in this kind of a situation.)

To update libraries for a Databricks cluster

  1. Sign in to the Lazsa Platform and click Configuration in the left navigation pane.
  2. On the Platform Setup screen, on the Cloud Platform, Tools & Technologies tile, click Configure.
  3. On the Cloud Platform, Tools & Technologies screen, in the Data Integration section, click Modify.
  4. Click the ellipsis (...) on the connection and click Edit.

    Edit Databricks Cluster Configuration

  5. On the Databricks screen, click View Details or Edit.

    Edit Details of Databricks Configuration

  6. Click Custom Libraries and edit the version of the required libraries. Click Save.

    Edit Custom Libraries

    Depending on the installation progress of the libraries, one of the following status is displayed in the Status column:

    • Pending

    • Installed

    • Failed

  7. In case the library installation fails or for some other reason, you need to revert back to a previous library version or default version, you can do so in the following ways:

    • Click the ellipsis (...) and then click Reset to previous version.

    • Click the ellipsis (...) and then click Reset to default version.

Related Topics Link IconRecommended Topics

What's next? Databricks Custom Transformation Job