Databricks
Note: You must have the necessary Roles associated with your User Profile
Pre-requisites
To connect to a Databricks, you need:
An existing PAT token or 0Auth credentials with the appropriate permissions
USE Catalog on any of the catalogs you want to create assets from
USE Schema (for any schemas)
SELECT (either at a catalog, schema or table/view level - depending on the desired granularity)
READ VOLUME (as above - catalog, schema or volume level)
It Workspace access' and 'Databricks SQL access' a new user is being created for this purpose (these are usually defaulted ON if creating the user/identity from inside the workspace)
A Databricks Managed Service Account is required - Entra accounts are not supported (more information)
Steps to create a Connector
Go MANAGE on the Navigation bar
Select Connectors to view the Manage Connectors screen
Click the Create connector button at the top right
Enter a Name for the connector and a Description (optional)
Select the Type of Connector Databricks
Select Authentication
Use either a PAT token from Developer settings in user area in Databrick
or OAuth using a service account number
Enter Host - this is the workspace URL e.g.
https://adb-1535878582058128.8.azuredatabricks.net (check that there is no trailing ‘/’)The Service account must also have permissions for to Use Tokens (for more details, please see https://learn.microsoft.com/en-us/azure/databricks/security/auth/api-access-permissions#get-all-permissions )
Add any Integration Metadata * needed for programmatic integrations
Key will be ‘httpPath’ (check case)
For the value, navigate to Databricks workspace, select ‘Compute’ on the left hand menu,
‘SQL Warehouses’ tab. Either select an existing SQL Warehouse you’d like to be used during
asset copy step, or create a new oneFor settings, there is no minimum Cluster Size recommended - but Small or Medium should be sufficient. The Platform will start and stop the warehouse, but feel free to configure an Auto Stop. No Scaling is required.
If using a Service Account for connectivity from the Platform, ensure the Service Account has permissions on the Warehouse.
Click into the Warehouse and navigate to ‘Connection Details’ and copy the httpPath value e.g. ‘/sql/1.0/warehouses/7cff6770269b80c7’
>>Add this httpPath as the value for the Integration Metadata. This warehouse will be used when loading data from your databricks instance.
Click the Create button to create your Connector
Those items marked with an * can be edited after the Connector has been created
Initiating asset updates
To complete identity configuration for a Databricks connector, certain Identity configurations must first take place.
The service account must be a user (or in a group) that has been added to your Databricks workspace. This ensures it has a valid identity to authenticate and receive the necessary entitlements.
Ensure the service account has the Databricks SQL Access entitlement enabled for the workspace. Without this, it cannot use SQL Warehouses to run queries.
In addition, the following Catalog & Table permissions must apply:
The identity must have permission to
USE CATALOG
on each catalog that contains the data we need to read.The identity must have
USE SCHEMA
on the relevant schemas so the account can explore themThe service account needs
SELECT
privileges on each table from which we will read data. This ensures it can run queries against those tables.
Finally, a Databricks SQL Warehouse must be created (or already exist) for the platform to query your data. To set this up, follow this guide.
To access SQL Warehouse, the service account needs permission to attach to or run queries using that particular SQL Warehouse. Note that in the image below, the ‘ws-az-demo-admin’ is the SQL warehouse, with permissions set to ‘can use’.
You can view the status of your data product updates at any time.
Summary: The service account must (1) be recognized in your workspace, (2) have Databricks SQL Access
, (3) hold USE CATALOG
& SELECT
privileges on relevant data, (4) have permission to query the designated SQL Warehouse, and (5) supply a valid token or principal credential.