Running Federated Queries from Unity Catalog on Microsoft Fabric SQL Endpoint
In a previous post, I covered Microsoft Fabric and Azure Databricks Unity Catalog scenarios, highlighting the ability to run federated queries from Unity Catalog on top of Microsoft Fabric SQL endpoint. This is a quick note on how to set up lakehouse federation to enable these federated queries, indicating the connection settings.
Note: The views and experiences shared in this article are my own and do not represent the official positions of Microsoft or Databricks. Please note that the connection settings outlined here may change, and the connector may no longer function in the future, or a new connector may be introduced.
To connect to the Fabric SQL analytics endpoint using Unity Catalog lakehouse federation, you’ll need to create the following in your Azure Databricks Unity Catalog metastore:
- A connection to your Fabric SQL endpoints using the SQL Server connection type. Still official Fabric connection is not available.
- A foreign catalog that mirrors your Fabric SQL endpoint in Unity Catalog so that you can query underlying tables and data.
We’ll follow the same steps as for connecting to SQL Server, as detailed in Run federated queries on Microsoft SQL Server — Azure Databricks | Microsoft Learn.
Prerequisites
- Unity Catalog enabled workspace in Azure Databricks.
- Service Principal created in Azure. See Run federated queries on Microsoft SQL Server — Azure Databricks | Microsoft Learn.
- A Fabric workspace and a lakehouse with some Delta Lake tables. Tested with 2 part naming lakehouse (lakehouse.table).
CONTRIBUTOR
role for the Service Principal in the Fabric workspace.
Creating a Connection
To create a connection, you can use Catalog Explorer, the CREATE CONNECTION
SQL command, the Databricks REST API or CLI.
Regarding permissions, you will need Unity Catalog metastore admin or user with the CREATE CONNECTION
privilege.
Below are the connection properties needed for the Fabric SQL endpoint. For a step-by-step guide on where to click or which SQL statements to use, refer to Run federated queries on Microsoft SQL Server — Azure Databricks | Microsoft Learn.
Host:
<sql_endpoint_conn_id>.datawarehouse.fabric.microsoft.com
Port:
1433
Authorization endpoint:
https://login.microsoftonline.com/<tenant_id>/oauth2/v2.0/authorize
Client ID:
<client_id>
Client secret:
<client_secret>
OAuth scope:
https://database.windows.net/default offline_access
Trust server certificate:
false
Note: when creating a Service Principal, remember to set the OAuth redirect URI to https://<databricks_workspace>/login/oauth/azure.html
.
Creating a Foreign Catalog
Once the connection is created, you can create a foreign catalog using either the Catalog Explorer, the CREATE FOREIGN CATALOG
SQL command in an Azure Databricks notebook or the SQL query editor, or via the Databricks REST API or CLI.
Regarding permissions, you will need CREATE CATALOG
permission on the metastore and either ownership of the connection or the CREATE FOREIGN CATALOG
privilege on the connection.
Catalog name:
<give_a_catalog_name>
Type:
Foreign
Connection:
<select_previously_created_connection>
Database:
<sql_endpoint_name>
Running Queries
Once the foreign catalog is created, you will be able to query it (read-only) from a notebook or SQL query editor.
Note: I only validated that it worked; I did not test data mapping or pushdown support. For more information, refer to Run federated queries on Microsoft SQL Server — Azure Databricks | Microsoft Learn.