Databricks Delta Sharing cloud data source
This article describes how to set up the Databricks Delta Sharing cloud data source.
The Databricks Delta Sharing cloud data source is currently in Early Access and is only available to select customers. Contact your Tealium customer success manager to get started with Databricks Delta Sharing.
For a general overview of setting up a cloud data source, see Manage a cloud data source.
How it works
The Databricks Delta Sharing cloud data source uses the Databricks-to-Databricks sharing protocol. This protocol securely shares data assets between Databricks and Tealium.
In this protocol, you are the provider and Tealium is the recipient. You create shares in Databricks and grant Tealium access to those shares. Shares are collections of read-only tables and assets.
You configure the secure connection to Tealium in Databricks using the Tealium sharing identifier (metastore ID). No database usernames, passwords, or access tokens leave Databricks.
When used with CloudStream, the data pipeline is zero-copy. Data remains in your Databricks environment and is accessed on demand for activation.
Data types
The Databricks Delta Sharing data source supports all Databricks data types. To ensure data is imported correctly, map the Databricks data types according to the following guidelines:
| Databricks | Tealium |
|---|---|
| Numeric data types | Number attributes |
| String and binary data types | String attributes |
| Logical data types | Boolean attributes |
| Date and time data types | Date attributes |
| Arrays | Array of strings, array of numbers, or array of booleans |
| Map, struct, object, variant | String attributes |
For more information, see Databricks: Data Types (AWS, Azure, GCP).
Create a connection
Users must have the following Databricks Unity Catalog permissions to configure Delta Sharing: CREATE RECIPIENT, CREATE SHARE, USE CATALOG, USE SCHEMA, and SELECT. A metastore admin already has these permissions.
To create a connection with Databricks Delta Sharing, complete the following steps in Databricks:
- Enable Delta Sharing in the Databricks Unity Catalog metastore that contains the data you want to share.
For more information, see Databricks: Enable Delta Sharing on a metastore (AWS, Azure, GCP). - Create a share with the tables, views, or catalogs that you plan to share with Tealium.
For more information, see Databricks: Create and manage shares for Delta Sharing (AWS, Azure, GCP). - Create a recipient. Contact your Tealium customer success manager for the Tealium
metastore_idthat you need to create the Tealium recipient.
For more information, see Databricks: Create and manage data recipients for Delta Sharing (Databricks-to-Databricks sharing) (AWS, Azure, GCP). - Grant Tealium access to one or more shares.
For more information, see Databricks: Manage access to Delta Sharing data shares (for providers) (AWS, Azure, GCP). - Contact your Tealium customer success manager to create the reusable connection for you.
After you connect to Databricks Delta Sharing, select the data source table from the Table Selection list.
Multitable join is not supported. You can achieve the same functionality with a Databricks view. For more information, see Databricks: Create and Manage Views (AWS, Azure, GCP).
Query mode
For a general overview, see Query modes.
For Databricks Delta Sharing, note the following requirements:
- Timestamp + Incrementing and Timestamp modes: The selected timestamp column must be the type
TIMESTAMP.
For more information, see Databricks: TIMESTAMP type (AWS, Azure, GCP). - Incrementing mode: The selected numeric column must increment in value for every row added. A recommended definition for an auto-increment column is:
COL1 BIGINT GENERATED ALWAYS AS IDENTITY (START WITH 1 INCREMENT BY 1)
For more information, see Databricks CREATE TABLE (AWS, Azure, GCP).
WHERE clause
For a general overview, see SQL Query.
The WHERE clause does not support subqueries from multiple tables. To import data from multiple Databricks tables, create a view in Databricks and select the view in the data source configuration.
For more information, see Databricks: What is a view? (AWS, Azure, GCP).
IP access list
If your Databricks workspace is restricted by IP addresses, add the Tealium IP addresses to your Databricks IP access list.
For more information, see Databricks: Manage IP access list (AWS, Azure, GCP).
This page was last updated: March 11, 2026