About cloud data sources
This article describes how to import data from a cloud datawarehouse or database.
Requirements
This feature requires the following:
- Tealium EventStream or Tealium AudienceStream
Supported vendors
The configuration for cloud data sources is nearly the same for every vendor. For an overview, see Manage a cloud data source.
For vendor-specific configuration details, see the following:
How it works
A cloud data source connects your cloud data warehouse or database to Tealium, so you can import data as events for processing. The process involves several key components:
- Connection:
Set up a secure reusable connection to your cloud data source by providing the necessary credentials and authentication details. Once connected, select the specific table or view you want to import data from. - Data source configuration:
Choose which columns to import from your table or view. Select a query mode to control how new or updated rows are detected—using a timestamp column, an incrementing column, or both. You can further refine the imported data by adding a SQLWHERE
clause to filter rows. - Column mapping:
Map the columns from your cloud database to Tealium event attributes. This ensures that data from your source is correctly assigned to the appropriate attributes in Tealium. - Visitor ID mapping:
To unify data across sources, map a column from your data source to a visitor ID attribute in AudienceStream. This enables accurate visitor stitching and profile enrichment.
Each imported row is processed as an event in Tealium, which allows you to bring in bulk data from your cloud systems and use it alongside other event data for real-time processing, segmentation, and activation.
A maximum of 10 cloud data warehouse data sources per profile can be enabled at a time.
Event processing
Each imported row is processed as an event with columns mapped to event attributes. Columns not mapped to an event attribute are ignored.
In the default Tealium data collection order of operations, events from a cloud data source are processed before the Event received step and do not change the order of operations.
Cloud data source rows are processed in EventStream and AudienceStream in the same way as events from other data sources, but with the following important exceptions:
- Browser-specific attributes: Browser-specific attributes, such as
User agent
, are not set. - Enrichments: AudienceStream enriches only the
First visit
preloaded attribute. All other enrichments for preloaded attributes are skipped. - Functions: Data transformation functions do not run.
- Single-page visits: Incoming events are exempt from the single-page visit criteria. Single-page visits and visitors from other data sources are not persisted in AudienceStream. For more information, see How are single-page visits processed in AudienceStream?
- Visit length: A visit started by a cloud data source event lasts for 60 seconds.
- Visitor ID mapping: If you map an AudienceStream visitor ID attribute in your cloud data source configuration, the visitor ID is set directly to the value of the column you choose without the need for an enrichment.
Rate limits
Imports from cloud data sources are typically limited to 500 events per second per account, but may vary. Standard attribute size limits still apply. For more information, see About attributes > Size limits.
Batch size
The cloud data sources import data in batches, with a maximum of 1,000 rows per batch. This behavior is important to consider when selecting a query mode.
Tables
Each cloud data source supports importing data from one table or view. To import data from multiple tables, create a view in your cloud database and select the view in the data source configuration.
Data types
The cloud data source supports all data types. In general, use the following data type mappings to ensure data is imported correctly:
Cloud data | Tealium |
---|---|
Numeric | Number attributes |
String | String attributes |
Logical | Boolean attributes |
Date and time | Date attributes |
Arrays | Array of strings, array of numbers, or array of booleans |
Other | String attributes |
For more vendor-specific data types, see:
Query modes
The cloud data source supports the following three query modes to control how data is imported from your table or view:
- Timestamp + Incrementing
- Timestamp
- Incrementing
Each mode uses a timestamp column, an increment column, or both to determine which rows to import.
Requirements for columns
To ensure a query mode works effectively, you must have one or both of the following:
- Timestamp column
A timestamp column set to the current time when a row is added or modified. - Increment column
A numeric column that increments in value for every row added. A recommended definition for an auto-increment column is:COL1 NUMBER AUTOINCREMENT START 1 INCREMENT 1
For vendor-specific requirements, see:
Select a mode that aligns with the requirements of your use case. For an example of how these modes work, see Query mode example.
Timestamp + Incrementing (Recommended)
This mode uses both a timestamp column and an incrementing column to import new or modified rows.
Rows are imported if they have a newer timestamp than the previous import, a larger increment value than the last imported row, or both.
This mode provides the highest reliability for ensuring that all rows are imported as intended.
Timestamp
This mode uses a timestamp column to import new or updated rows.
Rows are imported if they have a newer timestamp than the previous import.
Use this mode if your table has a timestamp column set on every insert or update operation.
Be aware that if the number of rows with the same timestamp exceeds the batch size, then some rows are not imported.
Incrementing
This mode uses an incrementing column to import rows.
Rows are imported if they have a larger increment value than the last imported row. Rows with an increment value less than or equal to the maximum value from the previous import are skipped. This mode does not detect modifications to existing rows.
Use this mode if your table does not have a timestamp column.
If you maintain your own increment column, as opposed to using an auto-increment column, ensure that the values always increase.
Query mode example
The following example shows how batch processing of rows and query modes work together.
In the following table, modification_time
is the timestamp column and customer_id
is the incrementing column.
customer_id |
modification_time |
customer_segment |
---|---|---|
1 |
01Apr 13:00 |
A |
2 |
01Apr 13:00 |
B |
… | … | … |
1000 |
01Apr 13:00 |
D |
1001 |
01Apr 13:00 |
E |
1002 |
02Apr 14:00 |
A |
The cloud data source fetches data 1,000 rows at a time and records the highest value found in the timestamp column, the incrementing column, or both columns, depending on the selected query mode.
- Timestamp + Incrementing:
The data source fetches rows 1-1000. The next time the data source fetches the data, it looks for rows where either
modification_time
is01Apr 13:00
and thecustomer_id
is greater than1000
or
modification_time
is greater than01Apr 13:00
. - Incrementing:
The data source fetches rows 1-1000 and marks the maximum incrementing value of1000
. On the next import, rows 1-1000 are skipped even if they were modified since the first import. Only new rows that increment the incrementing column (customer_id
in the example) are processed. - Timestamp (
modification_time
in the example):
The data source fetches rows 1-1000 and marks the maximum timestamp of01Apr 13:00
. On the next import, rows with a timestamp greater than01Apr 13:00
are imported. In this case, row1001
is skipped because it has the same timestamp value that was fetched in the previous batch of data.
SQL Query
In the cloud data source Query Configuration, select the columns you want to import. To add conditions for the import beyond the query mode, use the SQL WHERE
clause. This option adds a WHERE
statement to your query. The WHERE
statement must use valid SQL syntax.
The WHERE
clause does not support subqueries from multiple tables. To import data from multiple tables, create a view in your cloud database and select the view in the data source configuration.
Scheduling
The Processing Settings section controls how and when your cloud data source imports new data. You can enable or disable processing and configure the frequency at which data is imported. Each import processes data in batches of 1,000 rows.
Frequency
Select how often to import data:
Option | Description |
---|---|
Near Real-Time | Fetches data every 2 seconds. Ideal for low-latency use cases. |
Hourly | Runs at the beginning of every hour. |
Daily | Runs once per day at the hour of the day you set (UTC). |
Weekly | Runs once per week at the day and hour you set (UTC). |
Fetching data with high frequency (near real-time or hourly) may lead to increased costs and system strain in your data warehouse.
Column mapping
The column mapping configuration determines the event attributes that correspond to each column in the cloud database table.
Column names are often different from the attribute names in your Tealium account, so this mapping ensures that the data is imported properly. For example, if your table uses the column name postalCode
but the event attribute in Tealium is customer_zip
, create a mapping to associate these fields.
For information about mapping cloud data types to Tealium data types, see the Data Types section.
For vendor-specific data types, see the following:
Visitor ID mapping
To ensure your imported data is stitched with other sources, such as web, mobile, or HTTP API, ensure that every row in the table has a column with a unique visitor ID.
Map the visitor ID column and corresponding event attribute to a visitor ID attribute (a unique attribute type for visitor identification in AudienceStream). The value in the mapped event attribute is assigned to the tealium_visitor_id
attribute and matched directly to any existing visitor profiles.
For more information about Visitor ID Mapping in AudienceStream, see Visitor Identification using Tealium Data Sources.
IP addresses to allow
If your cloud data account has strict rules about which systems it accepts requests from, add the Tealium IP addresses to your cloud database allowlist.
This page was last updated: June 17, 2025