MoreRSS

site iconTableau BlogModify

Everything we do is driven by our mission to help people see and understand data.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of Tableau Blog

Monitor Tableau Cloud Deployments of Any Size with the Platform Data API

2026-01-03 00:07:29

Spencer Czapiewski

Tableau puts self-service visual analytics within reach of everyone, from small businesses with just a few licenses to massive enterprises deploying it across their entire organization. But as adoption grows, activity increases. Users generate prolific amounts of content. More dashboards are rendered, more databases are queried. Groups and permissions become more complex. Invariably, a point comes when help is needed to lead, organize, and shepherd these efforts—lest they devolve into an ungoverned mess that does more harm than good.

Having spent a good portion of my career as a Tableau administrator myself, I believe that every admin has three core jobs: Ensure that a platform is safe, functional, and valuable to its users. This means securing the platform from bad actors (external or internal), complying with regulatory requirements, monitoring for operational issues and performance degradations, and ensuring everyone gets the most value by using the platform efficiently.

Observability for enterprises

Doing all those things is easier said than done, especially for large deployments with hundreds, or even thousands, of users working across time zones, geographies, and different sites. When the platform you administer reaches this scale, the traditional built-in tools and processes that worked well at the beginning often just aren’t fit for purpose anymore. Working one-on-one with users and clicking through pages manually can work when you have hundreds of users, but it’s not sustainable when you have thousands. It’s at this point that admins need true observability—access to data about what is happening inside their application at scale. For many admins, this data-based approach means integrating event logs into SIEM tools for monitoring, data warehouses for usage analytics, and building automation to execute tasks autonomously.

For some time, the Activity Log feature of Tableau Cloud has helped meet some of those needs. With Tableau 2025.3, a huge step forward is delivered in giving Tableau Cloud admins more observability data than ever before: the Platform Data API!

Introducing the Tableau Cloud Platform Data API

Accessible via new endpoints in Tableau Cloud Manager (TCM), the Platform Data API provides a central, unified way to retrieve event log data from your entire Tableau Cloud deployment. Want to see who logged into TCM to create a new site? There’s auditing for that. How about a permissions change someone made to their workbook? You can pull that data, too. What about monitoring extract refreshes? We’ve got you covered (including the ones that run on Bridge)!

Unlike "push" integrations that only support specific platform vendors, our API allows you to "pull" data down to store, integrate, and analyze it using the systems you prefer. This platform-agnostic approach means more customers can benefit from this observability data. It also gives our partners and DataDev community a standardized way to create new tools and products that enrich the Tableau ecosystem for everyone.

I'm excited to share that this feature is available for all customers of Tableau Cloud! This means that for the first time ever, any customer, regardless of edition, can access this same observability data using the new API. We know how important and valuable this data is, and we want to maximize its benefit for you. For our Tableau Enterprise and Tableau+ edition customers with strict monitoring and resiliency requirements, you’ll enjoy access to near real-time data to ensure you’re always on top of anything that needs immediate attention, and extended retention periods that keep the data available for a full year—just in case.

A list of Tableau APIs on the left with "Platform Data API" selected and configurations for the API parameters on the right.

What's next for enterprise observability

The Platform Data API is just the first step in our journey to provide you with a comprehensive set of tools for managing your Tableau Cloud deployment. We are actively working on new features that will further expand the site's observability options:

  • Entity Snapshots: While event data can tell you what's happened, it can’t tell you what things are like now. We’re working to supplement the eventing data with "snapshots" of your deployment's entities—such as workbooks, data sources, users, and groups. This will allow you to answer questions like, "Which workbooks haven't been used in the last six months?" and "What users are no longer active?"
  • Accelerated Admin Insights: As we build this robust data foundation, we are committed to improving our existing admin features. We plan to re-base Admin Insights on this new, more efficient data pipeline, which will allow us to accelerate the data refresh time and provide you with faster, more timely information about your deployment.

The Tableau Cloud Platform Data API is the foundation for a new level of control and insight. It's available to all Tableau Cloud customers, helping everyone—from individual teams to large enterprises—build a more secure, efficient, and data-driven analytics environment. It's the key to truly understanding your deployment at scale.

Get started with the Platform Data API today

You can learn more about how to use the new Platform Data API by reading our Activity Log overview in the Help documentation. Detailed technical specs for the API calls can be found in the Platform Data Methods section of the TCM REST API, and our official Postman collection will help you get up and running quickly. And as always, you can join us in the DataDev Slack workspace if you need some help! 

Introducing Tableau Cloud Custom Domains

2025-08-12 03:12:43

Dzifa Amexo

What are Tableau Cloud Custom Domains?

Custom Domains (per site) is a new feature in Tableau Cloud that allows site administrators to configure a subdomain, like analytics.example.com, enabling them to easily embed and access published content. With a custom domain, they can efficiently route stakeholders to Tableau Cloud content, offering an organization-specific URL subdomain that maintains their brand identity instead of having to remember and solely depend on the complicated and non user-friendly pod url's of today. 

Key Benefits for Tableau Customers

Seamlessly Embed Your Analytics
Custom Domains particularly resolves instances where third-party cookies are blocked by default in some browsers for embedding customers. Tableau Cloud Custom Domain feature resolves these 3rd party cookie issues by allowing site administrators to configure a subdomain of their own, and thus the embedded content comes from the customer owner domain (aka 1st party). 

Strengthen Your Brand Identity and Security
Custom domain reinforces this by presenting your Tableau content under your company's own URL. This provides a more professional and trustworthy experience for your employees, partners, and customers. vs. embedding. Even if browsers are not blocking by default, this enables them to embed and access published content without exposing Tableau site details resolving Safari, Chrome and other browsers and addressing their 3rd party cookies' privacy and security concerns.

Simplify Access for Everyone
With a custom domain, you can provide a simple, memorable web address for your stakeholders to access their dashboards and reports. This makes it easier to share links and direct traffic to your Tableau Cloud site.

How to get started with Custom Domains

For this Tableau Cloud feature Site Admins can create and configure Custom Domains in site settings. There is no additional cost to Custom Domains feature. Customers/Site Admins can utilize and set up Custom Domains on any of their sites by going to the individual sites settings for each site.

Note: To add a custom domain to Tableau Cloud, you must meet the following user and TLS certificate requirements. Full info and steps for setup are in the online docs, the summary below is just to get you started.

Requirements: 

  • Administrator access to your Tableau Cloud site.
  • Administrator privileges to the DNS configuration of the parent domain.
  • While not required, it is a good idea to acquire a valid TLS certificate chain file and private key from a trusted authority (for example, Verisign, Thawte, Comodo, GoDaddy) ahead of starting the setup. It is also a good idea to give a heads up to the IT/DNS team as they will need to update the records once the custom domain dns info is provided during setup. 

To setup Custom domain (per site), sign in to the Tableau Cloud site as an administrator, go to settings and follow the steps under the Custom Domain section.

If you’re ready to experience Tableau Cloud yourself, start a free trial.

Choosing the Right Tool for Your Tableau Content Management and Migrations

2025-07-28 23:20:29

Dzifa Amexo

The migration journey from Tableau Server to Tableau Cloud can be a daunting experience. Without a reliable process, the migration can be a time-consuming and error-prone endeavor, potentially leading to a multitude of issues. To help solve this challenge, Tableau offers a suite of tools designed to facilitate content movement, both between Server environments and Server to Cloud in the form of the Migration SDK, the Cloud Migration App, and the Content Migration Tool

While they share the common goal of simplifying content transfer, they differ significantly in their design, target audience, capabilities, and intended use cases. Understanding these distinctions is crucial for selecting the right tool for your content migration and management.

Overview of Tableau Tooling 

Here's a breakdown to help clarify which tool might be right for you when considering a move from Tableau Server to Tableau Cloud:

  • Tableau Migration SDK: This is a Software Development Kit (SDK), a toolbox for developers to build their own custom migration applications. It is the recommended tool for the technical movement of users and content during a migration from Tableau Server to Tableau Cloud. The Migration SDK is an API-driven engine with a networking layer and uses hooks like Filters, Mappings, and Transformers. It is not a turnkey solution or a pre-built application and requires customization and development effort. It's designed for junior level developers and above who are familiar with Tableau and have experience in Python or .NET. It's intended for a one-time migration event from Server to Cloud, not recurring content promotion.
  • Cloud Migration App: This is an open-source desktop application built on the Tableau Migration SDK. It is designed to allow administrators to copy content and users from Tableau Server to Tableau Cloud with ease, specifically simplifying the process for administrators with smaller deployments of around 100 workbooks. A major limitation of this app is that it only has basic features, like user mapping, but lacks functionality such as filtering, renaming, and authentication types.
  • Tableau Content Migration Tool (CMT): This Windows-only desktop app with console runner provides a user interface and is designed for copying or migrating content between Tableau Server sites or between Tableau Cloud sites. It is explicitly not recommended for migrations when moving from Tableau Server to Tableau Cloud.
     

Comparison 

 

Migration SDK

Cloud Migration App

Content Migration Tool (CMT)

Primary Use Case

Recommended for all migrations from Tableau Server to Tableau Cloud.

Built on the Migration SDK and simplifies Server to Cloud migration for smaller deployments (generally 100 workbooks or less)

Server to Server, Cloud to Cloud, and other content management tasks.

Not supported for Server to Cloud.

Target Audience

Developers experienced in Python or .NET. Also Professional Services and Partners. Not for business users.

Administrators, users with less dev expertise

Administrators, users with permissions

Recommended Customer Setup

Large/Enterprise (>100 users)

Small/SMB (<100 workbooks)

Variable given use case

Nature of Tool

Software Development Kit (SDK) - requires building a custom application. API-driven engine.

Open-source application with a user interface (built on the SDK).

User interface (UI) to create and run migration plans. Supports pre/post migration scripts.

Technical Expertise Required

High (coding)

Low to Medium (application configuration)

Medium (application configuration, scripting optional)

Supported Content Types

Users, Groups, Projects, Workbooks, Published Data Sources, Embedded Credentials, Custom Views, Subscriptions, Extract Refresh Tasks

Same as Migration SDK

Workbooks, Published Data Sources. Embedded credentials (with specific configuration, except OAuth).

Unsupported Content Types

Future Plans for: Favorites, Prep Flows, Virtual Connections, Group Sets, Collections, Subscriptions for Custom Views

Other Content Types not listed here without Rest API exposure

Same as Migration SDK

Users, Groups, Site settings. Many content types including Ask Data Lenses, Collections, Custom views, Data-driven alerts, Extract refresh schedules (to Cloud), Favorites, Flows, Virtual connections, etc..

Choosing the Right Tool for the Job

The choice of migration tool depends heavily on the specific scenario, the size and complexity of the Tableau environment, the technical expertise available, and the specific content types that need to be migrated. In general, the scenarios below should help guide the right choice when deciding on what to use.

Server to Cloud Migration:

  • Customers with small deployments (around 100 workbooks), limited development resources, and core content types (users, groups, projects, data sources, workbooks, permissions, custom views) may find the Cloud Migration App to be the most suitable, straightforward option.
  • Customers with larger or more complex Server deployments, significant customization needs, specific content types not supported by the App (e.g. Prep Flows or Virtual Connections in the future), or who require integrating migration into existing workflows should consider using the Migration SDK and building a custom application. This is also the intended path for Professional Services and Migration Partners serving large customers.
  • Using the Tableau Content Migration Tool for Server to Cloud migration is not recommended.

Server to Server or Cloud to Cloud Migration:

  • The Tableau Content Migration Tool (CMT) is the primary and recommended tool for migrations and content management tasks between Tableau Server sites and between Tableau Cloud sites. Its broad capabilities for content promotion, environment migration, site consolidation, and maintenance make it well-suited for these scenarios.
  • The Migration SDK is also able to support Cloud to Cloud migrations with code customization and limitations with embedded credentials and subscriptions, with plans to expand on this functionality in the future.
  • The Cloud Migration App is not able to support either Server to Server or Cloud to Cloud migrations.

Ultimately, the three tools provided by Tableau to help with content management and migration are distinct offerings with different use cases and supported scenarios. In general, the rule of thumb is that all Server to Cloud migrations should be done with the Migration SDK or the Migration App, while content migration across Server to Server or Cloud to Cloud should be done with the Content Migration Tool. At the end of the day, organizations should carefully evaluate their specific requirements, technical resources, and migration path to determine the most appropriate tool for their specific needs.

Learn more about best practices for migrating to Tableau Cloud from Tableau Server with the Trailhead Badge for Cloud Migration.

Keep Payment and Cardholder Data Secure with PCI-DSS Compliance for Tableau Cloud

2025-07-10 22:41:17

Dzifa Amexo

In today's digital landscape, businesses increasingly rely on cloud solutions, often involving sensitive information, including payment card data. This achievement means that Tableau Cloud not only empowers organizations in the financial services industry to leverage their data effectively but also enhances our existing robust security and compliance built into Tableau Cloud.

Tableau Cloud has achieved Payment Card Industry Data Security Standard (PCI-DSS) 4.0 compliance, reinforcing our unwavering commitment to security and ensuring your sensitive payment card data is always protected.

What is PCI-DSS and How Does it Impact Tableau Cloud?

The Payment Card Industry Data Security Standard (PCI-DSS) is a globally recognized set of security standards. Its purpose is to ensure that all companies that accept, process, store, or transmit credit card information maintain a secure environment. For businesses handling customer payments or financial data, PCI-DSS compliance is crucial for safeguarding cardholder data and maintaining customer trust.

Tableau Cloud plays a vital role as a "service provider" in the PCI-DSS ecosystem, supporting payment processing for other companies. Our customers can now operate with even greater assurance knowing that their data environment within Tableau Cloud meets these rigorous standards.

Tableau Cloud's Commitment to Top-Tier Security

As part of our PCI-DSS compliance, we're proud to share that Tableau Cloud is a Level 1 service provider under PCI-DSS. This is the highest level of compliance, signifying that we meet all 12 technical and organizational controls of PCI-DSS.

To meet this standard, Tableau performs rigorous testing and validation, including:

  • An annual on-site assessment and Report on Compliance (ROC) by a Qualified Security Assessor (QSA).
  • Quarterly network scans by an Approved Scanning Vendor (ASV) to identify and address any vulnerabilities.
  • An Attestation of Compliance (AOC) signed by a senior executive, confirming our adherence to the standards.

Furthermore, Tableau Cloud operates on AWS's Infrastructure as a Service (IaaS) platform and Salesforce’s Hyperforce, both of which are themselves PCI-DSS Level 1 compliant. This means our foundational infrastructure is built on a bedrock of security, with AWS responsible for securing the data centers and physical controls.

The Shared Responsibility Model: A Partnership in Security

Tableau’s PCI-DSS compliance is built on the foundational principle of a Shared Responsibility Model. As the cloud service provider, Tableau Cloud manages the underlying infrastructure and platform, securing areas like network firewalls, data encryption at rest and in transit, vulnerability management, patching, and system activity logging. Conversely, our customers are responsible for securing their own application settings, data, and usage of the service through configurations of secure access, encrypting data during transmission to and from Tableau Cloud, monitoring user activity, and securing any custom integrations.

A Venn diagram titled "Shared Responsibility Model in PCI DSS: Ensuring Compliance Together

It is crucial that customers uphold their responsibility under this model for the environment to be fully PCI-DSS compliant. 

To that end, Tableau provides a bevy of tools that can help customers uphold their compliance requirements under the shared responsibility model:

  • Customer-Managed Encryption Key (CMEK): This critical feature allows customers to encrypt site data extracts with their own customer-managed, site-specific key.
  • Tableau Bridge: Securely connect Tableau Cloud to private network data, including on-premises databases or private cloud data, using HTTPS and WebSockets for reliable, encrypted communication.
  • Activity Log: Gain comprehensive visibility into detailed log events, sent directly to your own Amazon S3 bucket. This enables in-depth analysis, auditing, and monitoring of site activities and permission changes, crucial for compliance.
  • Robust Access Controls: Tableau Cloud offers a variety of features to ensure only authorized users access your data, including:
    • Multi-Factor Authentication (MFA) for strong verification.
    • Single Sign-On (SSO) for streamlined and consistent authentication.
    • SCIM for automated user provisioning and de-provisioning.
    • User Role/Permission Control to enforce the principle of least privilege.
    • Data Access Control and Row-Level Security (RLS) to restrict data visibility based on user identity.

A diagram titled "The Shared Responsibility Model, Layered Approach" illustrates the division of security responsibilities between Customer, Tableau, and AWS.

Key Considerations for Shared Responsibility

Furthermore, in addition to customer responsibilities, it is important to call out the limitations around Tableau’s current PCI-DSS implementation. These limitations must be taken into account by customers as part of their Tableau Cloud implementation in order to be covered by Tableau’s PCI-DSS compliance:

  • Customers must use CMEK to encrypt their extracts.
  • Only data in the form of data extracts or live connections are supported for PCI-DSS compliance. Using embedded data sources like Excel or CSV files, which cannot be encrypted with CMEK, is not within the scope of this compliance.
  • Customers are required to use their own Identity Provider (IdP) for Single Sign-On (SSO).

Understanding these specific points helps ensure that customers can effectively leverage Tableau Cloud's PCI-DSS compliant environment while meeting their own compliance obligations.

Your Trusted Partner in Data Security

At Salesforce, trust is our number one value, a principle that guides everything we do. Tableau Cloud is engineered with security at its core, committed to protecting your sensitive cardholder data. Our comprehensive approach, spanning infrastructure security, robust encryption, and detailed logging, provides a resilient environment for your critical information.

We are dedicated to being your trusted partner on the journey toward compliance and data security. 

For more detailed information, please refer to the Tableau Cloud PCI-DSS Whitepaper, visit our Tableau compliance website and resources, or reach out to your Salesforce account executive.

Want to experience this level of security and compliance? Start a free trial of Tableau Cloud.

How to Improve Data Readiness for Tableau Cloud

2025-01-10 05:59:16

Candice Vu

How often have you heard, “Without good data, you won’t have good AI?” It sounds simple, but studies show that “ensuring scalable, reliable data” is one of the top analytics challenges organizations face according to the Salesforce State of Data and Analytics Report. This challenge has significant downstream effects. Organizations that cannot confidently confirm they have reliable data cannot confidently adopt AI or self-service capabilities. To scale responsibly around AI and self-service-driven capabilities, organizations need a better process to explore, improve, and validate the data sources that drive them.

This blog provides the tools to monitor your Tableau data sources more effectively and a framework to drive iterative improvements, so you can be more confident in your data.

The framework consists of four steps:

  1. Monitor your data
  2. Identify a meaningful objective
  3. Create and promote data sources
  4. Enable your users

Step 1: Monitor your data sources

Improving data confidence starts with a baseline understanding of your data. Consider one of your top-level Tableau Projects, and ask yourself:

  • How many data sources are in that project?
  • How do users distinguish trusted, governed sources from experimental ones?
  • Is there supporting documentation to help new users understand the data?

If you can’t answer these questions, scaling for AI or self-service discovery may make you nervous. The first step to alleviate these concerns is to monitor and explore your data sources. Any Tableau Cloud Admin can do so by connecting the Data Source Manager Accelerator to your Tableau Cloud Site, and then sharing for other users to see.

Blue Tableau Data Source Manager screen showing 94 data sources, 10 of which are certified, 10 published, 74 embedded

Start by reviewing the “Know your data sources” section to see how users can interact with their data.

published data source makes a single source of data accessible to permissioned users, enabling self-service and Tableau Pulse capabilities. Certified data sources go a step further, assuring users that the data has passed organizational governance standards and can be trusted. Embedded data sources, however, are embedded within a Tableau workbook, making them inaccessible for exploration outside of the dashboard through web-authoring or Tableau Pulse. Monitoring and managing your data assets along these categories will help you assess your readiness to embrace Tableau’s AI and self-service capabilities in your Tableau Sites.

Step 2: Select a meaningful objective

Instead of overwhelming yourself by attempting to improve all of your data at once, focus on achieving short-term wins on meaningful objectives. To identify these use cases, take any potential initiative, a digital whiteboard, and answer the following questions for the initiative:

List of meaningful objectives and related questions for improving data, including C-suite goals, your initiatives, Success Metrics, and so on

The initiatives you choose to focus on should confirm all of the above. It should impact relevant organizational goals, use data that can feasibly be brought into Tableau, and demonstrate that analytics can indeed drive meaningful progress. Once confirmed, you have a worthy use case to focus on. Create a designated Tableau Project and follow the next step to equip users with explorable data assets.

Learn more about creating value maps to help select meaningful objections on our website.

Step 3: Create and promote data assets

You’ve listed the necessary data sources for your initiative in step 2 above, now make sure they are available in your Tableau Site by referring to the Data Source Manager Accelerator. 

Menu with gray bar charts showing the monitoring of data assets for readiness and adoption, and management of published and embedded data sources

Here, all of your data sources are categorized as Certified, Published, or Embedded, as discussed in Step 1. Filter the Accelerator for your Tableau Project, and explore the following sections of the dashboard to learn more about your data.

Embedded Data Sources (bottom right)

On the bottom right, you see all your embedded data sources, sorted by repetitions. Remember, these data sources are embedded within dashboards, meaning they cannot be explored with Tableau Pulse or web-authoring. Review this section to:

  1. Consider publishing highly-used embedded data sources to enable features and consistency
  2. Redirect embedded dashboard connections to their Published/Certified equivalent, if it exists 

Published Data (bottom left)

On the bottom left, you see all your Published data sources, sorted by user activity. While these data sources enable enhanced Tableau features, they lack any indication of whether the data has passed governance checks and can be trusted. Review this section to:

  1. Consider retiring unused data sources
  2. Consider certifying highly-used Published data sources
  3. For certification candidates, collaborate with data owners and experts to complete the certification checklist

Learn more about how to publish data sources on our website.

Certification Checklist (hover over the checklist on the dashboard)

The certification checklist is your organization’s process for vetting data sources, which ensures users understand why they can trust their data. This checklist typically includes a variety of checks to confirm accuracy of the data and the presence of supporting documentation. While we can’t directly monitor data accuracy with this Accelerator, we can track other variables to help data owners organize their certification efforts. Hover over the checklist on the dashboard to view the following six checklist items for each data source.

  • Licensed Owner: Confirms the data owner is still a licensed Tableau user;
  • Refreshed Last 30 days: Confirms the data source has been refreshed in the last month;
  • Description: Confirms the data source description includes specific background about the data and guides users where to go with questions (format customizable by you);
  • Data Source Type: Confirms the data comes from preferred sources and databases;
  • Title Format: Confirms the title is legible without special characters;
  • Accessed Last 90 days Confirms users still use the data;

Certification checklist for vetting data sources with two red X warnings for Licensed Owner and Refresh Last 30 days

Note: This is a starting point. Customize your checklist items by editing the “Score_” calculations in the Accelerator or adding new ones.

Once a data expert vouches for the accuracy of the data, the data owner can use this Accelerator to ensure the necessary supporting documentation is included. These combined checks, done inside and outside of this dashboard, will ensure data sources are ready to receive the Certification label in Tableau.

Certified Data (top)

At the top, you’ll find all data sources given the Certification label on Tableau Cloud, sorted by user activity and an icon confirming compliance with the checklist.

Screen with gray bar charts showing the management of published and embedded data sources

These are your prized data assets. They have been reviewed by a data expert to ensure accuracy, supplemented with supporting resources to address user questions, and have passed organizational governance standards to ensure trust and consistency. Review this section to:

  1. Confirm data adoption
  2. Ensure compliance with the certification checklist
  3. Consider retiring unused data
  4. Identify missing data sources from step 2

Greyed out screen showing external data asset labeled "Certified" with green button

Learn more about how to certify data sources on our website. 

Step 4: Enable your users

With a defined symbol to distinguish trusted data and a credible process to back it up, it’s time to enable your users. Start by informing your users via discussion forums, user group meetings, and other channels that they now have access to new, high-quality data sources. Next, teach them best practices on how to validate and explore these data assets by guiding them to free Tableau Modules on Trailhead. Lastly, regularly check the Data Source Manager Accelerator to ensure data adoption levels meet expectations. 

List of free, educational Tableau Modules on Trailhead about data literacy, Tableau Desktop, and becoming data-driven

A framework to improve data readiness

By implementing our four-step framework for a meaningful objective, you have successfully converted the challenge of “ensuring scalable, reliable data” into an ongoing discussion. This dialogue unlocks new features, improves users’ speed-to-insight, and creates a catalog of quality data ready to inform AI and self-service at scale across your business. Now, identify the next meaningful initiative, and repeat the process. Eventually, you’ll flip the narrative and be able to say, “we have great data, so we know we have great AI.”

Download the Data Source Manager for Tableau Cloud Accelerator on the Tableau Exchange to help monitor best practices. 

Tableau and dbt Labs: Strategic Partnership and Integration

2024-10-08 05:59:00

Spencer Czapiewski

Enabling teams to make trusted, data-driven decisions has become increasingly complex due to the proliferation of data, technologies, and tools. For example, through Mulesoft’s Connectivity Benchmark report we found that the average enterprise uses 991 applications. This technology sprawl often creates data silos and presents challenges to ensuring that organizations can effectively enforce data governance while still providing trusted, real-time insights to the business.

Tableau is a leader in the analytics market, known for helping organizations see and understand their data, but we recognize that gaps still exist: while many of our joint customers already benefit from dbt and trust the metrics that result from these workflows, they are often disconnected and obscured from Tableau’s analytics layer.

This can reduce trust in the data presented to decision makers, and forces analytics engineers and data stewards to replicate semantic layer definitions across their Tableau deployment. To help overcome these challenges, our community has expressed the need for an integration between Tableau and dbt to reduce friction, enforce trust in the insights, while leading to improved decision making.

At dbt Labs’ Coalesce 2024, we announced a new strategic partnership with dbt to build a seamless integration between dbt and Tableau to help our customers deliver on trusted, end-to-end analytics.

Through our new partnership with dbt, we are aiming to broaden the trust, extensibility, and value of Tableau by incorporating dbt models and metrics directly into the product. This will bring together two industry-leading solutions to accelerate time to value, foster seamless collaboration, and provide a world-class analytics experience to our customers.

Delivering on our vision together

Our vision is to bring end to end trusted data and analytics to organizations, with the data platforms of their choice.

Here are five key areas where we will deliver important features to achieve this:

  1. Export dbt models and metrics: Data teams can instantly create a Tableau semantic model and Tableau Pulse metric from within dbt Cloud, without the need to republish data sources into Tableau or manually set up the metric.
  2. Lineage and data health: We will enhance data details and data lineage in Tableau Catalog by allowing dbt to import key data health information, such as when data was last refreshed, when data quality checks passed, and more. With dbt Exposures, data teams get automatic context into how and where models are used so they can prioritize work to promote data quality and trigger related Tableau content to refresh.
  3. Tableau dashboards: Analysts can build visualizations based on imported dbt data models and seamlessly combine external data with metrics data for deeper analysis.
  4. Tableau Pulse: Tableau Pulse metrics can be directly connected to dbt models and metrics.
  5. Collaboration: Via Tableau apps in Slack and Teams, a user will have a search experience that bridges both platforms, allowing for the one-click creation of new Pulse metrics based off of dbt metrics, all in the flow of work.

Our partnership benefits any data-driven individual, at any level, in any organization:

  • Business users can trust the insights they receive from Tableau, predicated on the dbt governance layer, increasing self-service capabilities.
  • Data professionals no longer have to replicate their dbt semantic layer into Tableau’s own version of a semantic layer, and can reduce time spent servicing ad-hoc requests on data quality, freshness, access, and creating new analytics and metrics for business users and key decision makers.
  • Key stakeholders will increase their overall trust in data in their organization, leading to faster and more confident decision making.

Enabling trust and governance at every step of the analytics process

When viewing Tableau content, such as dashboards, Pulse metrics, or even published data sources, we often have an incomplete view of the underlying data. We can see structured data in the context of the Tableau environment, however, any transformations that occur before the environment are not captured or displayed.

dbt Data Health Tiles, announced GA at Coalesce 2024, provide dashboard viewers an easy way to get information on data quality and freshness checks. From Data Health Tiles embedded in Tableau dashboards, users can navigate to dbt to see the full Exposure, providing even more context about the underlying tables that make up a published data source connection in Tableau.

(Available for Tableau dashboards today.)

A Tableau dashboard showing the dbt Labs integration where Data Health Tiles are embedded and accessible via Tableau

In the future, a dbt-to-Tableau Catalog integration will provide richer descriptions, and surface dbt metadata on quality checks and freshness. Within the UI of a Pulse metric, it will be easy to find added dbt metadata for more information about the Pulse metric lineage.

Bringing the dbt Metric Layer to Tableau Pulse and back

Through our one-click Publish-to-Pulse integration, users in dbt can instantly create a trusted Pulse Metric that is always up to date, and always connected to the dbt Metrics Layer. In an Exposure in dbt, it will be easy for an analytics engineer to explore the model by using “Explore in Tableau” functionality in an ephemeral Tableau interface before publishing and certifying content in dbt, all without having to leave their flow of work.

(Coming soon!)

dbt Cloud Semantic Layer Connector to Tableau Cloud

The dbt Semantic Layer connector is available to Tableau Server customers in the Tableau Exchange today. As the modern data landscape continues to evolve, organizations are managing increasingly complex data ecosystems. We’ve heard from Tableau and dbt customers on how analytics teams struggle to maintain consistency between their dbt-defined metrics and their Tableau Cloud visualizations. This disconnect has forced data teams to rebuild semantic definitions across multiple platforms, creating friction for analytics engineers, analysts and business users.

The dbt Semantic Layer connector is now available in Tableau Cloud as part of Tableau 2025.2. For organizations leveraging both the dbt platform and Tableau Cloud, this integration eliminates a critical gap in the modern data stack. Data teams can now seamlessly connect their dbt semantic layer directly to Tableau Cloud, ensuring trusted analytics built on trusted and centrally defined metrics.

Empowering Every Analytics Role
  • Business Users gain access to trusted, self-service analytics capabilities in the cloud, with the confidence that their numbers are based on centralized business logic.
  • Data Analysts can focus on generating insights rather than reconciling or duplicating metric definitions in Tableau and the dbt platform. They can create and share analytics insights, knowing that metrics are standardized across multiple platforms and departments.
  • Analytics Engineers no longer need to maintain separate semantic layers for cloud and on-premises deployments. They can define metrics once in dbt and trust that those definitions are properly and consistently delegated to Tableau.
Enabling Trusted Self-Service Analytics in the Cloud

Tableau Cloud support for the connector will enable:

  • Enhanced Data source and Dashboard Authoring: Published data sources and dashboards can be authored in Desktop or natively in web authoring
  • Delegated aggregation: Published data sources in Tableau Cloud can be connected live to the entire semantic layer model or saved queries, delegating metric definition and aggregation to the dbt platform
  • Tableau Pulse Integration (coming soon): Pulse metrics will soon be able to directly connect to dbt models and metrics
Technical Implementation and Getting Started

The dbt Semantic Layer connector and driver are available for all Tableau Cloud customers. 

To get started with the connector in Tableau Cloud, you will need:

  • A dbt Cloud Team or Enterprise account with dbt version 1.6 or higher
  • The dbt Semantic Layer configured in dbt
  • PAT, hostname, environment ID from dbt
  • Driver and taco file installed on personal machines for Tableau Desktop

Collaborate in the flow of work

Making dbt metrics and Tableau Pulse Metrics available in the flow of your work should be easy and trusted. A Slack or Teams user will be able to use the Tableau app to surface Pulse Metrics, search for unpublished dbt metrics, and interact with the app to one-click publish dbt metrics to Pulse.

(Coming soon!)

Looking Ahead: The Future of dbt and Tableau 

Enabling the dbt Semantic Layer in Tableau Cloud is the first step to standardizing external metrics in Tableau. We are excited to bring deeper integrations, such as an improved UI for dashboard authoring when using delegated metrics, ability to combine external metrics with Tableau defined metrics, or even other relational data sources. 

The dbt Labs and Tableau partnership is focused on enabling trusted data and analytics, with any data platform of choice. As we continue to partner with dbt Labs, we're excited to see how our community will leverage both platforms to accelerate their analytics initiatives and drive more confident decision-making across their business.

How do I get started?

To get started, enable the dbt Cloud connector on your Tableau Server today, and set up Data Health Tiles as web page elements in your dashboards.

We’re excited to add more functionality to our integration—stay tuned for future product announcements!