MoreRSS

site iconTableau BlogModify

Everything we do is driven by our mission to help people see and understand data.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of Tableau Blog

How to Improve Data Readiness for Tableau Cloud

2025-01-10 05:59:16

Candice Vu Francis Dejonckheere Senior Principal Success Manager

How often have you heard, “Without good data, you won’t have good AI?” It sounds simple, but studies show that “ensuring scalable, reliable data” is one of the top analytics challenges organizations face according to the Salesforce State of Data and Analytics Report. This challenge has significant downstream effects. Organizations that cannot confidently confirm they have reliable data cannot confidently adopt AI or self-service capabilities. To scale responsibly around AI and self-service-driven capabilities, organizations need a better process to explore, improve, and validate the data sources that drive them.

This blog provides the tools to monitor your Tableau data sources more effectively and a framework to drive iterative improvements, so you can be more confident in your data.

The framework consists of four steps:

  1. Monitor your data
  2. Identify a meaningful objective
  3. Create and promote data sources
  4. Enable your users

Step 1: Monitor your data sources

Improving data confidence starts with a baseline understanding of your data. Consider one of your top-level Tableau Projects, and ask yourself:

  • How many data sources are in that project?
  • How do users distinguish trusted, governed sources from experimental ones?
  • Is there supporting documentation to help new users understand the data?

If you can’t answer these questions, scaling for AI or self-service discovery may make you nervous. The first step to alleviate these concerns is to monitor and explore your data sources. Any Tableau Cloud Admin can do so by connecting the Data Source Manager Accelerator to your Tableau Cloud Site, and then sharing for other users to see.

Start by reviewing the “Know your data sources” section to see how users can interact with their data.

published data source makes a single source of data accessible to permissioned users, enabling self-service and Tableau Pulse capabilities. Certified data sources go a step further, assuring users that the data has passed organizational governance standards and can be trusted. Embedded data sources, however, are embedded within a Tableau workbook, making them inaccessible for exploration outside of the dashboard through web-authoring or Tableau Pulse. Monitoring and managing your data assets along these categories will help you assess your readiness to embrace Tableau’s AI and self-service capabilities in your Tableau Sites.

Step 2: Select a meaningful objective

Instead of overwhelming yourself by attempting to improve all of your data at once, focus on achieving short-term wins on meaningful objectives. To identify these use cases, take any potential initiative, a digital whiteboard, and answer the following questions for the initiative:

The initiatives you choose to focus on should confirm all of the above. It should impact relevant organizational goals, use data that can feasibly be brought into Tableau, and demonstrate that analytics can indeed drive meaningful progress. Once confirmed, you have a worthy use case to focus on. Create a designated Tableau Project and follow the next step to equip users with explorable data assets.

Learn more about creating value maps to help select meaningful objections on our website.

Step 3: Create and promote data assets

You’ve listed the necessary data sources for your initiative in step 2 above, now make sure they are available in your Tableau Site by referring to the Data Source Manager Accelerator. 

Here, all of your data sources are categorized as Certified, Published, or Embedded, as discussed in Step 1. Filter the Accelerator for your Tableau Project, and explore the following sections of the dashboard to learn more about your data.

Embedded Data Sources (bottom right)

On the bottom right, you see all your embedded data sources, sorted by repetitions. Remember, these data sources are embedded within dashboards, meaning they cannot be explored with Tableau Pulse or web-authoring. Review this section to:

  1. Consider publishing highly-used embedded data sources to enable features and consistency
  2. Redirect embedded dashboard connections to their Published/Certified equivalent, if it exists 

Published Data (bottom left)

On the bottom left, you see all your Published data sources, sorted by user activity. While these data sources enable enhanced Tableau features, they lack any indication of whether the data has passed governance checks and can be trusted. Review this section to:

  1. Consider retiring unused data sources
  2. Consider certifying highly-used Published data sources
  3. For certification candidates, collaborate with data owners and experts to complete the certification checklist

Learn more about how to publish data sources on our website.

Certification Checklist (hover over the checklist on the dashboard)

The certification checklist is your organization’s process for vetting data sources, which ensures users understand why they can trust their data. This checklist typically includes a variety of checks to confirm accuracy of the data and the presence of supporting documentation. While we can’t directly monitor data accuracy with this Accelerator, we can track other variables to help data owners organize their certification efforts. Hover over the checklist on the dashboard to view the following six checklist items for each data source.

  • Licensed Owner: Confirms the data owner is still a licensed Tableau user;
  • Refreshed Last 30 days: Confirms the data source has been refreshed in the last month;
  • Description: Confirms the data source description includes specific background about the data and guides users where to go with questions (format customizable by you);
  • Data Source Type: Confirms the data comes from preferred sources and databases;
  • Title Format: Confirms the title is legible without special characters;
  • Accessed Last 90 days Confirms users still use the data;


Note: This is a starting point. Customize your checklist items by editing the “Score_” calculations in the Accelerator or adding new ones.

Once a data expert vouches for the accuracy of the data, the data owner can use this Accelerator to ensure the necessary supporting documentation is included. These combined checks, done inside and outside of this dashboard, will ensure data sources are ready to receive the Certification label in Tableau.

Certified Data (top)

At the top, you’ll find all data sources given the Certification label on Tableau Cloud, sorted by user activity and an icon confirming compliance with the checklist.

These are your prized data assets. They have been reviewed by a data expert to ensure accuracy, supplemented with supporting resources to address user questions, and have passed organizational governance standards to ensure trust and consistency. Review this section to:

  1. Confirm data adoption
  2. Ensure compliance with the certification checklist
  3. Consider retiring unused data
  4. Identify missing data sources from step 2

Learn more about how to certify data sources on our website. 

Step 4: Enable your users

With a defined symbol to distinguish trusted data and a credible process to back it up, it’s time to enable your users. Start by informing your users via discussion forums, user group meetings, and other channels that they now have access to new, high-quality data sources. Next, teach them best practices on how to validate and explore these data assets by guiding them to free Tableau Modules on Trailhead. Lastly, regularly check the Data Source Manager Accelerator to ensure data adoption levels meet expectations. 

A framework to improve data readiness

By implementing our four-step framework for a meaningful objective, you have successfully converted the challenge of “ensuring scalable, reliable data” into an ongoing discussion. This dialogue unlocks new features, improves users’ speed-to-insight, and creates a catalog of quality data ready to inform AI and self-service at scale across your business. Now, identify the next meaningful initiative, and repeat the process. Eventually, you’ll flip the narrative and be able to say, “we have great data, so we know we have great AI.”

Download the Data Source Manager for Tableau Cloud Accelerator on the Tableau Exchange to help monitor best practices. 

January 11, 2025

Tableau Community-Driven Innovation Recap: Product Ideas Released in 2024

2024-12-21 06:44:13

Danika Harrod Dan Jewett Senior Vice President of Product Management

As 2024 draws to a close, we want to express our heartfelt gratitude to the incredible Tableau Community. Your boundless creativity and innovative ideas have been a constant source of inspiration, driving us to develop products that enhance and enrich your experience. It’s your feedback and collaboration that help us push the boundaries of what’s possible in data visualization and analytics.

Thank you DataFam for inspiring, voting for, and sharing with us about the features released in 2024! This year we released features that address 150+ ideas with 46,000+ points in the Forums, across Tableau Desktop, Online, Tableau Pulse, and more. And we made progress on providing visibility into the ideas on the Tableau Community Forums: This year we have updated more than 2,900 ideas.

Here are some of the features inspired by ideas you submitted on the forums:

Viz Extensions (2024.2) & Table Viz Extension (2024.3)

See this idea on Sankey Charts and Thierry Jakercevic’s idea on tables

You asked, and we delivered—unlock your creativity and speed to insight with the latest updates to table formatting in Tableau. These enhancements, including resizing, sorting, and conditional formatting, are designed to help you visualize and present your data more efficiently and clearly. Together, these features not only streamline your workflow but also enhance the overall clarity and attention to important data points and anomalies.

For developers, Viz Extensions provide a unique opportunity to expand Tableau’s analytics capabilities with custom visualizations. These can be developed using any JavaScript visualization library and easily integrated into Tableau through our Extensions API. This makes it straightforward to deliver personalized visualization tools that can be scaled to every analyst in your organization and shared with the wider Tableau Community.

 

"Tables in Tableau have always been in the forefront of user feedback. It's a critical means for how our users deliver analytics, and stands to have more dedicated usability improvements. The community advocacy around these themes helped us build the business case with leadership that we should make a more dedicated approach as a Viz Extension, delivering to a variety of feedback on how users specifically wanted to design and format tables inside Tableau. It provided us the quantitation such that we could discuss usability in Tableau with decision makers, and help them rationalize that the investment to this experience is a data driven decision. The ideas forum continues to be how we think about the key issues we want to explore in our long term roadmap and specifically how we can develop the foundations that allow for us to be more responsive to the feedback as we look ahead."

We encourage developers to join our DataDev Developer community to share feedback and ideas for new extensions. Developers can learn more about developing new Viz Extensions by learning more about the Tableau Extension API.

Accessibility: Google Fonts in Tableau Cloud (2024.3)

Idea submitted by stelloprint dev

You spoke, and we listened! Support for Google fonts is a significant win for our front-end developers. With these web safe fonts, they can be used without having to install them on all client machines. This feature helps provide a consistent experience across all viewers' browsers, while maintaining the intended look and feel of your data visualizations and dashboards.

 

"Thanks to our community’s input and ideas regarding our product support for web safe fonts. From their requests and feedback, we were able to prioritize and focus on adding new Google fonts to Tableau Cloud. These ten new fonts add new variety and options for our Cloud customers while also being web safe fonts. This adds a huge amount of value for customers needing a consistent representation of their data across browsers."

Multi-fact Relationships (2024.2)

Idea submitted by Cristhian De la Hoz

The Tableau Community was ecstatic with the release of Multi-fact Relationships! This data modeling capability allows you to combine multiple fact tables and gain insights from complex data models. With Multi-fact Relationships, you can easily connect different data sources and analyze them together, without having to blend them. This is especially useful when you have dimensions that share the same meaning across tables, or when analyzing data that goes through different stages. To use Multi-fact Relationships, simply identify your base tables and create relationships between them and shared dimension tables. With this powerful tool, you can unlock new types of analysis and accelerate your time to insights. 

Say goodbye to linking fields - easily blend data sources with shared tables and trees.

 

"The Multi-Fact Relationship update to the Tableau relationship data model brings together 17 different community ideas and discussion threads into a single feature. According to one customer, "MFR is the most consequential update to analytics by Tableau in the past three years!" We accomplished this by working hand-in-hand with our community to understand their analytic scenarios, iterate through multiple designs that changed the key branding elements of Tableau, and collaborate with partners to help the broader community to build more flexible data models that can answer more sophisticated analytic questions; so they can have fewer published data sources that can power more vizzes and dashboards. We have a tight feedback loop between community and the product team to evolve Tableau as the #1 analytics tool-of-choice for customers."

How the Tableau Community can help us innovate

Since 2011, the Forums have seen over 11,000 ideas contributed by nearly 4,000 members of our community. To everyone who's posted ideas, voted, offered feedback, instructed on new features, or shared their experiences: thank you!

In order to create a better experience for the community sharing ideas moving forward, at the end of this year, we transitioned our platform to the Salesforce IdeaExchange.

The IdeaExchange will host all of the ideas that are already on the Tableau forums and offer functionalities the old platform is lacking, including a community-driven prioritization system, the ability for the community to propose merging ideas, and an activity page to monitor the status of ideas you care about. A great way to learn about the exchange (besides exploring it yourself) is through this Trailhead.

In addition to sharing your ideas on the IdeaExchange, we’ve compiled a list of ways for you to get involved and help us shape our products below. We’re excited to hear from you!

  • Submit ideas (or vote up ideas) on the Salesforce Idea Exchange
  • Join our Tableau Research Program, where you can provide feedback on UX, design, and pricing & packaging directly to our product teams.
  • Join the Tableau Developer program to provide feedback on our APIs and work with our team on building the solutions on the Tableau platform.
  • Come to Tableau Conference 2025 and interact with our devs at Tableau Labs.

Thank you for your unwavering support and for being an integral part of our journey. We can’t wait to continue to deliver features you'll love in the year ahead!

December 23, 2024

Learn the Basics of Well-Structured Data

2024-12-06 07:11:37

Danika Harrod Sam Priddy Sr. Audience Marketing Manager, Tableau

We define Data Literacy as the range of capabilities that describe someone’s ability to explore, understand, and communicate with data. These days, when we have questions, there’s plenty of data around for us to find the answers, and having some data literacy will make that possible. Unfortunately, it’s not always structured in a way that makes it easy to use, which is why we’ll spend time exploring the different characteristics of data, the ways it can be structured, and why. Lastly, we’ll touch on how to restructure the data to make it work better for your needs.

Know what to look for in data

There are lots of words that are used to describe the details of an item: traits, characteristics, specifications, descriptions. All of them help us find the information we need to determine if something is a good fit for what we’re doing. Recently, I needed to replace a broken part on my bike. For the bike to work right, the replacement I bought had to match the specifications of the original and the only way to be sure of that was to read the descriptions of the parts I was considering buying. It’s similar when looking at data: for us to get the answers we need, the data has to have the traits that make finding our answer possible, and the only way to be sure it does is to know what qualities it needs to be a good fit.

The first trait is High Volume. More (relevant) data gives us a much more reliable result. If one person tells you a restaurant is good, you might believe them. However, if 100 people tell you a restaurant is good, it’d be hard to believe otherwise. The same goes for our professional data - more records telling us something is good or bad, gives us higher confidence in our results.

The second trait is Historical. If I visited Seattle for the first time and it was sunny the first day of my visit, it would be naive of me to say that it’s always sunny in Seattle. I’d need data stretching back quite a while to feel confident in making a statement like this and it’s no different for your analysis. The more historical data you have, the more likely you are to accurately anticipate what’s to come in the future.

Third we have Detail. Recently, the water pressure in my kitchen sink dropped significantly, forcing me to troubleshoot the situation. First I checked the water supply, then the valves under the sink, then the line going to the faucet, and finally the faucet itself getting more detailed with each dead end. I finally discovered that it was debris in the faucet blocking the water flow. It’s no different than exploring our data to understand why something is happening in our business or lives. The more detail we have in our data, the more depth and accuracy we can get in our answers and insight which we can confidently use to make the best decisions

Lastly is Consistency and Standardization. Think about the last time you were in a conversation with a friend and you were using the same words but had different understandings of what they meant. You end up with miscommunications that potentially cause issues down the road. Similarly, a lack of consistency in your data: field names, date formats, number formats, and a lack of standardization in how data is recorded results in numbers that aren’t an accurate representation of what actually happened. Imagine that you work at a cell phone store and you return five iPhones. When typing in what they were you type: iPHONE5, I phone5, iphone 5, I Phone 5, and Iphone Five. You then try to report on how many iPhone 5’s you returned and instead of your system showing five iPhone 5’s it shows one return of five different iPhones because it’s impossible for your system to know they are the same thing. Standardizing on a naming convention, or date format, or currency prevents situations like this and provides the consistency needed to get an accurate picture of your data.

Know what well-structured data looks like

It’s bound to happen. At some point you’ll come up against data that just feels difficult to work with. You may not know why, but I’d bet that it comes down to the data not being well-structured. It’s a bit of a loaded statement as the meaning of “well-structured” can differ based on the use case, and the systems/applications you’re using, however there are some basic data structure principles that should improve your experience when working with data.

First, you need to figure out how it’ll be used. Are you going to use software to visualize the data? Or are you presenting the data in a spreadsheet? Or are you preparing the data for someone else? Each method will change the way the data needs to be structured. There are exceptions to every rule, but once you know the way it’ll be used, these tips are a good rule of thumb to get you started:

Columns and rows make up the core of our structure. Just like columns in buildings, columns in data are the vertical sections of our data structure stretching up and down the page. In the example below, “Date” would be a column of information along with 7 other columns. Rows on the other hand are the horizontal sections of our data structure spanning left to right across the page. In the example below there are 12 rows of data.

If you are using software like Tableau to visualize your data, each row should be an instance or an event, and each column should represent a detail about that instance or event. In the image below, the first row (or event) is a commute. The details (columns) are the date, the time of day, the temperature, the precipitation type etc…Each row after the first row is an additional event where we collect the same types of details that we did in the row (event) above: 

This allows the software to easily search through your data when trying to retrieve the answers to your questions. The downside to this format is that it’s hard for humans to consume which is why it’s not uncommon see data structured like this when people don’t have access to data visualization software: 

The format above is human readable but requires you to manually calculate totals and aggregations, and manipulate structure. This structure will also continue to grow wider over time as new data is added, making it much harder to visually analyze without lots of scrolling. Lastly, having individual columns for each date or having multiple types of values within a single column makes it almost impossible for an application to help you analyze your data.

Want to learn more about well-structured data? Check out this Trailhead module today!

What to do if your data is not well-structured

Rather than let these issues prevent us from getting answers, there are a few simple things we can do to solve a good amount of our data problems.

Go back to the source. Restructuring your data can help your immediate problem but doesn’t solve the larger issue of how it got that way in the first place. It also creates additional data sources that may not align to your corporate standards which can be a problem when comparing results across your business. It’s always the best decision to start with your data team to see what’s possible. Remember, for them to make the right changes, they’ll need plenty of context so come prepared to explain what you are hoping to get from the data.

Make the change. In a perfect world your data team would help you fix poorly structured data, but that isn’t always possible, and not everyone has a data team. When it’s not, there are a couple of common structure issues that you can address on your own.

Split ‘em Up! The first issue is when fields that should be separate appear as a single line of information. For example if the fields [Name] and [Customer ID] showed up together as:

“SamPriddy CN1357WA” and needed to be separate fields instead, you’d perform what’s called a “Split”. A split gives the user the ability to choose which “delimiter” gets used to decide where to break the field apart. A delimiter is just the name for the character in the line of information that the user chooses to use as the breaking point. So in the Case of the example above where the field says: “SamPriddy CN1357WA” we could use the “Space” character as our breaking point, or our delimiter, since we wanted to split the name apart from the customer ID. This functionality is built right into Tableau Desktop and Tableau Prep with no coding required. Want to learn how? Check out this help article for more information.

Pivot, Pivot. The second issue relates back to the commute data example above. In its human readable form the data is wide, having a column for each date, with a mix of different types of values filling in the column below. You’d use a “Pivot” to change the data from being wide to being long instead.

This is by no means the only way to do a pivot but an example of the concept and what it means. Here’s a simplified example to illustrate the process:

The data below is formatted for human consumption. We want to use this in Tableau though and this format won’t work. We’ll need to do a pivot to get it into the appropriate structure. 

First we’ll make space by shifting our data out of the way. We’ll then make a column header for our fields that would be more appropriate as a category. In the example below we’ll copy/paste the “Product Type” fields down column A for the amount of values that showed up across their row. Because “Appliances” had four values (1 for each quarter) we’ll copy/paste “Appliances” until we have four rows of this product type. Similarly, we’ll pivot the individual quarters down and to the left, with Column B being our pivot point, changing the data from being horizontal, to vertical. We’ll then rename the column “Quarter” as it now contains all quarters, not just one specific quarter. 

Next, we continue the process until we have all of our product types duplicated for the amount of records they originally showed (4 each in this example). This is what we mean when we say the data gets “longer”. We haven’t added any data, just changed the layout. We’ll also continue copy/pasting the quarters down our “Quarter” column until each product type has a specific quarter next to it. 

Once that’s done, all that’s left to do is to fill in our values. Remember that originally Q1-Q4 were horizontal so we’ll move our values into the empty column next to the quarter they belong with. We’ll repeat this process until all of our values are where they belong in their new column which we’ll name “Values.” Depending on your data, the name should fit whatever best fits the values. 

After the final column is populated we’re ready to connect to this data to answer some questions. 

This example is shown to illustrate the concept, however it’s more likely that you’ll do a pivot in a tool like Tableau Desktop, or a data preparation tool like Tableau Prep. In those cases, the tools have functionality that eliminates the manual work shown in the example above, but ultimately gives you the same results. Want to learn how to do pivots in Tableau? Check out this help article for in-depth instructions.

What next?

Be a sponge, and practice, practice, practice! The best thing you can do is continue to learn, and get hands on as much as possible. Check out our Data Skills homepage for access to tons of Data Literacy resources. Visit Tableau Public to download and use Tableau Desktop for free as well as free learning and sample data sets! If you prefer webinars, check out the Datafam Discovery Kit for more resources and updates on upcoming live webinars. 

December 6, 2024

How to Use the Cloud Migration App for Small Deployments

2024-11-07 13:31:52

Spencer Czapiewski Adiascar Cisneros Manager, Product Management, Tableau

The Cloud Migration App for Small Deployments is an open-source application based on the Tableau Migration SDK that allows administrators to copy content and users from Tableau Server to their Tableau Cloud site with ease. Created to simplify the migration process for Tableau admins with smaller deployments, this tool supports essential content transfers such as workbooks, users, groups, and more, providing a straightforward solution to make the transition to Tableau Cloud smoother.

The Cloud Migration App for Small Deployments is built to meet core migration needs in the following areas:

  • Authentication — The app supports only local authentication (Server) to Tab ID + MFA (Tableau Cloud) for migrated users, helping keep the process secure and manageable.
  • Sites — To ensure a focused, efficient migration, the app transfers content for one site at a time.
  • Data sources — All data sources supported by Tableau Cloud are eligible, including those integrated through Tableau Bridge (note that Bridge must be installed and configured separately by the user).
  • Migrated content — The app transfers users, groups, projects, data sources, workbooks, permissions, and custom views, providing a comprehensive content migration experience.
  • Tableau Server version — While the app can migrate from recent Server versions, it is recommended to have the latest version installed.
  • Operating system — The app runs on Microsoft Windows and MacOS, making it easy to integrate into most IT environments.
  • Recommended workload — Though there is no strict limit, the app is designed to handle migrations of around 100 workbooks smoothly. This is a helpful benchmark for implementations with larger workloads to consider alternative or supplementary tools.

For more details on how to best prepare for your migration, see SDK pre-migration checklist.

1. Ensure the app fits your needs

The Cloud Migration App for Small Deployments is ideal for specific use cases. To experience a smooth migration, we recommend administrators make sure their needs are addressed by the tool.

The following are situations in which the tool may not be the best fit:

  • Data management — Tableau Prep Flows and Virtual Connections are not transferred by the app. Administrators will need to migrate these manually.
  • Embedded content — The app migrates Tableau content only. If your Tableau Server content is embedded within other applications, those embedded integrations will require separate adjustments post-migration.
  • Authentication — The tool does not support SAML or OIDC authentication, meaning advanced authentication configurations must be set up separately on Tableau Cloud.
  • Data sources — The app does not support cube data sources or large Hyper extracts (over 25GB). The app supports data sources compatible with Tableau Cloud, so ensure your data fits within these requirements.

If you’re a large organization or have additional needs not supported by the app, refer to the Tableau Cloud Migration page to learn more about additional resources available to support your migration.

2. Set up your migration

After confirming the Cloud Migration App for Small Deployments meets your needs, you’ll only need a personal system (Windows or MacOS) to run it. To download the app, follow the instructions on the app's page.

You’ll start by providing the web addresses (URLs) of your Tableau Server and Tableau Cloud Site, along with your authentication credentials (Personal Access Token Name and Secret).

You also have the option to specify mappings between Tableau Server usernames and Tableau Cloud usernames. This is helpful since Tableau Cloud, unlike Tableau Server, requires usernames in the format name@domain. A file with username mappings can be used, or the app can apply a default domain name to any usernames not listed in a mapping file.

3. Migrate your content

To initiate the migration, simply click the Run Migration button. A window below the button displays progress in real time, and a log file will be generated for reference. Once migration is complete, all selected content should be available in your Tableau Cloud site. We highly recommend reviewing all workbooks and content post-migration to confirm everything operates as expected.

For administrators who may have large content sets or limited time to manually verify the migration, our trusted partner Wiiisdom offers an automated solution to verify workbooks quickly and efficiently.

How to get started

We’re excited to make the transition to Tableau Cloud more accessible than ever with the Cloud Migration App for Small Deployments—and we are here to support you every step of the way.

If you’re ready to get started, contact your account representative or visit the app's page. We hope you enjoy a seamless migration and we look forward to seeing what you create in the cloud!

November 14, 2024

How to Use VizQL Data Service in Your Tableau Cloud Site

2024-10-16 00:09:42

Spencer Czapiewski Joe Chirilov Director, Tableau Product Management Megan Kelly Associate Product Marketing Manager

Since the launch of the VizQL Data Service Developer Preview, the DataDev community has shown impressive creativity and innovation. The ability to programmatically query Tableau’s published data sources has empowered developers to simplify workflows, integrate custom applications, automate processes efficiently, and more. Thanks to your valuable feedback and the real-world use cases we’ve seen, we're excited to expand the developer preview to all Tableau Cloud customers.

Are you on-premises with Tableau Server? We haven’t forgotten about you! The developer team is hard at work building VizQL Data Service for general availability for both Tableau Cloud and on-premises Server. We encourage you to join the Developer Preview via our beta site and begin exploring the benefits of VizQL Data Service for yourself.

Your data, your site, your way

If you’ve been waiting to test the VizQL Data Service API with your own data sources in Tableau Cloud, now is the time! As part of the 2024.3 release, you can now experiment with the VizQL Data Service API using your own data within your Tableau Cloud site, while it remains in beta. This expansion moves beyond the limitations of the initial beta environment, allowing more users to explore its potential. Whether you’re ‌integrating Tableau data into customer-facing applications or automating critical workflows, this expanded access allows you to use your real, published data within your Tableau Cloud site.

The service remains in beta, and is off by default—meaning you’ll need to grant API Access permissions at the data source level to turn on programmatic queries. Let’s get more into that!

Introducing API Access permissions

Available now in Tableau Cloud as part of the 24.3 release, API Access permissions offer more control over who can query published data sources via the API.

As developers explored the capabilities of VizQL Data Service, the need for more detailed control over data access became clear. Many of you wanted the ability to manage who could programmatically query data sources for various administrative reasons. In response, we’ve introduced API Access permissions, allowing you to assign specific users or groups the ability to query data sources directly within the data source permissions dialog.

While users still need the existing “view” ability to access a data source, this new permission provides an additional layer of control. For example, an admin might want to manage consumption costs on back-end systems (e.g. Snowflake) or control load and usage on on-premises servers.

Get started easily with the Postman collection

In addition to the new permissions feature, we are making it easier to access and work with our developer tools. To help you get up and running with the VizQL Data Service API, we have created a Postman Collection. This resource is designed to simplify testing and interacting with the API, giving you a straightforward way to start querying your data programmatically. You can find this collection in our GitHub repository.

Try VizQL Data Service

With the latest updates, Tableau Cloud users can start using API Access permissions as part of the 2024.3 release. Head over to your Tableau Cloud site and start exploring how these new capabilities can enhance your workflows and give you more control over data access. And don't forget to join the VizQL Data Service Developer Preview to share your feedback!

October 16, 2024

What to Know About Tableau Einstein: FAQ

2024-10-09 05:14:27

Spencer Czapiewski Southard Jones Chief Product Officer, Tableau

Since the announcement of Tableau Einstein, we've been thrilled by all the enthusiasm and curiosity from our community, customers, and partners. Your excitement is palpable, and we're grateful for the outpouring of interest and feedback. As with any new innovation, we also understand that there are still many questions you have about the new platform, what it means for your organizations, the community, and careers.

We've gathered some of the most common questions about Tableau Einstein from our community to help provide some clarity—and we plan to update this post as more information becomes available. We appreciate you for sharing what you love about Tableau and partnering with us to continually improve the product for everyone.

What is Tableau Einstein and why now?

Tableau Einstein is a reimagined analytics experience: an open, composable, enterprise-grade data and analytics platform built on the Salesforce Platform and deeply integrated with Agentforce, a suite of customizable AI agents and tools. In today’s autonomous revolution, we have a unique opportunity to bridge the gap between siloed data and everyday workflows where people act on insights. This means we can harness AI to help tackle the most common and pressing data and analytics challenges—including fragmented data landscapes, a lack of trust in data, overlooked insights, and the reusability of analytics assets—in one platform.

Learn more about how we’re addressing these challenges in our blog, “What is Tableau Einstein?” and see Tableau Einstein in action by watching our keynote at Dreamforce 2024.

What are the benefits of Tableau Einstein being built on the Salesforce Platform?

Building Tableau Einstein natively on the Salesforce Platform allows us to offer a unique and truly unified analytics experience that brings together the best of Salesforce and Tableau technology. Tableau Einstein is the first BI platform with a workflow engine that spans the analytics journey from raw data, to a data lake, a semantic layer, a viz layer, and an action layer. As a completely open and API-first development platform, Tableau Einstein helps accelerate the development of analytics solutions and provides the ability to package up reusable, composable assets and offer them on a marketplace.

Salesforce’s Hyperforce cloud infrastructure provides world-class standards for security, compliance, agility, and scalability, backed by Salesforce’s continued commitment to privacy. With enterprise-grade governance and permissions, your analytics can reach a larger, global audience while meeting regional requirements for security and compliance. With a deep integration to Data Cloud, Tableau Einstein has the most modern data capabilities from zero-copy data ingestion to prep, data management, and semantics. Tableau Einstein is also deeply integrated with Agentforce and provides access to all of the latest no-code/low-code AI advancements and innovations.

Lastly, Salesforce CRM customers benefit from one place to centrally manage user permissions, administration, inherited security, and more.

Is Tableau Einstein valuable for me if I don't use other products from Salesforce?

Yes! Any organization can get more value from their data with Tableau Einstein through capabilities like data unification and orchestration, integrating practical insights where people work, and the ability to build, share, and reuse analytics assets and apps. And you don’t have to be an experienced Salesforce user to use Tableau Einstein.

For example, HR can analyze unstructured data from employee surveys with the help of an AI agent. A retail company can automatically forecast inventory risks, receive proactive alerts and relevant questions for exploring the data, and even take action from within Tableau Einstein, like submitting an inventory request to an ERP system. Because the platform is open and API-first, the possibilities are endless.

If Tableau Einstein is the future of Tableau, what happens to Tableau Cloud, Tableau Server, CRM Analytics, etc.?

Whether you have Tableau Cloud, Tableau Server, CRM Analytics, or any other analytics solution from Salesforce, you and your analytics assets are in good hands and will continue to be invested in. We have robust product roadmaps for each of these existing products today and will deliver on them—see all the latest capabilities we’ve released across the Tableau platform.

Remember that Tableau Einstein is going to be a journey as the vision rolls out; for many use cases today, our existing products will help you achieve your data and analytics goals. As we progress, our investments in Tableau Einstein will continue to drive innovation across these offerings. Additionally, Tableau Einstein is being built to work with existing products so that you can use both of them together as you progress in your journey.

Will existing products and Tableau Einstein work together? What happens to my Tableau assets?

Tableau and Tableau Einstein assets will be different, but compatible. For example, you’ll be able to work with Tableau Semantics from Tableau Einstein in Tableau Desktop. And starting soon you will be able to output your data from Prep Flows to Tableau Einstein's data layer DLOs. We are committed to offering tooling, guidance, and trusted partners to ensure your critical Tableau assets move forward. Helpful resources will be available soon.

Will Tableau Einstein be available for on-premises customers?

Tableau Einstein requires cloud-based deployment. That said, we are dedicated to supporting our existing on-premises customers via enhancements to Tableau Server so they can benefit from new innovations.

Does the Marketplace replace Tableau Public?

Tableau Public is here to stay as our community's free platform to connect and grow by exploring, creating, and sharing data visualizations about everyday topics of personal passion. We are excited to expand Tableau Public's capabilities to enable our community to create, share, and leverage reusable Tableau Einstein components through the new Marketplace.

What guidelines will there be around IP and data privacy for assets offered in the Marketplace?

IP protection and data privacy are critical not only for the marketplace but for the entire community. These are foundational principles in building Tableau Einstein. By developing Tableau Einstein natively on the Salesforce platform, we can leverage the same robust capabilities that independent software vendors (ISVs) use to ensure the security and integrity of their solutions on Salesforce. This makes IP protection and data privacy central to our strategy. Additionally, we are committed to further investing in advanced tools to prevent, detect, and address any potential issues, ensuring ongoing protection for all our users.

Can I use my own data lakehouse or semantic layer if I’ve already invested in another technology?

We understand that many organizations have made deep investments in other tools, and our roadmap includes connecting them to Tableau Einstein to allow you to use them together. Customers can use their own data lakehouse by leveraging Data Cloud connectors—this includes batch and streaming ingestion, zero copy, and bring-your-own-lake capabilities. We are working on the ability to translate an existing semantics layer from another system to reuse in Tableau Einstein. We are also working on opening the Tableau Einstein semantic layer completely, so that it can work with existing semantic layers in the market (like dbt).

Does Tableau Einstein support multiple languages for global teams who want to standardize and scale data sets, models, and workflows across geographies?

Tableau Einstein will support the 18 languages that core Salesforce products support, with initial English support for Insight Services and Packaged App Content, expanding after February 2025. Multi-currency is another area we are looking to support. 

How can I get started with Tableau Einstein today? What are the licensing and pricing options?

To get started with Tableau Einstein, Tableau+ is the best path forward as this offering includes many of the modular components of Tableau Einstein, available today. For all customers, this package includes everything you need to get started right now: Data Cloud for unified data modeling capabilities, Tableau Pulse to empower your teams with personalized metrics and digests, and Tableau Agent (formerly Einstein Copilot for Tableau) to begin your path with generative AI.

As we prepare to deliver licensing and pricing options in the near future, we are working diligently to ensure existing analytics customers have an affordable path to Tableau Einstein. 

October 24, 2024