How to combine the AI strengths of Tableau and GCP effectively
I have been developing and training in Tableau for most of the last decade, and I often describe it to my learners on the first day of a course as ‘agnostic’; they will discover that they can connect to a database of any type in Tableau’s data source connection screen, regardless of the infrastructure provider, and use that data to create visualisations.
Salesforce’s purchase of Tableau in 2019 has not clipped Tableau’s wings insofar as being compatible with the big three cloud service providers either. As we near the end of a huge year for AI, it is a wise time to consider what your investment in two platforms, Google Cloud (GCP) and Tableau, can provide from AI.
This article will explore two ways the GCP-Tableau relationship can be leveraged to bring the benefits of AI into your organisation.
1: Take care of the heavy stuff before democratising the output
Tableau’s ecosystem is built around broad adoption by non-technical staff, who interact with dashboards, save their version of reports with filters, export the results, or drag and drop fields without code in an ad hoc analysis.
To support them, there is always a small number of tableau creators established at a Tableau site, who curate the data sources, making them more readable, and design reports for others to use to reach their answers quickly.
Similarly, although GCP might be established in an organisation, most employees will not have access to the back end, and only a small number of cloud engineers and data scientists will be in touch with the parts of the cloud platform that provide business answers, like GCP BigQuery.
The main driver behind role demarcation is cost; computational resources in cloud platforms are expensive, and inadvertent or wasteful consumption is typically avoided via restrictive access.
To meet the hunger of employees for more nuanced answers to business challenges, including forecasting and predictions, wise organisations will locate the most complex processes inside their cloud platform, where the tools are optimised for handling volume.
GCP BigQuery is columnar, not row-based, so a well-written query will return results quickly and intelligently cache the predictions, while AI-aided experimentation in GCP Vertex AI will result in efficient models that can be regularly updated as business needs evolve.
Once the data is captured, the heaviest queries run and the predictive models tuned, it’s vital to democratise this information by delivering it to those around the business who can action the recommendations. Connecting Tableau to BigQuery natively allows predictions stored in BigQuery to be visualised in Tableau, then Tableau’s built-in AI features (Explain Data and Pulseiii) can be used to extract insights from forecasts and models in the natural language that business users will understand.
2: Allow staff to use the right tool for the job
It may be tempting for strategic teams to seek out a one-size-fits-all approach to data solutions. However, organisations should create space for data specialists to utilise any platform that suits the problems they are trying to solve. They will be better placed to retain their expertise while fostering an environment of bottom-up innovation.
AI applications in GCP can support with:
- Automated data cleaning.
- Detecting, and handling outliers automatically.
- Standardising inconsistent data formats without requiring manual intervention.
- Interpreting natural language data at the point of arrival.
These techniques can improve the performance of your data pipelines where the volume and velocity of data would likely present computational challenges for traditional on-premises infrastructure.
To this end, an organisation may use a Cloud API for patient healthcare data collection and processing for example, before feeding the data into a large-scale risk model to predict readmissions. These processes occur within the data flow architecture hosted by GCP.
This approach aligns with the ‘right tool for the job’ (RTFTJ) strategy, by leveraging the performance of GCP tools, ensuring low latency along with regulatory compliance through data lineage and model transparency.
The same organisation, with Tableau embedded in a custom portal with real-time Big Query integration, can carry automated insights from the risk model straight to data decision makers.
Using Tableau for data visualisation and communication also aligns with RTFTJ because Tableau prioritises flexibility in visualisation design, ensuring reports can be carefully tuned to effectively communicate complex insights to end users. Moreover, visualisations support interactive exploration, which also ensures compliance with regulatory frameworks because the recommendations can be drilled into and the underlying data explored.
The AI capabilities of GCP and Tableau tend to be complementary rather than overlap. Where GCP excels at the heaviest computational challenges, Tableau shines in making insights available to those who can act. Leverage the capacities of both in tandem to secure the success of your data and AI strategy.
If you're interested in upskilling your organisation's data capabilities, check out our data training solutions.