
To < 10 secs! A 30x improvement which the customer was absolutely delighted with. We then captured the logic for the CATEGORY and the MEASURE fields into a couple of Tableau calculated fields:Īnd the result? We reduced the time taken to render the dashboard from ~5 mins So, working with the analyst we managed to reverse engineer the relationship between the tables as this: Even more so once we determined that the extract would require frequent intra-day refreshes and was potentially very large. But the customer wanted to maintain up-to-date views of their data so a live connection was preferable. What to do? Well, we could follow basic best practice and make this an extract instead of a live connection so we only run the slow query once – after that everything points to the extract. The result? The whole query needed to run before we could then sub-select off the interim result set and the experience was very slow. It’s not optimal SQL and I’m sure it could be improved if we tried, but the analyst (like many Tableau users) didn’t have the time or the SQL knowledge to do this so simply cut&paste the query into a custom SQL connection. The customer’s database development team had provided the Tableau analyst with several SQL statements like this from their existing reporting solution. Oh my! I think we have found the problem. Let’s take a look inside and see what we’re dealing with. It’s also a bit of a red flag… a live connection using custom SQL isn’t a recommended design pattern. All the logic is wrapped up in a custom SQL query. OK – that explain why there were no calculations in the data source and why it was so simple. When I opened up the very simple data source I saw this: The next layer down is concerned with the data sources and data connections – how you are retrieving the data from the original location. Surprisingly this was extremely straightforward for the report: The analytics layer is concerned with the complexity of calculations and how you combine and manipulate the data (e.g. All the screenshots you will see from here on in reflect the original situation but are totally synthetic.

Note: I’ve created a mock-up of the data, anonymised to avoid any embarrassment for the client. However, I noticed that the main delay when opening the workbook was with “executing query” and not “computing layout” so this probably wasn’t the main issue we needed to fix. That could be part of the problem as big crosstabs are not the most efficient viz type to render. It was a simple design – a single sheet (not even a dashboard!) with a straightforward layout. I’m can’t include a screenshot of the report, but let me describe it for you: With this framework in mind, I started reviewing the workbook. OK – time to start looking for problems, but where to begin? Fortunately, I’m presenting a session at our customer conference next week (designing efficient workbooks: live on stage) and in it I propose a framework for identifying performance issues.īasically the top layers are where you are most likely to find performance issues, so you should start looking at the design of the dashboards and calculations (which are easy to fix) before you start looking at the data model or the underlying platform (where often you have less ability to change things). What do we mean by slowly? Well, for one of the workbooks, just opening it took ~5 mins! When I arrived their Tableau analyst showed me a couple of workbooks that were performing slowly.

I thought it might be helpful to share with you a recent experience where I was helping a customer who was complaining of slow performing workbooks.
