The primary table in our BigQuery data set has about 40 columns and approximately 9.6 million rows. It is joined to 7 other metadata tables. Refresh times are running between 30 and 60 seconds. Is this normal or is there something I can do to reduce the refresh time?
30-60 seconds is not that surprising for BigQuery. We believe we are seeing that transferring the result set from BigQuery to Tableau is taking a long time, so anything you can do to filter out records and/or remove columns from your queries is likely to speed things up.