1 of 1 people found this helpful
you are correct that the API on the google side will restrict the amount of data that is transfered. We document thsis ere - Google Analytics
Are you calling a GA for all of the data for the last 3 years at one time or are you running an incremental extract? I pull data for our own community forums site in GA but I use Tableau Server and the incremental extract feature of Tableau to update the data each day.
Do you have GA 360? If so, you shoud be able to link BigQuery to your GA property and access the data this way as well. This topic is discussed in Solution #4 here - https://www.lunametrics.com/blog/2016/07/20/unsampled-google-analytics-data-in-tableau/
I hope that helps.
Indeed, I had been calling GA data for the last 3 years at one time.
I plan on exploring the GA 360 route.
What I ended up doing is using an old saved .twbx file with the 2015-2016 data, unpacking it, pulling the .tde into Tableau Prep, unioning with newer data refreshed in a .hyper file to get the big picture. This is also awesome in the sense that I had a fairly complex blending to an Excel file that now I can do simple joins to
The refresh for new data requires refreshing a Tableau Desktop Data Source, pulling the new .hyper file into Tableau Prep and executing the flow, but that is actually quite painless in comparison to dealing with only the last 2 years of data.
Tableau Prep, although still limited in data connections (no GA yet) and simple in scope, fills a huge gap.
The only problem with this solution is that if I want to start using an attribute that I hadn't pulled in the old 2015-2016 .twbx, then I have completely lost Tableau access to the old data. That's why I am really interested in breaking the direct Google Analytics-to-Tableau dependency with the possibility of GA 360.