It looks like the official row count limit per workbook is 15M rows/workbook. That would be the sum of rows for all data connections you have in that workbook: Tableau Public Frequently Asked Questions
That said, I would also consider the 10GB limit and performance (both in uploading such a massive file to Tableau Public and the loading of this workbook). Based on the row count, you are bringing in the data at a very granular level - is this level of detail required? Can this be broken up into multiple workbooks? One technique i like to use is Aggregated Extracts; this is a good strategy for summarizing our extracts down to the granularity we need, rather than the granularity it is stored in. This might also be helpful to keep the extracts more manageable: Tableau Data Extracts - Tips, Tricks and Best Practices | Tableau Software
Thank Wilson for your reply suggestion.
So looking at the link you are talking about, does it matter that I am using "Tableau Desktop" professional version and not the public edition? The link references the 'Tableau Desktop Public Edition'.
Are there any limits to the amount of data I can work with?
Tableau Desktop (Public Edition) is limited to 15,000,000 rows of data per workbook.
Unfortunately, that level of detail is required and it's not something that we can aggregate to get the information.
I also found a posting Tableau Public Goes Premium for Everyone; Expands Access to 10 Million Rows of Data quoted that 'Support for data sets of up to 10 million rows so that anyone can analyze nearly all publicly available datasets for free.'. My questions, all data sets adds up to 10 million or 10 million per data set.
It is still "all data sets adds up to 10 million" - your Tableau Desktop Professional edition won't have issues connecting to the different data connections, but publishing the workbook when the data connections total beyond the threshold will likely result in an error. This ultimately may be a design trade-off you need to consider when you use Tableau public; the limits that are implemented are designed to keep this shared public resource running efficiently and not bogged down by volume of data. If you are managing workbooks with those data volumes, it sounds like we are hitting the limits of what extracts can provide from a data strategy.
I would go back to see if there is a way to duck under this requirement - my impression is that some level of aggregation or filtering is still likely to help out in these use cases (especially if your users don't intend to profile through millions of rows of details) or that splitting the workbook up might be the best solution.