This is expected behavior. For a datasource such as Excel, Tableau treats every column in it either as a Dimension or Measure. You will have to probably prepare / massage the data at the source before you consume in Tableau. How big / small the data file is really?
The data file isn't all that big. It's just long. It's tracking machine states from Jan 2016 to today so about 9 months of daily data.
This is only across about 15 machines though.
Hmm...I'm going to have to think about this. I really want to be able to show:
- Daily status of the machines (Functioning/Degraded/Blocked) on a daily basis
- Status of the machines on a sprint boundary
- Status of the machines on a quarterly boundary
Essentially, the argument I would be making is: "If machine states are degraded, then progress is impacted. Our highest priority is to make sure machine states are not impacted by the introduction of new code".
Right now, I get the argument that the team owning the machines needs to do more to work around problems but their primary purpose is not to workaround problems but validate scenario functionality.
Any other suggestions would be helpful. Thanks for responding with the information you have though. It does clear up the behavior question on Tableau/Excel