2 Replies Latest reply on Nov 7, 2017 11:24 AM by Andrew Steed

    Aggregate per-second data at hourly granularity

    Andrew Steed

      I have a worksheet ("Graph Sheet") with per-second data read at different times ("READINGTIMELOCAL"). I need the average of all the datapoints within the hour, represented as one (hourly) number. I have to show the data in a line graph by "HOUR(READINGTIMELOCAL)", but if I just average the data at hourly granularity I get an average of the hourly values, not of the per-second values within the hour.

       

      On the "Current Rates" worksheet I use my other datasource that only has the most recent value, so I can't use that for my graph (it would only ever have a single datapoint).

       

      On both sheets, "Flow Rate" is built from "Fuel" and "Sales" values. I'm expecting "Flow Rate" to match between the two sheets with a value of roughly 10k.

       

      TL;DR: Is there a way to average per-second datapoints within the hour, and show the result as one (hourly) number?

        • 1. Re: Aggregate per-second data at hourly granularity
          Stoyko Kostov

          Hi Andrew,

           

          I couldn't connect to the server that your attached workbook uses, so I may not have understood your question correctly.

           

          If I did, I think a level-of-detail (LOD) expression would help. Define a calculated field like this:

           

          {FIXED(DATETRUNC('second',[Ts])):AVG([Value])}

           

          This will make sure aggregation is always happening per second, regardless of the dimensions in your viz.  You can have a per-hour dimension in your viz, and still get aggregation per second if you drag the calculated value defined above.

           

          I'm attaching a sample viz on a simple text file I created.  I have 2 sheets in it - one aggregates the LOD defined above (Avg Per Second), the other - just the Value column according to the dimensions in the viz (Avg Per Hour).  You will see how they are different.

           

          Hope this helps with your problem.  If you have questions, please continue the thread - I'd be happy to help.

           

          Thank you

          Stoyko

          1 of 1 people found this helpful
          • 2. Re: Aggregate per-second data at hourly granularity
            Andrew Steed

            Thanks very much! I ended up having to alter the SQL query that provided the data, but what you said was part of the solution.