4 Replies Latest reply on Jan 14, 2019 12:13 PM by Alex Braga

    Calculated fields with blended data

    Alex Braga

      Hi all,


      I need your help. I have two datasets that are being blended:

      - One that shows utilization data, on a granular level to the day (that means I can summarize this utilization by quarter, month, week and day).

      - And I have another dataset for membership counts that is on a month granularity.


      I’d like to calculate utilization by membership per thousand (per quarter, per month, per week and per day). For these time frames, I have a parameter set.


      Formulas to accomplish this would be, for a couple of examples:


      Month Granularity:



      Week Granularity:





      Since the Membership data is on a month granularity, the display of the utilization and membership counts for quarter and month time frames are fine. As you can see from the table below the data populates nicely:


      The problem comes with the weekly and daily display of the data. Below is what happens:



      Those rows in yellow are the problem. I need them populated in order to have the calculations work as intended. I need that 604,484 repeating on those first 3 blank rows. Then I need that 603,071 repeating for the next subsequent 3 blank rows and so on.


      So I think I need some sort of retain statement (this is SAS lingo), or a FIXED LOD (maybe) to have the membership repeat and populate as I increased the date granularity.


      Is this possible? Any ideas on how I can get this done?


      Thank you for your help!

        • 1. Re: Calculated fields with blended data
          Joe Oppelt

          Depends on how your data is set up, and what field you're blending on.


          It's hard to follow all the descriptions and keep all the details straight without an actual workbook.  Any chance you can upload a sample workbook.


          If you're concerned about proprietary data, anonymize your data as demonstrated in the video linked here:


          Video demonstrates how to anonymize your workbook/data


          And I wouldn't need ALL your data.  Just a few months, and just enough rows to demonstrate what you're getting there.

          • 2. Re: Calculated fields with blended data
            Alex Braga

            Hi Joe,


            Thanks for the reply. Example dashboard is attached.


            Based on the parameter “Date Level Selector” I want to calculate number of records (numerator) divided by the membership (denominator), for each row.


            Quarter and monthly membership values are aggregating correctly. The problem comes on the weekly and daily tables, because membership’s granularity is monthly.


            On the weekly table below for example, I need that membership value 604,484 to repeat until it finds the next membership value (603,071), and so on. Same for the daily table.


            Is this doable?


            Thank you in advance for your help!

            • 3. Re: Calculated fields with blended data
              Joe Oppelt

              If I understand what you need, see attached.


              The [index] calc is just there to help me make sure I was addressing the table direction correctly.  When messing with table calcs I almost always do this.


              For now, given the example's sheet layout, the default of TABLE(down) for the table calc is fine.  You may need to tell that calc to "walk" the table a different way if you have a different sheet layout in your actual application.  If you run into that, ping back.


              So what makes this work here is the use of the PREVIOUS_VALUE() function.  It's a pretty cool tool.  It says, "for this calc, pull whatever is in the previous cell of the table into this cell's value."  You can use this in place of [Membership] on the viz as I did in this example.

              1 of 1 people found this helpful
              • 4. Re: Calculated fields with blended data
                Alex Braga

                Amazing work, Joe! Thanks for teaching me this! Truly appreciated!