1 Reply Latest reply on Apr 5, 2012 4:58 AM by Jonathan Drummey

    Color based on deviation

    Alex Welch

      I am attempting to capture the scope of where certain points fall on a map based on the standard deviation. What I am struggling with is being able to tell which deviation (1st, 2nd, 3rd) that the value falls into. Currently, to find this I have the below code.


      if([Weight]>=[1st Deviation Low] AND [Weight] <= [1st Deviation High]) THEN

      "1st Deviation"




      Where 1st Deviation LOW and HIGH are calculated similiarily

      HIGH: AVG([Weight])+[Standard Deviation Weight]

      LOW: AVG([Weight])-[Standard Deviation Weight]


      SD of Weight: STDEV([Weight] )


      but I am getting the below error:



      Cannot mix aggregate and non-aggregate arguments to function.


      Is there a way to go about this?

        • 1. Re: Color based on deviation
          Jonathan Drummey

          You're getting the error because the SD of Weight and 1st Deviation calculations are aggregates, while the last calculation (to determine the color) is comparing the non-aggregated Weight measure to the aggregated measures. You'll need to change the use of [Weight] in that statement to an aggregate. If you are sure you are only returning one value for [Weight] in the view, then use ATTR([Weight]), otherwise you'll need to use something like AVG([Weight]). ATTR() is a (very handy) aggregate calc that returns the value if there is only one value for the given field/calc in the dataset in the view, otherwise it returns *, meaning there is more than one value.