1 Reply Latest reply on Nov 9, 2016 7:39 AM by Walt Reed

# Determining Average Scores in a Table (SQL Data Source)

I have a list of Data where people received points, example table below

DatePersonPoints
1/1/16Jackie1
2/1/16Rob1
2/15/16Jackie3
3/7/16Frank5
6/16/16Frank1
6/16/16Frank1

Let's just pretend those were all the possible dates where people could receive points. I want to create a table like the one shown below:

11/7/16JackieFrank
Daily00
MTD00
YTD.21.4

You can see that Rob is not present (I know how to filter so that's not the problem), but also, the YTD is an average of total number of points achieved by that individual divided  by the number of unique days where they could have gained points (Frank = 7 pts/5 days since 6/16 has two entries). I can total up the points gained by each person, and I can determine how many days people could have received points, however, when I make enter the calculations as:

SUM([Score])/COUNTD([Days])

It only counts the distinct days where the individual received points instead of all possible days. So I end up with something like this.

11/7/16JackieFrank
Daily00
MTD00
YTD13.5

Where Frank is at 3.5 (7 pts/2 days) and Jackie is at 1 (1 pt/1 day). Obviously this inflates their average score dramatically.

Could someone explain what I need to do? I'd really appreciate it.

Thank you!

• ###### 1. Re: Determining Average Scores in a Table (SQL Data Source)

Hi Peter,

Do you have an example workbook you could attach?

Walt