I'm having a really hard time understanding your problem/question. Perhaps you can rephrase?
Please see the 2 screen shots. If you look at the Y axis in each of the shots you will see that the numbers are different, however the only difference between the 2 sheets is thta one includes the dimension "Tier" and the other does not. The totals in both cases should be the same. In ExcludingTier sheet, I would expect to see the total of ~9k(as a can be seen in the IncludingTier chart).
An Area Chart will always sum values visually. You are summing averages. For example:
if you have 10 people with test scores averaging 95% then you have one level of detail at .95.
If you decide to graphically represent test score averages of men and women then you get two values (Hypothetically 91% and 98% distribution isn't relevant) With an area chart you are now representing .91 + .98 getting a total axis of 1.89. You can't ask what you're asking with an area chart and expect to get your desired result. Is there another way you would want to represent multiple averages?
To better describe the problem. If you look at the data, for a given day a "Job_Name" can appear more than once, if this is the case then I would like to take the average time it took to run that job for that day. I would then like to plot for each day the total/sum of the average times it took to run each of the jobs defined in "Job_Names".
In the example data, for the 6th, there are 4 jobs:
A, this was run twice with run times of 3010+3012 so average is 3011
C, this was run once with time of 3120
D this was run once with time of 3040
So the total time take to run all 3 jobs is 9171.
However when you plot this in a graph the total is calculated as the avg(3010+3012+3120+3040)=3045.5 which is not displaying the same information as 9171.
Basically I was to view a graph of total time take to run jobs as opposed to the average run time of all jobs.