Gantt Chart with Resource Loading Report
The response to POST 1 of this Gantt Chart mini series was swift and it didn’t take long for Vegard to contact me with a query regarding resource loading. Vegard had quickly taken the template and adapted it to include resource hours. The issue existed with getting the total hours for each resource to behave the way he wanted. Naturally, individuals will normally work for a set expected amount of time per day and if tasks are planned which exceed that expectation, it’s obvious that they might not get the work done!
In this post, I’ll start to explain how to adapt the model from the previous post to include resource loading and associated subtotals per individual resource.
The Additional Data
The original dataset provided details relating to start and end dates, who was responsible for the task and some categorisation. However, it didn’t include any expectation of how long, in terms of effort, the task would take. For example, you could have a task with a start date of 1st Aug and an end date of 15th Aug, but an expectation of the work only taking 20 hours.
This is the updated Tasks table in Excel
You’ll notice that I’ve added an additional field called Effort_Hrs. This is an expectation of the physical work commitment to complete the task. I’ve only added values where a task is assigned to a specific individual as the value may be less meaningful for groups of individuals of indeterminate numbers.
You can see the new Effort_Hrs column in the PowerPivot window above. I’ve then created two new calculated columns
Working_Days – With Effort_Hrs in place, I need to understand how many hours per working day that value equates to. Intuitively, to calculate this, I need to know how many working days between the start and end dates.
The data structure now looks like this with the new fields included:
And, the Dates table looks like this:
The Tasks table doesn’t hold any relationship to the other two tables. The reason for this is because PowerPivot doesn’t accommodate anything other than a many-to-one relationship. Therefore, we can’t establish a relationship where one record in our primary table relates to many records in the linked table. As a result, we have to formulate DAX to essentially create a relationship to get information from these other tables to accommodate the fact that we have a start and end date on our record as opposed to individual records for each date.
Quick note – I wondered if I could use USERELATIONSHIP here, but I hit a brick wall. I still need to investigate why and I’ll come back with the answer if I ever find it!
This measure jumps across to the dates table and counts the number of valid dates between the start and end dates on the record in the Tasks table. You’ll notice that it specifically filters items that are to be included using Dates[Include]=1. Therefore, we eliminate non-working days – i.e weekends!
Effort_Per_Working_Day is as simple as it comes. Since we know the amount of effort required and the number of working days to distribute that value over, I can simple divide one into the other, obviously making sure that I accommodate divide-by-zero errors with IFERROR together with some rounding for cosmetic reasons and ease of checking.
I won’t insult anyone’s intelligence by explaining that!
Job Done Right?
It’s reasonable to suggest that we can simply now add a simple SUM([Effort_Per_Working_Day) and job would be done. Unfortunately……no!
We have a saying in the UK that simple goes like this – “Pants”! I’m sure we’re not alone. Numbers repeated everywhere and no association to the dates on my report. The reason for this is that the Dates table delivering our column headers isn’t related to the Tasks table.
As a result, we have to drive that association using a dynamic measure in DAX.
Let’s hit the DAX with a measure I’ve called Hours.
“Now, THAT’s a Spicy Meatball”
OK, don’t worry, I’m not stealing Rob’s thunder on the movie quote thing. I honestly found myself saying that exact quote to myself and then spent another hour trying to remember where it came from!
Check out the beauties in that measure… We’ve got SUMX, SUMMARIZE, COUNTROWS, a couple of FILTERs, MAX and an AVERAGE for good measure. It doesn’t seem logical that it should work!
In truth, I have some concerns over how well this will perform with big datasets. However, for Gantt charts, we’re not really likely to look at a huge dataset. If that’s the case and you’ve got thousands of tasks – good luck delivering your project!
Having said that, I’m not in the business of delivering RAM munching queries where possible so I’ll investigate.
To do this measure justice, I would be cramming and, if I’m honest, I need to take a deep breath to try and explain it. As a result, I’ll leave it for a future post – “cop out Churchy”!
In the meantime, feel free to DOWNLOAD THE UPDATED WORKBOOK HERE and please let me know your thoughts. If you’ve got an alternative solution, please let me know. Otherwise, I’ll be back next week to make an attempt at trying to explain this baby!