Task Type Use Cases in an IT Environment

In training sessions, my students traditionally have a tough time grasping the concept of task types and how they may be applied in the real world.   I often get the question of “Which task type would you recommend for ‘XYZ’?”  To which the response is almost always, “It depends.”  In this post, I figured I’d spend a little time talking about my own personal practices, and see if that sheds some light on task type usage patterns.

Task Types

First off, what are the available task types?

  • Fixed Duration
  • Fixed Work
  • Fixed Units

These task types play the part of variables in the Duration X Units = Work equation that Microsoft Project uses as the basis for a fair number of scheduling calculations.  (Note that the Duration X Units = Work equation should probably be more accurately depicted as Duration X Units (X Person Hours Per Day) = Work.)

The general way to identify which task type you would like to use is to ask yourself the question, “Which of those three variables, Duration, Units, or Work, do I know at this point?  Which variable will not change?”  That is the variable that should be fixed.

Planning the Project

Great in theory, but how does that work in practice?  Let’s take an IT project as an example.  As a consultant, I am often called upon to decompose the project scope and to identify the tasks required to complete the scope.  Since work frequently on similar projects, i.e. Project Server deployments, I might start with a schedule template that already lists the tasks and dependencies.

So my first goal is to work through that schedule list and attach work estimates.  At this point, I am not attaching duration or calculating a real schedule.  All I am doing is working through the list, and adding an estimate of work to each task….80 hours here, 20 hours there, 4 hours there.  Now here’s the trick.  Unless I uncover something new during the planning/estimating process, those hours will not change.  They probably will change when I get into the actual execution of the project, but we haven’t gotten there yet.  Hence, I will set those tasks to Fixed Work.  I don’t want those estimates to change.

Once I’ve gone through a first cut of the schedule and attached hours to the tasks – and set the tasks to Fixed Work, then I’ll go through and play with either the units of allocation of a resource or more likely the duration of a task.  I run through the project and start assessing duration for each of the activities.  Since work is fixed, and I am editing duration, the units field will recalculate – which is what I want.  My duration estimates are usually based on client availability estimates, i.e. I will assume that requirements gathering may take longer or shorter depending on the number of people involved in defining requirements or the complexity of the project.

Again though, unless I uncover something new during this process, the work estimates do not change.

Tracking the Project

Once I’ve reviewed my estimates, added calendars and constraints, then smoothed the overallocations, I tend to flip all of my tasks from Fixed Work to Fixed Units.  “Why?” you ask.  Well, when you get into execution, that’s when I am pretty sure that my work estimates will change routinely.  Similarly, I am pretty sure that my units will stay pretty stable – or at least they will do so for future estimates.  Past estimates will be overridden by the actual hours entered in the system.

So every week, I fill in my timesheet, and my actual hours get transferred back to my plan.  If I book more hours than I expected, units are fixed, and the duration comes in.  If as happens more often, I book less hours than expected, units are still fixed, and the duration gets pushed out.  If, based on partial completion, I change my work estimate, again units are fixed, and the duration is recalculated.

Hence, generally, in an IT environment that tracks hours, I would recommend using Fixed Units task.

So there you have it…..a long winded explanation of why “It depends” is a valid answer.  My recommendation for this specific scenario would be Fixed Work during the estimating and Fixed Units during the execution of the project.

Task Type Use Cases in an IT Environment

Considering a Large Project Server Implementation? Read This….

I’ve been heads down on a large implementation myself for the last couple months, and must’ve missed this one.  Check out this white paper from my colleague and frequent collaborator, Victor Richardson, where he breaks down deployment scenarios for the poorly understood multi-site collection Project Server install.

Link to PDF

Considering a Large Project Server Implementation? Read This….

Surfacing Risk Data in an External List

…so now that we have a custom SQL view developed and that data is surfaced in the form of an External Content Type, let’s add it to our SharePoint site.

Navigate to a SharePoint site, select the option to View All Site Content, and create a new External List.  If you plan to modify this list using SharePoint Designer, you’ll need to provision the list on any site but the main PWA site.


The list will appear as follows:


Add a grouping by Project Name, and you get the following…


Formatting the External List

To get fancy, you can even add conditional formatting to the External List.  To do so, open the list in SharePoint Designer.  Click on one of the cells in the column you would like to format, and choose the option to apply Conditional Formatting. 


Add the appropriate options, hit Save, refresh the page in the browser, and you should see something like this….


Search around the Web, and you should be able to find all sorts of blog postings about how to add icons instead of simple conditional formatting.

Removing the HTML Tags

One thing you may note is that some of the text fields have HTML tags.  This is because the fields are stored as rich text fields and then surfaced as plain text.


There’re probably a couple ways to fix this, including using both SQL and SharePoint Designer functionality.  Until I figure those out however, I simply went to the Project Site and converted the fields in the SharePoint Risk list from rich text to plain text. 


That solved the issue and doesn’t appear to throw errors on publish.  Note that you will have to republish the Risk InfoPath form if you make any changes to the fields.

…and there you have it, a more or less out of the box way to create an aggregated Risk list.  With a little practice, it shouldn’t take more than a couple of hours to implement.

Surfacing Risk Data in an External List

Creating an External Content Type for Project Risks

In this post, I will talk about how to surface the custom SQL view we developed in the previous post through External Content Types and SharePoint Business Connectivity Services (BCS).

Now this isn’t a new topic.  The only difference is that we’re using it to surface risk information.  For more information on deploying ECT against Project Server, please see one of my older posts.

….and, er….that’s pretty much it.  Not sure how much more I can add to that.  Once you’ve done that against the new custom SQL view, it will look something like this:


Next up….adding the data to an external list.

Creating an External Content Type for Project Risks

Customizing SQL Views to Provide Useful Datasets

In my last post, I decided to take a look at how to use External Content Types to surface Project Server Reporting database data on Risks and Issues.  In this post, I plan to talk about some quick and easy steps that are required to provide a useful dataset.

Now, before we get too far into this post, I should probably mention that working directly in the database is always a sensitive issue, but as far as I can tell, adding a custom view to the Reporting database will not cause supportability issues going forward.  Just make sure that everything is documented and backed up.  Now back to the narrative…

The challenge that I found when I first started playing with the risk tables in the Reporting database was that none of the risk tables seemed to include the Project name.  A couple include the Project ID field, but that won’t help when it’s surfaced in an External List.


…so I decided to create a custom SQL view.  To do that, I fired up SQL Management Studio and navigated to the Reporting database.  Once I found that, I right clicked on the Views option to create a new View.


From there, it’s a relatively simple matter to select the WSSRisk_Olapview and EPMProject_Userview tables to be included in my view.  I check the fields to be included, and then add a join at the ProjectUID field.  (If you’re not familiar with this interface, it works much like Access, you just grab the ProjectUID field from the table on the left and drag it to the one on the right to generate the join.)


Save the Custom View, and you can now use it for reporting purposes, whether that report is based on an External Content Type, an Office Data Connection file, or a direct connection back into SQL.


Customizing SQL Views to Provide Useful Datasets

Centralized Risk Repositories with External Lists

I was plugging away at the Project Server newsgroup when one of the contributors asked an interesting question.  After trying to aggregate Issues from each of the project sites to a single centralized repository using Content Query Webparts, Bart E. asked, “So there is no OOTB solution for a sortable aggregation, I take it?”

….well, that got me to thinking.  Is there an easy out of the box solution to deliver centralized risks and issues from Project Server?  The obvious solution would seem to be through the use of External Content Types, which I have come to regard as the Swiss Army knife of the SharePoint 2010 world.  Using External Content Types and External Lists would allow the development of easily customizable lists of aggregated risks.

So I sat down with a virtual image to see if I could make it work.  To my surprise, 20 minutes later, I pretty much had it working – with some minor formatting issues that took me back to the drawing board.

Hence my next couple of posts will be about this topic, specifically how to use External Content Types to surface the risk information embedded in the Project Server reporting database.  This approach is only suitable in the following circumstances:

  • Projects are published primarily from the Microsoft Project client – as that is the action which seems to refresh the database from the Project site data.
  • Risks are managed without custom fields, or at least the aggregation does not require those custom fields to work.  The Reporting database only captures the default fields.
  • Project schedules are updated regularly.  It is through the update process that the risks are populated within the Reporting database.

First, let’s start with a little review of the available literature.  What options are currently available?  Here are the ones I’ve come across so far.

  1. SSRS Reports – as documented by Christophe Fiessinger here.
  2. Third party tools such as this one from iPMO.
  3. Utilizing the Content Query Webpart within SharePoint – which required a little more coding the last time I played with it than I was comfortable with.  Still, probably worth a blog post at some point in the future.  A variant of this approach would cross site collections and might include something like the Content Monster Webpart.

This series of posts will focus on a fourth method, using External Content Types and an External List to surface Reporting database data.  You could probably do the same with a Web Service pulling data from SharePoint content databases, but I would defer to the SharePoint folks on that.

Hence, over the next couple of days, watch out for the following posts:

  1. Customizing SQL views to create the dataset
  2. Creating an External Content Type from a customized view
  3. Creating an External List with Risk data

(My plan is to come back and add hyperlinks to this post once each of those individual posts have been published.)

Centralized Risk Repositories with External Lists

Back to the Basics: Calculating Duration in Fixed Duration Tasks

My colleague approached me with an interesting question the other day.  Under specific conditions, he could make seemingly identical tasks appear with different durations.


So the question is “why?”  Why do seemingly identical tasks have different durations.  First off, lets look at how duration is calculated…

Fixed Duration Activities Without Assignments

In the example below, I have three tasks.  Each task is a different type.


You’ll note that the duration is the same.  Next, I will introduce an exception of one day in the Project Calendar.  In theory, this will push out the finish date for each task.


The results appear as below:


We see that the duration has not changed, even though the tasks now finish up on 6/13 instead of 6/10.  This is as expected, because the duration is calculated as the total number of working days, and excludes any nonworking days such as weekends or holidays.

Fixed Duration Activities With Assignments

Now let’s assign a resource to each of those tasks.


Everything looks correct.  The resource is using the same Project Calendar, and therefore, we anticipate no changes.  Here’s where things get a bit trickier.  I am going to add a 1 day exception to my resource calendar.


…and now my schedule looks like this:


See how the duration shows differently, yet for all intents and purposes, the start and finish dates are identical?

The reason for that is that Fixed Duration activities ignore the resource calendar when calculating duration.  A Fixed Duration task calculates the difference between the start and end date using the Task or Project calendar.  In this case, since I haven’t applied a Task calendar, the calculation uses the Project calendar.

Fixed Unit and Fixed Work tasks on the other hand, calculate duration by counting how many days on work is actually assigned.


Back to the Basics: Calculating Duration in Fixed Duration Tasks

Project Center Ribbon Greyed Out After Solution Starter Uninstall

Here’s an issue that’s popped up a couple of times for me, and I figured was worth of note.

Upon navigating to the Project Center view in Microsoft Project Server 2010, I see that the ribbon is greyed out and inactive.  A small error notice appears in the bottom left of the screen stating.  When I click on the icon, it gives me the following message.

User Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; InfoPath.2; MS-RTC LM 8; .NET4.0C; .NET4.0E; InfoPath.3)

Timestamp: Tue, 3 May 2011 00:06:53 UTC

Message: Object required

Line: 2

Char: 199616

Code: 0

URI: http://myserver/_layouts/cui.js?rev=wvoVpqlQb30nGo4DjDk8Kg%3D%3D

There may be a couple of reasons for this to occur.  The most common one that I’ve run into is that this will occur when a solution starter that appears on the ribbon is uninstalled from the server.  As far as I can tell, this leaves a cached version of the ribbon on your local client IE – which throws an error on load.

Luckily the fix is quite easy.  Simply delete your local IE cache, refresh the page, and everything is back to normal.

Project Center Ribbon Greyed Out After Solution Starter Uninstall

Know Thy Operational Integration Model

Sooner or later, every IT department implementation faces the issue of how to integrate their EPM tool with their operational service desk management tool.  This is not only an issue for IT, as I have seen the same challenges in pipeline and oilfield maintenance, but it’s an issue that seems to arise more in IT than in other domains. 

I was thinking about this a couple of weeks ago when I was working on a PMI presentation including items to consider when deploying an EPM tool.  During my preparation, I identified two potential models for Project-Operations integration that I think are helpful to take into consideration when developing a road map for an EPM tool deployment.

If you’re implementing an EPM tool right now, and you plan to integrate project and operational work management, it would behoove you to spend a couple minutes thinking about the following models – and how or when they may be implemented within your organization.

Model #1: Segregated Project & Break-Fix Work Force


This is the model most strongly espoused in all of the IT Service Management literature, i.e. that all of the people on staff doing break fix work are kept separate from the people building new applications.

In the ITIL vision then, each of the costs of the operational work items (read: “trouble tickets”) and the projects can be rolled up and mapped to a specific application to determine the total cost of ownership for that specific application.

What are the ramifications of this from a deployment perspective?  When developing the custom fields for slicing and dicing your project portfolio, ensure that projects are coded such that they may be rolled up to the model the TCO for your application portfolio (or to use non-IT terms, your asset portfolio), i.e. ensure your Project and Ops coding are compatible and may be combined.

Model #2: Integrated Project & Break-Fix Work Force


This is the model that I see more often in my deployments, specifically in low-maturity organizations where resources are often split between building new assets and keeping the existing ones working.  This, of course, is the configuration that makes Agile and TOC practitioners cringe.

In this environment, the resources are shared, and thus to develop an accurate profile of the work demand, a comprehensive picture of both operational and project work is required.  This sort of implementation is significantly more complicated than the first model, as it requires both the integration of the costs as well as the work demand profile – i.e. I need some way of modeling the resource commitment to breakfix work in addition to the project pipeline of proposed projects.

This kind of model also gets more complicated as actual costs are booked to both operational and project work – usually meaning either double booking resource time, using two different timesheet interfaces, or implementing a third party solution to provide a shared interface between operational and project tracking tools.

Moral of the Story

If you’re implementing an EPM tool, take the time up front during envisioning and certainly during the development of your road map to identify which operational integration model you will most likely be implementing – and to build into the deployment the specific features that will be required to support it.

Know Thy Operational Integration Model

Using the Bulk Import Tool to Edit Populated Lookup Fields

Well, I thought I had it all figured out with my review of the Bulk Import
Tool….specifically with regards to using the Bulk Import Tool to perform bulk edits of existing data. 

Looks like I may have missed one key element.

Recently when trying to edit a group of existing projects, the tool kept returning an error of customfieldmaxvaluesexceeded.  The weird thing was that the error wouldn’t occur on every project, but only some projects – and then not on the same projects in DEV and PROD.

After playing with a couple scenarios, I think that I’ve figured out the issue.

The Bulk Import Tool will return the customfieldmaxvaluesexceeded under the following circumstances:

  1. The user is attempting to edit a field connected to a lookup table.
  2. The field is already populated.
  3. The new value is different than the original value.

This issue will not occur when the new value is the same as the old value, or if the lookup value is blank before the edit.

I am still trying to identify a valid workaround, but for now, I might make the following recommendations:

  1. Manually edit those projects and clear out the lookup values.  Although, of course, by the time you do that, you may as well have just set the values correctly.
  2. Export the data into Excel.  Edit in Excel.  Delete the old lookup field, create a new one, and then use the Bulk Import Tool to modify the new data set.
  3. A variant of topic #2, but instead of deleting the field wholesale, simply delete the original value from the lookup table list.  This will blank the fields that contain that specific value.

All in all, I would say that #1 is probably the least work intensive approach to getting around this restriction.

Using the Bulk Import Tool to Edit Populated Lookup Fields