Monte Carlo Simulations, Part III

monte-carlo-simulation-pmp-examPublished in the Project Post-Gazette, Issue 2014-03
by Paul Lohnes, MBA, PMP
Project Post-Gazette

 

Continuing our series on the concepts and use of Monte Carlo Simulations (MCS) in the discipline of project management and business analysis, this article will begin illustrating the application via the Microsoft Excel spreadsheet application since over 90% of the world’s project managers (PM) and business analysts (BA) use this particular application for their estimation, financial reporting, and even project activities tracking. It makes sense to use the application that we are most familiar with and most are using.

We at the PPG Research and Publishing division do not advocate the use of a technology or in this case a technique simply because it is elegant or sophisticated from a mathematical or erudite perspective (Google the word and you will see the joke). A technique must provide value and utilitarian application at the right level of return on investment of your time and effort to make the technique useful in your daily work life. Thus, using MCS is valuable and useful, yes, but there are also less stringent or cerebrally challenging techniques that Excel supports that are cousins of the more vaulted MCS that we use in situations where the entire power of a MCS is not quite required. In addition, they can help us understand how to implement MCS in a step-wise format making the journey easier by providing valuable and enjoyable waysides that can divide the learning of MCS into manageable sections especially given our limited time together each issue.

The first module we will cover on our journey toward mastering MCS is the use of data tables. We will see that data tables in Excel are the foundation of understanding MCS applications and must be mastered if MCS are going to be the ultimate goal of our exercise together. Data tables are one of the most useful, but probably most underutilized higher level features of Excel by both PM and BA. When asked about these amazing little gems of analysis support and decision making aids that are easily implemented and quite elegant in their presentation, most professional project managers and business analysts feign ignorance.

Data tables provide the user with a simple method for calculating straightforward values such as the duration estimate that we will be using as our example in this article. Yes, you can use Excel to help you make better estimates for duration, costs, risks, scope impact, etc. There are literally hundreds of uses for Excel data tables and their more powerful cousins – the pivot tables – which we will cover after we have finished with MCS applications.

Data tables in Excel are part of a larger series of features which are called ‘what-if-analysis’ capabilities. They are found in the Data Menu / What-If-Analysis drop down (Fig 1). You must first setup up the data table before calling Excel to treat a certain portion of your worksheet as a data table. In our example (which you can download as a file: DataTables1_2014_3.xlsx) you will see the following setup (Fig 2). Let us familiarize you with the worksheet and how data tables can help you with your duration estimation.

TargetedV_WhatIfAnalysis1_Fig1

Figure 1

Using Fig 2, we have designed a duration estimate to be dependent on the following input parameters (do not worry about how we created the duration estimate – that will come later when we teach you about estimation):

  • Complexity (1 to 5)
  • Resource Allocation (1-100%)
  • Work hrs / day (1-12)
  • Work days / month (1-31)
  • Normal monthly work hrs (hrs)
  • Multitasking (yes/no)
  • Pure 3 point estimate – in same units as the estimate, in this case – hours.
Figure 2

Figure 2

Complexity is an indicator or value indicating the relative level of complexity of the task being estimated. Normal complexity is a 3 with low being 1, and high being 5. Resource allocation is the amount of application to the task that your resource can offer in percent where 100% means full time. Normal monthly work hours is an indication of the normal, 8 hrs/day * number of work days in a month: this gives us the ability to determine over-allocation conditions and for ‘crashing’ and ‘fast-tracking’ the project, if needed. Multitasking gives us the indicator ‘yes’ if the resource will be or not be doing other things. Think of it as Multitasking is a macro level parameter where the Resource Allocation is the micro level parameter. Finally, the pure 3 point estimate is the estimate value without worrying about risks, multitasking, resource allocation, etc. It is the purest estimate of doing the work if you had a normal complexity task with a 100% allocated resource of medium capabilities working for a normal work day for normal work month. You can see this when the inputs are set to default: 3, 100%, 8, 20, 160, No, 160. The duration formulation renders a duration estimate of 160 hours. You will also notice that we have limited the input choices through Excel’s Data input validation ‘list’ feature. The data of the lists is in the ‘Data’ worksheet. Complexity must be a value to 1 to 5 as shown in Fig 3.

 

 

Figure 3

Figure 3

Now, how can data tables help if we want to vary one of the inputs, say complexity? How will this impact our duration estimate? Using Fig 4, you can see that we have created a two column table with values of complexity going down the column and the column to the right shows how the duration would vary for each value of complexity. Note that for complexity = 3, the result table shows 160 which is normative for this example, but for each value of complexity below and above this normal value, the estimate varies to show how various levels of complexity will impact the duration estimate.

Figure 4

 

 

To setup the data table, the column labeled Complexity holds the values of complexity over which we want to vary the scenarios to shows the impact on our duration estimates that will be calculated in the column labeled Duration. These two columns form the heart of our data table. You will notice using Fig 5, that the cell H7 (see your data file worksheet) is set equal to the named value ‘CalcEstimate’ that is calculated in C15 and this forms the values of the data table that result from the different scenarios of complexity in cells G8-12 (Fig 6). Once you have these columns of the data table setup (G8-H12), select the entire data table cell set (G8-H12), and then select the Data Table menu item in the “What-If-Scenarios’ choice of the Data Menu (see Fig 1). In Fig. 7, the Data Table input cell selection windows opens and since we are using columns for our complexity values, we select $C$7 of the Inputs to indicate that this is the parameter that will be used for the different scenarios that result in the calculations of Column H8-H12 for Duration.

 

Figure 5

Figure 5

 

Figure 6

Figure 7

Figure 7

 

 

 

 

 

 

 

 

 

 

 

Finally, using Fig 8, you can see the Chart that we provided to show the variation in duration estimates given the change in complexity levels. Since it is a straight line curve, you could ask Excel to provide you with the linear regression equation if you desired. So, we have shown how you can setup a data table so calculate different scenarios for your duration estimate given changes in the complexity of your task. For those interested in the grey column E, we have ‘greyed’ some intermediate calculations to make the main duration calculation easier to understand. We have shown some of the calculations ‘un-greyed’ to illustrate their values in Fig. 9. Fig 10 shows you the choices that Excel offers if through their Quick Analysis window. These are pre-formatted and setup analysis choices if you do not want to do your own.

Figure 8

Figure 8

TargetedV_InputCalcFields_Fig9

Figure 9

 

 

 

 

 

Figure 10

 

Now that you know how to setup data tables, we suggest that you play with our spreadsheet over the next 4 to 5 weeks until we bring you the next article in this MCS series on how to setup and use two-way and multi-way data tables that support the variation of multiple input parameters to see how they impact duration estimates. Yes, this means that we can vary not only complexity, but allocation, hours per day, hours per month, and multitasking in the same analysis to see how these will impact our duration. With a bit of calculus (which we will save you the trouble of creating), you can even find the optimal values of all your inputs that will minimize the task duration. Interested? Come back and see you during our series.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s