This is Part 5 of a 6-part series on Portfolio Management. In previous articles, we covered various steps from the Management of Portfolio’s handbook and made a case for applying Decision Science to the Portfolio Definition cycle. In the table below, we have highlighted the pitfalls that we often see emerge in Portfolio Definition, and how best to avoid them.
Pitfall | How to avoid |
---|---|
Decision makers and senior stakeholders do not own the process, criteria and scores and hence dismiss the results of the Portfolio Definition cycle. | This pitfall goes against two of our Decision Thinking principles, described in the book “Hard Decisions made easy”, written by Catalyze CEO Paul Gordon. The first principle is: Process before content; the other is Active Stakeholder Participation. In practice, this means getting buy-in from the stakeholders for the overall process before populating the process with the content. It also means ensuring that stakeholders are involved in the development of the criteria. Their involvement will have them get clear on the value delivered (criteria) and that will significantly increase the likelihood that the results of the Definition cycle make sense to them. |
Creating Initiatives and assessing strategic alignment without a clear, coherent strategy and set of strategic objectives. | As the cat said to Alice in Wonderland, it doesn’t matter which road you take if you don’t know where you want to go. Strategy first; without it, you can’t manage the portfolio. |
Using an incomplete set of criteria which do not fully encompass the strategic objectives and therefore fail to reflect some of the value delivered by the Initiatives | Work with senior leaders and stakeholders to develop the final set of criteria. Test the criteria to see that the value they articulate will cover the range of strategic objectives. If the answers from prioritisation don’t make sense, then find out why and look to see if some dimension of value has not been articulated in the criteria used. |
Failing to generate imaginative Initiatives which can deliver the strategy | A key aspect of portfolio prioritisation is to ensure that there is a lot of choice to determine where the sweet spot lies in terms of benefit/investment. These choices can be mutually exclusive options that build up (Option 2 is Option 1+, Option 3 is Option 2+ etc.) or cumulative (or complimentary to each other). |
Treating portfolio definition as a financial planning exercise as opposed to maximising value-for-money from limited resources. | Good practice is to go through the prioritisation without the costs associated with the initiatives being presented to the evaluators. Once benefits are evaluated, the benefit/cost ratios are presented. This keeps the focus of the conversation on the value that Initiatives will deliver, before getting clear on the value for money propositions. |
Prioritising Initiatives using a single subjective judgement, without using decision criteria, and scoring and weighting mechanisms that measure the relative value of the Initiatives. | This pitfall is at the heart of the benefits provided by MCDA and Decision Science overall. These approaches provide academic rigour to the process; this is so often missing from “home grown” prioritisation processes. |
Not taking in account the risks to delivering the Initiatives and their associated value. | If you have a range of risk settings for your initiatives, you need to adjust the expected benefits based on the associated risks. This can be done in a couple of different ways; either treating risk as a “reverse benefit” (criteria where low risk project scores highly) or scaling the benefits with the probability of success (inverse of risk). |
Mistaking urgency for value and hence biasing the portfolio towards the short term. | This is a classic symptom of the human mind being better at processing things that are here and now. A good way of mitigating the “short termism” is to have a specific criterion called Future Proofing or some similar construct relevant to the organisation’s circumstances. This provides the opportunity to recognise and compare the long-term value that some projects deliver over short-term projects. |
Arbitrarily applying weights to criteria without adhering to the principles of decision science to derive a measure of relative value for each of the Initiatives. | This is a common issue and well understood in Decision Science. Using MCDA, this is addressed by deriving the criteria weights based on the value available from the highest scoring option on each criterion. To find out more about this approach called Swing Weighting, contact us for our white paper on the topic – just send a request via our contact us page. |
Relying on a tool to define the portfolio, as opposed to people engaged in a group process supported by tools. People make decisions, not tools. | This pitfall goes against our Decision Thinking principle of: Active Stakeholder Participation Decision Conferencing is a key element of our approach, bringing the stakeholders together. The resulting order of priority is not the answer though; but it is a great place to start the conversation with stakeholders making a final decision. |
In our next and final instalment of this series we’ll be reviewing what we have covered from Management of Portfolio’s handbook and outlining ways to practically implement the Decision Thinking principles in your portfolio management.
Recent Comments