Listen Now
Subscribe on iTunes
Check out the podcast on Google Play Music

Software Process and Measurement Cast 405 is a cornucopia of topics!  We begin by exploring a bit of the psychology of change in four short essays. These topics are important for any change agent at any level to understand. Change at any scale is not an easy task. Change requires establishing a goal, recruiting a sponsor, acquiring a budget, developing a set of plans and then there is the part where the miracle happens and people change. The last step is always the hardest and is often akin to herding cats. Psychology and sociology have identified many of the reasons why people embrace change and innovation in different ways.  

Our second column is from Jon M. Quigley.  We have settled on a name for the column, “The Alpha-Omega of Product Development.” In this month’s column, we discuss using metrics to dispel assumptions. Metrics don’t have to add to overhead, for example, one item we discussed was using planning poker to expose assumptions and then to find tactics to address them.

Anchoring the cast, Jeremy Berriault brings the QA Corner to the Software Process and Measurement Cast.  In this installment of the QA Corner, Jeremy talks about whether test automation scripting for new functions should be tackled or not.  Jeremy has an opinion and provides advice for testing professionals on a sticky topic.  

Re-Read Saturday News

This week we continue our re-read of Kent Beck’s XP Explained, Second Edition with a discussion of Chapters 12 and 13.  This week we tackle two concepts central to XP: planning and testing both done the XP way.   (more…)


Too big to fail?

Moral hazards occur when the potential outcome of taking a risk is disassociated from who will bear the cost of a risk.  Moral hazard is often caused by information asymmetry; the risk taker has more information than the person or organization that will bear the cost of a risk. Even though we assume in many cases perfect information or harp on the need for communication, information asymmetry is a common occurrence. Too big to fail is a form of moral hazard in which the organization may take larger risks with the potential the larger returns because they know they will not be allowed to fail. (more…)

Moral Hazards In Software Development Processes
Thomas M Cagley, Jr.

A Moral hazard occurs when a party insulated from risk behaves differently than it would behave if they were fully exposed to the risk. I recently listened to someone order two pounds of chicken wings and then comment that the drug he took for high cholesterol would protect him from the potential of a negative impact.  He felt insulated from the risk therefore made a different choice than if he had not been insulated. Discussions of financial crises and bank bailouts have embedded the term moral hazard into our day to day vocabulary (not just into discussions of behavioral economics). While it is rare, even today, to attach the concept of moral hazard to the use of a methodology, project management or software development framework, I would suggest that when applied to the world of project governance that processes and methods can generate the potential for moral hazards that might not be easily recognized or mitigated.

Projects, project managers and project teams can be insulated from risk through a number of common, innocuous and sometimes beneficial mechanisms.  Risk can be diluted through disassociation, shielding or risk sharing (decision via committee).  Another example of a moral hazard I recently observed was when a project manager strenuously argued for a decision to explore an experimental technology to his project oversight committee even though a more tried and true solution existed.  The project manager spun the argument in such fashion to favor the experiment.  The committee’s approval of the experiment diffused the risk and responsibility for the project manager which encouraging a behavior pattern that was outside of the norm.  This example reflects a scenario where a moral hazard arises when individuals engage in risk sharing under conditions such that privately held knowledge affects the probability distribution of the outcome.

All methodologies, processes and group think can create scenarios where a moral hazard can occur by separating practitioners from the risk of the project or by diffusing risk in a way that separates actions from consequences. I would suggest that there are four simple scenarios in which frameworks, methods or processes can create an avenue where a moral hazard or at the least the appearance of a moral hazard can occur.

You Don’t Eat Your Own Dog Food – Why would the groups that create process not follow the same level of process discipline they expouse when they create for change their own processes? Even if there is no moral hazard, no real separation of the process designers from risk, the perception is that the process designers know better than follow the same discipline as in the processes they design.  Solution:  Eat your own dog food.  Use the same level of disciplin to design and implement process.


Zombie Templates – Following the process and completing templates unconsciously with the belief that there is magic in the steps that will immunize the project from risk puts everyone at risk until it is too late.  Note:  this type of behavior can be attributed to passive aggressive behavior which is a different type of problem.  Solution:  All processes must be built to encourage active thought and interaction as they are used.  Practitioners that are actively involved in reviewing deliverables should actively convince themselves that risks and options have been weighed rather than performed as a simple check the box activity.

Process Bailouts – Projects get in trouble for of all sorts of reasons, I would suggest that following the process isn’t usually the base issue if a project is in trouble but it’s an easy target. Moral hazard attaches itself when processes are abrogated when other issues are at the heart of the problem. In many cases a project improves just because someone pays attention to it (look up the Hawthorne effect). Just because project performance improves, does not mean that releasing the project from using the process actually made things better and in the long run more harm might have been done.  Solution:  When a project is in trouble do the due diligence to actually determine why.  Eschew knee jerk reactions.  If the processes the project is using are at the heart of the matter, FIX them rather than embracing the age old response of winging it.

High Cover – Meetings can be used as mechanism for diffusing responsibility through risk sharing.  The diffusion of risk under the banner of collaboration sets up the possibility that privately held knowledge or actions can affect the outcome because relevant knowledge is not wildly held in the collective understanding.  While I am not suggesting that a Borg like consciousness is required, I am suggesting that teams are constructed to maximize the possibility of transparency (solution).

Building processes and teams the foster knowledge sharing and that can be changed to highlight and mitigate risk are great first steps.  Transparency is another good step towards reducing the possibility of moral hazards but transparency and good processes alone are not sufficient.  Organizational behavior change is required and that is topic for another time! Take the first steps of building your process infrastructure then get ready to change the world.

The podcast version of this essay is available here: