Similar, but not the same.

Models, frameworks, methods, processes, procedures, and the list goes on and on.  Whether we are discussing Agile or plan based software development, works like methods, models, frameworks, processes and others are often used.  The people that use these terms in polite conversation often assume or imply a hierarchy in these terms.  For example, a model might include one or more frameworks and be instantiated in several methods. Each layer in the hierarchy breaks down into one or more items at the next level. Words and their definitions are an important tool to help understand how all the pieces and parts fit together and how to interpret the conversations about how software is developed and maintained in the lunch room or in hallways at conferences like Agile 2016. The unfortunate part is that few people agree on the hierarchy of models, methods, and frameworks.  These words are often used synonymously sowing seeds of confusion and mass hysteria (ok , that might be a teeny tiny overstatement). 

A proposed process hierarchy or architecture is as follows: (more…)

Baseline, not base line...

Baseline, not base line…

Measuring a process generates a baseline.  By contrast, a benchmark is a comparison of a baseline to another baseline.  Benchmarks can compare baselines to other internal baselines or external baselines.  I am often asked whether it possible to externally benchmark measures and metrics that have no industry definition or occasionally are team specific. Without developing a common definition of the measure or metric so that data is comparable, the answer is no.  A valid baseline line and benchmark requires that the measure or metric being collected is defined and consistently collected by all parties using the benchmark.

Measures or metrics used in external benchmarks need to be based on published or agreed upon standards between the parties involved in the benchmark.  Most examples of standards are obvious.  For example, in the software field there are a myriad of standards that can be leveraged to define software metrics.  Examples of standards groups include: IEEE, ISO, IFPUG, COSMIC and OMG. Metrics that are defined by these standards can be externally benchmarked and there are numerous sources of data.  Measures without international standards require all parties to specifically define what is being measured.  I recently ran across a simple example. The definition of a month caused a lot of discussion.  An organization compared function points per month (a simple throughput metric) to benchmark data they purchased.  The organization’s data was remarkably below the baseline.  The problem was that the benchmark used the common definition of a month (12 in a year) while their data used an internal definition of a 13 period year. The benchmark data or their data should have been modified to be comparable.

Applying the defined metric consistently is also critical and not always a given.  For example, when discussing the cost of an IT project understanding what is included is important for consistency.  Project costs could include hardware, software development and changes, purchased software, management costs, project management costs, business participation costs, and the list could go on ad-infinitum.  Another example might be the use of story points (a relative measure based on team perception), while a team may well be able to apply the measure consistently because it is based on perception comparisons, outside of the team would be at best valueless and at worst dangerous.

The data needed to create a baseline and for a benchmark comparison must be based on a common definition that is understood by all parties, or the results will generate misunderstandings.  A common definition is only a step along the route to a valuable baseline or benchmark, the data collection must be done on a consistent basis.  It is one thing to agree upon a definition and then have that definition consistently applied during data collection. Even metrics like IFPUG Function Points, which have a standard definition and rigorous training, can show up to a five percent variance between counters.  Less rigorously defined and trained metrics are unknowns that require due diligence by anyone that use them.


Effectiveness is an important concept that is featured centrally when discussing the performance of processes, teams and organizations. The problem is that many people don’t know what the word means precisely. Wikipeida defines effectiveness as “the capability of producing a desired result.”[1] The Business Dictionary expands the definition a bit, “The degree to which objectives are achieved and the extent to which targeted problems are solved.”[2] I use a simpler definition when talking about IT or process improvement projects, effectiveness is the capability of a process or project is doing the right thing when compared to the goals of the organization.  There are two core questions that need to be reviewed: 1) What is the right thing? and 2) How do we measure effectiveness?

When we are interested in effectiveness the focus shifts to understanding what the right thing is and then to ensure we track the right thing. Software development personnel intuitively know that doing the right thing made sense, but is very hard. Classic waterfall projects (waterfall projects have phases such as analysis, design, construction, testing that are completed before moving to the next phase) create requirement documents to establish what the “right thing” is, then enforce reviews and sign-offs for feedback to stay on track. Agile projects build backlogs, involve product owners and perform sprint reviews to cyclically establish the right thing and to generate a feedback loop to stay on-track. All projects want to deliver the right thing so that the organization can reach its goals.

When measuring effectiveness the question you are answering is, did the work accomplish the goals of the project? It is difficult to directly measure the effectiveness of an IT project, because it generally is component of a larger business product or program.  Rather we generally focus on efficiency measures (cost, time-to-market, productivity and velocity). Note this is a tendency to turn efficiency measures into tactical project goals and then to declare that the project was effective when they are met.  But, the real goal of very few projects is to be on-time or on-budget. If the goal of the project to help the organization deliver a new widget to market, and the metric is whether the new widget was delivered to market when it was promised. Focusing on the business goal of the project provides the basis to determine whether the project was effective or not.  Another proxy for effectiveness that is used is customer satisfaction.  When measuring customer satisfaction you can ask the respondent whether they think the project delivered the right thing. Even when you ask, it makes senses to go back to the business goals of the project and compare what was delivered to those goals. Unresolved mismatches means you were not as effective as possible.

A simple, workable definition of effectiveness for IT projects is the capability of a process or project is doing the right thing when compared to the goals of the organization.  Effectiveness excites me because it forces me to think about the bigger picture, in particular, the organizational goals that the project or process is striving to support.  Knowing that I am supporting the goals of the organization is motivational.





Memorial Day

On Memorial Day 2013, Daily Process Thoughts and The Software Process and Measurement Cast wants to thank all of those in the United States Armed Services that have paid the ultimate price in service of their country.

For those not in the US, Memorial Day is a day dedicated to the remembrance of those that have fallen while serving in the United States Armed Services (Wikipedia has a good history).  It also marks the beginning of summer, and if you believe the Connecticut Post, the beginning of the barbecue season. Conflating all of these ideas together is at best confusing and maybe a bit demeaning to the tribute the day is supposed to reflect.

Definitions matter! A common set of definitions facilitates clarity of thought and communication. As we discussed in earlier Daily Process Thoughts, commitment is saying what we will do then doing what we say. In order to make this type of commitment work we need a common understanding of what we are actually trying to do.  A common set of definitions a required first step.

What exactly do I mean when I use the terms Agile, framework or effective? Lack of common definitions may mean that we are speaking different languages. If you don’t think that can be a problem just try order noodles off the beaten track Beijing if you only speak English. One of the team building steps I commonly suggest is to create a common set of definitions early in a project. This can be done during project chartering using brainstorming to identify potential stumbling blocks and then developing a common set of definitions, a team glossary and a common vehicle for communication.


Programming Notes:

Daily Process Thoughts will begin an arc focused on definitions.  We will tackle words like effectiveness, efficiency, framework . . . maybe even Agile and Waterfall.  Are there other terms you would like to put on the list?