#NoEstimates is a lightening rod.


Estimation is one of the lightening rod issues in software development and maintenance. Over the past few years the concept of #NoEstimates has emerged and has become a movement within the Agile community. Due to its newness, #NoEstimates has several camps revolving around a central concept. This essay begins the process of identifying and defining a core set of concepts in order to have a measured discussion.  A  shared language accross the gamut of estimating ideas whether Agile or not including #NoEstimates is critical for comparing both sets of concepts.  We begin our exploration of the ideas around the #NoEstimates concepts by establishing a context that includes classic estimatimation and #NoEstimates.

Classic Estimation Context: Estimation as a topic is often a synthesis of three related, but different concepts. The three concepts are budgeting, estimation and planning. Becasue these three conccepts are often conflated it is important to understand tthe relationship between the three.  These are typical in a normal commercial organization, however these concepts might be called different things depending your business model. An estimate is a finite approximation of cost, effort and/or duration based on some basis of knowledge (this is known as a basis of estimation). The flow of activity conflated as estimation often runs from budget, to project estimation to planning. In most organizations, the act of generating a finite approximation typically begins as a form of portfolio management in order to generate a budget for a department or group. The budgeting process helps make decisions about which pieces of work are to be done. Most organizations have a portfolio of work that is larger than they can accomplish, therefore they need a mechanism to prioritize. Most portfolio managers, whether proponents of an Agile or a classic approach, would defend using value as a key determinant of prioritization. Value requires having some type of forecast of cost and benefit of the project over some timeframe. Once a project enters a pipeline in a classic organization. an estimate is typically generated.  The estimate is generally believed to be more accurate than the orgianal  budget due to the information gathered as the project is groomed to begin. Plans breakdown stories into tasks often with personal assigned, an estimate of effort generated at the task level and sum the estimates into higher-level estimates. Any of these steps can (but should not) be called estimation. The three -level process described above, if misused, can cause several team and organizational issues. Proponents of the #NoEstimates movement often classify these issues as estimation pathologies; we will explore these “pathologies” in later essays.

#NoEstimates Context:  There are two camps macro camps in the #NoEstimate movement (the two camps probably reflect more of continuum of ideas rathe than absolutes).  The first camp argues that a team should break work down into small chunks and then immediately begin completing those small chunks (doing the highest value first). The chunks would build up quickly to a minimum viable product (MVP) that can generate feedback, so the team can hone its ability to deliver value. I call this camp the “Feedback’ers”, and luminaries like Woody Zuill often champion this camp. A second camp begins in a similar manner – by breaking the work into small pieces, prioritizing on value (and perhaps risk), delivering against a MVP to generate feedback – but they measure throughput. Throughput is a measure of how many units of work (e.g. stories or widgets) a team can deliver in a specific period of time. Continuously measuring the throughput of the team provides a tool to understand when work needs to start in order for to be delivered within a period time. Average throughput is used to provide the team and other stakeholders with a forecast of the future. This is very similar to throughput measured used in Kanban. People like Vasco Duarte (listen to my interview with Vasco) champion the second camp, which I tend to call the “Kanban’ers”.  I recently heard David Anderson, the Kanban visionary, discuss a similar #NoEstimates position using throughput as a forecasting tool. Both camps in the #NoEstimates movement eschew developing story- or task-level estimates. The major difference is on the use of throughput to provide forecasting which is central to bottom-up estimating and planning at the lowest level of the classic estimation continuum.

When done correctly, both #NoEstimates and classic estimation are tools to generate feedback and create guidance for the organization. In its purest form #NoEstimates uses functionality to generate feedback and to provide guidance about what is possible. The less absolutist “Kanban’er” form of #NoEstimates uses both functional software and throughput measures as feedback and guidance tools. Classic estimation tools use plans and performance to the plan to generate feedback and guidance. The goal is usually the same, it is just that the mechanisms are very different. With an established context and vocabulary exlore the concepts more deeply.

Listen to the Software Process and Measurement Cast 303

Software Process and Measurement Cast number 303 features our essay titled “Topics in Estimation.” This essay is a collection of smaller essays that cover wide range of issues effecting estimation.  Topics include estimation and customer satisfaction, risk and project estimates, estimation frameworks and size and estimation.  Something to help and irritate everyone, we are talking about estimation – what would you expect?

We also have a new installment of Kim Pries’s Software Sensei column.  In this installment Kim discusses education as defect prevention.  Do we really believe that education improves productivity, quality and time to market?

Listen to the Software Process and Measurement Cast 303


Software Process and Measurement Cast number 304 will feature our long awaited interview with Jamie Lynn Cooke, author The Power of the Agile Business Analyst. We discussed the definition of an Agile business analyst and what they actually do in Agile projects.  Jamie provides a clear and succinct explanation of the role and value of Agile business analysts.

Upcoming Events

I will be presenting at the International Conference on Software Quality and Test Management in San Diego, CA on October 1.  I have a great discount code!!!! Contact me if you are interested!

I will be presenting at the North East Quality Council 60th Conference October 21st and 22nd in Springfield, MA.

More on all of these great events in the near future! I look forward to seeing all SPaMCAST readers and listeners that attend these great events!

The Software Process and Measurement Cast has a sponsor.

As many you know I do at least one webinar for the IT Metrics and Productivity Institute (ITMPI) every year. The ITMPI provides a great service to the IT profession. ITMPI’s mission is to pull together the expertise and educational efforts of the world’s leading IT thought leaders and to create a single online destination where IT practitioners and executives can meet all of their educational and professional development needs. The ITMPI offers a premium membership that gives members unlimited free access to 400 PDU accredited webinar recordings, and waives the PDU processing fees on all live and recorded webinars. The Software Process and Measurement Cast some support if you sign up here. All the revenue our sponsorship generates goes for bandwidth, hosting and new cool equipment to create more and better content for you. Support the SPaMCAST and learn from the ITMPI.

Shameless Ad for my book!

Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, neither for you or your team.” Support SPaMCAST by buying the book here.

Available in English and Chinese.

Sometimes you need a seeing eye dog to see the solution.

Sometimes you need a seeing eye dog to see the solution.

In the entry, The Top Five Issues In Project Estimation, we identified the five macro categories of estimation problems generated when I asked a group of people the question “What are the two largest issues in project estimation?”  Knowing what the issues are is important, however equally important is having a set of solutions.

  1. Requirements. Techniques that reduce the impact of unclear and changing requirements on budgeting and estimation include release plans, identifying a clear minimum viable product and changing how requirements changes are viewed when judging project success. See Requirements: The Chronic Problem with Project Estimation.
  2. Estimate Reliability. Recognize that budgets, estimates and plans are subject to the cone of uncertainty.  The cone of uncertainty is a reflection of the fact earlier in a project the less you know about the project.  Predictions of the future will be more variable the less you know about the project.  Budgets, estimates and plans are predictions of cost, effort, duration or size.
  3. Project History.  Collect predicted and actual project size, effort, duration and other project demographics for each project.  Project history can be used both as the basis for analogous estimates and/or to train parametric estimation tools.  The act of collecting the quantitative history and the qualitative story about how projects performed is a useful form of introspection that can drive change.
  4. Labor Hours Are Not The Same As Size.  Implement functional (e.g. IFPUG Function Points) or relative sizing (Story Points) as a step in the estimation process. The act of focusing on size separately allows estimators to gain greater focus on the other parts of the estimation process like team capabilities, processes, risks or changes that will affect velocity.  Greater focus leads to greater understanding, which leads to a better estimate.
  5. No One Dedicated to Estimation.  Estimating is a skill that that not only requires but practice to develop consistency.  While everyone should understand the concepts of estimation, consistency will be gained faster if someone is dedicated to learn and to execute the estimation process.

Solving the five macro estimation problems requires organizational change.  Many of the changes required are difficult because they are less about “how” to estimate and more about what we think estimates are, which leads into a discussion of why we estimate.  Organization’s budget and estimate to provide direction at a high level.   At this level budgets and estimates affect planning for tax accruals and for communicating portfolio level decisions to organizational stakeholders.  Investing in improving how organizations estimate will improve communication between CIOs, CFOs and business stakeholders.

Like IT professionals, perhaps too optimistic?

Like IT professionals, perhaps too optimistic?

In Software Project Estimation: The Budget, Estimate, Plan Continuum we defined a numerical continuum that makes up estimation.  There are numerous specific techniques for generating budgets, estimates and plans.  The techniques can be sorted into three basic categories.  Hybrids exist that the leverage components of each.


Expert techniques use the judgment, generally based on experience, of an individual to determine the cost, duration or effort of a project. The primary strengths of an expert approach is that it can be developed relatively quickly and is championed by a person who has developed a high level of organizational trust. The obvious weakness of these techniques is the reliance on an individual with all of his or her biases.  Dr. Ricarado Valerdi in SPaMCAST 84 noted his research has found that IT personnel are notoriously poor estimators.  One of the reasons cited in the interview was that IT personnel are generally overly optimistic of their problem solving ability. Techniques such Delphi and Planning Poker use multiple experts as a technique to fight individual bias by using collaboration in an attempt to triangulate on the better answer.  Developing an estimate or budget leverages past performance on a specific project to anchor the estimator(s) memory, and then uses judgment to determine how much one project will be like another. Expert techniques make the most sense when there is little or no data to base the prediction, for instance when a budget is being developed.

The second technique is parametric estimation.  Parametric estimation is generally an estimation technique (as opposed to a budget or a plan, although many commercial products also include planning features) that generates an estimate based on historical data of productivity, staffing and quality that is used to create a set of equations.  These equations are then fed information about the size of the project (IFPUG Function Points for example), project complexity and the predicted capabilities of the team.  Tools like SEER-SIM and COCOMO II are parametric estimation tool. The strengths of parametric estimates are derived from the historical performance data they use to generate the estimates and the enforced rigorous estimation process.  The weakness of any parametric based estimation model is that they require the estimator generate, or have access to, a numerical size which can add overhead to the project or take time that would be better spent building the software.  We have discussed the fallacy of these issues in the discussion of IFPUG function points.  A bigger issue exists when there is no historical data that can be used to generate the productivity equations.  When no data exists I would recommend seeking external data (many firms, including the David Consulting Group – my day job – and ISBSG can sell or help you with this issue).  When no trustworthy data exists, parametric estimation does not make sense.

Work breakdown structures are the third category.  This category is generally used for planning, and some cases, as means of building a bottom-up estimate.  In this category a planner or team will generate a list of tasks need to complete the job. The level of granularity of the tasks can vary greatly – I had a colleague that planned tasks at an hourly increment. Constraints, staffing and sequence can be added to the plan to generate schedule.  The sprint backlog used in Scrum is a form of this technique.  The power of the techniques are derived from a focus on what is to be done, by whom and when at an actionable level of detail. The problem is that you need an incredible amount of information about the project and project team to be able to generate an accurate task list let alone an accurate project schedule.  It is well known that the amount of data needed for this technique is generally only accurately known over short time horizons. However, I have seen processes that require detailed schedules for long projects up to a year before they are scheduled to start.   These techniques are best use for deriving short term plans.

Most IT organizations tend to fixate on one of these categories of techniques however organizations that understand differences between a budget, an estimate and a plan will use techniques from all three categories.  Using the data and knowledge gained from using each tool or technique as a feedback loop to improve the performance of all the techniques the use.  For example, an organization I recently spoke with uses both parametric and expert techniques to generate estimates on critical projects.  Both techniques cause the estimation team to surface different assumptions that need to be understood when deciding which work can be done and how much money to ask for from the business.

Do I budget, estimate or plan the number of crawfish I am going to eat?

Do I budget, estimate or plan the number of crawfish I am going to eat?

Software project estimation is a conflation of three related but different concepts. The three concepts are budgeting, estimation and planning.  These are typical in a normal commercial organization, however these concepts might be called different things depending your business model.  For example, organizations that sell software services typically develop sales bids instead of budgets.  The budget to estimate to plan evolution really follows the path a project team takes as they lean about the project.

Budgeting is the first step.  You usually build a budget at the point that you know the least amount about the project. Looking back on my corporate career, I can’t tell how many late nights were spent in November conceptualizing projects with my clients so that we could build a budget for the following year.  The figures we came up with were (at best) based on an analogy to a similar project.  Even more intriguing was that accounting expects you to submit a single number for each project (if you threw in decimal points they were more apt to believe the number).  Budget figures are generated when we know the least, which means we are at the widest point of the cone of uncertainty.  A term that is sometimes used instead of a budget is rough order of magnitude.

The second stop on the estimation journey generally occurs as the project is considered or staged in the overall project portfolio.  An estimate generally provides a more finite approximation of cost, effort and duration based on deeper knowledge. There is a wide range of techniques for generating an estimate, ranging from analogies to parametric estimation (a process based on quantitative inputs, expected behaviors and past performance).  The method you use depends on the organization’s culture and the amount of available information.  Mature estimation organizations almost always express an estimate either as a range or as a probability.  Estimates can be generated iteratively as you gather new information and experience.

The final stop in the decomposition of the from the budget to the estimate is the plan. A plan is the work breakdown structure for the project (generally with task estimates and resources assigned) or a task list. In order to create a plan you must have a fairly precise understanding what you are building and how you are going to build it.  Good planning can only occur when a team is in thinnest part of the cone of uncertainty. Or, in other words, where you have significant knowledge and information about what you are planning.  Immature organizations often build a plan for a project, sum the effort and the cost and then call the total an estimate (this is called bottom-up estimating) which mean they must pretend to know more than the really can. More mature organizations plan iteratively up to a short-term planning horizon (in Agile that would be the duration of a sprint) and then estimate (top-down) for periods outside the short term planning window.

Short Descriptions:

  • Budgeting: Defines how much we have to spend and influences scope.   A budget is generally a single number that ignores the cone of uncertainty.
  • Estimating: Defines an approximation of one or more of the basic attributes that define the size of the project. The attributes include of cost, effort, duration. An estimate is generally given as a range based on where the project is the cone of uncertainty.
  • Planning: Builds the task list or the work breakdown so that resources can be planned and organized. Planning occurs at the narrowest part of the cone of uncertainty.

Estimating means many things to many people.  In order to understand the process and why some form of estimation will always be required in any organization, we need unpack the term and consider each of the component parts.  Each step along the continuum from budgeting to planning provides different information and requires different levels of information ranging from the classic back of the napkin concept (budget) to a task list generated in a sprint planning session (plan).  Having one does not replace the need for the other.

Agile Estimation Using Functional Metrics, Part 1
by Thomas M. Cagley Jr.

The term agile has come to mean many things to many people.  The definitions and connotations range from how work is organized within a project to a description of the speed at which work is completed or alternately a radical rethinking of organizational culture.   Regardless of how you define agile I would suggest that we all would agree that agile methods are now maturing.  Part of the process of maturing is the incorporation of best practices from other methods and frameworks creating a hybrid.  The fringe is influencing the center and the center is influencing the fringe.  The hybrid is at once better than any of the absolutes and threatening to those who believe in absolutes.

Estimation has been a lightning rod for the discussion all methods (agile, waterfall, iterative or water fountain) with the issues of predictability and standardization radiating outward.  Because of the controversy this is an area where a wide range of hybridization has always occurred.  Organizations adjust techniques to fit governance structures, culture and risk profiles.  There is no one size fits all solution.   This paper provides a path for incorporating the use of function points into agile estimation techniques.  The process will yield an estimation process that combines one part functional metrics, one part parametric estimation techniques with two parts agile estimation (heavily influenced by Mike Cohn).   I would suggest that functional metrics provide a path for incorporating the best practices of robust software sizing with the collaborative techniques championed by the agile community in a manner that increase standardization without ignoring the principals of the Agile Manifesto.

Budgeting, Estimation and Planning

I’d like to begin this discussion by challenging your preconceived notion of estimation as compared to the activities of budgeting and planning .  These three concepts are sometimes thought of as being synonymous however I believe it is important to understand just how different these concepts are.  Each has different inputs and outputs, uses different tools and techniques and is generally used by different groups within the organization.

A quick overview of the macro differences are:
•    Budgeting
o    Defines how much we have to spend based on the influence of scope
o    Tends to ignore the cone of uncertainty
•    Estimation
o    Presents an approximation of effort and duration based on size and project nature
o    Focused by the cone of uncertainty (a range based on knowledge)
•    Planning
o    Defines tasks and allocates resources
o    Focused on the narrow part of the cone of uncertainty (a much smaller range)

Estimation, planning and budgeting might be related but they are certainly not the same.  The use of functional metrics in agile estimation is targeted at the estimation layer of this three layer cake but provides support for planning.  Developing a basic understanding of the components of estimation (we are going to ignore budgeting as bastion of guesses) and its relationship to sizing is critical to using these techniques.


Estimation is several parts science and a least one part magic.  This strange confluence of science and magic defines the transformation of requirements size, skills, people and equipment into how much the project will cost and how much effort it will take .  The whole process of transformation is bound by a cone of uncertainty.  Uncertainty builds boundaries around the false precision of the estimate providing a range around the estimate based on what is known and unknown.  Collaborative estimation techniques are good at increasing team knowledge while reducing the amount of self-deceit that can occur when knowledge is discussed.

The amount of art increases as the estimation discipline is replaced by the planning discipline.  The art of planning matches specific tasks with people thru a process of assignment.  In a perfect world estimates and planning would be able to be done together in seamless workflow but estimates happen generally earlier in the project lifecycle before you can decompose work into tasks which is required for planning.

The simplest form of any estimation model, human or tool based is a mathematical mash-up of size (implied or counted), team and organizational behavioral attributes and degree of difficulty (technical complexity) applied to a productivity signature.  As the level of sophistication in the mathematics increases tools SEER, SLIM or KnowledgePlan make sense.  Other methods raise level of collaboration and do any of the required math in the heads of the participants.  These techniques include Delphi, analogy or planning poker. The process in this paper splits the difference leveraging collaboration to increase participation and self knowledge while suggesting the use of a simple spreadsheet based parametric models to increase consistency and standardization.

Sounds simple, right?  Estimation has been a nagging pain in every IT manager’s backside since a user asked how much a project would cost and when it would be done.   We have gotten pretty good at budgeting using techniques like “x number of people times 20 hours in a day and you’ll get something next year” methods.  It’s when we try to figure out how much functionality will be delivered in real life that  things start to break down or least get very, very complicated.

There are three main categories of problems that cause estimation to be problematic in the real world.
1.    Uncertainty: how much do you know about what you’re building?
2.    Self knowledge: what you do really know about yourself and your team?
3.    Consistency of method: do you have a process for estimating?

Just How Badly Do You Want A Number?

Thomas M Cagley Jr.

An audio version of this essay can be found in the Software Process and Measurement Cast 38 (www.spamcast.net)

Every project begins with a prediction of how much it will cost and when it will be delivered. Project managers, as a rule, admit this behavior delivers results that are a mistake, can lead to perception problems and might actually warp space and time. Most projects I see begin with this sort of incantation. Why? The rationale for this seemingly irrational behavior is generated by many competing forces. It might be a reaction to a market deadline, a requirement to secure funding or based on the mistaken impression that the project team actually knows what they need to know to complete the project. Every IT manager I know understands the fallacy of the initial estimate, however I know very few if any that will actually stand up and just say no. This behavior causes cognitive dissonance (stress caused by holding two contradictory ideas simultaneously), but it is assuaged by continuing to look for a solution to stop the insanity (or by giving up and embracing the dark side).

Why, if we all know this type of behavior is wrong and that the outcome of the behavior is rarely effective, do we continue to do it? Why do we feel compelled to act in a manner that is non-nominal? Do Information Technology (IT) managers and project managers have a touch of a victim complex? The driver for this behavior begins at the interface between IT and finance known as the budgeting or funding process. While not the root of all evil, financial procedures and thinking are at times at odds with many standard ideas for planning and estimating a project. Concepts like ROI and tax accruals have precise predictable financial definitions and calculations. Financial control and analysis requires a level of precision in reporting project data. The problem is that the level of precision is generally at odds with initial project estimates. There will always be a mismatch between finance’s needs and IT’s data needs, unless projects are being actively measured and are using mechanisms to continually access they need to know to complete the project.

Agile methods such as xP and Scrum attempt to make peace with the initial estimation conundrum by breaking projects into small bites and only making promises for each small bite prior to beginning the work for that bite. In Scrum terms, the sprint planning exercise is an operational illustration of how agile methods have used a short term planning horizon to address the ‘how much’ and ‘when’ questions. This does not address the need to know when the overall project will be done and how much it will cost. You will encounter the same problem we began this essay with if you extend the short term planning windows to the entire known project backlog by counting the number of sprints or iterations runs. .

Non-agile shops have been adopted other tools to deal with the vagaries of early estimation and the perceived need for precision. The estimation funnel is an example of other strategies. An estimation funnel helps enforce the understanding that the variance of any prediction is larger earlier in the project and reduces as project team learns what they need to deliver and how to deliver it.

In the end however, nature, our society, finance and our fellow project managers betray the quest for abandoning the initial fixed estimate. Nature’s betrayal comes in the form of the belief that Sir Isaac Newton had something to do with project management. We believe that each action has an equal and opposite reaction, or that if we do step A then outcome B will magically appear. Remember that the combination of humans and physics does not a straight line make. Projects are human ventures, and therefore are more akin to herding cats than the laws of physics. Society’s betrayal is driven by ingrained consumerism. As a consumer, you and I have an expectation that if we’re going to buy something, regardless of complexity, someone will tell us for how much to write the check. An example of this type of behavior can be seen in the general hullabaloo that occurs when NASA overruns a cost estimate for the space station, despite the incredible level of complexity. How can anyone know exactly how much a project of this type will cost when it begins? The finance department’s betrayal is through their need to plan, secure funding and to understand cash flow. Making a significant mistake in the financial arena will have disastrous consequences for any company’s value and potentially its ability to make payroll. Most importantly, project managers let themselves down by saying “yes” and giving the number to whomever asks. Why? They feel they must, if for no other reason that there is always someone who wants the job badly enough to say anything and managers want a number badly enough to believe the lie and drink the Kool-Aid. Failure to play the game is seem to be career limiting.

I believe we need to rethink our concept of initial estimates. To drive this change we will have to change both our vision of the world and that of other constituencies within our companies. We need to de-link estimation from other non-estimation behaviors (we are not shopping for an HDTV), we need to change how estimation the process works, we need to measure projects which means embracing concepts such as function points for size and a proxy for functional knowledge. But most of all we will need to set a standard of behavior for ourselves and our fellow project managers and follow it.

Recognize these issues? Leave a comment or drop me an email at spamcastinfo@gmail.com

Making Tangible The Intangible

Thomas M. Cagley Jr

An audio version of this blog post can be found on SPaMCAST 37 (www.spamcast.net).

It is a rare that the ideas espoused by an interviewee end up affecting me to the point that I need to incorporate them into the essay that accompanies their specific interview.  The gestation period is typically longer at least by a week this stuff was just way to powerful.  The essay for this cast reflects concepts espoused by Phil Armour in SPaMCAST 36 and Kenji Hiranabe in SPaMCAST 37 (the current cast for those of you reading this on my blog).  The confluence of concepts that so moved me begins with the Kenji’s comments on the intangibility of both software and the processes (the process being both intangible and opaque) used to create software unless, of course, you are in the IT business.  In SPaMCAST 36 Phil Armour put forth the thought that software was a container for knowledge.  Knowledge is only tangible when demonstrated or, in software terms, executed.   All of this discussion boils down to a product built to harness knowledge, using what is perceived to be an intangible process. This process is only full recognizable for the brief period that it executes.  On top of all of that, there is every expectation that the delivery of the product will be on-time, on-budget, have high quality and be managed and orderly.  No wonder IT managers have blood pressure issues!

Intangibility creates the need for managers and customers to apply controls to understand what is happening in a project and why it is happening.  The level of control required for managers and customers to feel comfortable will cost a project time, effort and money that could be better spent actually delivering functionality (or dare I say it, reducing the cost of the project).  Therefore finding tools and techniques to make software and the process used to create software more tangible and while at the same time more transparent to scrutiny, is generally a good goal.  I use the term “generally” on purpose. The steps taken to increase tangibility and transparency (boy, doesn’t that sound like an oxymoron) need be less invasive than those typically seen in command and control organizations. Otherwise, why would you risk the process change?

Agile projects have leveraged tools like WIKIs, standup meetings, big picture measurements and customer involvement as tools to increase visibility into the process and functional code as a tool to make their knowledge visible.  I will attest that when well defined agile processes are coupled with proper corporate culture, an environment is created that is highly effective for breaking down walls.  But, and as you and I know, there had to be a “but”. The processes aren’t always well defined or applied with discipline and not all organizational cultures can embrace agile methods. There needs to be another way to solve the tangibility and transparency problems without resorting to draconian command and control procedures that cost more than they are normally worth.

In his two SPaMCAST interviews, Mr. Hiranabe has laid out two processes that are as applicable across the perceived divide defined by waterfall and agile projects.  Last year in SPaMCAST 7, Kenji talked about mind mapping.  Mind mapping is a tool used to visualize and organize data and ideas.  Mind mapping provides a method for capturing data concepts, visualizing the relationships between them and in doing so making ideas and knowledge tangible.  In the current SPaMCAST Kenji proposes a way to integrate kanban into the software development process.  According to WIKIPEDIA,  “Kanban is a signaling system to trigger action which in Toyota Production System leverages physical cards as the signal”.  In other words the signal is used to indicate when new tasks should start, and by inference, the status of current work.  Kenji does a great job at explaining how the kanban can be used in system development.  The bottom line is that the signal, whether physical or electronic, provides a low impact means of indicating how the development process is functioning and how functionality is flowing through the process. This increases the visibility of the process and makes it more tangible to those viewing from outside the trenches of IT.

Code that when executed does what was expected is the ultimate evidence that we have successfully captured knowledge and harnessed it to provide the functionality our customer requested.  The sprint demos in SCRUM are a means of providing a glimpse into that knowledge and to build credibility with customers.  However if your project is not leveraging SCRUM, then daily or weekly builds with testing can be leveraged to provide some assurance that knowledge is being captured and assembled into a framework that functions the way that you would expect.  You should note that demos and daily builds are not an either/or situation.  Do both!

The lack of tangibility and lack of transparency of the process of capturing knowledge and building the knowledgeware we call software has been a sore point between developers and managers since the first line of code was written.  We are now finally getting to the point where we have recognized that we have to address these issues; not just from a command and control perspective, but also from a social engineering perspective.  Even if Agile as a movement was to disappear tomorrow, there is no retreat from integrating tools and techniques like mind mapping and kanban while embracing software engineering within the social construct of the organization and perhaps the wider world outside the organization.  Our goal is to make tangible what intangible and visible that which is opaque.