Definition of done

Definition of done

In “User Stories Are For Technical, Operational and Risk Stories Too!” we made the case that user stories are not just for customer facing user interfaces.  But, we only partially answered the question:

“I would like to see some sample user stories in infrastructure projects such as network migration or upgrade, storage provisioning, or server provisioning.  I think storage provisioning would be a little bit complex due to hardening and other security requirements.”

The nuance in the question centers on the phrase “hardening and other security requirements.” There are two approaches that I often use for stories with nonfunctional requirements.  The first (and not my favorite) is through a final hardening sprint. The second approach is splitting the user stories into thin slices and then incorporating the nonfunctional requirements into the definition of done. (more…)

19452897313_0dd46dd8fa_k

While there is agreement that you should use DoD at scale, how to apply it is less clear.

The Definition of Done (DoD) is an important technique for increasing the operational effectiveness of team-level Agile. The DoD provides a team with a set of criteria that they can use to plan and bound their work. As Agile is scaled up to deliver larger, more integrated solutions the question that is often asked is whether the concept of the DoD can be applied. And if it is applied, does the application require another layer of done (more complexity)?

The answer to the first question is simple and straightforward. If the question is whether the Definition of Done technique can be used as Agile projects are scaled, then the answer is an unequivocal ‘yes’. In preparation for this essay I surveyed a few dozen practitioners and coaches on the topic to ensure that my use of the technique at scale wasn’t extraordinary. To a person, they all used the technique in some form. Mario Lucero, an Agile Coach in Chile, (interviewed on SPaMCAST 334) said it succinctly, “No, the use of Definition of Done doesn’t depend on how large is the project.”

While everyone agreed that the DoD makes sense in a scaled Agile environment, there is far less consensus on how to apply the technique. The divergence of opinion and practice centered on whether or not the teams working together continually integrated their code as part of their build management process. There are two different camps. The first camp typically finds themselves in organizations that integrated functions either as a final step in a sprint, performed integration as a separate function outside of development or as a separate hardening sprint. This camp generally feels that to apply the Definition of Done requires a separate DoD specifically for integration. This DoD would include requirements for integrating functions, testing integration and architectural requirements that span teams. The second camp of respondent finds themselves in environments where continuous integration is performed. In this scenario each respondent either added integration criteria in the team DoD or did nothing at all. The primary difference boiled down to whether the team members were responsible for making sure their code integrated with the overall system or whether someone else (real or perceived) was responsible.

In practice the way that DoD is practiced includes a bit of the infamous “it depends” magic. During our discussion on the topic, Luc Bourgault from Wolters Kluwer stated, “in a perfect world the definition should be same, but I think we should be accept differences when it makes sense.” Pradeep Chennavajhula, Senior Global VP at QAI, made three points:

  1. Principles/characteristics of Definition of done do not change by size of the project.
  2. However, the considerations and detail will be certainly impacted.
  3. This may however, create a perception that Definition of Done varies by size of project.

The Definition of Done is useful for all Agile work whether a single team or a large scaled effort. However, how you have organized your Agile effort will have more of a potential impact on your approach.

Simplicity and Complexity

Simplicity and Complexity

The Definition of Done is an important Agile technique to help teams plan and execute work.   The simplest definition of the Definition of Done is the criteria that a product must meet to be considered to be complete. While the concept is simple, the implementation of the technique in the real world is rarely simple. Both context and interpretations make things just a bit gray!

The first layer of complexity is that the Definition of Done must be understood in the context of an environment that must account for technical completeness, functional completeness, and the product owner’s acceptance. The product owner’s acceptance of the product can be considered as a proxy for value. Acceptance criteria or satisfaction conditions, regardless of what they are called, are a reflection of whether the code solves the specific business problem defined by the user story or requirement. Acceptance criteria provides are used to confirm that the story does what was intended and can be used to create an acceptance test.

Technically, the Definition of Done is often a reflection of organizational technical and process standards.  For example organizations often have coding and unit testing standards, so the criteria for the Definition of Done might include the requirements for the code structure, documentation, and testing. The Definition of Done may also include functionality components, however functionality is generally included in the definition through satisfaction or acceptance criteria.  In some cases I have seen a statement in the Definition of Done indicating that the acceptance criteria must be met. Finally the product owner’s evaluation of the value delivered by the solution after meeting the twin hurdles of done and acceptance sets up a final evaluation in which the product owner accepts or rejects the solution.  I have heard the outcome of this process described as done-done or even done-done-done.

Implementing a concept of done in a robust manner includes three hurdles; the criteria in the Definition of Done, acceptance criteria (part of a User Story) and the product owner’s acceptance. Implementations also requires weighing the answers to a number of definitional questions that complicate what is otherwise a fairly simple process.  The questions that must be answered include:

  1. Who decides what done means technically? In many organizations, the criteria that define done are often left to the discretion of individual teams. That discretion typically operates within the limits defined by the organization’s technical and process standards. For example, a few years ago I was asked by a development team, in an organization with stringent coding and unit testing standards, whether they could remove unit test requirements for the their code.  Unit testing was one of the criteria in their Definition of Done.  Their rationale was that testers would find out if it worked later in the process and their product owner did not like the coders’ testing.   However, the team only had discretion within the boundaries defined by organizational standards. In this case the organization defined one of the requirements for what done meant from a technical perspective, and it wasn’t an option to remove the unit test requirements.
  2. Can it be acceptable for all of the criteria in done be incomplete? Every team I have ever been involved with has sooner or later had to face the question of whether it is ever ok to shortchange the Definition of Done. The answer is often yes. Getting to yes generally involves a discussion of the gray area, accepting technical debt and then committing to when the debt will be retired.
  3. Can the organization’s needs outweigh the needs of a product owner? This is the alter ego of the previous question. Everyone on the team, including the product owner, needs to understand when and where organizational standards and/or needs will trump a product owner’s acceptance. Pressure to deliver can often incent a team and product owner to push code forward with the intention of refactoring in later sprints which pushes the envelope of standards and architecture.  Most mature organizations establish bright lines so that everyone knows where organizational needs and standards must be respected.

All definitions of the word done connote a degree of finality; of being complete.  In software development, enhancement, and maintenance environments, the concept of done can have many layers.  Each layer can have different technical, functional, and value-oriented nuances. Teams quickly learn that to be truly done, a piece of work must really be done, done and done.

Ready to develop insures that the team is ready to work as soon as they sit down.

Ready to develop ensures that the team is ready to work as soon as they sit down.

The definition of done is an important tool to help guide programs and teams. It is the requirements that the software must meet to be considered complete. For example, a simple of the definition of done is:

  • All stories must be unit tested,
  • a code review performed,
  • the code is integrated into the main build,
  • functionality has been integration tested, and
  • the release documentation completed.

The definition of done is generally agreed upon by the entire core team at the beginning of a project or program and stays roughly the same over the life of the project. It provides all team members with an outline of the macro requirements that all stories must meet. Therefore the definition helps in estimating by suggesting many of the tasks that will be required. Another perspective on the definition of done is that it represents the requirements generated by an organization’s policies, processes, and methods. For example, the organization may have a policy that requires the code to be scanned for security holes. Bookending the concept of done is the concept of ready to develop (or just ready). Ready to develop is just as powerful a tool as the definition done. Ready acts as a filter to determine whether a user story is ready to be passed to the team and taken into a sprint. Ready keeps teams from spinning their wheels working on stories that are unknown, malformed or are not understood. The basic definition of ready I typically use is: (more…)

You know the night is done when they lock the door.

You know the night is done when they lock the door.

What is the difference between the definition of done and acceptance criteria? If a team has let this question fester for any length of time, they generally will decide that the two concepts are synonymous. Unfortunately they are wrong.

The definition of done is the requirements that the software must meet to be considered complete. An example of the definition of done is:

All stories must be unit tested, a code review performed, integrated into the main build, integration tested, and release documentation completed.

Almost every team has a different definition of done, as technology, business or government requirements or organizational culture can have an impact on how a specific team implements the definition. For example, a team building software or hardware for use in a medical device will have different regulatory requirements they must adhere to. The definition of done is generally agreed upon by the entire core team at the beginning of a project or program and stays roughly the same over the life of the project. It provides all team members with an outline of the macro requirements that all stories must meet. Therefore the definition helps in estimating by suggesting many of the tasks that will be required. I have heard the definition of done described as the requirements generated by an organizations policies, processes and methods For example the organization may have a policy that requires code to be scanned for security holes. This requirement would need to be in the definition of done.

Acceptance criteria, on the other hand, provides confirmation that the story does what was intended and can be used to create an acceptance test. An example of acceptance criteria for a simple ( a more robust version of this example was shown previously ) data entry screen for a logo glass collection application would include:

  • Brewery name is a required field.
  • Glass logo copy is a required field
  • Glass type is a required field

The software must meet these criteria in order to meet the Product Owner and stakeholder needs. During a hands-on demonstration the Product Owner and stakeholders would be able to execute these functions. Acceptance criteria are a part of the description of the stakeholder’s requirements for the software.

In order to be part of a demonstration where the story can be accepted, all stories must satisfy both the definition of done and the acceptance criteria. The definition of done provides the team with a clear understanding of their obligations to meet overall organizational and process requirements. Acceptance criteria define how the Product Owner and stakeholders will know that the story meets their requirements for a specific function. Both are required for the team to understand when a story is done.

Having common goals and definitions helps to lift the fog.

Having common goals and definitions helps to lift the fog.

The first two articles on sprint planning have sparked a number of interesting side conversations about logistics and precursor concepts.  There are several constraints, concepts and ideas that the team needs to keep in mind to facilitate effective sprint planning. For good planning, all teams must understand the goal of the sprint they are planning, the definition of done and the time box for completing planning. 

The sprint goal (or goals) is the high-level business and technical target for the sprint.  The goal acts as an anchor that roots the units of work that product owner will ask the team to complete in the upcoming sprint.  The sprint goal is a touch point that each team member can use to make the moment-to-moment decisions every development team makes. The goal can also be used as a communication vehicle to help stakeholders and others outside the team understand what will be delivered during the sprint.

The second concept all teams must understand prior to beginning sprint planning is the definition of done. The definition of done represents both the team’s and organization’s expectations of the state of the functional software when the sprint is complete. For example my standard definition of done is:

  • Design documents developed (if needed)
  • Code written
  • Required code comments written
  • Unit tests preformed
  • Integration test executed
  • Release notes developed
  • Acceptance criteria met

The unit of work is not done until all of these criteria are complete.  Every organization will have its own list of macro tasks that are required to show demonstrable value.  For instance, my base list includes design documents. Several of my clients are contractually required to produce design documents.  Without the design (or with an incorrect design document), they will not be able to deliver the software and get paid.

Third, the whole team needs to understand the planning time box.  As a general rule, the preplanning meeting should last no more than an hour for a two week iteration (30 minutes for a one week iteration).  The main planning session should be planned to last no more than two hours for each week of sprint duration.  The time box helps teams stay focused (also keeps me from telling stories), and from being overly precise (which usually represents false precision), which can cause the team to spin into planning paralysis. When the clock strikes the planning time box is over and the team needs to start executing.  The plan will be revisited during the daily meeting.

Teams need to begin planning with a firm grasp of sprint goal to provide direction and focus.  Explicitly understanding the expectations for what done means will help the team plan activities and tasks.  The definition of done also helps the team keep each other on track because they all have common expectations.  Finally, planning like most team sports, is constrained by a clock.  The time box keeps the team from spending the entire sprint planning rather than delivering.

20130620-211537.jpg

Demonstrations come in many sizes and formats. Some are concise affairs while others are long and labored. Some are interactive and conversational while others are grand presentations with information being shared in one direction. The more complex and unidirectional the conversation the less interesting the demonstrations are for attendees. Interesting and focused demonstrations can be accomplished using a few basic techniques. Good demonstration requires context, preparation and execution.

Sprints begin with work being accepted by the team into the sprint. The work that the team accepts into the sprint provides the context of the demonstration. The goal of taking work into a sprint is to complete all of that work based on the overall definition of done and as proved by completing all acceptance criteria. Work that does not meet the definition of done should not be demonstrated. When not completed the team needs to communicate that they were not able to complete the work and return the unit of work to the backlog then move on to the units of work that were completed.

Preparing for a demonstration includes beginning with the end in mind. As units of work are accepted into the team should ensure that the acceptance criteria are written that so they can demonstrate the team met the definition of done. A second step in the preparation process is to have the team walk through the demo prior to do it live. I believe that the product owner should facilitate the demonstration, explaining how the team has solved the business problem and soliciting interaction and feedback. Therefore a bit of practice is always good idea to ensure the product owner is not caught flatfooted. Demonstrations in which the team fumbles around or solicits interaction and feedback and the software isn’t accessible is at best comedic and at worst gut-wrenching.

When it is time for the demo, execute according to the plan. I suggest beginning with a bit of preamble such as explaining what a demo is, for any new attendees, introduce the team and then let everyone know which units of work will be demo’ed and which won’t. Then demo the units of work one by one. Explain the unit of work, walk through an execution of the acceptance criteria, solicit interaction with the software (if appropriate) and finally ask for feedback. As noted before I have found that having the product owner facilitate the demo supported by the technical team is most effective for getting stakeholder involvement and feedback. Keep the process concise without being clipped or abrupt. At the end of demonstrating each unit of work it should be easy to determine if the stakeholders believe the units of work are complete, if there are issues that will require the unit of work to be returned to the backlog or if new backlog items have been identified.

Demonstrations can be done using a simple, repeatable process that straight forward and to the point. By making sure the process is interactive and the material concise all of the participants will find the demo engaging and focused.

A Demo

Demonstrations reflect the output of the team, i.e. the value they created during a sprint. Because of the pressure to “show” value there is a tendency for immature Agile teams to add all sorts of interim deliverables to the demo. Assuming that you are involved in a software project, an Agile demonstration would be based on working software. Lists of code, PowerPoint slides and status reports should be avoided.

The demonstration, led by the Product Owner, shows how the unit of work was completed. In the best case scenario, the stakeholders get to exercise the functionality. By seeing and exercising the software stakeholders are in a great position to provide feedback about what they need or don’t need and about new ideas that might have been triggered. Only the units of work (also known as stories) that are done should be included in the demonstration. The word “done” has a special meaning in Agile. Done is defined as the task required to make the software shippable/implementable. The definition of done usually includes design, coding, testing (various flavors) and documenting. These tasks are completed during the sprint.

Items such as code (the printed representation), status reports, graphs or PowerPoint slides do not represent progress. PowerPoint slides are very rarely installed in production as operational software. Anything that does not fall into the category of implementable software is not demonstrable. I have seen perfectly fine demos that begin with a a slide showing a list of stories that were in the sprint, a few minutes of a preamble describing how a demo works and a flip chart showing the burn-down chart for the sprint and the program.  While these are really barnacles (add-ons) attached to the demonstration they didn’t distract from the real measure of progress.

Demonstrations are a tool to solicit feedback by presenting working software (if this is a software project). Demonstrations are not status meetings or platforms for presentations or for reading code, showing PowerPoint slides or graphs as a proof that work has been accomplished. The point is to show what has been completed and could be implemented. That gives the stakeholders the best understanding of direction and allows them to supply targeted feedback based on knowledge.