Book Cover

In week ten of the re-read of L. David Marquet’s Turn the Ship Around! we add two more mechanisms for control and complete part two the book. This week the two chapters are A New Ship and We Have A Problem. (more…)

Listen Now

Subscribe on iTunes

This week’s Software Process and Measurement Cast features three columns.  The first our essay on Reviews and Inspections.  Reviews and inspections are a critical tool for improving quality and team effectiveness. Whether you are using Agile or classic techniques improving your reviews and inspections will directly increase the value you deliver.

We also have a new entry in the QA Corner with Jeremy Berriault.  Jeremy and I discussed the value of independent QA.

Anchoring the cast is a new installment from the Software Sensei, Kim Pries. Kim contrasts planned activities and improvisation in software development.  Kim builds on the differences in the two approaches to help teams to understand when to do either.

Call to Action!

For the remainder of September let’s try something a little different.  Forget about iTunes reviews and tell a friend or a coworker about the Software Process and Measurement Cast. Let’s use word of mouth will help grow the audience for the podcast.  After all the SPaMCAST provides you with value, why keep it yourself?!

Re-Read Saturday News

Remember that the Re-Read Saturday of The Mythical Man-Month is in full swing.  This week we tackle the essay titled “The Documentary Hypothesis”! Check out the new installment at Software Process and Measurement Blog.

Upcoming Events

Software Quality and Test Management
September 13 – 18, 2015
San Diego, California

I will be speaking on the impact of cognitive biases on teams.  Let me know if you are attending! If you are still deciding on attending let me know because I have a discount code.

Agile Development Conference East
November 8-13, 2015
Orlando, Florida

I will be speaking on November 12th on the topic of Agile Risk. Let me know if you are going and we will have a SPaMCAST Meetup.

More conferences next week including Agile DC and Agile Philly!


The next Software Process and Measurement Cast will feature interview with Steve Boronski. Steve and I had a great conversation about Agile and Prince2. The conversation focused on the new Prince2 Agile Project Management Best Practice extension to Prince2 and why the world needs another interpretation of Agile project management.

Shameless Ad for my book!

Mastering Software Project Management: Best Practices, Tools and Techniques co-authored by Murali Chematuri and myself and published by J. Ross Publishing. We have received unsolicited reviews like the following: “This book will prove that software projects should not be a tedious process, neither for you or your team.” Support SPaMCAST by buying the book here. Available in English and Chinese.

Reviews and inspections focus on quality.

Reviews and inspections focus on quality.

Why are reviews and inspections important?  Because they affect the quality a project’s deliverable. The quality of product is generally positively correlated with value. As the quality of a project’s deliverable increases, the higher that product is valued. Quality influences at least five major components of value:

  1. Usability: Higher quality products are by definition more usable than the same product with lower quality. For example, I am an avid Evernote user, however recently I discovered a latent bug that broke the app for over a month. The lack of usability negatively impacted my perception of quality (even thought their customer service resolved the situation).
  2. Reduced Maintenance: A product with a smaller number of latent defects will require fewer fixes (less maintenance) than the same product with more defects. Every dollar spent on fixing defects in production is a dollar that can’t be spent on new features. Improving quality increases the budget that can be spent on development and enhancements.
  3. More Output: Increasing development quality helps an IT organization deliver more value. As noted, reduced maintenance provides more time for development.  Secondly, it is a commonly held belief that a defect discovered earlier in the development cycle requires less effort to fix than one found later in the development cycle. The effort to fix a defect is typically called rework. The lower the amount of rework a project has to deliver, the more time that will be available to deliver value.
  4. Cost: Improving quality lowers rework and reduces maintenance, which translates directly to a lower cost per unit of work delivered. A simple metric is cost is cost per function point. Lower costs generally make everyone happy, specifically those people than have to manage the IT budget.
  5. Customer satisfaction: Quality and customer satisfaction are highly interrelated. Customer satisfaction is influenced by many different attributes. Product quality is a broader concept than just product quality, however without product quality few of the other project attributes (for example team professionalism and empathy for your customers needs) matter. Increased quality is generally linked to increased customer satisfaction. When a team’s customers are satisfied the team’s job satisfaction increases, which leads to improved performance and even higher quality.  A cycle of improving quality and customer satisfaction is a virtuous circle.

Reviews and inspections are a method for improving the quality of the products a team delivers. In a perfect world we would have a perfect engineering process that would allow a project team to gather requirements, design, build, and deliver a perfect solution. The perfect engineering process would not require the burden of management approvals, reviews, inspections, or even testing. Real world engineering processes will never be perfect. Therefore, every project team must work hard to balance the level of quality an organization needs (or can afford) with the cost of testing.  Reviews and inspections tend to be less costly than testing because they can be applied earlier in the development cycle. As a result, they help to tip the balance toward quality.

Untitled2Reviews and inspections are powerful tools that can deliver a great deal of value, if they are done correctly. Like all powerful tools, if you use them incorrectly the results will not be as expected. There are several typical anti-patterns that can effect reviews and inspections. The top six are:

  1. Finding Fault: Reviews and inspections remove defects and provide the basis for teachable moments.  Finding fault or assigning blame for the defects found during or in a review will generate an adversarial environment that incents the person whose deliverable is being reviewed to hide defects or to strenuously debate whether anything is actually a defect.  Neither outcome promotes the goals of the review process. Note: not finding fault does not mean that leaders and managers should not correct problems or not to get people help when they need it.
  2. Lack of Preparation: Most reviews and inspections require preparation. Preparing ensures that the time spent doing a review or inspection is focused on removing defects rather than reading the deliverable in a group setting. When teams or organizations begin to forego preparing for reviews, they instead spend inordinate amounts of time in review meetings doing the preparation together and then doing the review. This approach is highly inefficient and because we all tend multitask, our attention span will tend to wander during long meetings, which reduces effectiveness. I have also observed that as review duration increases, participation decreases.
  3. Combining Reviews and Approval Events: Combining reviews and inspections with approval events generally shifts the focus of the process from defect removal to getting approval to progress (sometimes called sign off). Making a good review or inspection into a hurdle for approval incents the author and his or her allies to hide, downplay or generally spin any potential defects so that the project can keep moving. Going down this path will insure less defects are found, not that less defects exist.
  4. Manager Involvement: It is a rare team in which the person that controls (or strongly influences) team member’s salary or promotion opportunities is truly a peer of those on the team doing the work. Involving managers in reviews and inspections will tend to increase the level of defect hiding, posturing and just plain brown-nosing. None of these behaviors are conductive to efficiently and effectively finding and removing defects. Reviews and inspections work best in an environment where sharing and accepting feedback is not dangerous to one’s career.
  5. Nothing To Compare With: Reviewing any deliverable against nothing or just reviewers experience is generally ineffective and tends to spin down into discussions or debates based on opinions. Reviews are most effective when the deliverable being reviews is compared to something everyone can reference.  Team, organizational and/or industrial standards, architectural frameworks or even previous deliverables are all useful to compare against in a review or inspection.
  6. Not Enough Time: The requirement to do a review or inspection without enough time to do it right defeats the reason for doing reviews. I can’t count the number of times I have heard teams rushing thought a review say, “well this is better than nothing.” Well, perhaps it really isn’t better than nothing.  At the very best, a rushed review will leave defects in deliverables that will need to be removed later or that your customers will get to find for you.  Even worse, short cutting the time needed to do reviews correctly sends a message that we only pretend to believe in the processes we use and the IT really doesn’t know what they are doing. The consequences in the long run are usually a change in management or outsourcing.

If you recognize any of these anti-patterns, fix the problem.  If you need help making the point get a coach or consultant to help you. Organizationally, if you can’t correct these anti-patterns, stop using reviews and inspections and do something else, such as hiring lots of extra independent testers.  Doing reviews and inspections poorly is not doing you any favors.

Some good team guidelines

Some good team guidelines

Even the most informal review needs a few guidelines to keep it effective and civil. All types of reviews and inspections feature two core elements: someone looks at someone else’s work and then tells them if they see any problems. When the process is rushed or overloaded because reviewers are being asked to review too much in too short of a time, effectiveness suffers. If reviews lack civility they will either be avoided or someone will start selling tickets as if they were professional wrestling matches. Guidelines make sense.


  1. Review in small chunks, if at all possible.  Small pieces of work are easier to schedule and evaluate. It is the same rationale for having smaller user stories and time-boxed work in Agile.
  2. Allow for adequate time to review the material. Rushing a review leads to less than stellar results.  If you rush pre-work for an inspection, reviewers will either opt out (can’t get the review completed) or just do a cursory job hoping someone else will find the problems that exist.  Every organization needs to create guidelines for how much can be reviewed in a given timeframe. One organization I have interacted with requires 1 week per 20 pages of lead time and another 200 to 300 lines of code per hour for code walkthroughs. Collect metrics and create your own guidelines.
  3. A set of meeting guidelines is usually a good idea.  Review meetings should never get personal and never turn into debates. Depending on the type of review, I have reviewers present issues in a round robin format or implement a two-comment limit per person on each topic (a variant of an idea from Robert’s Rules of Order).  Meeting guidelines should foster conversation, limit debate and ensure participants focus on the product not the person. If meetings devolve into finger pointing or debates, get a facilitator.
  4. In reviews and inspections that require reviewers do pre-work before a meeting, IF the pre-work was not done exclude the person who is unprepared or reschedule.  Generally I have found that this is a rare occurrence after the first exclusion or cancellation.
  5. Everyone that participates in the process MUST be trained in how your organization does reviews in order to participate. Every organization does reviews and inspections in a slightly different manner, therefore specific local training always makes sense.

Guidelines are important to help keep reviews and inspections effective and civil. I would like to provide a cautionary note or perhaps a 6th guideline: keep the guidelines as simple as possible. A long list of complex guidelines will cause frustration and will probably be ignored. Have teams do retrospectives focused on reviews and inspections and look for ways to make them more effective. Improving the deliverables and functional code that is delivered leads to higher customer satisfaction and typically to improved capacity to deliver value.

Inspections are the gold standard.

Inspections are the gold standard.


Inspections are peer reviews by trained individuals using a formal, structured method. Inspections represent the gold standard for reviews.  According to Capers Jones, inspections are highly effective in removing over 97% of requirements defects. Data that I have collected supports the view that inspections are the most effective means of early defect removal. The issue is that very few people do inspections for two basic reasons.  The first is that inspections are expensive (they require time and effort) and inspections are uncomfortable for those being reviewed. Inspections require specific set of roles and a specific process.

The roles:

Moderator – The moderator leads the team through the process ensuring everyone participates and follows the rules.  The moderator will also ensure that metrics are collected and that the defects that are identified are correct and recorded.

Author – The author is the person that created the deliverable being reviewed.  He or she will participate in the inspection, providing feedback and developing an understanding of the defects being identified.

Reviewers – Reviewers read the deliverable and identify defects BEFORE the inspection meeting.  The pre-work is critical and needs to be turned into the scribe before the inspections meeting.

Scribe – The scribe insures that all participants have provided their feedback before the inspection team meets.  The scribe also combines all of the feedback to ensure that only unique items are discussed in the inspection meeting.  During the inspection meeting, the scribe records feedback on the defects identified during the pre-work  and records any new defects that are identified during the meeting.  After the meeting they keep track of the defects to make sure they are tracked to completion. The data collected by the scribe feeds the measures and metrics.

The Process:

  1. Plan the inspection process:  The plan needs to ensure that the process is followed and that need lead times are built in.
  2. Kick off meeting: The moderator explains the goal and reviews the plan for the inspection. The moderator will also ensure that all reviewers are aware of any organizational standards that need to be applied during the inspection. I suggest that during the kickoff meeting that the reviewer lets the moderator know if they will have a problem meeting the due dates and if they feel they are unqualified to perform reviews.
  3. Distribute the deliverable that will be inspected.  Note in today’s collaborative environments this may mean send a link to the document that will be inspected.
  4. Individual preparation:  Each reviewer reads the material being reviewed and records all defects identified using the format identified in the kick-off meeting.  Also keep track and record the time you spend on the review.  Send the defects identified to the scribe before the inspection meeting.  For large deliverables the scribe may need up a week to consolidate all of the feedback.  Collaboration tools can save time and effort to consolidate the feedback.
  5. Inspection Meeting: The inspection meeting is the core event.  Generally each reviewer goes over the unique defects they identified. As a moderator, use a round robin process with each reviewer walking the author through an item.  The author can ask questions or point out why something identified is not a defect, but under no circumstance can anything identified be debated.  Moderators need to ensure that the feedback stays technical and not personal. Being told you made a mistake is never fun.  Reviewers that did not complete the pre-work are not allowed to participate. Scribes make sure comments are recorded and all defects that are really defects are recorded for follow up.
  6. Rework and Follow Up:  After the inspection meeting the author needs to resolve the defects found in the inspection.  Some organizations have rules about which defects must be corrected and which can be deferred. Major problems may require a further inspection. The moderator will usually make this type of determination based on organizational standards and the level of criticality of the project. The time required for rework is useful data for process improvement and determining whether inspections are effective.
  7. Retrospective:  Always do a retrospective on the process to improve how you are doing inspections. The moderator should ensure that the retrospective occurs (it should be part of the plan).
  8. Process Improvement: The data collected during the review process can be used to improve the process in the IT department, to identify training needs and to decide what types of work should be inspected. The moderator or process improvement personnel will do the analysis needed for process improvement.

Inspections are very effective if the team doing the inspection understands the process and the deliverable being reviewed.  For example, if you are doing a code review on a program written in Ruby don’t gather a group of reviewers that have never coded in Ruby.  They are not qualified. Also, everyone involved in the process needs to trust each other. All reviews, in general, and specifically inspections (because they very formal) can be very stressful for the author. The level of stress can cause authors to avoid inspections or short cut the process; which means that defects will be found later in the development process or in production.  Inspections remove defects in code or any other deliverable before they get into production or before lots of time is spent on building something that will need to be changed before it goes into production.

Reviews and Inspections: With Formality Come Rules

Reviews and Inspections: With Formality Come Rules

Reviews and inspections are an integral part of building most everything. You find reviews and inspections in manufacturing, in construction and even in publishing. Software development and maintenance is no different. Reviews and inspections can be powerful tools to remove defects before they can impact production and to share knowledge with the team and stakeholders. They are part of a class of verification and validation techniques called static techniques. These techniques are considered static because the system or application being built is not executed. Instead of executing the code, the code or other written deliverables are examined either by people (generally called reviews and inspections) or mechanically by a tool (known as static analysis).  Reviews and inspections can be applied to any product generated as part of the development process, while static analysis can only be applied to code-based products.

Reviews and inspections come in various levels of formality to meet different needs. Informal reviews typically do not follow a detailed written process and results are generally not documented for later review and analysis. They are often used to ensure knowledge sharing and training at a one-on-one level. One classic method used for informal reviews is called the desk check.  In a desk check, a team member emails a deliverable or code to another team member who reviews it and gives them feedback. Another form of informal review is pair programing.

Walkthroughs are a step up the formality ladder.  Walkthroughs are group sessions in which the author takes the group through the deliverable.  The attendees of walkthroughs generally include team members and technical specialists. Walkthroughs can be very informal (an impromptu gathering) or the degree of formality can be increased by requiring meeting preparation and collection of issues and defects. Walkthroughs are used to discover defects, make decisions and to distribute information.

Technical reviews leverage a defined process for defect detection, and include participation by peers, technical experts and often management personnel. They are more formal than the typical walkthrough and are much more formal than desk checks. A trained moderator, who is not the author, generally leads a technical review (to enforce independence) comparing the deliverable to organizational standards. In addition to defect discovery, decision making and information distribution, technical reviews are often used as a formal approval mechanism. For example, I recently observed an organization where all projects go through an architectural review. Technical reviews are a type of technical review usually based on defined organizational standards. The architecture review in the example was based on the organization’s published standard architecture.

Inspections are the most formal of the review and inspection techniques. The most well-known inspection technique is based on the process defined by Michael Fagan of Fagan Reviews.  The inspection process includes highly defined roles such as moderator, author, scribe and reviewer. All inspection processes typically include required pre-work, logging of defects, collection and publication of metrics, formal follow-up procedures and, in many cases, the use of statistical process control techniques. The goal of inspections is to find and remove defects.

Reviews and inspections are highly effective and powerful tools for finding and removing defects from software and other deliverables. Review and inspections are used in all software development and maintenance methods. The type of review and degree of formality is usually a function of the type of project. For example, inspections are almost always used on mission critical applications, such as medical devices and weapons systems, regardless of whether they are using Agile or plan-based techniques. Reviews and inspections remove defects and share knowledge so teams can maximize the value they deliver.