The top-ten tenets of software quality assurance, part three: the formal review
Formal reviews are often left by the wayside, but without them annoying defects can become crippling bugs, warns Mark Wilson
I've put the formal review at number three in this ten-part series, but this is the most basic and effective quality-control there is.
It's crucial to sit down, preferably NOT in a joint or peer review, and write down all the defects you find when you read a document or piece of code. This is ‘NOT testing' (although it's sometimes referred to as static testing) as you are preparing no data and there are no specific conditions.
It's just you and your brain checking that the review item is:
- Correct;
- Complete;
- Consistent;
- Unambiguous;
- Verifiable.
A product specification for the item being reviewed is a must. People used to refer to this kind of checking or verification as a ‘Fagan' inspection. This wasn't a reference to the Michael Fagan who invaded The Queen's bedroom in Buckingham Palace in 1982 doing a check on whether she used blankets or duvets (note: almost certainly blankets). It's taken from another, less interesting Michael Fagan, a trail-blazer for proper software QA and the verification, validation and test (VVT) model.
Most annoying but true thing that QA people say
I like formal document reviews because they are the opening salvo in the battle against the defects that are present in commissioned, crafted software, and because the process is pretty much straightforward. I also like them because they validate what I think is one of the best, and most true, of all the words of wisdom in the small, tragic world of QA practitioners:
You've never got time to do it right, but you've always got time to do it again.
One of the most depressing things about being the QA person is constantly being proved right; that's partly the purpose of these articles.
Reviewing documents, with a proper time and cost budget, is the single best way of improving the quality of software
It happens again and again, no matter how often the results back it up: the more reviewing and static testing you do before ANYONE cuts ANY code, the less time you will spend fire-fighting and cleaning up a mess towards the end of the project, when everything gets squashed into that horrible period when the user is merrily finding loads of issues with the pock-marked horror you've foisted on them.
Of course, a lot of this is psychology: Project managers just love showing they're ahead of the curve. They also love to tell their bosses that the lads and lasses on the coal-face are cutting coal (senior civil servants I knew were gratified if coders were there after 6pm - they didn't care less about the quality). They do not like showing precious resources being planned for deployment on something so boring and lame-sounding as reviewing documents or specs.
But I am here to tell you that reviewing documents, with a proper time and cost budget, is the single best way of improving the quality of software.
Which items to formally review?
All of the key documentation, obviously.
First, there's the project initiation document (PID - you do have one, or whatever you call it in your world, don't you?) for starters. Then, the terms of reference, the business case and the outline prioritised requirements list. The Test Strategy (again, whatever you call that vital item), the Configuration Management Plan (how you will identify, version-control, build, integrate, continuously integration test and deploy software components). Even if you are doing Agile, I have always thought the best results come from producing a proper features-based design specification or document. You will also review or walkthrough code.
One of the things I have harped on about for 30-odd years is that it's cheaper and easier to fix defects as early as possible in the life-cycle of an item as possible. Many studies will tell you this.
The cost of making changes to software at each stage of the project
If you look at the tiny print on this diagram, from Scott Ambler's Agile Modeling: Effective Practices for eXtreme Programming and the Unified Process, you'll see it's copyrighted 2002.
I recall seeing it earlier than that, and it's still being cited in articles on test-driven development, extreme programming and other Agile methods. The reason it keeps being cited is because it's true. It's common sense. As carpenters always say: Measure twice, cut once. Act in haste, repent at leisure. The only bad review is one that doesn't happen (I made that last one up).
Fixing a document (for example, a Microsoft Word requirements spec or a user story held in a repository) is the work of a few hours. It's not as though you need system analyst skills, either. I've reviewed lots of technical documents and, for a non-technical person, it's still possible to spot lots of defects simply by reading it and applying the checklist. That is to say, is it complete? Is it consistent? Is it unambiguous? Are its statements verifiable? And applying the product specification.
I like formal document reviews because they are the opening salvo in the battle against the defects that are present in commissioned, crafted software
Fixing a design after you've spent ages architecting it because the user story or requirement was defective, takes a lot longer, and also involves the time of more skilful people. Fixing a defect found in code testing may mean changing the design and even going back to the requirements. Once the defect has ‘leaked' into system testing, or further, into user testing or, worst-case scenario LIVE, the costs have increased dramatically.
Agile techniques, like pair programming and test-driven development, will drive out early coding errors, but a sit-down document review will find howlers even earlier.
I'll come back to this in a later article, but have a look at the follow real-world graph of multiple projects I was involved in. With plenty of formal reviews/inspections, we'd managed to get the balance of defects to the left, with spectacularly good outcomes for the user (several customers were handed virtually defect-free releases).
Regular reviews can not only keep projects on track, but ensure that they are delivered with minimal bugs
The top-ten tenets of software quality assurance, part three: the formal review
Formal reviews are often left by the wayside, but without them annoying defects can become crippling bugs, warns Mark Wilson
How to review
When I was first involved in reviewing documents, it was an extremely ponderous and top-heavy process. There were review panels, review leaders, scribes, approvers and so on. There were planned cycles of reviews. There was even a two-week series of reviews of the, err, review procedure. It was the most un-Agile process ever. If the process we followed at the University of Leeds, 175 Woodhouse Lane, in 1989 was a car, it wouldn't even be a car. It would be the QE2. We were about as Agile as a steamroller. Admittedly, though, we were designing a validated compiler for a brand-new language that had only just acquired a MIL-STD and ISO standard.
But as we get older, we get wiser, and over the years I have written, used and audited many of these processes and I've managed to slim it all down and make it slicker, and think the best method is to review using PRINCE2 product specifications, even if you are following Agile.
Product specifications
The best thing about these is that, like user stories, they must have acceptance criteria. They also resemble 'definitions of done'. And don't forget that even the product specifications must themselves be reviewed!
Agile techniques, like pair programming and test-driven development, will drive out early coding errors, but a sit-down document review will find howlers even earlier
You can download a template for a PRINCE2 product description from the internet. What's so good about them is that they do half of the job for you; they force you to state exactly what needs to be in an item (for example, theme, functional specification, test plan) and they force you to say how and by whom they have to be reviewed.
Ideally, the product spec should state:
Production criteria
- Title;
- Number produced;
- Frequency of update;
- Who writes it/produces it (the product in question - not the description);
- Its purpose (what it's for);
- What it derives from (its predecessor or parent artefact(s));
- What other data or inputs are required;
- What is derived from it (its successor or child artefact(s));
- Any data that it outputs;
- What skills or skill-level are required to produce it;
- Which role can produce it;
- The medium/media it's produced in;
- What meta-data it should bear.
Quality Criteria:
- What the acceptability criteria are (e.g. Correct, Complete, Consistent, Unambiguous, Verifiable). This should in part be derived from asking the questions:
- Will the customer for this product be easily able to produce the successor/child product(s) with THIS product in its current state?
- What will make this a good example of this product?
- What would make an unacceptable example of this product?
- Who reviews it;
- How frequently it must be reviewed;
- The skill-levels required to review it;
- What roles can review it;
- Review method;
- What the acceptable defect level is;
- Where/how the defects will be recorded.
I would also recommend looking at TryQA.Com, which has an excellent section on Formal Review.
Which documents do you formally review? In my view, there are three main types:
- Specifications (both products and processes);
- Plans;
- Reports.
In other words, I believe you should review things that cause you to do things or contain data that you use to make key decisions.
Then you have to decide what tool to use. By far the easiest way to do formal review is in Microsoft Word using the ‘track changes' feature. So long as you practice version control (it's easier and better to hard-wire version ID in the file-name), and store your reviews with the relevant version, you won't go far wrong.
Cautionary tales
Review is not enough
I used to review all contracts, PIDs, plans, test strategies, configuration management plans, test plans and test reports as part of my job.
In an ideal world, no item should be presented for a formal review unless it has already been informally reviewed and the author has received feedback that it's worth engaging others to verify it. Once a very important document was being prepared for delivery to the client, an extremely rich and powerful organisation that didn't even care how much anything cost: they had sacked two previous suppliers who'd attempted to build the same product. They were also very rude, hostile and threatening.
The document in question was a system Test Plan, and after 6 goes at getting it changed via an exhaustive review cycle I ended up writing it myself. In this example, even a great review process couldn't help the fact that the wrong person was being kept in post as the Test Manager.
The plumber's tap
In a classic example of ‘the plumber's tap always leaks', on two occasions I bucked the formal review process to release items that were being demanded by powerful clients. In one case, a Go/No Go Gate Review Report, I had jokingly referred to one of the customer's people as ‘The Prince of Darkness', expecting this hilarious deliberate howler would be picked up via the "idiot check" the document librarian normally did. Except no-one checked it. Oops.
Another time, I was editing a customer's logo on a requirements spec for a CRM system. In a playful moment, I added a red-button nose to one of those mandatory folk-gathered-round-a-computer-screen photos. When I remembered I'd done it, I was rather hoping the review would pick it up. But there was no review, and I had some explaining to do.
More walks from the wild, un-tamed world of quality assurance next week, when I will be looking at methods…
Missed Part One, the Contract Review? Read it now
Missed Part Two, documentation? Read it here