In every software project where I’ve used some form of review process (formal inspections, walkthroughs, or reviews — for this post I refer to all as “reviews”), the gain has always justified the pain. Invariably, some developers really dislike this process, leading to tantrums reminiscent of Orange Country Choppers.
But I’ve regretted every time I’ve dropped reviews for expediency (schedule pressure) or to mollify individual objections. Quality, transparency, and communication worsened. Expectations for code quality eroded and were replaced by inflated self-confidence. More bugs were created and escaped.
This post isn’t about the social psychology of development teams, but about the more mundane question of what kind of software support is available for reviews.
Although reviews can be done without any kind of software support, tools and development platforms can greatly increase productivity and effectiveness.
So, here in order of minimal to maximal support for reviews is a list of platforms and tools that can facilitate group reviews of software artifacts.
I recently was process architect for a global project team of 300 where reviews were used to scrutinize thousands of unique deliverables. We conducted an average of three detailed technical reviews a day over three years, covering large requirement documents, test design/plans, test code, and test run results. All artifacts went into a configuration management server, including documents that tracked review issues.
Every artifact was reviewed (about two dozen distinct types were produced for each work package), taking an average of 16 hours per set of artifacts, with two to four distinct reviews for each work package. Issues were entered as document comments. A simple macro extracted an issue report. After an online meeting,the reviewer tracked every review issue to resolution and controlled advance to the next phase. Re-reviews were conducted when necessary. Click here to view a deck that tells the story – slide 15 shows the review process.
This work followed well-defined procedures and relied on using basic features of Word, Excel, and Visual Studio. Despite minimal automated support, it was highly effective in finding and resolving issues. The housekeeping effort was dog work, but was routine and predictable.
The process and collaboration features of SharePoint are layered on top the Office apps. With this, teams can create, update, and track review issues and use Word or Excel as noted above. Out of the box collaboration support is limited, but can be extended with SharePoint workflow features.
This server app is designed to work with Visual Studio clients and can support other IDEs. All of the pieces are there to support reviews and inspections, but you’ll need to devise and execute a strategy that makes it work. Very good integration with Visual Studio, of course.
You have to work out conventions about how to attach line-level issues to file contents. This works very nicely for text (code) files, but isn’t possible for binary files (drawings, etc.) Workarounds are possible for binary files.
Both tools are no cost open source, relatively ease to use, but do require someone to control content definition and perform system administration.
These tools also do a great job supporting the rest of programming, testing, and bug tracking.
You can easily assemble an Eclipse tool chain that supports inspections, with tracking. DeveloperWorks describes such an approach.
“It’s software development Jim, but not as we know it.”
The leading Agile development platforms can support reviews and inspections — you just have transliterate the jargon, rituals, and shibboleths, then get buy-in for the discipline.
CodeCollaborator and PeerReview Complete from SmartBear are commercial products that support markup of code and design (binary) files, issue identification and tracking for multiple users.
ASSIST is research tool that provides flexible process definition. Free for academic use.
Crucible is a SAAS offering that provides excellent support for reviews. I’ve seen it in use and can highly recommend it for both Agile and incremental processes.
The DACS list of inspection and review support tools provides an overview of some free and fee products.
A 2005 report about inspection tools focuses more on a comparison framework than actual features, but offers some useful insights.
Process Impact’s links to inspection resources and tools provides checklists and forms and more guidance.
There are many static analysis tools that can and should be used to augment eyeballs. Let the tool do the dirty work. Human readers are better at looking for what isn’t there, subtle contradictions, and obstacles to maintainability. Static analyzers don’t obviate the need for reviews, but routine use will definitely boost review effectiveness. For example, getting a clean report from an analyzer(s) is a useful entry criteria for holding a review meeting.
Here are a few analyzers with notable features.
Coverity. Widely used, ferrets out a very wide range of common problems and security exposures.
Crap4J. Augments testing with Junit or similar harness. Helps to identify Change Risk Anti-Patterns — i.e., code with a high probability of hiding bugs and leading to high maintenance costs.
PolySpace. This Mathworks product uses a very sophisticated symbolic execution approach to find data flow anomalies and certain kinds of boundary errors, which are often very hard to find by human inspection.