Skip to main content

Walking Through a Definition of Done

May 31, 2017

“No book of mine is complete without a dog” - Peter Temple 

“Scrum begins with Done”. So remarked Gunther Verheyen, neatly synopsizing a paradox of agile practice in four words. The assertion might seem counter-intuitive, as though we must start by defining the end. The point being made is that knowing when you are “Done” frames the work which must be undertaken in order to get there. So if every book you write must have a dog, then you know you can't be finished unless Bonzo has made an appearance. 

In Scrum each iteration  - or Sprint - should yield a valuable product increment of release quality. That's the essential Bonzo moment. In fact everything really leads up to that. An understanding of what makes an increment truly releasable - and therefore genuinely “Done” - provides transparency over the work a Development Team plans to do, and the tasks it brings into progress.

Why is "Done" so important? Well, incomplete work has a nasty habit of mounting up,  and without visibility of how much effort truly remains, the deficit can quickly get out of hand. The tyranny of work which is nearly done, but not really done, can put a team in servitude to technical debt. Team members are obliged to repay the “deficit for release” at compounding rates of interest, as it becomes harder and harder with each Sprint to bring that delinquent, half-finished work into a usable and releasable state. They become indentured to debt, laboring to undo decisions which might have seemed expedient at the time, but which compromised their work and made it of less than release quality. Eventually the expense of fixing things, of repaying the debt, may exceed the value to be had from the next product increment. By that time the team are effectively insolvent.

However, if the team have a clear understanding of what “Done” means, the tyranny which can so easily beggar them is routed and overthrown. Transparency reigns, and in each Sprint the "Definition of Done" holds an honest court. What has truly been delivered? Is the increment of demonstrable release quality, immediately usable, and of actual value to stakeholders?

In short, a “Definition of Done” is fundamental to the attainment of transparency in agile practice. Strangely though, many teams fail to recognize this connection and see “Done” as a kind of stage-gate which, for the sake of "agility", ought to be negotiated fast-and-loose. The disdain for agile rigor can present a real challenge. Ask a team to share their Definition of Done, and a coach may hear vague mutterings about “unit tests passing” or work being “checked in” or “signed off”. Inquire further as to whether this is adequate for work to be of production quality, and the team can become defensive or sheepish. In truth their increments may not be even close to a releasable state. There may be months of further integration and testing, in multiple higher elevations of pre-production environments, before that condition is achieved. Often, the team might not have implemented Scrum at all, but faked it within the constraints of an unchanged organization and the status quo presented by its operating model.

It can be very tempting to try and fix a lack of rigor by patching a Definition of Done with stronger criteria. However, the definition will be poorly applied unless those terms are clear to the team, and they are in agreement with them.

It's best to start with the little that is thought to hold and jot it down. It doesn’t matter if it is only one or two lines on a Scrum board asserting trivia like “unit tests pass” or “code checked in” or “accepted and signed off”. At this stage, what matters is that the Definition of Done has begun to crystallize. The team must own a manifest view of the standard to which they hold their work, however shoddy it may be. Only then can they hope to improve it.

EXAMPLE DEFINITION OF DONE

Remember that a Definition of Done properly applies to an increment.

1. Environments are prepared for release

First check that no unintegrated work in progress has been left in any development or staging environment. Next, check that the continuous integration framework is verified and working, including regression tests and automated code reviews. The build engine ought to be configured to schedule a build on check-in. It may also trigger hourly or nightly builds. Also, check that all of the test data used to validate the features in the release has itself been validated .

2. Handover to support is complete (Note: This may be elided in a DevOps context or where the Dev Team will follow the product through to support)

All design models and specifications, including user stories and tests, must be accepted by support personnel who will maintain the increment henceforth. Note that they must also be satisfied that they are in competent control of the supporting environment.

3. Review Ready

Part of the work in a Sprint includes preparing for the review. Sprint metrics ought to be available, including burn-down or burn-up charts. Any user stories which have not been completed ought to be re-estimated and returned to the Product Backlog.

4. Code Complete

Any and all “To Do” annotations must have been resolved, and the source code has been commented to the satisfaction of the Development Team. Source code should have been refactored to make it understandable, maintainable and better able to support future change. 

Unit test cases must have been designed for all of the features in development, and allow requirements to be traced to the code implementation such as by clear feature-relevant naming conventions. The degree of Code coverage should be known, and should meet or exceed the standard required. The unit test cases should have been executed and the increment proven to work as expected.

Peer reviews ought to be done. (Note: If pair programming is used, a separate peer review session might not be required). Source code is checked into the configuration management system with appropriate, peer-reviewed comments added. The source code should have been merged with the main branch and the automatic deployment into elevated environments should be verified.

5. Test Complete

Functional testing should be done. This includes both automated testing and manual exploratory testing, and a test report should have been generated. All outstanding defects (or incidents such as build issues) should be elicited and resolved, or accepted by the team as not being contra-indicative to release. Regression testing has been completed, and the functionality provided in previous iterations has been shown to still work. Performance, security, and user acceptance testing should have been done, and the product should be shown to work on all required platforms.

DEFICIT FOR RELEASE

"Done" criteria which are needed to effect a release, but which cannot yet be observed, constitute a deficit. They should be enumerated here (e.g. by moving them out of the Definition of Done).

"Better a little which is well done than a great deal imperfectly" - Plato. 

 


What did you think about this post?