Skip to main content

The Definition of Done: Ensuring Quality without Compromising Value

October 5, 2023

The Definition of Done (DoD) is a sacrosanct measure of quality, ensuring that every piece of work meets the standards necessary for release. On the other hand, acceptance criteria focus on the work's content. Merging the two can risk the integrity of a working, usable product. This article delves into the nuances of maintaining the sanctity of the DoD while ensuring the delivery of valuable increments.

Why having Acceptance Criteria on the Definition of Done risks the sanctity of the working usable product

The DoD is my touchstone for quality. It's a measurable and automatable standard that ensures every increment is of a certain quality. Introducing optional or non-executable elements into the DoD can diminish its importance, potentially jeopardising the transparency and empiricism foundational to our work.

The DoD isn't about guaranteeing functionality but ensuring that what we've built works. It's the bedrock of transparency. If stakeholders are still determining whether the work meets the necessary standards for release, there needs to be more clarity on the additional effort required. Similarly, if Developers are unsure of these standards, they can't accurately gauge the work needed.

Incorporating elements that might undermine this transparency or minimise these standards is a risk we shouldn't take. The DoD's primary purpose is to maintain this clarity.

Why having Acceptance Criteria on the Definition of Done may not risk the sanctity of the working usable product

There are instances where quality measures form part of the acceptance criteria. In such cases, one might be tempted to add "must meet acceptance criteria" to the DoD. However, this can jeopardise the team's flexibility in determining the Sprint's contents while upholding the DoD's sanctity.

Instead, we should scrutinise the acceptance criteria. If specific criteria always needs to be true, they should be incorporated into the standards and, by extension, the DoD. For instance, criteria like "all content is spelled correctly and has good grammar" or "all copy meets visual spacing guidelines" can be integrated into the DoD.

If you're keen on including acceptance criteria in the DoD and wish to maintain the flexibility of value delivery, consider phrasing it as "for all work delivered in the Sprint, quality-based acceptance criteria on each backlog item are fulfilled". This retains the original intent while allowing the team to adapt based on their learnings.

Conclusion

The DoD aims to ensure transparency, confirming that all showcased work meets our product's standards. Anything that jeopardises this transparency or the adaptability of Sprint's contents should be approached with caution. Only adopt such changes if you can guarantee the preservation of adaptability and transparency.

---

NKDAgility can help!

These are the intricacies that lean-agile aficionados thrive on, but most find daunting. If you find it hard to distinguish between the Definition of Done and Acceptance Criteria, my team at NKDAgility is here to assist. Don't let these issues undermine your value delivery. Seek help sooner rather than later.

Right now, you can request a free consultation with my team or enrol in one of our upcoming professional Scrum classes. Because you don't just need agility, you need Naked Agility.


What did you think about this post?