Skip to main content

Scrum Forum

How can we put QA back in the Sprint?

Last post 07:01 am November 19, 2013
by Ian Mitchell
6 replies
03:46 pm November 15, 2013


I am looking for tips how to put QA back in the Sprint. At the moment the "support and maintenance" team is the QA team. The Development Team does TDD and little bit BDD as acceptance test. But the "support and maintenance" team does too much testing.

I have the book "Agile Testing", but I am still at the beginning. What can I do to get the "Definition of Ready" and "Definition of Done" in a useful state and used by the people? What else can I do?

(We have a coach, books, videos, trainings, a scrum master .... they work with scrum over 5 years now ...)


02:17 am November 16, 2013

> What can I do to get the "Definition of Ready"
> and "Definition of Done" in a useful state and
> used by the people?

Have you addressed the Definition of Done in a Sprint Retrospective yet? Improving the DoD is one of its outputs, and it should involve the entire Scrum Team.

02:22 am November 16, 2013

> What else can I do?

Check that the Definition of Done is one that the team can actually meet. If they don't own their process, and can't take backlog items through to completion without external dependency or impediment, the DoD will be of little value.

03:22 am November 16, 2013

It would be great to have some more information about DoD. I do not know how it could look like at all. For example "All tests are green" can be anything.

04:12 pm November 16, 2013

an example:
- every new piece of code has to be tested by unit test
- test coverage has to be above 75%
- every commit has to have a comment not shorter than 12 characters with the number of US or bug
- no compilation warnings
- no critical issues reported by static code analysis
- no failed regression (old) tests

04:20 pm November 16, 2013

Thanks a lot. That is a great start. :-)

07:01 am November 19, 2013

A Definition of Done should be good enough that the resulting increment is potentially releasable. However, many teams do not initially own their process, and a DoD of such quality is not possible. These teams may separate "story done" from "sprint done" and "release done". This allows technical debt to be controlled, or at least made visible, until such time as a single release-ready DoD can be applied to each and every Sprint.

Here's an example of an organizational DoD. Remember: this isn't an example of what a good DoD should look like, since it contains anti-patterns...a good DoD should not separate "story done" from "sprint done" from "release done". It is an illustration of an intermediate set of definitions that may be used during a controlled agile transformation before CI/CD is in effect:


• Code functionally working, checked in and builds
• Code is traceable to requirements (check in is linked to a work item, e.g. by ID)
• Unit tests written and all unit tests in the test suite passing
• Regression tests passing
• Scrum team tests automated
• Scrum tester testing completed and passing with tests scripts and evidence produced
• Technical debt register updated (as a story in the backlog)
• Approved by Product Owner (Points awarded at demo)


• Code and unit tests peer reviewed
- review on check in with a check in comment “Reviewed by…”


• Code conforms to organizational development standards
• Infrastructure components used appropriately
• Code deployed to a production equivalent integration environment
• Code deployed to Product Owner review area
• Code merged to main line
• Code coverage >= 75%
• New services registered in service repository


• Database transactions analysed for efficiency
• Master database scripts produced from Athena
• Automated confidence tests for deployments


• Master database scripts produced
• Database transactions analysed for efficiency
• Zero critical or high outstanding defects
• Performance & Volume tested
• Security tested
• Supporting documentation completed
• User permissions set up
• Wiki page with links to: Trouble shooting guide, functional spec, High Level Design, user stories, documentation of any non-standard patterns/tools used, location of project code