Skip to main content

How can we put QA back in the Sprint?

Last post 12:56 pm July 7, 2022 by Chuck Suscheck
10 replies
03:46 pm November 15, 2013


I am looking for tips how to put QA back in the Sprint. At the moment the "support and maintenance" team is the QA team. The Development Team does TDD and little bit BDD as acceptance test. But the "support and maintenance" team does too much testing.

I have the book "Agile Testing", but I am still at the beginning. What can I do to get the "Definition of Ready" and "Definition of Done" in a useful state and used by the people? What else can I do?

(We have a coach, books, videos, trainings, a scrum master .... they work with scrum over 5 years now ...)


02:17 am November 16, 2013

> What can I do to get the "Definition of Ready"
> and "Definition of Done" in a useful state and
> used by the people?

Have you addressed the Definition of Done in a Sprint Retrospective yet? Improving the DoD is one of its outputs, and it should involve the entire Scrum Team.

02:22 am November 16, 2013

> What else can I do?

Check that the Definition of Done is one that the team can actually meet. If they don't own their process, and can't take backlog items through to completion without external dependency or impediment, the DoD will be of little value.

03:22 am November 16, 2013

It would be great to have some more information about DoD. I do not know how it could look like at all. For example "All tests are green" can be anything.

04:12 pm November 16, 2013

an example:
- every new piece of code has to be tested by unit test
- test coverage has to be above 75%
- every commit has to have a comment not shorter than 12 characters with the number of US or bug
- no compilation warnings
- no critical issues reported by static code analysis
- no failed regression (old) tests

04:20 pm November 16, 2013

Thanks a lot. That is a great start. :-)

07:01 am November 19, 2013

A Definition of Done should be good enough that the resulting increment is potentially releasable. However, many teams do not initially own their process, and a DoD of such quality is not possible. These teams may separate "story done" from "sprint done" and "release done". This allows technical debt to be controlled, or at least made visible, until such time as a single release-ready DoD can be applied to each and every Sprint.

Here's an example of an organizational DoD. Remember: this isn't an example of what a good DoD should look like, since it contains anti-patterns...a good DoD should not separate "story done" from "sprint done" from "release done". It is an illustration of an intermediate set of definitions that may be used during a controlled agile transformation before CI/CD is in effect:


• Code functionally working, checked in and builds
• Code is traceable to requirements (check in is linked to a work item, e.g. by ID)
• Unit tests written and all unit tests in the test suite passing
• Regression tests passing
• Scrum team tests automated
• Scrum tester testing completed and passing with tests scripts and evidence produced
• Technical debt register updated (as a story in the backlog)
• Approved by Product Owner (Points awarded at demo)


• Code and unit tests peer reviewed
- review on check in with a check in comment “Reviewed by…”


• Code conforms to organizational development standards
• Infrastructure components used appropriately
• Code deployed to a production equivalent integration environment
• Code deployed to Product Owner review area
• Code merged to main line
• Code coverage >= 75%
• New services registered in service repository


• Database transactions analysed for efficiency
• Master database scripts produced from Athena
• Automated confidence tests for deployments


• Master database scripts produced
• Database transactions analysed for efficiency
• Zero critical or high outstanding defects
• Performance & Volume tested
• Security tested
• Supporting documentation completed
• User permissions set up
• Wiki page with links to: Trouble shooting guide, functional spec, High Level Design, user stories, documentation of any non-standard patterns/tools used, location of project code

09:42 am July 7, 2022

@Ian Mitchell

Approved by Product Owner (Points awarded at demo)

Code coverage >= 75%

Could you elaborate further on 'points awarded at the demo' and Code coverage of >=75%? 

10:39 am July 7, 2022

Approved by Product Owner (Points awarded at demo)

There's an antipattern sometimes referred to as story point productivity: the belief that value somehow lies in estimated points, rather than in an immediately usable increment . In this case, there's even an assumption that a Product Owner might in some sense "award" them. Other issues include Product Owner approval -- the Developers ought to be respected in their ability to craft a Done increment -- and the possible conflation of a Sprint Review with some kind of "demo".

Code coverage >= 75%

I imagine this is a reference to code quality metrics, such as unit test coverage of statements or conditions. Other metrics include cyclomatic complexity.


11:13 am July 7, 2022

Thank you Ian, it makes sense :)

i just realized that you mentioned ' ..this isn't an example of what a good DoD should look like, since it contains anti-patterns'

12:56 pm July 7, 2022

I wrote an article some time back that may help. Check this:

Ultimately work with the developers and tell them "you must do better with QA. " and then hold them accountable to getting everything as done as plausible.  Make them come up with ways to improve.

By posting on our forums you are agreeing to our Terms of Use.

Please note that the first and last name from your member profile will be displayed next to any topic or comment you post on the forums. For privacy concerns, we cannot allow you to post email addresses. All user-submitted content on our Forums may be subject to deletion if it is found to be in violation of our Terms of Use. does not endorse user-submitted content or the content of links to any third-party websites.

Terms of Use may, at its discretion, remove any post that it deems unsuitable for these forums. Unsuitable post content includes, but is not limited to, Professional-level assessment questions and answers, profanity, insults, racism or sexually explicit content. Using our forum as a platform for the marketing and solicitation of products or services is also prohibited. Forum members who post content deemed unsuitable by may have their access revoked at any time, without warning. may, but is not obliged to, monitor submissions.