Forums

By posting to our forums you are agreeing to the Forum terms of use.
How can we put QA back in the Sprint?
Last Post 19 Nov 2013 06:01 AM by Ian Mitchell. 6 Replies.
  •  
  •  
  •  
  •  
  •  
Sort:
PrevPrev NextNext
You are not authorized to post a reply.
Author Messages
SandraU
New Member
New Member
Posts:10
SandraU

--
15 Nov 2013 02:46 PM
    Hello,

    I am looking for tips how to put QA back in the Sprint. At the moment the "support and maintenance" team is the QA team. The Development Team does TDD and little bit BDD as acceptance test. But the "support and maintenance" team does too much testing.

    I have the book "Agile Testing", but I am still at the beginning. What can I do to get the "Definition of Ready" and "Definition of Done" in a useful state and used by the people? What else can I do?

    (We have a coach, books, videos, trainings, a scrum master .... they work with scrum over 5 years now ...)

    Regards,
    Sandra
    Ian Mitchell
    Advanced Member
    Advanced Member
    Posts:571
    Ian Mitchell

    --
    16 Nov 2013 01:17 AM
    > What can I do to get the "Definition of Ready"
    > and "Definition of Done" in a useful state and
    > used by the people?

    Have you addressed the Definition of Done in a Sprint Retrospective yet? Improving the DoD is one of its outputs, and it should involve the entire Scrum Team.
    Ian Mitchell
    Advanced Member
    Advanced Member
    Posts:571
    Ian Mitchell

    --
    16 Nov 2013 01:22 AM
    > What else can I do?

    Check that the Definition of Done is one that the team can actually meet. If they don't own their process, and can't take backlog items through to completion without external dependency or impediment, the DoD will be of little value.
    SandraU
    New Member
    New Member
    Posts:10
    SandraU

    --
    16 Nov 2013 02:22 AM
    It would be great to have some more information about DoD. I do not know how it could look like at all. For example "All tests are green" can be anything.
    Xebord
    New Member
    New Member
    Posts:13
    Xebord

    --
    16 Nov 2013 03:12 PM
    an example:
    - every new piece of code has to be tested by unit test
    - test coverage has to be above 75%
    - every commit has to have a comment not shorter than 12 characters with the number of US or bug
    - no compilation warnings
    - no critical issues reported by static code analysis
    - no failed regression (old) tests
    SandraU
    New Member
    New Member
    Posts:10
    SandraU

    --
    16 Nov 2013 03:20 PM
    Thanks a lot. That is a great start. :-)
    Ian Mitchell
    Advanced Member
    Advanced Member
    Posts:571
    Ian Mitchell

    --
    19 Nov 2013 06:01 AM
    A Definition of Done should be good enough that the resulting increment is potentially releasable. However, many teams do not initially own their process, and a DoD of such quality is not possible. These teams may separate "story done" from "sprint done" and "release done". This allows technical debt to be controlled, or at least made visible, until such time as a single release-ready DoD can be applied to each and every Sprint.

    Here's an example of an organizational DoD. Remember: this isn't an example of what a good DoD should look like, since it contains anti-patterns...a good DoD should not separate "story done" from "sprint done" from "release done". It is an illustration of an intermediate set of definitions that may be used during a controlled agile transformation before CI/CD is in effect:

    -- STORY DONE --

    • Code functionally working, checked in and builds
    • Code is traceable to requirements (check in is linked to a work item, e.g. by ID)
    • Unit tests written and all unit tests in the test suite passing
    • Regression tests passing
    • Scrum team tests automated
    • Scrum tester testing completed and passing with tests scripts and evidence produced
    • Technical debt register updated (as a story in the backlog)
    • Approved by Product Owner (Points awarded at demo)

    Optional:

    • Code and unit tests peer reviewed
    - review on check in with a check in comment “Reviewed by…”


    -- SPRINT DONE --

    • Code conforms to organizational development standards
    • Infrastructure components used appropriately
    • Code deployed to a production equivalent integration environment
    • Code deployed to Product Owner review area
    • Code merged to main line
    • Code coverage >= 75%
    • New services registered in service repository

    Optional:

    • Database transactions analysed for efficiency
    • Master database scripts produced from Athena
    • Automated confidence tests for deployments


    -- RELEASE DONE --

    • Master database scripts produced
    • Database transactions analysed for efficiency
    • Zero critical or high outstanding defects
    • Performance & Volume tested
    • Security tested
    • Supporting documentation completed
    • User permissions set up
    • Wiki page with links to: Trouble shooting guide, functional spec, High Level Design, user stories, documentation of any non-standard patterns/tools used, location of project code
    You are not authorized to post a reply.


    Feedback