Skip to main content

Reduction in repetitive testing effort

Last post 11:08 am October 7, 2017 by Raghunandan Ethiraj Srinivasan
No replies yet
11:08 am October 7, 2017

We work on a legacy system and as such a lot of our testing can't be automated. While we moved into the Scrum approach for delivery and a few Sprints were completed, the team identified that during each Sprint, when they build an increment to the software on top of what has been built in the last Scrum, a lot of effort was going in running the same test scripts over and over again to ensure that a potentially shippable product was ready at the end of the Sprint.

We took this up during the next Retrospective and analysed what was going on and what the pain point really was. The concern was that the developers in the waterfall module would build all the functionalities first and then test in one go. However in an iterative approach to ensure that the "Done" incremental software is available, they had to run most of the same test cases again and again. We further digged deep on how much effort is going on to retest. Surprisingly, it was found that the testing activity itself wasn't a major effort. The real pain point was in to capture the testing evidence and then formatting it such that it's easy for review and for future reference.

Prior to the retrospective, I googled to see if someone has come up with a solution for this kind of problem. While there were a number of suggestions, there were no off the shelf solution that I could propose and I wasn't even sure how the team would feel about me being a Scrum Master not really having a solution for this problem. I was concerned that the team would use this as a chance to say that let's move back to the older model of delivery. During the retrospective, I asked what would the team like to do in this scenario such that we are still able to deliver "Done" software at the end of each Sprint. After brainstorming, the team came up with an idea that works for them and for the Product Owner perfectly.

The solution was that the developers would still do the testing, however skip the part of capturing all the evidence for review during the Sprint. Only the code review would be completed and testing review would be pending. They would explain it to the reviewer on how they did the testing and why they are comfortable with the amount of testing to move forward. This wouldn't mean it is a shippable product as technically it hasn't been reviewed. In our scenario though, the release dates are well known in advance and hence the team would capture the results, format it and get it reviewed in the Sprint prior to the Release. From a Product Owner's perspective this was good enough. Team also convinced the Product Owner that if he makes an early call for release, then he will wait for one Sprint where these technical debts are cleared and he was ok with this. We updated our Definition of Done accordingly. We also agreed that we will look at developing tools going forward to see how the job of evidence collection and formatting can be automated.

A few things that helped to reach this situation was that the team displayed that they trusted the co-developers responsibility when they say that they have tested it while there was no full evidence to prove it. It is also important to note that since this was a legacy system, there was a good level of understanding among the team about the functionalities they were developing and not much can go wrong from the expected results. But for me as a Scrum Master, the greatest achievement was that teams collective thinking is a far more powerful search engine to a solution.

The motive for me to post this here is for two reasons:

  1. To share the tale that the Retrospectives can do magic.
  2. Want to also find out if more people have had to do similar experiences and what they did during these situations.

By posting on our forums you are agreeing to our Terms of Use.

Please note that the first and last name from your Scrum.org member profile will be displayed next to any topic or comment you post on the forums. For privacy concerns, we cannot allow you to post email addresses. All user-submitted content on our Forums may be subject to deletion if it is found to be in violation of our Terms of Use. Scrum.org does not endorse user-submitted content or the content of links to any third-party websites.

Terms of Use

Scrum.org may, at its discretion, remove any post that it deems unsuitable for these forums. Unsuitable post content includes, but is not limited to, Scrum.org Professional-level assessment questions and answers, profanity, insults, racism or sexually explicit content. Using our forum as a platform for the marketing and solicitation of products or services is also prohibited. Forum members who post content deemed unsuitable by Scrum.org may have their access revoked at any time, without warning. Scrum.org may, but is not obliged to, monitor submissions.