Skip to main content

End of Sprint Package Deployment Issues

Last post 08:49 pm November 6, 2013 by Joshua Partogi
2 replies
04:10 pm November 4, 2013

Greetings,

I am member of a a deployment team at a software company that uses the scrum process for software development. At the end of each sprint, code from multiple developers are merged into a qa environment and then manual testing is done by product reps. Once the testing has been completed and approval is given, the developers then cut a build / package and release to our deployment team.

Unfortunately we are usually scheduled to install or upgrade the packages at our customers test environments the same or next day we are actually given the build. The product owner is usually the product manager of the product and they have promised a certain feature/fix to the customer by a certain date. This causes a problem as the development team doesn't test the packages they give the deployment team, and the deployment team is not given any time to test the packages. It is not too uncommon for the packages that we receive not to actually install or upgrade without issues, or if it does actually get installed or upgraded successfully, the product has issues as the deployment package doesn't contain all the changes merged into the qa environment. The deployment and now support team will spend hours troubleshooting problems just to end up getting development involved. The development team in turn is not too happy as they have already started working on the next sprint, and don't have time to work with the deployment or support team.

This is very frustrating to say the least. This topic has been brought up numerous times with product owners, but we are told that there is no time available in the sprint process to test deployment packages.

From our standpoint, build automation processes should be used so testing can be done with the actual deployment packages. This would solve problems with finding deployment packages that are not working, and actual test the end product the customer would be getting. Unfortunately, I have been told this is not possible due to our limited testing capabilities.

We are just looking for any guidance/experience that others have had in our shoes of being in the deployment team. It seems as though the scrum process puts little emphasis in testing the actual product in its packaged state. I am not sure of this is a flaw of the scrum process in general or if it just a symptom of the way that it is currently being implemented here.

Sincerely,
Kimberly






05:44 am November 5, 2013

> I am not sure of this is a flaw of the scrum process in general or if
> it just a symptom of the way that it is currently being implemented here.

It's a flaw in the way it's being implemented. Here's the problem:

- In order to be considered Done, the increment must be deployed into a particular environment
- This criterion is not reflected in the Development Team's Definition of Done
- The responsibility for correct deployment lies with a separate Deployment Team
- Hence, in this implementation, the Development Team do not own their process. They are unable to take a Sprint Backlog through to completion so that the intended Definition of Done is satisfied.

> This is very frustrating to say the least. This topic has been brought
> up numerous times with product owners, but we are told that there
> is no time available in the sprint process to test deployment packages

Who is telling you that? The Product Owners? The length of a Sprint is jointly determined by all members of a Scrum Team, not just PO's. If there isn't enough time in the Sprint to satisfy a Definition of Done of suitable quality, then either time (length of sprint), scope (work done in sprint), or cost (developers allocated) must vary.

>It seems as though the scrum process puts little emphasis in testing
> the actual product in its packaged state

Scrum puts the emphasis firmly on delivering potentially releasable increments of value. In Scrum, it is considered important that each increment be deployed into production so that a return on investment can be earned, and so that real-world feedback can be elicited. Failure to do this will result in technical debt being incurred. This is what the "deployment" team are currently struggling with. They are trying to pay off technical debt that is arising from the shortfall between the Development Team's process and the Definition of Done that is actually needed.


08:49 pm November 6, 2013

I am with you Kimberly. Throwing incompletely baked work over the fence to the other team is definitely not how we work with Scrum. You need to discuss this in their retrospectives. Have you ever attended their Sprint Retrospectives and raise this issue?


By posting on our forums you are agreeing to our Terms of Use.

Please note that the first and last name from your Scrum.org member profile will be displayed next to any topic or comment you post on the forums. For privacy concerns, we cannot allow you to post email addresses. All user-submitted content on our Forums may be subject to deletion if it is found to be in violation of our Terms of Use. Scrum.org does not endorse user-submitted content or the content of links to any third-party websites.

Terms of Use

Scrum.org may, at its discretion, remove any post that it deems unsuitable for these forums. Unsuitable post content includes, but is not limited to, Scrum.org Professional-level assessment questions and answers, profanity, insults, racism or sexually explicit content. Using our forum as a platform for the marketing and solicitation of products or services is also prohibited. Forum members who post content deemed unsuitable by Scrum.org may have their access revoked at any time, without warning. Scrum.org may, but is not obliged to, monitor submissions.