Forums

By posting to our forums you are agreeing to the Forum terms of use.
End of Sprint Package Deployment Issues
Last Post 06 Nov 2013 07:49 PM by Joshua Partogi. 2 Replies.
  •  
  •  
  •  
  •  
  •  
Sort:
PrevPrev NextNext
You are not authorized to post a reply.
Author Messages
Kimberly Swanson
New Member
New Member
Posts:1
Kimberly Swanson

--
04 Nov 2013 03:10 PM
    Greetings,

    I am member of a a deployment team at a software company that uses the scrum process for software development. At the end of each sprint, code from multiple developers are merged into a qa environment and then manual testing is done by product reps. Once the testing has been completed and approval is given, the developers then cut a build / package and release to our deployment team.

    Unfortunately we are usually scheduled to install or upgrade the packages at our customers test environments the same or next day we are actually given the build. The product owner is usually the product manager of the product and they have promised a certain feature/fix to the customer by a certain date. This causes a problem as the development team doesn't test the packages they give the deployment team, and the deployment team is not given any time to test the packages. It is not too uncommon for the packages that we receive not to actually install or upgrade without issues, or if it does actually get installed or upgraded successfully, the product has issues as the deployment package doesn't contain all the changes merged into the qa environment. The deployment and now support team will spend hours troubleshooting problems just to end up getting development involved. The development team in turn is not too happy as they have already started working on the next sprint, and don't have time to work with the deployment or support team.

    This is very frustrating to say the least. This topic has been brought up numerous times with product owners, but we are told that there is no time available in the sprint process to test deployment packages.

    From our standpoint, build automation processes should be used so testing can be done with the actual deployment packages. This would solve problems with finding deployment packages that are not working, and actual test the end product the customer would be getting. Unfortunately, I have been told this is not possible due to our limited testing capabilities.

    We are just looking for any guidance/experience that others have had in our shoes of being in the deployment team. It seems as though the scrum process puts little emphasis in testing the actual product in its packaged state. I am not sure of this is a flaw of the scrum process in general or if it just a symptom of the way that it is currently being implemented here.

    Sincerely,
    Kimberly





    Ian Mitchell
    Advanced Member
    Advanced Member
    Posts:571
    Ian Mitchell

    --
    05 Nov 2013 04:44 AM
    > I am not sure of this is a flaw of the scrum process in general or if
    > it just a symptom of the way that it is currently being implemented here.

    It's a flaw in the way it's being implemented. Here's the problem:

    - In order to be considered Done, the increment must be deployed into a particular environment
    - This criterion is not reflected in the Development Team's Definition of Done
    - The responsibility for correct deployment lies with a separate Deployment Team
    - Hence, in this implementation, the Development Team do not own their process. They are unable to take a Sprint Backlog through to completion so that the intended Definition of Done is satisfied.

    > This is very frustrating to say the least. This topic has been brought
    > up numerous times with product owners, but we are told that there
    > is no time available in the sprint process to test deployment packages

    Who is telling you that? The Product Owners? The length of a Sprint is jointly determined by all members of a Scrum Team, not just PO's. If there isn't enough time in the Sprint to satisfy a Definition of Done of suitable quality, then either time (length of sprint), scope (work done in sprint), or cost (developers allocated) must vary.

    >It seems as though the scrum process puts little emphasis in testing
    > the actual product in its packaged state

    Scrum puts the emphasis firmly on delivering potentially releasable increments of value. In Scrum, it is considered important that each increment be deployed into production so that a return on investment can be earned, and so that real-world feedback can be elicited. Failure to do this will result in technical debt being incurred. This is what the "deployment" team are currently struggling with. They are trying to pay off technical debt that is arising from the shortfall between the Development Team's process and the Definition of Done that is actually needed.
    Joshua Partogi
    Posts:99
    Joshua Partogi

    --
    06 Nov 2013 07:49 PM
    I am with you Kimberly. Throwing incompletely baked work over the fence to the other team is definitely not how we work with Scrum. You need to discuss this in their retrospectives. Have you ever attended their Sprint Retrospectives and raise this issue?
    You are not authorized to post a reply.


    Feedback