Agility Path: 3 Basic Questions
I have some questions about the Agility Path framework which I hope someone can answer. The reference I am using is the scrum.org Agility Guide.
Question: What is the co-ordination policy for Agility Teams regarding the Practice Backlog?
Context: It seems that, although there can be multiple Agility Teams, there is only one Practice Backlog. This implies that each team should attend the same Sprint Review, since the Practice Backlog can be reordered and revised in that event. It further implies that the estimation of each backlog item must be a joint activity between teams. Assuming that these implications are true, is it then the case that *all* events...Planning, Weekly Scrum, Evaluation, Review, Retrospective, perhaps even the Sprint time-box itself...are shared across all teams? If not, what is the guidance regarding how Agility Teams should conduct their events independently and without compromising the integrity of the Practice Backlog?
Question: How are multi-sprint objectives managed?
Context: The Agility Guide says that it is possible for objectives to span multiple sprints. How do these relate to Sprint Goals, which may be evaluated in terms of a time-box? Who is accountable for setting these longer-range objectives? How are they evaluated, and when and by whom? Are there implications for the long-term batching of work across multiple sprints, if value cannot be released until these objectives are met?
Question: What is the policy regarding "done"ness and debt?
Context: There is no Definition of Done for backlog items in Agility Path. However, there is an Evaluation Backlog containing "the set of practices that need to be re-implemented based on their observed impact on the organization. The Change Team captures previously implemented practices it wants to strengthen." Isn't Evaluation Backlog just a sink for undone work...that is to say, process debt? Why isn't there a concept of "Done" similar to that which has proven so invaluable in Scrum?
The agility guide is an additive framework, much like scrum. There has been a conscious choice within it to describe boundaries or constraints - so that each organisation mindfully implements adaptations suitable for them.
Multiple teams would scale in a similar way to scaling a Scrum Product Backlog. There would be a higher level Practice Backlog that would be decomposed in to smaller items on the Practice Backlog. It would be of benefit to have organisationally cross functional Practice teams as opposed to teams from one business area.
Long Running Objectives Given the inherent complexity and uncertainty of organisational change, there will be objectives that will take longer than one month to achieve. There should be clear objective criteria that can be used to assess progress to the desired end state. As this is an organisational change, the progress will be continuous, enabling the business to improve the flow of value to the customer. The objective is to move the organisational mindset in to a state of continuous improvement, rather than aiming to achieve a final state.
there should be clear achievement criteria, however you are never done improving.
> It would be of benefit to have organisationally cross functional
> Practice teams as opposed to teams from one business area
Please can you clarify what you mean by "Practice Team"? Is that an Agility Team or Change Team? Or is it a Scrum Team, i.e. one of the teams that will ultimately apply the practices being rolled out?
Interestingly, it seems that Agility Teams are deliberately *not* cross-functional. Agility Path provides for separate business functions (domains) each of which can have at least one Agility Team.
> As this is an organisational change, the progress will be continuous...
This is quite different from Scrum, where the delivery of value occurs in discrete increments. Thanks for clarifying.
> there should be clear achievement criteria, however you are never done improving.
True, but it then begs the question as to whether objectives that span multiple sprints can ever be achieved...or more specifically, whether the work remaining for them can ever be summed. I'd be struggling to provide a figure for that, or a burn rate, if I knew undone work was lurking in the Evaluation Backlog.
My mistake, I meant Agility Team. You will need to scale these as required. They will be responsible for helping the organisation improve, and thereby create a better space for the Scrum Team.
I would disagree about the guidance on teams not being cross functional
An Agility Team consists of a Product Owner, Change Team, and Scrum Master. The team is self-organizing and cross-functional.
. The guidance regarding Domains is around the value proposition. It is the same principle that we coach in Scrum, with teams that can support the vertical slicing. I appreciate that the term Domain may confuse instead of make clear.
With the work spanning multiple sprints, I would encourage that this is treated like a very large Product Backlog Item (Epic), and decompose it down to the point where it is deliverable in a Sprint. You can size them, and in the sprint planning decompose and then burn them down. The main highlight is that it may take several sprints to see the effect of the change programme, and the metrics being used may often be a lagging indicator. This is where using a subjective measure as an interim sanity check may be viable.
The intention of the guide is to remind people that change is difficult, needs to be structured, and that using a template/cookie cutter approach may not deliver the profound change in organisational mind-set needed. If you bury yourself in the immediate numbers you may not even see the competition passing you by.
> ...the metrics being used may often be a lagging indicator
If lagging indicators are supported then I can see the purpose in having an Evaluation Backlog. It means that work cannot be perceived as being "done" until subsequent sprints, and so some sort of debt register will be needed.
I think this might prove to be a bit controversial though:
1. The Evaluation Backlog could be cited as evidence that Agility Path *systematises* the incurral of waste. There's nothing in the Agility Guide which says that this backlog should be pro-actively reduced, or even challenged.
2. The use of lagging indicators means that a team cannot inspect the output of each sprint and adapt its process. There will be a delay in obtaining actionable metrics. This could also be seen as systematising waste, and as negatively impacting the process of validated learning.
Hopefully none of the above is the case, and I have got the wrong end of the stick entirely...
There may be a lagging metric, however it should not be the only metric. There will be supporting metrics that are a combination of subjective and objective metrics.
If all your metrics for an aspect of undergoing change are lagging - then this is the dangerous situation that is described where you have waste being built in.
The backlog should be treated as a backlog, and continuously refined to ensure that it is in a healthy (refined and ordered) state - ready for work. It should not be over-refined so that effort is not wasted in refining an item that will not be done in the near future.
The Sprint output (increment) will be visible at the end of the Sprint, and all of the metrics should be reviewed. Any lagging indicators will be flagged to be monitored.
All of the Lean principles should be applied, and if there is something that you intuitively feel is the right way, then you should use your experience to guide you. The objective is to help the organisation to continually improve the ability to respond to a changing market, while improving the flow of work from idea to implementation.