Skip to main content

If Duration is historically Derived from Complexity Estimation...

Last post 09:50 pm March 1, 2021 by Ian Mitchell
3 replies
11:39 pm February 28, 2021

Why don't Sprint Teams always make this a retro item to look at how long it took to do something with respect to the complexity that they had assigned it in planning...?

This is not something I have ever heard any Scrum Master talk about doing... Is this because I was taking to the 'wrong' Scrum Masters?

Thoughts?


08:47 pm March 1, 2021

Why don't Sprint Teams always make this a retro item to look at how long it took to do something with respect to the complexity that they had assigned it in planning

I see this as a great avenue to expose many of the topics that would emerge at a Sprint Retrospective anyway, but in a data-driven way. Although I prefer that Scrum Teams allow themselves the freedom to use the Sprint Retrospective in other ways, if they consider that more effective.

I frequently discuss cycle time with my teams.

For refinement, I advise the developers to first look at their cycle time at a certain threshold (e.g. 85% of items have a cycle time of 7 calendar days or less). I then recommend that they ask themselves whether they believe the item being refined can be Done within that number of days. If not, they should reduce complexity or size in some way, so that they are confident the item can be Done within that time.

This means for that item, one of two things will happen. Either:

  • It will be Done within that number of days, and therefore contribute to a more predictable (and probably lower) cycle time.
  • It will take longer than the predicted number of days to be Done, which gives the developers a perfect example to inspect why they were wrong on this occasion, and gather data for subsequent adaptation.

I generally advise the teams to inspect at least the slowest few examples over a sprint, so that they can look for patterns in what is causing slowness in the most extreme cases.

This could be a result of complexity, but it could also be the result of items waiting in a queue (e.g. Ready for Peer Review), ineffective refinement, the need for rework, dependencies on others, or decisions to shift focus once work on an item had already begun.


09:07 pm March 1, 2021

I disagree with the duration being based upon complexity.  There are many simple tasks that can take much longer than complex tasks because of the need to wait on dependent tasks or a long running tasks to complete before others can be done.  For example, starting a script to do a data migration may take a minute but the script could run for hours. Complexity wise, starting a script is pretty simple and waiting is even simpler.  I think that you might not ever see this come up in the manner you expressed because people realize their mistakes and don't want to revisit them. 

In teams I have worked with that never came up because every day in the Daily Scrum the story was discussed.  During refinement activities, lessons learned from past refinement are applied.  During Sprint Planning history comes into play as they plan.  Honestly, revisiting an estimate after the fact does not have much value.  Because you will never experience that situation again.  Just the act of doing work means you are learning.  Each time you do a task, you get better at doing it.  And if you aren't then you really should think about that. Estimates are guesses based upon the information you know at the time you make the estimate.  After that point in time, that exact situation will never occur again. 

As @Simon Mayer points out, discussion on trailing indicators such as cycle time is much more beneficial.  Those metrics are based upon actual work and not guesses about work.  They incorporate learning as you go and do not require special consideration for it.


09:50 pm March 1, 2021

Why don't Sprint Teams always make this a retro item to look at how long it took to do something with respect to the complexity that they had assigned it in planning...?

Perhaps they are more interested in meeting their Sprint Goals, and in managing their workflow for that purpose. Product Backlog items might have been assumed to be more or less of the same size, and not estimated differently at all.


By posting on our forums you are agreeing to our Terms of Use.

Please note that the first and last name from your Scrum.org member profile will be displayed next to any topic or comment you post on the forums. For privacy concerns, we cannot allow you to post email addresses. All user-submitted content on our Forums may be subject to deletion if it is found to be in violation of our Terms of Use. Scrum.org does not endorse user-submitted content or the content of links to any third-party websites.

Terms of Use

Scrum.org may, at its discretion, remove any post that it deems unsuitable for these forums. Unsuitable post content includes, but is not limited to, Scrum.org Professional-level assessment questions and answers, profanity, insults, racism or sexually explicit content. Using our forum as a platform for the marketing and solicitation of products or services is also prohibited. Forum members who post content deemed unsuitable by Scrum.org may have their access revoked at any time, without warning. Scrum.org may, but is not obliged to, monitor submissions.