Burndown Chart Cliff

Last post 11:46 am February 2, 2022
by Alexander Wassiljew
5 replies
07:43 pm January 14, 2022

Hello everyone! 

Our team follows a two-week sprint where:

  • Development occurs during week 1
  • QA happens early week 2
  • Acceptance testing happens mid to late week 2
  • Release happens late week 2

Jira tickets are considered "Done" (and therefore burned down) after acceptance testing passes. This causes our burndown to be pretty flat the first 5-8 days of the sprint, and then we have a large burndown cliff at the end of the sprint when tickets are accepted. 

Curious to hear everyone's thoughts on this process. It bothers me every sprint that we don't have a steady downward slope, but I don't see how that would work with our current process.

Has anyone else had this issue (not even sure it's a problem necessarily - maybe just an annoyance?) and what have you done to improve it? Sample Burndown

05:02 pm January 16, 2022

Has anyone else had this issue (not even sure it's a problem necessarily - maybe just an annoyance?) and what have you done to improve it? 

Looks like you're handling your Sprint as if it was a mini-waterfall. What do you think that means for risk control, and might an improved focus on finishing work help?

05:22 pm January 16, 2022

Ian points out one problem - by doing development on all of the work and then QA on all of the work, you end up with a waterfall. Is there any reason why you're doing this, rather than getting each unit of work (Product Backlog Item) designed, developed, and tested whenever it's done?

I'm also curious about the acceptance testing. Who is performing that? If it's the Product Owner - someone embedded within the Scrum Team - I'd make the same comments as the QA work above in that you have a waterfall. Not a good way to get feedback on work, since you're always getting feedback on a huge chunk of work. However, if it's external users, I'd recommend getting that out of your Definition of Done so your Scrum Team isn't beholden to someone else's schedule and workload when it comes to getting work Done.

04:28 pm February 1, 2022

Thank you both for the helpful responses. 

I am struggling however to see how we can break this cycle. We plan the sprint and it begins with new PBI's that the team works on - this takes time, so inevitably there will not usually be any QA testing that begins until 3-5 days into the sprint. This also extends to acceptance testing as well.

We also have a platform with a lot of interrelated pieces of work, so when we've tried breaking tickets down into smaller pieces it usually has dependencies blocking it from QA / acceptance testing and we end up in the same situation. 

I know there is a solution here - just really having to give it a lot of thought, so would appreciate any further suggestions! 

08:05 am February 2, 2022

Well, if you start a sprint with all brand new PBIs, and they take 3 to 5 days work, you will have a plateau in the beginning of the Sprint. There is no way around it. And that is the reason, if someone asks.

So, I see some options:
1. Reduce the Waterfall: from what others already said: I'm hoping these are not "batches", but individual items that accidentally have the same sort of length for all the "steps"? What seems to be happening is that each developer has an item, works on it, then the item goes to the next step, and all the developers are then working in that "step".

2. Reduce stepping: is it possible for QA to start earlier? Does it have to wait for development to be finished?

3. Kanban's slogan "stop starting, start finishing": Can you reduce the amount of PBIs being worked at at the same time? Can people work together on one item, and thus get it to Done sooner? (maybe a WIP limit can be used?)

11:46 am February 2, 2022

Hello Donald,


All the previous said is completely right nothing to add here. 

I have the same issue right now in my team. I try to resolve it also. 


For this I found a pretty nice video that describes pretty clear what happens and visualize it.



Please let me know, whether external links are conform to the Terms of Use. I did not found something that says different.