The Sprint Review is often referred to as the Sprint Demo. The meeting where the team demonstrates what they have done in the past Sprint. And nothing more than that.
This is gross underinvestment in your own Scrum Team and its success. The Scrum framework is founded on empiricism, with three main pillars:
If we limit ourselves to just running a demonstration of the past Sprint, we are stopping at the Inspection point. We’re not adapting. It’s one-way communication. We throw our information at those that are present in the event but don’t incorporate the feedback into our Product Backlog. One of the lines in the Agile Manifesto reads “Responding to change over following a plan”. If we don’t gather feedback and change our path accordingly, we’re not adapting.
Imagine it in a different, more tangible (and horrible) scenario: your house is on fire. That's pretty transparent, everyone can see that. You’re screaming at top of your lungs “MY HOUSE IS ON FIRE”. And then… nothing. Your house just burns down as you watch, failing to adapt to the situation.
No one wants that, and everybody would call the fire department, right? Or at least get some water to make a start. That’s adapting. Not adapting to the circumstances is the equivalent of allowing our product to break down.
There are multiple opportunities to inspect in adapt, but Scrum talks about five events. In this article, I share eight things that truly make the Sprint Review an “inspect and adapt” event.
1. Put the product in the hands of your user and stakeholders as much as possible
Demonstrations are often limited to a single-sided PowerPoint presentation. This is A: boring, and B: non-telling about the experience the users have. While we’re developing the product, we design the product and gather knowledge about how the product works. We already know how the Increment works and how we should use it.
The users themselves, on the other hand, do not. Maybe we figured the new UI would be very intuitive, while this proves to be very different when our users are working with it. We can only validate our assumptions by putting our product in the hands of the user and (relevant) stakeholders as much as we can.
2. Gather feedback
Now we’re past the demo part, we want to know how our stakeholders feel about the new Increment. This Increment is already integrated with all other Increments, created in all past Sprints. So our stakeholders now have a complete overview of the whole product, instead of just the new Increment.
You create transparency this way. Stakeholders now have the right information for decision-making, therefore providing meaningful feedback. Listen to what they’re saying, but also what they’re not saying. It‘s generally a good idea to predefine what kind of feedback you’re looking for. If this doesn’t pop up naturally, you can ask them. That shows interest in their opinion by the developers as well, making your stakeholders feel heard. It’s different when it’s not just coming from the Product Owner.
I worked with a team once, that had one person that was consistently working with the stakeholders and users, while the others developed the product. The positive feedback that was coming from those stakeholders was not really heartfelt anymore after a while, because that person was the intermediary. It was always the same person sending the information back to the developers. But when the developers heard it from the stakeholders themselves, it really put a smile on the developers’ faces. Direct communication trumps having people in between/always having the same person in front of you.
3. Show progress relative to the Product Goal
The whole development effort started because there was a problem to solve, an experience to make more pleasant. A goal has been set to realize that. Yet I rarely seem to see the relative progress toward that goal being discussed during the Sprint Review. This could be done in the form of a roadmap, but it doesn’t have to.
Even discussing big blocks of work could do the trick. The gist of it is to create transparency as well as better decision-making. If we have until the end of the year to get our product live, and we have four months left, we can decide what order our Product Backlog gets in order to achieve maximum value in the pursuit of the Product Goal.
This way we can also discuss what NOT to do. Keep in mind to inform the stakeholders who brought in the Product Backlog Items that you removed. This, too, is creating transparency and showing respect.
4. Discuss what to do next
Now there is a clear state of the product, we’ve gathered feedback, and we know where we stand on the path toward our Product Goal. The next thing to do is to have a discussion, again with our stakeholders, about what to do next.
If this hasn’t been brought up during any of the prior topics, assess a few other points as well:
- Have any market circumstances changed?
- How is the budget?
Any changes in the team’s composition?
- Any assumptions that have proven to be wrong?
The sum of all these points is the core of what might make the most sense to do next. This doesn't mean that the Sprint Planning takes place then and there. This means a global discussion, from which you use the conclusion as input during the Sprint Planning.
5. Adapt the Product Goal and Product Backlog
As mentioned before, the Product Goal can be adapted. Dis-proven assumptions could necessitate changes to the Product Backlog. The same goes for the Product Goal. Both are not set in stone, leading to the whole point of agility; responding to change whenever it’s needed.
These five topics are the essentials of the Sprint Review. There are a lot more things we could add, here are a few additional ones that have proven to be useful in my experience:
6. Show (useful) metrics
Agile metrics have been a hot topic in the past few years. What do you measure, but more importantly, why? The Evidence-Based Management framework by Scrum.org could be very useful to gather more ideas on that.
It depends on the context and the product that you want to measure. Some metrics that I personally find very useful:
- Feature usage — are our features actually being used as expected? I worked with a team that had a highly-anticipated feature delivered that was exactly as the stakeholders wanted it. After three months, data showed they were rarely using it.
- Happiness Index — Both the team and the stakeholders. The team can be very happy, but the stakeholders aren’t. Or vice versa. Both are useful input to see what, if and where we need to make improvements.
Technical debt — This might seem a little strange to show. Showing the amount of technical debt helps the stakeholders understand the levels of quality and where, as a group, we should invest in. If you have heaps of TD, maybe it might make the most sense to spend a fair chunk of the coming Sprint(s) to resolve that debt.
So many more that I like to see during the Sprint Review. For the sake of the length of this article, I’ll stick to these three.
7. Keep it informal
The Sprint Review is not a formal handover whatsoever. It’s supposed to be light, fun, and informal. Treat it as such. Bring in some drinks and snacks, create breakouts where it makes sense, and bring in the right people.
8. Mix up facilitation
Repeat after me. “The Scrum Master is not the default facilitator”. In nearly all of my teams, this was the default expectation when I came in. Nowhere in the Scrum Guide is this mentioned. The Scrum Master can facilitate, but there is nothing stopping anyone else to do so either. It might be nice to see a change in facilitation techniques or voice. It could be a very useful skill for people to develop as well. This also has the side-effect that it could boost self-esteem for people who are not used to being in front of a group. The same goes for the Product Owner, by the way. It doesn’t have to be the same person over and over again. It’s a collaborative event, even multiple people can do a part of the facilitation.
Again, don’t treat the Sprint Review as just a demo. As this article illustrates, there is so much more to it. The purpose of the Scrum framework is to deliver value and minimize risk. When we fail to inspect and adapt, we fail to minimize risk and maximize the amount of value we can truly deliver.
Run a demonstration, as much as possible in the hands of the actual stakeholders, gather feedback, check the progress toward the Product Goal, discuss what to do next, and adapt the Product Goal and Backlog where ever needed. This way the probability of an actual value being delivered becomes a lot higher.
What are more ways you use in your Sprint Reviews?