Rough estimation of high-level ideas
I've run into a methodological (and slightly ideological) issue with estimation of barely refined requirements. Our team works with implementation of an ERP system. At the moment we employ all Scrum ceremonies (including Backlog Refinement), which we use to discuss and analyze requirements, which have already passed the initial architectural alignment phase and are somewhat clear and more or less ready for delivery or at least solution design.
Over time we have noticed that it's highly beneficial for the PO and Business Analysts to have initial estimates on high-level ideas in order to have at least a faint idea of what's the trade off in relation business value. For that, we've set up a separate ceremony called "Business Requirements Gates". Since "gates" come from waterfall, I was wondering if there is a similar practice in Scrum or if some of the Agilists out here use other ceremonies in order to achieve the same goal, or have some other good practices to share?
Refinement is where this has occurred for every team I have worked with. If there is little known about the item, it should be really easy to provide an estimate that will usually be pretty large. But you have to understand that the estimate is going to be wrong and in some cases very wrong.
I had one team that took a slightly unique approach. They used Story Points for their estimates. But to give the PM some idea for forecasting further out, they would use t-shirt sizes for those items where there was minimal knowledge. This helped them separate refinement of items soon to be addressed from those that are still in the early justificatoin stages.
Thank you, Daniel. The t-shirt sizing approach sounds neat, I will give it a go in the nearest future!
I am wondering if in your practice you've had cases when architecture and the solution itself is designed prior to splitting an Epic into User Stories? What is your take on such an approach, when the solution is designed prior to delivery, and User Stories pretty much just identify the pieces of deliverables that need to be developed? I understand it may be fundamentally not "Agile" as the continuous learning aspect is diminished, yet it is often the reality of large implementation projects (in my case an ERP implementation project).
I understand it may be fundamentally not "Agile" as the continuous learning aspect is diminished, yet it is often the reality of large implementation projects (in my case an ERP implementation project).
The reality you describe appears to be one of a stage-gated project, over which pseudo-Scrum “ceremonies” have been layered as a sort of veneer. In order to advise you regarding such an approach, can you clarify the benefits agile practice is expected to bring in this case?
I see your point here and that's the main reason I've started this discussion. The desire is to have as short lead time as possible, and we see that we can start working without having very detailed designs and learn as we progress and deliver business value more frequently than by having stage-gates. The issue is that we not developing the product by ourselves only, it is developed by multiple other teams, and at least some alignment on the architectural and solution level is a required. All things considered, I feel that there must be a better way and I'd love to hear more from you. :)
My advice would be to gain that alignment precisely by having shorter lead times.
That could involve multiple teams collaborating to deliver an MVP within a Sprint time-box. Scaled Professional Scrum (Nexus) is a suitable framework for this. The Sprint can be seen as a forcing-function which drives not only the continuous learning you refer to, but transparency over alignment as well. The challenge usually lies in developing the product literacy within the organization to identify an MVP candidate in the first place.
Generally speaking it is best to start with one team first, and only scale if absolutely necessary. You could therefore start with one team, and try to secure enough executive sponsorship to have that team's dependencies prioritized by others. I've seen this work a number of times, although usually adequate sponsorship only materializes if the team is responding to a major incident. The competencies that are required by a cross-functional team to deliver "done" work are quickly thrown into relief.
So basically your idea would be that the leading delivery team, solution architects, the Product Owner, representatives from other teams and any other stakeholders (if necessary) would collaboratively agree on an MVP candidate in a Sprint time-box (ideally I suppose they should deliver it in that time-box as well, though in our case even agreeing on the solution would be a leap forward)?
I will definitely check out Nexus, thank you for the reference.
I would suggest avoiding an MVP that becomes the minimum amount of effort to demonstrate to stakeholders some sort of stage-gated progress to towards a fixed solution, but the minimum amount of effort to obtain valuable empirical evidence from the marketplace.
I see what you're saying and it's a valid point. I agree that your approach would be the desired state. However, in your experience, does that work with delivery of increments of a large solution (10+ teams working on the same system)?
Try looking at it the other way round. What's the smallest useful learning experiment that can be conducted with the smallest number of teams?
Okay, I see that I have some homework to do. :) Really appreciate your comments and responsiveness, guys! Hopefully this discussion will be helpful for someone in the future.