Skip to main content

From Chaos to Control, Part 3 - Done

April 22, 2026
Recap

In the last post we established transparency over the amount of work a team has in hand. We put together what was essentially a "To Do" list -- "To Do", "Doing", and "Done" -- and gained an understanding of what it means to be working on something "right now". From that point we we able to start establishing a degree of discipline, and rigor, by deliberately limiting the amount of work in progress (WIP). We looked at the "avatar" technique, and used it to limit WIP to no more than the number of people we actually have in the team. In other words, we have achieved focus, with a clear emphasis on actually completing work, so we can "stop starting, and start finishing". Now we have established a hint of predictability and flow, so evidence-based forecasts can be made.

Basically, these are the shifts in mindset we've gone through:

  • From Internal to External: Success isn't measured by moving a card on a board, but by putting a feature in a user's hands.
  • From Opinion to Evidence: We’ve replaced internal assumptions with real-world performance data.
  • From Output to Outcome: It’s not just about finishing the work; it’s about the work being usable.

 

The next step is to tighten all of this up by clearly relating "Done" to "usable", and understanding what that actually means.

What does "Done" mean?

We discovered that our team had conflicting perspectives on what "Done" actually looked like. While a team's "To-Do" list is a solid start for visualizing workflow, its value is limited without a clear understanding of usability. We talked about this, and reached a vital conclusion: a task is only truly finished when it provides something that can be put into the actual customer's hands. It has to be of that quality. So for us, "Done" means the product can be deployed into a customer environment with no remaining barriers. We can then immediately gather the real-world feedback we need to measure performance. You see, if the quality is assured, we can eliminate that as a variable. We don't have to worry about the quality of a piece of work being the thing that lets us down. We know it's been usability tested, integration tested, regression tested, documented and so on.

The Always Integrated mindset

We continuously evolve our product to meet shifting customer needs. For us, "Done" signifies a feature is fully integrated, tested, and ready for deployment. Before closing any task, we verify that it contributes to a stable release and ensure no technical debt remains that might necessitate future rework.

Before we move a card to the finish line, we ask:

  • Should it still be released? Market conditions can change overnight. 
  • Does this still strengthen the product?
  • Are we doing it right the first time to avoid rework and headaches later?
In summary: "Done" is a quality check

So really, we've learned to think of the "Done" column as our primary quality gate, our quality assurance guarantee. By moving beyond a simple "To Do" list through the introduction of policies and checkpoints, we’re bringing valuable rigor to our process. Using ‘Done’ as a quality check adds necessary structure to our endeavors, and we've taken a significant step towards a more disciplined workflow.

The next thing we need to do is to audit our process, and to see where else we can add value-driven guardrails. We should evaluate our workflow together, and look for additional policies that could further streamline things. I’d like us to take a closer look at our entire workflow, and to identify any other "rules of the road", or policies, we can implement to keep things running smoothly, and maintain and further improve these high standards. That's what we'll look at in the next post.

 


What did you think about this post?

Comments (0)

Be the first to comment!