The Definition of “Done” is Badly Named!

In Scrum, there is a concept of the “Definition of ‘Done'” that is used to understand what teams produce every Sprint.  Unfortunately, it is not well understood, nor is it consistently applied.  Part of the problem is that the name of the concept is misleading.  Every time we do coaching or training in organizations, we spend an inordinate amount of time getting this concept across clearly.  Recently, one of our coaches, Travis Birch, had an insight about the name of the concept.  The word “definition” makes us think of something static and permanent.  Definitions don’t normally change.  However, in Scrum, the definition of “done” is always changing.  Doneness is not something that we get perfect at the start, rather, teams develop their capacity to deliver more and more each Sprint… and not just in terms of features.  As I explained in another article called Four Methods of Perfecting Agile, the definition of “done” is something that can and should grow over time.

So what can we do?  Can we rename the concept?  The concept is really a snapshot description of the activities and attributes that a team puts into a Sprint’s worth of work.  For example, at first a team might not be writing automated unit tests.  Therefore, automated unit tests are not part of the definition of “done” – a snapshot of their work at the end of the Sprint does not include automated unit tests.  Then in a retrospective the team decides that they need to create automated unit tests.  They do so in the next Sprint.  Now, automated unit tests are a part of the definition of “done”.  Finally, a few Sprints later, one of the members of the team attends Agile Software Engineering Practices training [shameless plug] and decides that they should start doing test-driven development.  The team learns how to do this and from now on the definition of “done” includes test-driving all production code.

A New Name

Let’s try another name for this concept: the “Expanding Benchmark”.  I think this term much more accurately conveys the sense of the concept.  It is expected that this concept is not static, rather, as the team overcomes obstacles, automates things, learns new skills, and gains new trust and authority, that their work will expand.  And specifically, we are expanding the benchmark of what activities and attributes of software are delivered at the end of each Sprint.

So – let’s get rid of the Definition of “Done” and start talking about a team’s Expanding Benchmark.  What sayest you?

Affiliated Promotions:

Register for a Scrum, Kanban and Agile training sessions for your, your team or your organization -- All Virtual! Satisfaction Guaranteed!

Please share!

11 thoughts on “The Definition of “Done” is Badly Named!”

  1. I don’t personally think of definitions as at all static or permanent. Brief perusal of a good dictionary should be enough to radically change the thinking of anyone who does. Look up the etymology of “nice” some time.

    So I’m not so concerned that the definition of done might change from sprint to sprint. I struggle with the idea of the expanding benchmark, though. I’ve always understood the focus on “done” to be an antidote to the more traditional developer’s syndrome of “it’s done, except for…” which is great. So the benchmark part sounds good to me. I like the metaphor of a gauge that is applied to the software to determine if it is or is not (and nothing in between) acceptable. On the other hand, it’s not clear to me that a team should need, want, be expected to produce nor benefit from an expansion of the benchmark.

    In fact, almost the opposite: wouldn’t it be great if the shape of the definition of done got smaller and simpler over time?

  2. Keith beat me in pointing out that definitions change over time. Definitions emerge from actual use in a community. So, I think that “definition of done” is an excellent name for that artifact. It’s a snapshot of what that particular community means when they call something “done.” I agree with you, Mishkin, that the definition of done should change over time on a team, but I don’t think anything about the name is an impediment to that happening.

  3. I agree with Keith and Richard. I think the word ‘done’ has a strong connotation of completion that is not implied by ‘expanding benchmark’ and therefore has a stronger resonance with agile teams. In my experience, teams pretty quickly grasp the ‘done’ concept and the fact that it expands over time seems natural to them as they learn more and adapt practices to their context. The challenge is more in the execution of ‘done’ than in grasping the concept.

  4. “Done Bar”… I love the concept, but I think the sound of saying it might lead to odd discussions… just say it five times fast 🙂

  5. I liked the idea to rename the concept. I have seen many people using “Definition of ‘Done’” not only to understand what teams produce every Sprint but also to verify (from also quality assurance point of view )what team produce and they are using “Definition of ‘Done’” as a checklist which helps them to ensure that they have not missed anything before making sprint output ready for deployment. Instead of “Definition of ‘Done’”, we are using “Readiness for ‘Done’” to understand and verify what teams produce every sprint. The Readiness criteria can not be always static which we can alter as per our findings, lesson learnt at end of sprints.

    Most of the time human being generally use terms such as “Done” or “Completed” to understand and verify the completeness of Sprint towards the end of sprint.

    The term “Expanding Benchmark” is also much more accurately conveys the sense of the concept. However I am not sure how people will use it to verify output of sprint as we generally use term such as “Done” for verification/ completeness. But it is really helpful and useful than “Definition of ‘Done’”.

    Aniket Mhala | Agile Coach
    Certified Scrum Master, Certified Scrum Practitioner

  6. I don’t personally think the problem is with the name (“What’s in a name?” after all, as the Bard taught us).

    In my experience, there are two main obstacles to accomplishing what the definition of “done” (whatever you want to call it) is intended for:

    1) Getting the members of the team onside with both determining what work is required before a feature or set of features are ready for the real world AND completing that work each Iteration. This can fail to happen for a variety of reasons, including (but not limited to):

    – disagreements among team members as to the necessity of certain activities or disciplines;
    – lack of expertise in certain areas (eg. “We really need to have automated tests but no one knows how to build them so that they don’t fail for environmental reasons or require huge amounts of work to keep current!”)
    – schedule pressure from outside the team causing some members to abandon “best practice” discipline and/or providing the excuse need by other team members to cut corners in areas in which they hadn’t been completely onside to begin with
    – a feeling among team members that some of the “done” activities should really be someone else’s responsibility (such as an Integration Lab)
    – lack of access to expensive equipment needed to simulate live-like environments

    2) Getting those people outside the team who have influence or even control over the teams to buy into the necessity of all of the “done” work. This can fail for a lot of different reasons, too, including (but not limited to):

    – those folks are often on the front lines when it come to disappointing customers with longer-than-hoped-for schedules and hence have a strong desire to deliver good news only (who wants to tell a customer that she can’t get her new features in time for that Spring campaign?!)
    – there may not exist a good foundation of trust between the agile team and these individuals, in which case “done” work can often be interpreted by those outside the team as “gold-plating”, “busy work” or “padding”
    – people not involved with doing the actual work (of Agile software development) are unlikely to be able to appreciate just how much effort is actually required to take what looks like “a simple task” and get it to production-quality
    – during a transition-to-Agile period, people will often recall traditional (Waterfall) projects with rose-coloured glasses, up to and including reflecting on the length of time that a past feature took to get to “feature complete” (ready to go into QA) while failing to account for the weeks or months that it took to get that same feature through QA and production-ready (hence, the time spent by an Agile team seems so much longer to them)
    – there often exists an undefined sense of “good enough quality” that can cause all kinds of friction between teams and those who interact with them, especially once the can of worms known as “compromising quality to hit a date” gets opened (“I’m willing to accept lower quality, as long as it’s still good enough… but I can’t tell you until you deliver it to me whether or not it’ll be good enough!”)

    Those are just some of the reasons that it can be so difficult to establish that whole healthy Agile definition of “done” culture that we all dream of.

  7. As I understand it, the concept of “done” is that working software is in the hands of its intended users and they are deriving business value from it. Anything short of that isn’t “done.”

    When we introduce agile methods in any given company, the typical case is that our activity is limited to the scope of one or a few development teams. If we’re lucky, we also involve QA and business analysts. If we’re very lucky, we may involve other specialized groups in the organization.

    So, when a development team is using Scrum (or similar) they commit to completing some amount of work per sprint, taking that work all the way to “done.” The problem is, some of the activities necessary to get the work completely into the “done” category lies outside the control of the development team. If we cleaved to the original intent of “done,” then virtually no agile development teams anywhere in the world would have a non-zero velocity, ever. Teams would find that rather depressing. So, we’ve come up with this multi-tiered concept of done-ness. There’s “done,” which often just means the code is ready for acceptance testing; there’s “done-done,” which may mean a story has passed acceptance testing and is now waiting for some downstream group to deal with it (but it’s not yet anywhere near its intended users). I’ve started to see things like “done-done-done” in print lately.

    All of that is a pretense. We’re making up definitions so that we can feel good about the portion of the value stream over which we have some degree of control. “Done” is still what it always was. The true fix to the problem is to address the organizational issues that prevent our delivering working software all the way to the end of the value stream. Redefining “done” so that it matches whatever we’re able to do within present-day constraints doesn’t solve anything. It’s just word-play.

    More textual abuse on the subject may be found here, FWIW:

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.