Often I get called in to coach teams who have been trying to implement Scrum and are struggling to deliver on their commitments by the end of the sprint.They agree what features should be possible to get done but somehow things don't go to plan. It looks like the problem is testing but there's more to it than that. Well-known agile expert Jeff Patton illustrates the underlying reality using the following picture. He used this sketch in support of arguments people use for going to a Kanban style approach ( which also focuses teams on what Done means).
One source of the pain is that team members lack a shared agreement on what quality level they're aiming for. Some programmers may be writing automated unit tests, others may delivering code without checking it works. Some testers may talk to programmers as soon as they find problems, others may spend a lot of time filing length defect reports. Soon a blame game starts.
Instead, if we establish a shared "Definition of Done", we have a better chance of understanding what needs to happen during the sprint to get the features properly ready ("Done-Done").
That sounds fine and dandy but just how do we go about creating a such a definition of Done in our team? Here's some simple steps that you can try:
- Surface the current practice (Always, Sometimes, Not Yet)
- Discuss what Definition of Done what feels practical for the team
- Resolve any puzzles and concerns before putting into action
- After a couple of weeks, repeat from step 1.
I've posted some pictures below from a conversation with five developers in a small non-profit organisation to show you what this looks like.
Step one is to invite everyone in the team to write sticky notes about what they do now. Expect to get some duplicates. You'll also find that there is some discussion about which category the notes should go in. Most often these debates are resolved by putting the notes into the Sometimes category (if it really is Always or Not Doing then why the debate?).
It's now clear to the team that they have a pretty minimal test strategy and also that people on the team have different assumptions on what needs to be done. No wonder there's confusion. Now it's time to turn the conversation to working out a basic definition of Done that they can start using starting tomorrow. Help the team get clear what each proposal means to them and whether they are really ready to do this yet? It best not to be too ambitious, remind them that they can improve it over time.
The team decided to explore this basic proposal for a new definition of Done.
Then we explored what this would actually mean for the team. We needed this become a shared working agreement that every team member was happy to implement. So you may be surprised to hear that we didn't vote on it. Instead, we worked towards consensus by checking for concerns.The pink notes represent each element of Done. The green notes are questions and concerns from the team. We put these in two columns "What we do.." elaborating detail about what this means (who/what/when) and "How we check it's being done" about making this activity visible.
Don't forget that step 4 is to review how the new definition of Done worked for the team. Agree a date when this will happen by, a natural point is the next retrospective but the team may prefer to wait longer to allow the new approach time to bed in.
Hopefully, you find these tips useful. Please do add your own suggestions for other ways of how to do this as blog comments below. And, yes, I know that there's more to a test strategy than establishing a definition of done :-)