The Ever Increasing Complexity of Done
“We’ll just edit our definition of done on the Wiki and add that to it”
– anonymous
All the agile projects I’ve worked with have a concept of “Done” at the work item level. For some projects, it’s gloriously printed, framed and given pride of place on a prominent wall. For others, it’s hidden in a wiki page or behind a door somewhere. It’s a noble cause – to ensure everyone knows what is needed to mark a piece of work as completed and claim the story points. Consider it a form of contract between the implementors and the business.
Here’s an interesting challenge – ask your implementation teams what the definition of done is. If they can’t recount your definition of done is, then how can they follow it?
The definition of done often becomes an end in itself. We add more conditions to meet ever more edge cases. Soon, there are too many to remember and we ignore the definition completely. At this point, any more additions to the definition are meaningless.
“Just as preachers, politicians, PR spin masters and the media can’t create truth by writing or speaking words they say are true, authors can’t validate truth by putting it into print”
– Lionel Fisher, Celebrating Time Alone: Stories Of Splendid Solitude
Updating the definition of done doesn’t make it true. “We’ll update it and email the developers” is something I’ve overheard which makes me sad. A lesson from the real world – wishing for something doesn’t make it real, saying something doesn’t make it true, and writing something doesn’t mean people will read it.
I like the naive and simplistic nature of “Done is when the Product Owner says it is done”. In most cases it’s the ugly truth (for instance you might get stories delivered without appropriate unit tests).
We can progress onwards to a set of principles for “Done”:-
- Code reviewed against code checklist
- WebOps config/deploy tasks completed
- Tests passed in acceptance environment
- Acceptance criteria met
The above principles are nice and concise, and can easily be stored in your head.
Now try holding all the items below in your head as you develop:-
- All tasks completed unless agreed by team and product owner
- All acceptance tests passed
- Security requirements met : data handled safely
- Code commented, checked in and run against current version in source control
- Passed UAT and signed off as meeting requirements
- Updated user story in the project tracking software
- Any build/deployment/configuration changes implemented/documented/communicated
- Regression tests pass
- Product Owner has to agree that the feature is done
- Peer reviewed (or produced by pair programming) and meeting development standards
- Any NFRs met
- Unit tests written and passing with an overall code coverage of at least 80%
- Relevant documentation / diagrams produced and/or updated
My advice is straightforward – keep it simple so everyone understands it. If they’re only checking it at delivery time, you’ve got a gated delivery process.