Scope | Phases | Roles | Activities | Abstraction/Refinement Level |
---|---|---|---|---|
Single backlog item (epic or user story) |
Any sprint/iteration (end of it) |
Agile team |
Sprint/iteration review meeting |
Design and development |
also known as Done-Done
Synopsis (taken from Agile Alliance Glossary): "The team agrees on, and displays prominently somewhere in the team room, a list of criteria which must be met before a product increment 'often a user story' is considered 'done'. Failure to meet these criteria at the end of a sprint normally implies that the work should not be counted toward that sprint’s velocity."
A Definition of Done (DoD) is the base for sprint/iteration planning and progress tracking (e.g., via velocity metrics and burndown charts). It also serves as a document of understanding when changes are welcome and when they are not. It typically contains criteria such as such as product owner approval obtained, unit test written and passed, demo to end user given, documentation including example written and reviewed, code and other artifacts reviewed by at least one non-author.
Maintaining a DoD is an established Agile practice, enhanced and extended for ESE (this project/repository).
See entry in Agile Glossary for general motivation.
The completion of a step in the value-based approach in IEEE Std. 7000 and Story Valuation in ESE also must be agreed upon.
The DoD is checked when the results of the Ethical Risk-Based Design Process (Clause 9) have been implemented.
Also refer to the process mapping tables on the Background Information page.
Add the following criteria to your already existing DoD checklist:
- Have the IEEE Std. 7000 artifacts (ConOps documentation, the Value Register, EVRs/VBSRs) been updated for accuracy and completeness?
- Does the chosen design satisfy the EVRs and VBSRs as desired?
- Are all made design decisions sound w.r.t. business requirements as well as EVRs and VBSRs (and their priorities)? Can they be traced back to requirements? Are the conflicts and tradeoffs made explicit?
- Will the overall design work in practice (in terms of technical feasibility and user acceptance)? Have all new design elements been validated and verified in that regard?
- Have all design decisions and design updates been documented so that the IEEE Std. 7000 Transparency Management Process can pick them up? For instance, have the decisions been recorded in a recognized ADR format?
ESE has a suggestive rather than normative character; it is ok to add context-specific criteria to this list.
In the Lakeside Mutual example used in Story Valuation, the team created and uses the following DoD checklist:
---
title: "ESE: Sample DoD Checklist"
author: Prowno Lakemutstaff
date: "mm, dd, yyyy (Project ESE)"
---
# Definition of Done for "Custer Self Service" development at Lakeside Mutual
## General DoD Questions
* [ ] Automated build is green
* [ ] Minimum viable documentation is available
* [ ] End user test has been performed and succeeded
* [ ] Product manager/owner has signed completion of story off
* [ ] to be continued
## Value-Based Done Criteria:
* [ ] ConOps documentation, the Value Register and the EVRs have been reviewed for accuracy and completeness
These ESE/IEEE Std. 7000 artifacts are up to date.
* [ ] The design satisfies the EVRs.
* [ ] The chosen designs/the made decisions are sound and can be traced back to requirements
* [ ] There is evidence that the design will work in practice/production; its implementation has been validated and verified.
* [ ] All design updates been documented and are ready for the Transparency Management Process to pick them up?
- Task management and issue-tracking tool
- Any virtual or physical tool for checklist processing
- Reusable DoD criteria templates and catalogs
- Be honest in your assessments.1
- Spend as much time as needed on ethics and be as efficient as possible; avoid a mere "checkbox ethics" approach, often perceived as an antipattern.2
- Consistency matters; define and agree on a DoD at the project start and stick to it. Do not change the criteria in flight. Otherwise, the practice might yield false data and lose its value.
- Plan and execute the acceptance testing adequately. Keep an eye on effort vs. benefit; it might not be needed and/or not possible to let end users test every story in each sprint. Choosing a suited granularity and frequency is a product management responsibility.
In alphabetical order:
- Acceptance Testing
- Definition of Ready
- Ethical Review, performed at sprint/iteration end (all "done" stories)
- Product Backlog
- Value Retrospective
- Sprint Planning
- Story Valuation
Annex I of IEEE Std. 7000 suggests structure and content of a complete Case for Ethics; the DoD definition can be part of it.
The Design Practice Repository/Reference (DPR) on GitHub and on LeanPub summarizes agile architecting practices such as user stories.
"Enhancing Your "Definition of Done" Can Improve Your Minimum Viable Architecture " on InfoQ suggests architectural extensions to the DoD. There is a DoD for architectural decisions as well.
title: "ESE artifact: Ethics-Enabled DoD"
author: Olaf Zimmermann (ZIO)
date: "11, 13, 2023 (Version 1.0)"
copyright: The author, 2023-2024 (unless noted otherwise). All rights reserved.
license: Creative Commons Attribution 4.0 International License
Footnotes
-
Arguably, honesty is a fundamental ethical value that can be expected to also appear in many Value Registers. ↩
-
See "Stop Using Checkbox Ethics" (Part 1, Part 2) and/or "Research Ethics and Ethical Research: An Example of Integrating Ethics in R&I Research" for a clarification of the term "checkbox ethics" (in a broader context). ↩