Teams cannot explain what done means
Completion criteria are vague enough that teams, stakeholders, and reviewers mean different things by 'done.'
- Where you see this
complex delivery programscross-functional teamsstakeholder-heavy initiatives
- Not necessarily a problem when
- exploratory discovery work is explicitly marked as learning rather than delivery
- Often mistaken for
- everyone knows what done means
- Time horizon
- near-term
- Best placed to act
delivery leadproduct and engineering leads together
The signal
What you would actually notice
Vague done definitions create hidden rework, surprise quality gaps, and endless near-completion.
Field observation
Work moves through states, but completion criteria are inconsistent, local, or only implicit.
Also observed
- It is done for engineering.
- We are basically done.
Primary reading
What it usually indicates
Most likely underlying patterns when this signal shows up. Not a diagnosis, a starting hypothesis.
Usually indicates
Most likely underlying patterns when this signal shows up.
- weak acceptance criteria
- misaligned stakeholders
- quality and operations not integrated into delivery definition
Not necessarily a problem when
Contexts where this signal is expected and does not indicate a deeper issue.
- exploratory discovery work is explicitly marked as learning rather than delivery
Stakes
Why it matters
Vague done definitions create hidden rework, surprise quality gaps, and endless near-completion.
Heuristic
If done is fuzzy, delivery confidence is performative.
Inspection
What to check next
Deliberate steps to confirm or disconfirm the primary reading above. Not a checklist. An order of inspection.
- acceptance criteria quality
- handoff friction
- reopen rates
- release readiness checklist
Diagnostic questions
Questions to ask the team, or yourself, before concluding anything.
- What must be true for this to count as done?
- Does done include rollout, observability, docs, and support readiness?
- Who would disagree with our definition of done?
Progression
Under the signal
Where this pattern tends to come from, what's holding it up, and where it goes if nothing changes.
Leading indicators
What tends to show up first.
- items appear done then reopen
- handoffs reveal missing assumptions
- teams talk about development done, QA done, release done separately without shared definition
Common root causes
What is usually sitting under the signal.
- weak cross-functional alignment
- delivery pressure
- role-siloed definitions of completion
Likely consequences
What happens if nothing changes.
- rework
- misalignment
- hidden quality debt
- date slippage
Look-alikes
Not what it looks like
Patterns that can be mistaken for this signal, and 'fix' attempts that make it worse.
- everyone knows what done means
Anti-patterns when responding
Responses that feel sensible and usually make the underlying pattern worse.
- tracking status with multiple incompatible done states
- calling code-complete done
Context
Context and ownership
Where this signal surfaces, who sees it first, who can actually act, and how much runway there usually is before escalation.
Where it shows up
- complex delivery programs
- cross-functional teams
- stakeholder-heavy initiatives
Who sees it first
Before it escalates.
- delivery lead
- QA
- support
- product manager
Who can move on it
Not always the same as who notices it.
- delivery lead
- product and engineering leads together
near-term
How much runway there usually is before the signal hardens into the underlying pattern.
AI impact
AI effects on this signal
How AI-assisted and AI-driven workflows tend to amplify or hide this signal.
AI amplifies
Ways AI tooling tends to make this signal louder or more common.
- AI can write polished acceptance criteria that still avoid the real operational meaning of done.
AI masks
Ways AI tooling tends to hide this signal, so it keeps growing under the surface.
- Documentation quality can make incomplete readiness look complete.
AI synthesis
AI-generated test or ticket artifacts create the illusion of completeness without operational closure.
Relationships
Connected signals
Related failure modes, decisions behind the signal, response playbooks, and neighboring red flags.