Skip to main content
The Hard Parts.dev
RF-28 Process · Delivery RF Red Flags
Severity high Freq common

Teams cannot explain what done means

Completion criteria are vague enough that teams, stakeholders, and reviewers mean different things by 'done.'

Severity
high
Frequency
common
First noticed by
delivery lead · QA · support · product manager
Detectability
visible-if-you-look
Confidence
high
At a glanceRF-28
Where you see this

complex delivery programscross-functional teamsstakeholder-heavy initiatives

Not necessarily a problem when
exploratory discovery work is explicitly marked as learning rather than delivery
Often mistaken for
everyone knows what done means
Time horizon
near-term
Best placed to act

delivery leadproduct and engineering leads together

The signal

What you would actually notice

Vague done definitions create hidden rework, surprise quality gaps, and endless near-completion.

Field observation

Work moves through states, but completion criteria are inconsistent, local, or only implicit.

Also observed

  • It is done for engineering.
  • We are basically done.

Primary reading

What it usually indicates

Most likely underlying patterns when this signal shows up. Not a diagnosis, a starting hypothesis.

Usually indicates

Most likely underlying patterns when this signal shows up.

  • weak acceptance criteria
  • misaligned stakeholders
  • quality and operations not integrated into delivery definition

Stakes

Why it matters

Vague done definitions create hidden rework, surprise quality gaps, and endless near-completion.

Inspection

What to check next

Deliberate steps to confirm or disconfirm the primary reading above. Not a checklist. An order of inspection.

  1. acceptance criteria quality
  2. handoff friction
  3. reopen rates
  4. release readiness checklist

Diagnostic questions

Questions to ask the team, or yourself, before concluding anything.

  1. What must be true for this to count as done?
  2. Does done include rollout, observability, docs, and support readiness?
  3. Who would disagree with our definition of done?

Progression

Under the signal

Where this pattern tends to come from, what's holding it up, and where it goes if nothing changes.

Leading indicators

What tends to show up first.

  • items appear done then reopen
  • handoffs reveal missing assumptions
  • teams talk about development done, QA done, release done separately without shared definition

Common root causes

What is usually sitting under the signal.

  • weak cross-functional alignment
  • delivery pressure
  • role-siloed definitions of completion

Likely consequences

What happens if nothing changes.

  • rework
  • misalignment
  • hidden quality debt
  • date slippage

Look-alikes

Not what it looks like

Patterns that can be mistaken for this signal, and 'fix' attempts that make it worse.

False friends Things the signal is often confused with, but isn't.
  • everyone knows what done means

Anti-patterns when responding

Responses that feel sensible and usually make the underlying pattern worse.

  • tracking status with multiple incompatible done states
  • calling code-complete done

Context

Context and ownership

Where this signal surfaces, who sees it first, who can actually act, and how much runway there usually is before escalation.

Common contexts

Where it shows up

  • complex delivery programs
  • cross-functional teams
  • stakeholder-heavy initiatives
Most likely to notice

Who sees it first

Before it escalates.

  • delivery lead
  • QA
  • support
  • product manager
Best placed to act

Who can move on it

Not always the same as who notices it.

  • delivery lead
  • product and engineering leads together
Time horizon

near-term

How much runway there usually is before the signal hardens into the underlying pattern.

AI impact

AI effects on this signal

How AI-assisted and AI-driven workflows tend to amplify or hide this signal.

AI amplifies

Ways AI tooling tends to make this signal louder or more common.

  • AI can write polished acceptance criteria that still avoid the real operational meaning of done.

AI masks

Ways AI tooling tends to hide this signal, so it keeps growing under the surface.

  • Documentation quality can make incomplete readiness look complete.

Relationships

Connected signals

Related failure modes, decisions behind the signal, response playbooks, and neighboring red flags.