Skip to main content
The Hard Parts.dev
RF-30 Leadership · Behavioral RF Red Flags
Severity high Freq very common

Teams are measured on output, not outcome

Success is tracked through activity volume, ticket throughput, or artifact production more than through user, business, or system outcomes.

Severity
high
Frequency
very common
First noticed by
staff engineers · product leaders · good managers
Detectability
easy-to-normalize
Confidence
high
At a glanceRF-30
Where you see this

delivery organizationsplatform teamstransformation programs

Not necessarily a problem when
output is being used as a temporary proxy in an area where outcome signals genuinely lag and this is acknowledged
Often mistaken for
high output proves execution excellence
Time horizon
medium-to-long-term
Best placed to act

leadershipproduct leadershipdirectors

The signal

What you would actually notice

Output-oriented measurement often creates local optimization and synthetic velocity.

Field observation

Teams optimize for visible delivery counts, not for whether the work improved the situation it was meant to change.

Also observed

  • We delivered 400 story points.
  • The question is how much we shipped, not what changed.

Primary reading

What it usually indicates

Most likely underlying patterns when this signal shows up. Not a diagnosis, a starting hypothesis.

Usually indicates

Most likely underlying patterns when this signal shows up.

  • weak product measurement
  • leadership demand for controllable metrics
  • status-driven incentives

Stakes

Why it matters

Output-oriented measurement often creates local optimization and synthetic velocity.

Inspection

What to check next

Deliberate steps to confirm or disconfirm the primary reading above. Not a checklist. An order of inspection.

  1. KPI structure
  2. team OKRs
  3. work selection incentives

Diagnostic questions

Questions to ask the team, or yourself, before concluding anything.

  1. What real problem improved because of this work?
  2. Would the team make different choices if measured on outcome?
  3. What is easy to count that is crowding out what matters?

Progression

Under the signal

Where this pattern tends to come from, what's holding it up, and where it goes if nothing changes.

Leading indicators

What tends to show up first.

  • throughput is celebrated more than effect
  • teams close lots of work but core pain remains
  • measurement discussions avoid user or operational reality

Common root causes

What is usually sitting under the signal.

  • measurement convenience
  • weak product strategy
  • leadership comfort with controllable numbers

Likely consequences

What happens if nothing changes.

  • ticket theater
  • synthetic velocity
  • platform-before-product

Look-alikes

Not what it looks like

Patterns that can be mistaken for this signal, and 'fix' attempts that make it worse.

False friends Things the signal is often confused with, but isn't.
  • high output proves execution excellence

Anti-patterns when responding

Responses that feel sensible and usually make the underlying pattern worse.

  • mistaking utilization and throughput for impact
  • managing by volume because outcomes are harder to discuss

Context

Context and ownership

Where this signal surfaces, who sees it first, who can actually act, and how much runway there usually is before escalation.

Common contexts

Where it shows up

  • delivery organizations
  • platform teams
  • transformation programs
Most likely to notice

Who sees it first

Before it escalates.

  • staff engineers
  • product leaders
  • good managers
Best placed to act

Who can move on it

Not always the same as who notices it.

  • leadership
  • product leadership
  • directors
Time horizon

medium-to-long-term

How much runway there usually is before the signal hardens into the underlying pattern.

AI impact

AI effects on this signal

How AI-assisted and AI-driven workflows tend to amplify or hide this signal.

AI amplifies

Ways AI tooling tends to make this signal louder or more common.

  • AI makes outputs cheaper to generate, which can worsen output-biased measurement rapidly.

AI masks

Ways AI tooling tends to hide this signal, so it keeps growing under the surface.

  • High artifact volume can look like progress even when underlying complexity is unsolved.

Relationships

Connected signals

Related failure modes, decisions behind the signal, response playbooks, and neighboring red flags.