Metrics are visible but not trusted
Dashboards, KPIs, and delivery metrics exist, but people do not believe they reflect reality well enough to act on them confidently.
- Where you see this
engineering managementdelivery reportingplatform dashboardsAI evaluation reporting
- Not necessarily a problem when
- a new metric is still being calibrated openly and trust-building is active
- Often mistaken for
- if it is measured, it is under control
- Time horizon
- medium-term
- Best placed to act
metric ownerengineering leadershipplatform/data owners
The signal
What you would actually notice
Measurement without trust creates reporting theater rather than better decisions.
Field observation
Teams still ask people for reality even though dashboards exist, or interpret dashboards defensively instead of operationally.
Also observed
- The dashboard says green, but everyone knows we are not green.
- We do not use that metric for real decisions.
Primary reading
What it usually indicates
Most likely underlying patterns when this signal shows up. Not a diagnosis, a starting hypothesis.
Usually indicates
Most likely underlying patterns when this signal shows up.
- poor metric design
- bad data quality
- incentives distorting reporting
- metrics misaligned with lived reality
Not necessarily a problem when
Contexts where this signal is expected and does not indicate a deeper issue.
- a new metric is still being calibrated openly and trust-building is active
Stakes
Why it matters
Measurement without trust creates reporting theater rather than better decisions.
Heuristic
If no one trusts the metric, the visibility is cosmetic.
Inspection
What to check next
Deliberate steps to confirm or disconfirm the primary reading above. Not a checklist. An order of inspection.
- metric definitions
- data quality
- decision use cases
- local workarounds
Diagnostic questions
Questions to ask the team, or yourself, before concluding anything.
- What reality does this metric fail to capture?
- Who trusts it least, and why?
- What behavior is the metric incentivizing?
Progression
Under the signal
Where this pattern tends to come from, what's holding it up, and where it goes if nothing changes.
Leading indicators
What tends to show up first.
- teams caveat every dashboard discussion
- manual spreadsheets or side channels are used to explain the truth
- leaders ask for alternate numbers
Common root causes
What is usually sitting under the signal.
- measurement divorced from practice
- status incentives
- poor instrumentation
Likely consequences
What happens if nothing changes.
- bad decisions
- defensive reporting
- reporting inflation
Look-alikes
Not what it looks like
Patterns that can be mistaken for this signal, and 'fix' attempts that make it worse.
- if it is measured, it is under control
Anti-patterns when responding
Responses that feel sensible and usually make the underlying pattern worse.
- adding more dashboards instead of repairing trust
- using disputed metrics for judgment-heavy performance conversations
Context
Context and ownership
Where this signal surfaces, who sees it first, who can actually act, and how much runway there usually is before escalation.
Where it shows up
- engineering management
- delivery reporting
- platform dashboards
- AI evaluation reporting
Who sees it first
Before it escalates.
- managers
- team leads
- analysts
Who can move on it
Not always the same as who notices it.
- metric owner
- engineering leadership
- platform/data owners
medium-term
How much runway there usually is before the signal hardens into the underlying pattern.
AI impact
AI effects on this signal
How AI-assisted and AI-driven workflows tend to amplify or hide this signal.
AI amplifies
Ways AI tooling tends to make this signal louder or more common.
- AI can make reporting easier to produce, which can increase the volume of low-trust metrics quickly.
AI masks
Ways AI tooling tends to hide this signal, so it keeps growing under the surface.
- AI-generated narrative around a dashboard can make weak metrics feel more legitimate.
AI synthesis
Evaluation dashboards look rigorous while task-ground truth remains thin.
Relationships
Connected signals
Related failure modes, decisions behind the signal, response playbooks, and neighboring red flags.