By Arjun Mehta
DORA metrics are the gold standard for measuring engineering performance. Your team deploys daily. Lead time is under an hour. Change failure rate is below 5%. MTTR is 15 minutes.
You have elite engineering performance.
But you're shipping features nobody wants.
DORA metrics measure how fast you ship, not what you're shipping or whether it matters. A team with perfect DORA could be building the wrong product at high velocity.
What DORA Actually Measures
The four DORA metrics:
- Deployment frequency. How often do you ship?
- Lead time. How fast from commit to production?
- Change failure rate. How many deployments cause incidents?
- MTTR. How quickly do you recover from failures?
All four answer: "Can we ship reliably and frequently?"
They don't answer: Are we shipping the right things? Do customers care? Are we making money? Are we learning?
The Elite Team Shipping Wrong
Imagine a team:
10 deployments per day. Lead time under 30 minutes. 99% success rate. MTTR under 15 minutes.
DORA metrics are elite.
But users don't adopt the features they ship. Retention doesn't improve. Revenue is flat. The product is slowly dying.
DORA metrics would celebrate this team. Reality: they're optimizing for speed toward the wrong destination.
What DORA Misses
Miss 1: Feature Adoption
You ship a dashboard in 2 days (elite DORA). 95% of users never open it.
DORA says: "Shipped fast." Reality: "Wasted engineering time."
Miss 2: Customer Impact
You improve metrics by cutting QA. Now you ship bugs faster. Users get worse experience.
DORA says: "Faster delivery." Reality: "Worse product."
Miss 3: Learning Velocity
You ship 20 experiments per quarter without running A/B tests. You have no idea which ones worked.
DORA says: "High deployment frequency." Reality: "Shipping without learning."
Miss 4: Technical Debt Accumulation
You ship fast by skipping refactoring. In 6 months, feature velocity drops 50% because you're drowning in debt.
DORA doesn't see the debt until velocity crashes.
Miss 5: Team Health
You achieve elite DORA by working weekends. Your best people are burned out.
DORA says: "High performance." Reality: "High burnout."
The Metrics You Need
Metric 1: Feature Adoption
What it measures: Are users actually using what we shipped?
How to measure: % of users who've tried the feature. % of users who use it regularly. Time to first use. Repeat usage rate.
Why it matters: A feature nobody uses is wasted engineering time.
Metric 2: Business Impact
What it measures: Does the feature move business metrics?
Examples: Revenue increased by $X. Conversion improved by Y%. CAC decreased by Z%. Retention improved.
Why it matters: Engineering is a cost center unless it drives business results.
Metric 3: Learning Velocity
What it measures: How much are we learning about customers?
Examples: Number of experiments run. User interviews conducted. Hypotheses validated or invalidated.
Why it matters: Shipping fast in the wrong direction is worse than shipping slow in the right direction.
Metric 4: Code Quality
What it measures: Are we maintaining quality while shipping?
Examples: Defect escape rate. Support tickets from new features. Incident impact. Technical debt ratio.
Why it matters: Elite DORA + poor quality = false sense of success.
Metric 5: Developer Satisfaction
What it measures: Are engineers happy?
Examples: Satisfaction scores. Code review time. Time in meetings. Onboarding experience. Retention.
Why it matters: Burned out engineers quit. High DORA + high burnout = unsustainable.
The Three-Dimension Framework
Dimension 1: Engineering Excellence (DORA) Deployment frequency. Lead time. Change failure rate. MTTR.
Dimension 2: Product Success Feature adoption. Business impact. Learning velocity. Customer satisfaction.
Dimension 3: Team Health Developer satisfaction. Retention. Onboarding experience. Career growth.
Optimize all three. Optimizing DORA alone creates fast teams shipping the wrong product to burned out people.
The Danger of Single-Metric Optimization
If you optimize DORA metrics alone, you get:
Fast shipping of wrong features. Burned out engineers. Accumulated technical debt. Low-quality code. High churn.
This is worse than shipping slow with high quality and learning.
Getting Started
- Track DORA metrics (engineering excellence)
- Add product metrics (are features being adopted?)
- Monitor team health (are people okay?)
- Review all metrics together in retrospectives
- Optimize for the combination, not any single metric
DORA metrics are necessary but not sufficient. Add product and people metrics to get the full picture. Your engineering performance looks very different when you do.
Frequently Asked Questions
Should we deprioritize DORA if product metrics are bad? No. You need both. Elite DORA + poor product = wrong features shipped fast. Poor DORA + great product = right features shipped slowly. Optimize for both simultaneously.
How do we handle trade-offs between DORA and product metrics? They usually don't conflict. Taking time to validate features doesn't require slow deployments. They're orthogonal. Fast deployment velocity and thoughtful product decisions can coexist.
What if business metrics don't improve despite elite DORA? That's a product strategy problem, not engineering. You're shipping the wrong things. Work with product to understand the real problem.