Portfolio Flow Metrics
This framework defines metrics for measuring and managing the flow of initiatives through the Portfolio Kanban system. It enables portfolio managers to track flow efficiency, predictability, and economic performance at both initiative and portfolio level.
Measurement Scope
What We Measure
Three measurement domains:
Flow Metrics — How efficiently initiatives move through the delivery system: time to market, WIP, throughput.
Economic Metrics — Cost tracking and spending patterns, budget predictability and burn rates.
Commitment and Predictability Metrics — Delivery reliability and planning accuracy.
What We Do Not Measure
Business outcome metrics — ROI, customer satisfaction, strategic goal achievement (OKRs), and market share are outside this framework.
Why: Outcome measurement requires organizational maturity in value tracking. Business value realization happens months after delivery, depends on adoption and market conditions beyond portfolio control, and requires integration with business analytics systems.
Organizations with mature outcome measurement capabilities should complement these metrics with business outcome KPIs aligned to strategic themes and OKRs.
What is an Initiative?
An Initiative is a significant investment that flows through the Portfolio Kanban system — a substantial value stream investment requiring approval and oversight from Lean Portfolio Management.
The term “Initiative” is preferred over “Epic” to avoid confusion with DVS-level Epics and to emphasize the strategic, investment-oriented nature of portfolio-level work.
Initiative Size Categories
| Size | Duration | Typical characteristics |
|---|---|---|
| Medium | 3–9 months | One or few DVSs, moderate complexity, clear scope |
| Large | 9–18 months | Multiple DVSs, significant organizational impact |
| Very Large | 18–27 months | Enterprise-wide, complex dependencies |
| Strategic | 27+ months | Multi-year, fundamental organizational change |
Initiatives shorter than 3 months are typically features delivered within a DVS. Exceptions exist — a short-duration item may warrant portfolio visibility for governance or strategic reasons — but these should be treated as exceptions, not a standard size category.
Initiative-Level Metrics
20 metrics in total across four categories.
Flow Time Metrics (6)
All flow time metrics use calendar days.
1. Flow Time — TTM (Time to Market)
Total elapsed time from entering Reviewing to Done.
Flow Time TTM = Date entered Done − Date entered Reviewing
2. Flow Time — Reviewing
Date exited Reviewing − Date entered Reviewing
3. Flow Time — Analyzing
Date exited Analyzing − Date entered Analyzing
4. Flow Time — Portfolio Backlog
Wait time after LPM approval before implementation begins.
Date exited Portfolio Backlog − Date entered Portfolio Backlog
5. Flow Time — Implementing
Date exited Implementing − Date entered Implementing
6. Flow Time — Evaluating
Time from entering Evaluating to outcome assessment conclusion.
Date exited Evaluating − Date entered Evaluating
WIP Age Metrics (6)
WIP Age measures how long an initiative has been in its current stage. Used to identify stalled items before they become critical.
7. WIP Age — TTM
Total time active since entering the system.
Current Date − Date entered Reviewing
8. WIP Age — Reviewing Current age in Reviewing. Identifies stalled items in early analysis.
9. WIP Age — Analyzing Current age in Analyzing. Highlights initiatives stuck in business case development.
10. WIP Age — Portfolio Backlog Wait time for approved initiatives awaiting implementation capacity. The most critical WIP age metric — represents direct opportunity cost.
11. WIP Age — Implementing Current duration in implementation. Identifies long-running implementations and potential scope creep.
12. WIP Age — Evaluating
Current time in outcome assessment. Typically one full OKR cycle. High WIP age signals that outcome assessment is not being concluded — the learning is not being fed forward.
Current Date − Date entered Evaluating
Cost in Progress (CIP) Metrics (5)
13. CIP — TTM Total actual cost incurred from entering Reviewing to current date.
14. CIP — Reviewing Cost during Reviewing phase (Initiative Owner time, analysis resources).
15. CIP — Analyzing Cost during Analyzing phase (business case development, subject matter experts).
16. CIP — Portfolio Backlog Cost while waiting (typically minimal; primarily opportunity cost).
17. CIP — Implementing Cost during implementation (DVS capacity, external vendors, infrastructure). Typically the largest cost component.
Predictability Metrics (3)
18. Flow Predictability
1 − |Actual Duration − Forecasted Duration| / Forecasted Duration
Example: Forecasted 9 months, actual 12 months → Predictability = 67%
19. Cost Predictability
1 − |Actual Cost − Forecasted Cost| / Forecasted Cost
Example: Forecasted $1M, actual $1.2M → Predictability = 80%
20. Commitment Stability Count of formal re-baseline decisions after the GO decision.
- 0 = High stability
- 1 = Moderate
- 2+ = Low
Metric Interpretation by Initiative Size
| Metric | Small (<3m) | Medium (3–9m) | Large (9–18m) | Very Large (18–27m) | Strategic (27+m) |
|---|---|---|---|---|---|
| Flow Time TTM | 30–90 days | 90–270 days | 270–540 days | 540–810 days | 810+ days |
| WIP Age threshold | >60 days | >180 days | >365 days | >540 days | Monitor quarterly |
| Flow Predictability | 85%+ | 75%+ | 70%+ | 65%+ | 60%+ |
| Cost Predictability | 85%+ | 75%+ | 70%+ | 65%+ | 60%+ |
| Commitment Stability | 0 ideal | 0–1 acceptable | 1–2 acceptable | 2–3 acceptable | 3+ expected |
Lower predictability thresholds for larger initiatives reflect necessary learning and adaptation — not failure. Strategic initiatives carry inherent uncertainty that cannot be eliminated through better planning.
Portfolio-Level Metrics
10 metrics in total across four categories.
Flow Metrics (4)
1. Flow Velocity Initiatives completed per time period. Primary measure of portfolio delivery capacity.
2. Flow Time Median time for initiatives to flow from Reviewing to Done, segmented by size and phase.
3. Flow Load (WIP) Total initiatives in active work, segmented by phase and size. Fundamental flow metric — high WIP causes longer lead times (Little’s Law).
4. Flow Distribution How capacity is distributed across initiative types and value streams. Ensures portfolio investment aligns with strategy.
Predictability Metrics (2)
5. Flow Predictability Median of individual initiative Flow Predictability scores across the active portfolio.
6. Cost Predictability Median of individual initiative Cost Predictability scores across the active portfolio.
Economic Metrics (2)
7. Cost in Progress (CIP) Total and median cost in active initiatives, segmented by phase and size.
8. Cost Burn Rate Rate of spending per time period. Monitors actual spending pace against budget.
Commitment Metrics (2)
9. Commitment Reliability
(Initiatives delivered as committed / Total with GO decision) × 100
10. Commitment Stability
(Initiatives without re-baseline / Total active with GO) × 100
Recommended Review Cadence
Mid-PI review (weeks 5–6) — Portfolio health check and early intervention for at-risk initiatives.
End-PI review (Inspect and Adapt) — Complete analysis and improvement planning.
On-demand — When specific initiatives show concerning trends in WIP Age or Cost Burn Rate.
Framework Roots
- SAFe Lean Portfolio Management — Portfolio Kanban, initiative flow, and WSJF as prioritization heuristic
- Lean / Little’s Law — WIP and flow time relationship; flow as the primary optimization target
- Don Reinertsen — Flow economics, cost of delay, and the economics of WIP limits
- Mik Kersten (Project to Product) — Flow metrics applied to software delivery at scale