Skip to content

Concrete Examples of Non-Work Issues

Purpose: Help stakeholders understand what will be excluded from velocity/time metrics


What Are "Non-Work" Issues?

Non-work issues are tickets that were closed without any development work being performed. They typically fall into these categories:

  1. Immediate Rejections (< 1 day): Created and closed immediately
  2. Backlog Cleanup (> 120 days): Sat in backlog for months, then closed without work

Both have the same characteristic: Never entered "In Progress" status


Real Examples from UAS Team

Category 1: Immediate Rejections (< 1 day)

Example: UAS-905 (Task)

  • Lead Time: 0.78 days (19 hours)
  • Pattern: To Do → Done
  • Likely Reason: Duplicate, invalid, or already completed elsewhere

Example: UAS-996 (Bug)

  • Lead Time: 1.02 days (24 hours)
  • Pattern: To Do → Done
  • Likely Reason: Cannot reproduce, duplicate bug report, or invalid

Count: 1-2 issues in dataset


Category 2: Backlog Cleanup (> 120 days)

Example: UAS-520 (TechTask)

  • Lead Time: 294 days (9.7 months)
  • Created: 2024-10-15
  • Closed: 2025-08-06
  • Pattern: To Do → Done
  • Likely Reason: No longer needed, superseded by other work

Example: UAS-204 (Task)

  • Lead Time: 267 days (8.8 months)
  • Created: 2024-11-12
  • Closed: 2025-08-06
  • Pattern: To Do → Done
  • Likely Reason: Backlog grooming, no longer relevant

Example: UAS-637 (Epic)

  • Lead Time: 294 days (9.7 months)
  • Created: 2024-10-17
  • Closed: 2025-08-06
  • Pattern: To Do → Done
  • Likely Reason: Epic abandoned or completed via other means

Count: 12-15 issues in dataset

Notable Pattern: Many closed on the same date (2025-08-06), suggesting a deliberate backlog cleanup session.


Category 3: Quick Fixes (KEPT - These Are Real Work)

Example: UAS-1047 (Bug)

  • Lead Time: 0.19 days (5 hours)
  • Pattern: To Do → In Progress → Done
  • Cycle Time: 0.18 days
  • Status: KEPT - This entered In Progress, so it's real work

Example: UAS-987 (Task)

  • Lead Time: 0.87 days (21 hours)
  • Pattern: To Do → In Progress → Done
  • Cycle Time: 0.09 days (2 hours)
  • Status: KEPT - Actual development work was performed

Key Difference: These issues entered "In Progress", indicating someone worked on them, even if briefly.


Comparison Table

Issue Type Lead Time Pattern Classification Reason
UAS-905 Task 0.78 days To Do → Done EXCLUDE Immediate rejection
UAS-996 Bug 1.02 days To Do → Done EXCLUDE Immediate rejection
UAS-520 TechTask 294 days To Do → Done EXCLUDE Backlog cleanup
UAS-637 Epic 294 days To Do → Done EXCLUDE Backlog cleanup
UAS-1047 Bug 0.19 days To Do → In Progress → Done KEEP Real work (entered In Progress)
UAS-987 Task 0.87 days To Do → In Progress → Done KEEP Real work (entered In Progress)

Impact on Metrics

Current Metrics (Including Non-Work)

  • Velocity: 175 issues/period
  • Average Lead Time: 54.53 days
  • Median Lead Time: 21.99 days

Adjusted Metrics (Excluding Non-Work)

Using conservative approach (< 1 day OR > 120 days skip pattern):

  • Velocity: 160 issues/period (-8.6%)
  • Average Lead Time: ~50 days (-8.3%)
  • Median Lead Time: ~20 days (-9.0%)

What This Means

Before: Team completed 175 issues - But 15 of these required zero development work

After: Team completed 160 issues of actual work - More accurate representation of productivity - Better for trend analysis and capacity planning


Why This Matters

Problem 1: Inflated Velocity

Scenario: Team has backlog cleanup session

  • Close 20 old issues in one day (all To Do → Done)
  • Velocity spikes from 30 to 50 issues/sprint
  • False signal - team didn't actually work faster

Problem 2: Skewed Lead Time

Current Calculation:

Issue created Jan 1, sits in backlog until Aug 1 (214 days), then closed without work
Lead time = 214 days

This inflates average lead time, making the team appear slower than they are.

Adjusted Calculation:

Issue excluded from lead time calculation
Only issues with actual work (In Progress) included
More accurate reflection of development speed

Problem 3: Misleading Comparisons

Comparing Teams: - Team A: Does regular backlog cleanup (velocity includes non-work) - Team B: Rarely cleans backlog (velocity is pure work) - Unfair comparison - Team A appears more productive


Validation Questions

To validate whether an issue should be excluded, ask:

  1. Did anyone develop/code anything for this issue?

    • Yes → KEEP
    • No → Consider excluding
  2. Did it enter "In Progress" status?

    • Yes → KEEP (even if brief)
    • No → Potential non-work
  3. Why was it closed?

    • Duplicate → EXCLUDE
    • Won't fix → EXCLUDE
    • No longer needed → EXCLUDE
    • Completed → KEEP
    • Fixed → KEEP
  4. How long did it sit in backlog?

    • < 1 day: Immediate rejection → EXCLUDE
    • 1-120 days: Possibly real work → KEEP (unless resolution says otherwise)
    • 120 days: Likely backlog cleanup → EXCLUDE


Stakeholder Decision Points

Decision 1: Threshold for "Backlog Cleanup"

Options: - 90 days (more aggressive, excludes more issues) - 120 days (recommended, balanced approach) - 180 days (more conservative, excludes fewer issues)

Recommendation: 120 days - Captures obvious backlog cleanup - Preserves issues that may have been legitimately delayed

Decision 2: Report Both Metrics?

Option A: Show only adjusted metrics - Simpler, cleaner - But hides the filtering

Option B: Show both raw and adjusted - More transparent - Stakeholders can see what's excluded - Recommended

Example:

Velocity: 160 issues (175 total, 15 non-work excluded)
Lead Time: 50.2 days (adjusted), 54.5 days (raw)

Decision 3: Retroactive Adjustment?

Question: Should we recalculate historical metrics?

Options: - Yes: More accurate trend analysis - No: Could cause confusion about "changing" past metrics

Recommendation: Yes, but clearly document the methodology change


Next Steps

  1. Review these examples with team leads
  2. Validate classification logic against domain knowledge
  3. Approve threshold for backlog cleanup (120 days?)
  4. Decide on reporting format (show both metrics?)
  5. Implement filter in calculators
  6. Test on other teams to validate consistency

Appendix: All Skip Pattern Issues (UAS)

Immediate Rejections (< 7 days)

Issue Type Lead Time Created Closed
UAS-905 Task 0.78d 2025-07-30 2025-07-31
UAS-996 Bug 1.02d 2025-09-05 2025-09-06

Backlog Cleanup (> 120 days)

Issue Type Lead Time Created Closed
UAS-520 TechTask 294d 2024-10-15 2025-08-06
UAS-204 Task 267d 2024-11-12 2025-08-06
UAS-359 TechTask 267d 2024-11-12 2025-08-06
UAS-365 TechTask 263d 2024-11-16 2025-08-06
UAS-388 Task 257d 2024-11-22 2025-08-06
UAS-399 Task 251d 2024-11-28 2025-08-06
UAS-597 Bug 183d 2025-02-03 2025-08-06
UAS-637 Epic 294d 2024-10-17 2025-08-06
UAS-649 Task 278d 2024-11-02 2025-08-06
UAS-687 Task 262d 2024-11-18 2025-08-06
UAS-712 TechTask 246d 2024-12-04 2025-08-06
UAS-729 Task 237d 2024-12-13 2025-08-06
UAS-901 Task 212d 2025-01-07 2025-08-06

Total for Conservative Approach: 15 issues (8.6% of completed)


Contact: Share feedback with team leads by [date]