Skip to content

JIRA Non-Work Issue Analysis - UAS Team

Analysis Date: 2025-11-26 Dataset: UAS team (238 total issues, 175 completed)

Executive Summary

Key Finding: 21.7% of completed issues (38 out of 175) were closed without any development work being done.

These issues went directly from "To Do" to "Done" status, never entering "In Progress", indicating they were closed for administrative reasons (duplicate, won't fix, no longer needed, etc.).

1. Cycle Time Analysis

Metric Count Percentage
Issues with cycle time (entered In Progress) 137 78.3%
Issues WITHOUT cycle time (never In Progress) 38 21.7%

Examples of Non-Work Issues

All of these went directly from "To Do" → "Done":

  • UAS-520 (TechTask): To Do → Done
  • UAS-204 (Task): To Do → Done
  • UAS-359 (TechTask): To Do → Done
  • UAS-365 (TechTask): To Do → Done
  • UAS-388 (Task): To Do → Done

2. Lead Time Distribution

All Completed Issues (n=175)

Bucket Count Percentage
< 1 day 9 5.1%
1-7 days 31 17.7%
7-14 days 29 16.6%
14-30 days 34 19.4%
> 30 days 72 41.1%

Statistics: - Min: 0.10 days - Max: 349.84 days - Average: 54.53 days - Median: 21.99 days

Skip Pattern Issues Only (To Do → Done, n=38)

Bucket Count Percentage
< 1 day 1 2.6%
1-7 days 1 2.6%
7-30 days 3 7.9%
> 30 days 33 86.8%

Statistics: - Min: 0.78 days - Max: 323.97 days - Average: 124.26 days (2.3x higher than all issues) - Median: 117.96 days

Key Insight: Most skip pattern issues (86.8%) sat in the backlog for over 30 days before being closed without work. This suggests they were backlog cleanup activities rather than quick rejections.

3. Resolution Field Analysis

Finding: No resolution field data is available in the current dataset.

The JIRA API response does not include the resolution field (e.g., "Won't Do", "Duplicate", "Cannot Reproduce"). This data may not have been fetched or may not be available in the current data model.

Recommendation: Enhance data fetching to include resolution field for better classification.

4. Status Transition Patterns

Pattern: Direct Skip (To Do → Done)

Count: 38 issues (21.7% of completed)

This pattern clearly indicates no development work was performed: - Issue created in "To Do" - Never moved to "In Progress" or any development status - Directly transitioned to "Done"

Issue Type Breakdown

Issue Type Count Percentage of Skip Pattern
Task 20 52.6%
TechTask 8 21.1%
Sub-task 6 15.8%
Bug 3 7.9%
Epic 1 2.6%

Notable: Even 3 bugs were closed without entering development, suggesting they were duplicates or invalid reports.

5. Classification Criteria Comparison

Option 1: Resolution-based (Most Reliable)

Status: Cannot implement - resolution field not available in current data

Option 2: Cycle Time-based

Criteria: Exclude issues that never entered "In Progress" status Impact: 38 issues (21.7% of completed)

Pros: - Clear indicator of no actual work - Easy to implement with current data - Removes obvious non-work items

Cons: - May exclude legitimate quick fixes that were resolved immediately - Doesn't account for issues that entered In Progress but were immediately abandoned

Criteria: Exclude issues that meet ANY of: 1. Never entered "In Progress" status AND lead time < 7 days 2. Never entered "In Progress" status AND lead time > 90 days (likely backlog cleanup) 3. Issue type is "Spike" or "Research" with no deliverable

Impact: Would exclude ~15-20 issues (more conservative)

Pros: - More nuanced approach - Preserves issues that sat in backlog briefly before being validly closed - Focuses on clear non-work patterns

Cons: - More complex logic - Requires tuning thresholds

Immediate Action (Conservative)

Filter out issues that meet BOTH criteria: - Never entered "In Progress" status - Lead time < 1 day OR lead time > 120 days

Estimated exclusions: 5-10 issues (2.9-5.7% of completed)

Future Enhancement (When Resolution Field Available)

Add resolution-based filtering:

NON_WORK_RESOLUTIONS = [
    "Won't Do",
    "Won't Fix",
    "Duplicate",
    "Cannot Reproduce",
    "Invalid",
    "Declined",
    "Abandoned",
    "Withdrawn"
]

Implementation Code

def is_non_work_issue(issue, transitions):
    """
    Determine if an issue should be excluded from velocity/time metrics.

    Args:
        issue: Issue dict with created_at, resolved_at, status
        transitions: List of status transitions

    Returns:
        bool: True if issue is non-work
    """
    # Check if issue ever entered in-progress
    ever_in_progress = any(
        get_status_category(t['to_status']) == 'in_progress'
        for t in transitions
    )

    if not ever_in_progress and issue['resolved_at']:
        # Calculate lead time
        created = parse_date(issue['created_at'])
        resolved = parse_date(issue['resolved_at'])
        lead_time_days = (resolved - created).total_seconds() / 86400

        # Exclude if very quick (< 1 day) or very old (> 120 days)
        if lead_time_days < 1 or lead_time_days > 120:
            return True

    return False

7. Impact on Metrics

Current vs. Adjusted (Conservative Approach)

Metric Current Adjusted Difference
Velocity (issues/period) 175 ~170 -2.9%
Average Lead Time 54.53 days ~52 days -2.53 days
Median Lead Time 21.99 days ~21 days -0.99 days

Current vs. Adjusted (Aggressive - All Skip Patterns)

Metric Current Adjusted Difference
Velocity (issues/period) 175 137 -21.7%
Average Lead Time 54.53 days ~45 days -9.53 days
Median Lead Time 21.99 days ~18 days -3.99 days

8. Next Steps

  1. Immediate: Implement conservative filtering (< 1 day or > 120 days skip pattern)
  2. Short-term: Enhance data fetching to include resolution field
  3. Medium-term: Analyze other teams to validate pattern consistency
  4. Long-term: Consider workflow changes to prevent backlog pollution

9. Questions for Stakeholders

  1. Should issues that sat in backlog for 100+ days before being closed count toward velocity?
  2. Are there specific issue types that should always be excluded (e.g., Spikes)?
  3. Would you like to track "administrative work" separately from "development work"?
  4. Should we report both raw and adjusted metrics for transparency?

Appendix: Sample Skip Pattern Issues

Issue Type Created Resolved Lead Time Pattern
UAS-905 Task 2025-07-30 2025-07-31 0.78 days Quick close
UAS-996 Bug 2025-09-05 2025-09-06 1.02 days Quick close
UAS-712 TechTask 2025-04-24 2025-08-06 104 days Backlog cleanup
UAS-901 Task 2025-04-16 2025-08-06 112 days Backlog cleanup
UAS-637 Epic 2024-10-17 2025-08-06 294 days Backlog cleanup

Analysis Tool: /Users/maikel.lammers/projects/jira-kpi-system/analyze_non_work_issues.py