Project

General

Profile

Actions

Sprint History, Metrics & Audit (Authoritative)

Applies to all closed sprints on the Tinggit Platform
Defines how sprint history is preserved, measured, audited, and trusted


1. Purpose

The purpose of this page is to define how sprint outcomes are preserved and analyzed after a sprint is closed.

This process ensures that:

  • Sprint history is permanent and immutable
  • Metrics can be recalculated months or years later
  • Velocity trends remain trustworthy
  • Spillover and quality issues are visible
  • Internal and external audits are supported without rework

bq. Once a sprint is closed, its data must never be rewritten.


2. Scope

This applies to:

  • All closed sprints
  • All issues that were part of any sprint
  • All teams using Scrum-based delivery
  • All metrics derived from sprint execution

3. Source of Truth for Sprint History

Sprint history does not rely on Sprint Versions.

Artifact Purpose Mutability
Completed In Sprint Permanent sprint completion record Immutable
Spillover Reason Reason for unfinished work Immutable
Issue Status History Timeline of execution Immutable
Sprint Wiki Human context and narrative Immutable after close

bq. Sprint Versions are execution containers, not history storage.


4. What Is Preserved After Sprint Close

After Sprint Close, the following data must always remain available:

  • What was completed
  • What was not completed
  • Why work spilled over
  • Who worked on what
  • When work was completed
  • What the sprint goal was
This information enables:
  • Reliable retrospectives
  • Trend analysis
  • Audit readiness

5. Sprint Metrics (Authoritative Definitions)


5.1 Velocity

Definition:
Total Story Points completed in a sprint.

How it is calculated:

Tracker = Story
Completed In Sprint = Sprint YYYY-NN
Status = Done or Closed
Sum(Story Points)
Rules:
  • Only completed stories count
  • Tasks and bugs do not contribute
  • Velocity is never adjusted retroactively

5.2 Throughput

Definition:
Number of stories completed in a sprint.

How it is calculated:

Tracker = Story
Completed In Sprint = Sprint YYYY-NN
Status = Done or Closed
Metric:
  • Count of issues

5.3 Spillover Rate

Definition:
Work not completed in the sprint.

How it is identified:

Spillover Reason is not empty
Completed In Sprint is empty
Spillover is analyzed by:
  • Reason
  • Type of work
  • Repetition across sprints

5.4 Sprint Completion Ratio

Definition:
Completed stories ÷ committed stories.

This is derived from:
  • Sprint commitment (Sprint Wiki)
  • Completed In Sprint data
Used for:
  • Planning accuracy
  • Continuous improvement

6. Saved Queries for Sprint History

The following saved queries are mandatory and reusable.


Q7 – Sprint History: Completed Items

  • Filters:
    • Completed In Sprint = Sprint YYYY-NN
    • Status = Done or Closed
  • Saved as:
    Sprint History – Completed items (Sprint YYYY-NN)

Q8 – Sprint Metrics: Velocity

  • Filters:
    • Tracker = Story
    • Completed In Sprint = Sprint YYYY-NN
  • Options:
    • Sum: Story Points
  • Saved as:
    Sprint Metrics – Velocity (Sprint YYYY-NN)

Q9 – Sprint Metrics: Throughput

  • Filters:
    • Tracker = Story
    • Completed In Sprint = Sprint YYYY-NN
  • Saved as:
    Sprint Metrics – Throughput (Sprint YYYY-NN)

Q10 – Sprint Metrics: Spillover Analysis

  • Filters:
    • Spillover Reason is not empty
    • Completed In Sprint is empty
  • Options:
    • Group by Spillover Reason
  • Saved as:
    Sprint Metrics – Spillover (Reasons)

7. Sprint Audit Readiness

Sprint audits must be able to answer the following questions at any time:

  • What was completed in Sprint YYYY-NN?
  • What was committed vs completed?
  • Why did items spill over?
  • Was the Sprint Goal achieved?
  • Were metrics manipulated?

If any of these cannot be answered, the sprint process failed.


8. Audit Procedure (Step-by-Step)

To audit a past sprint:

1. Open Sprint Wiki page
2. Review Sprint Goal and scope
3. Run Sprint History – Completed items
4. Review velocity and throughput
5. Review spillover analysis
6. Check issue histories for anomalies

No additional documentation should be required.


9. Scope Change Visibility (Approximation)

Redmine does not natively track mid-sprint additions.

Approximation method:

Fixed version = Sprint YYYY-NN
Created on > Sprint start date

Saved as:
Sprint Audit – Scope added mid-sprint (approx)

This is used for:
  • Retrospectives
  • Process improvement
  • Pattern detection

10. Common Audit Red Flags

The following indicate process violations:

  • Missing Completed In Sprint values
  • Spillover without reasons
  • Changing Story Points after sprint close
  • Reopening closed sprint issues
  • Editing sprint wiki after closure

These must be escalated.


11. Relationship to Other Processes

Sprint History & Metrics depends on:

Failures earlier in the lifecycle degrade history quality.


12. Final Statement

Sprint history is a strategic asset, not a report.

By preserving immutable sprint data, Tinggit ensures:

  • Honest planning
  • Reliable forecasting
  • Continuous improvement
  • Audit confidence
  • Organizational trust

This page is the single authority for sprint history and metrics.

Updated by Redmine Admin about 2 months ago · 1 revisions