STP
SBOM Observer/

Metrics and KPIs

Key performance indicators for measuring SBOM program effectiveness

Metrics transform SBOM programs from activities into measurable outcomes. Without metrics, cannot demonstrate value, identify problems, or prove improvement. "We generate SBOMs" is activity. "SBOM-enabled vulnerability assessment now completes in 4 hours instead of 3 days" is measurable outcome proving operational value. Metrics enable data-driven decision making, executive communication, and continuous improvement.

Effective metrics balance comprehensiveness with practicality—measuring what matters without creating measurement overhead that consumes resources better spent on actual SBOM work. Start with core metrics proving basic program health, expand to advanced metrics as program matures and measurement infrastructure develops.

Coverage Metrics

Products with SBOMs

Definition: Percentage of products with current, valid SBOMs.

Calculation: (Products with valid SBOMs / Total products in scope) × 100

Targets:

  • Level 1 (Basic): 30-50% within 6 months, 60%+ within 12 months
  • Level 2 (Advanced): 85%+ within 18 months, 95%+ within 24 months

Measurement frequency: Monthly

Why it matters: Core program metric. Low coverage means SBOM program provides limited organizational value. High coverage demonstrates comprehensive transparency.

Improvement strategies:

  • Prioritize high-value products first
  • Automate generation to reduce per-product effort
  • Address blockers systematically (legacy systems, tooling gaps, organizational resistance)

SBOM Freshness

Definition: Average age of SBOMs compared to software versions they describe.

Calculation: Average(Days between software release and current SBOM update date)

Targets:

  • Level 1 (Basic): Under 30 days average
  • Level 2 (Advanced): Under 3 days average, same-day for 90%+ of releases

Measurement frequency: Weekly

Why it matters: Stale SBOMs undermine vulnerability management and supplier transparency. Cannot assess current risk using outdated component information.

Improvement strategies:

  • Automate SBOM generation in CI/CD pipelines
  • Implement update triggers aligned with releases
  • Monitor freshness and alert on staleness

Supplier SBOM Collection Rate

Definition: Percentage of critical suppliers providing SBOMs.

Calculation: (Suppliers providing SBOMs / Critical suppliers) × 100

Targets:

  • Level 1 (Basic): 30% within 12 months, 50% within 18 months
  • Level 2 (Advanced): 70% within 24 months, 85% within 36 months

Measurement frequency: Quarterly

Why it matters: Supply chain transparency requires supplier participation. Low collection rate means blind spots in third-party risk.

Improvement strategies:

  • Systematic supplier engagement campaigns
  • Contractual SBOM requirements
  • Provide supplier guidance and support
  • Track supplier responsiveness and escalate non-compliance

Quality Metrics

SBOM Completeness Score

Definition: Average completeness score across all SBOMs, measuring metadata richness and component enumeration.

Calculation: Average(Individual SBOM completeness scores 0-100)

Completeness scoring considers:

  • Component enumeration (transitive depth)
  • PURL presence
  • License information
  • Version data
  • Relationship documentation
  • Supplier/author metadata

Targets:

  • Level 1 (Basic): 60-70 average score
  • Level 2 (Advanced): 80+ average score

Measurement frequency: Monthly

Why it matters: High-quality SBOMs enable operational use. Low-quality SBOMs limit utility to basic compliance checkbox.

Improvement strategies:

  • Enhance SBOM generation tool configurations
  • Implement quality gates blocking low-quality SBOMs
  • Automated enrichment workflows
  • Training on metadata importance

Validation Pass Rate

Definition: Percentage of generated SBOMs passing validation on first attempt.

Calculation: (SBOMs passing validation / Total SBOMs generated) × 100

Targets:

  • Level 1 (Basic): 70% pass rate
  • Level 2 (Advanced): 95%+ pass rate

Measurement frequency: Weekly

Why it matters: Low pass rate indicates generation process problems, wasted effort on corrections, delayed distribution.

Improvement strategies:

  • Root cause analysis on common validation failures
  • Improve generation tool configuration
  • Pre-validation in development environments
  • Training developers on common errors

Signed SBOM Percentage

Definition: Percentage of distributed SBOMs with cryptographic signatures.

Calculation: (Signed SBOMs / Total distributed SBOMs) × 100

Targets:

  • Level 1 (Basic): 50% within 12 months
  • Level 2 (Advanced): 100% within 18 months

Measurement frequency: Monthly

Why it matters: Signatures prove authenticity and detect tampering. Unsigned SBOMs cannot be trusted in security-critical decisions.

Improvement strategies:

  • Automate signing in build pipelines
  • Establish key management infrastructure
  • Publish verification instructions
  • Make signing mandatory for production releases

Operational Metrics

Time to Vulnerability Assessment

Definition: Average hours from CVE publication to complete impact assessment of all affected products.

Calculation: Average(Hours between CVE disclosure timestamp and completion of SBOM-based impact analysis)

Targets:

  • Baseline (pre-SBOM): 48-120 hours
  • Level 1 (Basic): 24-48 hours
  • Level 2 (Advanced): 2-8 hours

Measurement frequency: Per-incident, monthly aggregate

Why it matters: Speed of vulnerability assessment directly impacts exposure window and incident response effectiveness.

Improvement strategies:

  • Automate SBOM querying for vulnerable components
  • Pre-compute component inventories
  • Webhook-driven vulnerability alerts
  • Continuous SBOM monitoring

VEX Publication Latency

Definition: Average hours from vulnerability disclosure to VEX document publication.

Calculation: Average(Hours between CVE publication and VEX document release)

Targets:

  • Level 1 (Basic): Under 7 days for high/critical severity
  • Level 2 (Advanced): Under 48 hours for high/critical, under 7 days for medium

Measurement frequency: Per-VEX, monthly aggregate

Why it matters: Timely VEX documents reduce customer uncertainty and support ticket load. Delayed VEX leaves customers making decisions with incomplete information.

Improvement strategies:

  • Automated VEX workflow initiation
  • Pre-written templates for common scenarios
  • Cross-functional VEX response teams
  • SLA-driven prioritization

Remediation Velocity

Definition: Average days from vulnerability identification to remediation deployment.

Calculation: Average(Days from SBOM-identified vulnerability to deployed patch/workaround)

Targets:

  • Baseline (pre-SBOM): 30-60 days
  • Level 1 (Basic): 14-30 days
  • Level 2 (Advanced): 3-7 days for critical, 14-21 days for high

Measurement frequency: Per-vulnerability, monthly aggregate

Why it matters: Faster remediation reduces exposure and risk. SBOM programs should accelerate remediation through better visibility.

Improvement strategies:

  • Automated remediation ticketing from SBOM analysis
  • Pre-approved emergency change processes
  • Vendor engagement for faster patches
  • Workaround documentation and deployment

Incident Response Time Reduction

Definition: Percentage reduction in time to complete incident impact assessment using SBOMs vs. manual investigation.

Calculation: ((Baseline time - Current time) / Baseline time) × 100

Targets:

  • Level 1 (Basic): 40% reduction
  • Level 2 (Advanced): 70% reduction

Measurement frequency: Quarterly retrospective

Why it matters: Demonstrates tangible SBOM program value in crisis situations where time matters most.

Improvement strategies:

  • Documented SBOM-leveraging incident playbooks
  • Tabletop exercises practicing SBOM-based response
  • Integration with incident management systems
  • Metrics collection during actual incidents

Business Metrics

Cost Per Product

Definition: Average annual cost to generate, maintain, and distribute SBOM per product.

Calculation: (Total SBOM program costs / Number of products with SBOMs)

Targets:

  • Level 1 (Basic): €500-1,000 per product annually
  • Level 2 (Advanced): €100-300 per product annually (automation efficiency)

Measurement frequency: Quarterly

Why it matters: Demonstrates efficiency and scalability. Automation should reduce per-unit costs as program scales.

Improvement strategies:

  • Automation to reduce manual effort
  • Shared infrastructure across products
  • Self-service tooling reducing support overhead
  • Process optimization based on experience

Regulatory Compliance Achievement

Definition: Percentage of products meeting regulatory SBOM requirements (NIS2, CRA, federal procurement).

Calculation: (Products meeting compliance / Products subject to regulation) × 100

Targets:

  • Level 1 (Basic): 60% at initial deadline, 100% within 6 months post-deadline
  • Level 2 (Advanced): 100% before regulatory deadline

Measurement frequency: Quarterly

Why it matters: Avoids penalties, maintains market access, demonstrates organizational maturity.

Improvement strategies:

  • Early compliance planning
  • Regulatory requirement tracking
  • Phased implementation prioritizing regulated products
  • Documentation of compliance evidence

Customer Satisfaction

Definition: Customer satisfaction with SBOM quality, accessibility, and responsiveness.

Calculation: Survey-based Net Promoter Score or satisfaction rating

Targets:

  • Level 1 (Basic): 60% satisfied
  • Level 2 (Advanced): 80%+ satisfied

Measurement frequency: Semi-annually via customer survey

Why it matters: Customer perspective on SBOM program value. Low satisfaction indicates programs missing customer needs despite internal metrics looking good.

Improvement strategies:

  • Customer feedback collection and action
  • Accessibility improvements
  • VEX responsiveness
  • Documentation and support quality

Supplier Performance Improvement

Definition: Trend in supplier SBOM quality scores and delivery timeliness over time.

Calculation: Average(Supplier SBOM quality scores quarter-over-quarter)

Targets:

  • Positive trend (improving scores)
  • 70%+ suppliers rated "good" or better

Measurement frequency: Quarterly

Why it matters: Demonstrates supply chain maturity impact. Supplier quality improvement validates engagement effectiveness.

Improvement strategies:

  • Constructive supplier feedback
  • Supplier scorecards and transparency
  • Recognition for high-performing suppliers
  • Escalation for poor performers

Advanced Metrics (Level 2+)

Component Health Score

Definition: Percentage of components meeting health criteria (maintained, not EOL, good security history).

Calculation: (Healthy components / Total unique components) × 100

Targets: 85%+ components rated healthy

Measurement frequency: Monthly

Why it matters: Proactive risk management. Declining health scores predict future vulnerability problems.

Transitive Dependency Depth

Definition: Average depth of transitive dependency trees across products.

Calculation: Average(Maximum dependency depth in each SBOM)

Targets: Monitor for excessive depth (over 6-7 levels indicates complexity risk)

Measurement frequency: Quarterly

Why it matters: Deep dependency trees increase attack surface and remediation complexity.

License Compliance Rate

Definition: Percentage of SBOMs with no license policy violations.

Calculation: (SBOMs with compliant licenses / Total SBOMs) × 100

Targets: 95%+ compliance rate

Measurement frequency: Monthly

Why it matters: Legal risk management. Violations require remediation that could be blocked through preventive measures.

Automation Coverage

Definition: Percentage of SBOM generation that is fully automated vs. manual.

Calculation: (Products with automated SBOM generation / Total products) × 100

Targets: 90%+ automated (Level 2)

Measurement frequency: Quarterly

Why it matters: Automation drives efficiency and scalability. Low automation limits program growth.

Metrics Dashboard Design

Executive Dashboard:

  • SBOM coverage percentage
  • Time to vulnerability assessment (trend)
  • Regulatory compliance status
  • Cost per product (efficiency trend)

Operational Dashboard:

  • SBOM freshness by product
  • Validation pass rate (weekly trend)
  • VEX publication latency
  • Quality scores distribution

Supplier Dashboard:

  • Supplier compliance rate
  • Supplier quality score distribution
  • Response time trends
  • Top/bottom performers

Metrics Collection Best Practices

Automate collection: Manual metrics become stale and burdensome. Automate wherever possible.

Consistent definitions: Document how each metric is calculated. Ensure consistency over time.

Trend analysis: Single data points have limited value. Track trends revealing improvement or degradation.

Context matters: Metrics without context mislead. "50% SBOM coverage" is bad for mature program, excellent for 3-month pilot.

Actionable insights: Metrics should drive decisions. If metric doesn't inform action, reconsider collecting it.

Regular review: Monthly or quarterly metric reviews with stakeholders. Discuss trends, celebrate improvements, address declines.

Next Steps

On this page