RedSheep SecurityRedSheepSecurity
Academy/Advanced/Lesson 30
Advanced — Lesson 30 of 10

CTI Metrics & Program Evaluation

10 min read

A CTI program that cannot demonstrate its value is a program at risk of losing funding, headcount, and organizational support. Metrics provide the evidence that intelligence work contributes to organizational security outcomes. However, measuring intelligence is inherently difficult — the best outcome (an attack prevented because intelligence enabled early detection) is often invisible. This lesson covers how to build a metrics framework that captures meaningful indicators of CTI program performance and presents them effectively to leadership.

Learning Objectives

  • Distinguish between input, process, output, and outcome metrics for CTI programs
  • Identify and avoid vanity metrics that look impressive but lack substance
  • Build a practical CTI metrics dashboard aligned to organizational priorities
  • Present metrics to leadership in terms of business risk and value
  • Assess CTI program maturity using established frameworks

Why Metrics Matter

CTI programs exist in competition with every other security investment for budget and staffing. Without metrics, intelligence teams rely on anecdotal evidence ("our report helped the SOC last quarter") to justify their existence. This is insufficient for several reasons:

  • Resource allocation: Leadership needs data to decide whether to invest more in CTI, maintain current levels, or redirect resources
  • Program improvement: You cannot improve what you do not measure. Metrics reveal bottlenecks, gaps, and inefficiencies in your intelligence cycle
  • Stakeholder alignment: Metrics create a shared understanding of what the CTI program does and whether it meets expectations
  • Accountability: Metrics establish clear expectations and track whether the program delivers on its mission

Key Principle: The purpose of CTI metrics is not to prove the team is busy. It is to demonstrate that intelligence work improves the organization's security posture and decision-making.

The Four Categories of CTI Metrics

A comprehensive metrics framework spans four categories, each measuring a different aspect of the intelligence function.

Input Metrics

Input metrics measure what goes into the intelligence process — the raw materials the team consumes.

Metric What It Measures Example
Intelligence sources monitored Breadth of collection 12 commercial feeds, 8 OSINT sources, 3 ISAC feeds
Reports/articles consumed Volume of intake 340 reports reviewed this quarter
IOCs ingested Volume of indicator data 45,000 indicators ingested from all feeds
Intelligence requirements active Scope of the program 15 PIRs actively tracked
Sharing community memberships Collaborative engagement Active in 3 ISACs, 2 informal sharing groups

Caution: Input metrics alone are meaningless. Ingesting 100,000 IOCs means nothing if they are not enriched, validated, and operationalized. Input metrics are necessary context but never sufficient evidence of program value.

Process Metrics

Process metrics measure the efficiency and effectiveness of the intelligence cycle itself.

Metric What It Measures Target Direction
Average production cycle time Time from requirement to finished product Lower is better
Percentage of PIRs addressed Coverage of intelligence requirements Higher is better
Indicator enrichment rate Proportion of IOCs enriched with context Higher is better
False positive rate of shared indicators Quality of indicator curation Lower is better
Time to disseminate critical intelligence Speed of urgent distribution Lower is better
Analyst utilization Time spent on analysis vs. administrative tasks More analysis time is better

Process metrics help identify operational bottlenecks. If your average production cycle is three weeks but leadership needs intelligence in three days, that gap needs to be addressed through process improvement, automation, or staffing changes.

Output Metrics

Output metrics count what the CTI program produces — the tangible deliverables.

Metric What It Measures
Intelligence reports produced Volume of finished intelligence
Briefings delivered Direct engagement with stakeholders
Detection rules created from intelligence Operationalization of intel
IOC packages disseminated Tactical product delivery
Threat actor profiles maintained Strategic knowledge base
ATT&CK techniques covered by detection Breadth of detection mapped to adversary behavior
Contributions to information sharing communities Collaborative output

Output metrics are straightforward to collect but easy to game. A team could produce 50 low-quality reports or 10 high-quality ones. Pair output metrics with quality indicators and outcome metrics.

Outcome Metrics

Outcome metrics are the most important and hardest to measure. They capture the actual impact of intelligence on security outcomes.

Metric What It Measures Why It Matters
Incidents detected via CTI-driven rules Intelligence leading to detection Direct demonstration of prevention value
Mean time to detect (MTTD) improvement Speed improvement from intel Shows intelligence makes detection faster
Incidents where CTI accelerated response Intelligence reducing IR time Quantifies time savings during incidents
Threat hunts initiated from intelligence Proactive security driven by intel Shows intel drives proactive operations
Hunt findings per intel-driven hunt Quality of intelligence-driven hunts Validates that intelligence leads to real discoveries
Vulnerabilities prioritized via threat intel Risk-informed patching Shows intel improves vulnerability management decisions
Executive decisions informed by CTI Strategic influence Demonstrates value beyond technical operations

Measuring Prevention: One approach to measuring prevented incidents is tracking "near misses" — cases where intelligence-driven detections caught activity that would have progressed to a full incident without intervention. Document these cases carefully; they are your strongest evidence of program value.

Building a Metrics Dashboard

A CTI metrics dashboard should present information at multiple levels of detail for different audiences.

Executive View

The executive dashboard should fit on a single page and answer three questions:

  1. What threats are we facing? (Threat landscape summary)
  2. How is the CTI program performing? (3-5 key outcome metrics with trend lines)
  3. What should we be concerned about? (Emerging risks requiring decisions)

Avoid technical jargon. Translate metrics into business language: "CTI-driven detections prevented an estimated 12 potential incidents this quarter" rather than "we produced 47 Sigma rules from 23 intelligence reports."

Operational View

The operational dashboard serves CTI team leads and SOC/IR management with:

  • Production cycle times and backlogs
  • PIR coverage status
  • Detection rule effectiveness (true positive vs. false positive rates)
  • Source quality assessment
  • Analyst workload distribution

Analyst View

The analyst dashboard supports day-to-day work with:

  • Current intelligence requirements and assignments
  • Feed health monitoring (are sources delivering data?)
  • Enrichment pipeline status
  • Pending tasks and deadlines

Avoiding Vanity Metrics

Vanity metrics look impressive in presentations but do not indicate program effectiveness.

Common vanity metrics to avoid:

  • Total IOCs in the database: A large number means nothing without context on quality, relevance, and whether they are actually being used for detection
  • Number of reports read: Consumption without production is not intelligence work
  • Feeds subscribed to: More feeds do not equal better intelligence; integration and curation matter
  • Pages of reports written: Length is not a proxy for value
  • Total ATT&CK techniques in the knowledge base: Coverage without depth or operational relevance is misleading

The test: For any metric, ask "If this number doubled, would our security posture measurably improve?" If the answer is no or uncertain, it may be a vanity metric.

SANS CTI Metrics Guidance

The SANS Institute, through its CTI curriculum and summit presentations, has advocated for metrics frameworks that emphasize:

  • Timeliness: Was intelligence delivered in time to be actionable?
  • Accuracy: Was the intelligence correct? Track assessments against eventual ground truth.
  • Relevance: Did the intelligence address actual organizational needs? Survey consumers.
  • Actionability: Did the intelligence lead to a concrete action (detection rule, hunt, patch prioritization, architectural change)?

These four dimensions — timeliness, accuracy, relevance, and actionability — serve as quality multipliers for any output metric. A report is only valuable if it was timely, accurate, relevant, and actionable.

Maturity Assessment Frameworks

Beyond operational metrics, periodic maturity assessments evaluate the overall sophistication of the CTI program.

CTI Maturity Models

Several maturity models exist for CTI programs. Common maturity levels follow a progression:

  1. Ad hoc / Reactive: No formal CTI function. Threat data is consumed opportunistically during incidents.
  2. Emerging: Dedicated CTI staff exist. Basic feed ingestion and IOC matching is operational. Intelligence products are primarily tactical.
  3. Defined: Formal intelligence requirements exist. The intelligence cycle is documented and followed. Products span tactical, operational, and strategic levels.
  4. Managed: Metrics are collected and used for program improvement. Feedback loops with IR, SOC, and vulnerability management are established. Automation handles routine enrichment.
  5. Optimized: CTI is embedded in organizational decision-making. Intelligence drives proactive security strategy. The program continuously self-improves based on outcome metrics.

Assess your program annually against a maturity model and set specific goals for advancement. Moving from Level 2 to Level 3 might mean formalizing PIRs; from Level 3 to Level 4 might mean implementing the metrics framework described in this lesson.

Presenting Metrics to Leadership

Effective metrics presentations follow these principles:

  • Lead with outcomes, not outputs: Start with what intelligence prevented or detected, not how many reports were written
  • Use trend lines, not snapshots: Show improvement over time rather than point-in-time numbers
  • Contextualize with the threat landscape: Frame metrics against the threats the organization faces
  • Include specific examples: One concrete story of intelligence preventing an incident is more compelling than a spreadsheet of numbers
  • Be honest about gaps: Acknowledge areas where the program needs improvement. Leadership respects candor and will fund solutions for problems you identify
  • Make clear asks: If metrics reveal a gap, propose a solution with resource requirements

Key Takeaways

  • CTI metrics span four categories: input, process, output, and outcome — outcome metrics demonstrate the most value
  • Avoid vanity metrics that measure volume without impact; always ask whether doubling the number would improve security
  • Timeliness, accuracy, relevance, and actionability are the quality dimensions that make output metrics meaningful
  • Build dashboards for multiple audiences: executives need business impact, operators need efficiency data, analysts need workflow status
  • Regular maturity assessments provide a roadmap for program improvement
  • Present metrics to leadership with outcome focus, trend lines, concrete examples, and honest gap assessment

Practical Exercise

Build a CTI Metrics Framework

  1. Inventory your outputs: List every type of product your CTI program (or a hypothetical one) produces. For each, identify at least one metric that measures its volume and one that measures its quality.

  2. Define three outcome metrics: Based on your organization's mission, define three outcome metrics that would demonstrate CTI program value to a non-technical executive. For each, describe how you would collect the data.

  3. Identify your vanity metrics: Review any existing metrics your organization tracks. Apply the "if this doubled" test. Flag any that are vanity metrics and propose replacements.

  4. Draft an executive brief: Write a one-page quarterly CTI metrics summary for a fictional CISO. Include 3-5 key metrics with trend data, one specific success story, one gap with a proposed solution, and a forward-looking threat assessment.

  5. Self-assess maturity: Using the five-level maturity model above, assess where your program (or a hypothetical program) falls. Identify three specific actions that would advance it one level.

Further Reading

  • SANS Institute — "CTI Metrics: Measuring the Value of Threat Intelligence" (SANS CTI Summit presentations, available via SANS reading room)
  • Kime, B. — "The SANS 2024 CTI Survey" (annual survey covering CTI program practices and metrics)
  • Chismon, D., Ruks, M. — "Threat Intelligence: Collecting, Analysing, Evaluating" (MWR InfoSecurity / CPNI, 2015)
  • NIST SP 800-55 Rev. 1 — "Performance Measurement Guide for Information Security" (NIST, 2008)