RedSheep SecurityRedSheepSecurity
Foundations — Lesson 2 of 10

The Intelligence Cycle

11 min read

The Intelligence Cycle is the structured, repeatable methodology that transforms raw data into finished intelligence. Borrowed from traditional military and national intelligence disciplines, it provides CTI teams with a framework for producing intelligence that is relevant, timely, and actionable. This lesson breaks down each of the six phases with cyber-specific examples and highlights common pitfalls that undermine the process.

Learning Objectives

  • Identify and describe the six phases of the Intelligence Cycle
  • Explain how each phase applies specifically to Cyber Threat Intelligence
  • Understand how intelligence requirements drive the entire cycle
  • Recognize common pitfalls that degrade intelligence quality
  • Apply the cycle conceptually to a CTI scenario

Overview of the Intelligence Cycle

The Intelligence Cycle is typically represented as six sequential phases that feed into one another in a continuous loop. While presented linearly, in practice the phases often overlap, and analysts frequently revisit earlier phases as new information emerges.

The six phases are:

  1. Direction and Planning
  2. Collection
  3. Processing
  4. Analysis
  5. Dissemination
  6. Feedback

Key Concept: The cycle is continuous. Feedback from consumers drives new requirements, which restart the cycle. Intelligence is never "done" — it is an ongoing process that adapts as the threat landscape evolves.

Phase 1: Direction and Planning

Direction and Planning is the most critical phase of the cycle. It establishes what the intelligence team needs to find out and why. Without clear direction, collection becomes unfocused, analysis lacks purpose, and the resulting products fail to support decisions.

Intelligence Requirements

The foundation of this phase is defining intelligence requirements — specific questions that the intelligence effort must answer. These are typically structured in a hierarchy:

Requirement Level Description Example
Standing Intelligence Requirement (SIR) Broad, enduring organizational priorities "Understand threat groups targeting the healthcare sector"
Priority Intelligence Requirement (PIR) Specific questions supporting a SIR "Which threat actors are actively exploiting VPN vulnerabilities to target healthcare organizations?"
Essential Elements of Information (EEI) Granular data points needed to answer a PIR "What CVEs are being exploited? What infrastructure do they use? What are their initial access TTPs?"

Intelligence requirements should come from stakeholders — the people who will consume and act on the intelligence. A SOC manager may need intelligence about active exploitation of a newly disclosed vulnerability. A CISO may need a strategic assessment of ransomware trends affecting their industry. These needs must be explicitly captured, not assumed.

Planning Considerations

  • Define the scope — what is in and out of bounds for this effort
  • Identify collection sources relevant to the requirements
  • Establish timelines — when is the intelligence needed?
  • Allocate resources — who will collect, process, and analyze?
  • Determine output format — briefing, written report, IOC feed, detection rules?

Common Pitfall: Skipping This Phase

Many CTI teams jump straight to collection — monitoring feeds, reading reports, and gathering IOCs without clear requirements. This produces intelligence that may be interesting but is not relevant to organizational needs. The result is a team that is busy but not impactful.

Phase 2: Collection

Collection is the systematic gathering of raw data from sources relevant to the intelligence requirements. In CTI, sources span a wide range.

Collection Sources

Source Category Examples Strengths Limitations
Open Source (OSINT) Vendor threat reports, news, social media, paste sites, public malware repositories Broadly available, low cost Volume can be overwhelming, variable quality
Technical SIEM logs, IDS/IPS alerts, firewall logs, DNS logs, endpoint telemetry Organization-specific, high relevance Requires infrastructure, internal only
Human (HUMINT) Industry contacts, conference hallway conversations, ISAC/ISAO member discussions Contextual, nuanced Difficult to scale, relationship-dependent
Dark Web / Underground Forums, marketplaces, Telegram channels, paste sites Early warning, adversary perspective Access challenges, legal/ethical considerations, deception risk
Commercial Threat intelligence platforms, paid feeds, vendor subscriptions Curated, enriched, often high quality Cost, potential for vendor bias
Government CISA advisories, FBI flash alerts, NSA cybersecurity advisories, Five Eyes partner reports Authoritative, high confidence May lag behind real-time activity, sometimes vague

Collection Management

Effective collection requires a collection plan — a documented mapping of intelligence requirements to specific sources. Not every source is relevant to every requirement. A collection plan prevents both gaps (missing critical sources) and waste (collecting data that serves no requirement).

Common Pitfall: Collection Overload

The temptation is to collect everything. This creates a data management burden and makes processing and analysis more difficult. Collection should be guided by requirements, not by what is available. If a source does not serve a defined requirement, it should not be part of the collection effort.

Phase 3: Processing

Processing transforms raw collected data into a format suitable for analysis. Raw data is often noisy, inconsistent, and in multiple formats. Processing standardizes it.

Processing Activities in CTI

  • Normalization — Converting data into consistent formats (e.g., standardizing timestamp formats, IP address notation, hash formats)
  • Deduplication — Removing duplicate indicators or reports from multiple sources
  • Translation — Converting foreign-language sources into the working language
  • Decryption/Decoding — Handling encoded or obfuscated data (Base64, XOR-encoded C2 communications)
  • Enrichment — Adding context to raw indicators (e.g., GeoIP data for IP addresses, WHOIS data for domains, VirusTotal results for file hashes)
  • Validation — Confirming data integrity and filtering out known false positives, sinkholed domains, or researcher infrastructure
  • Structuring — Organizing data into structured formats (STIX, OpenIOC, CSV) for ingestion into tools

Automation in Processing

Processing is the phase most amenable to automation. Threat Intelligence Platforms (TIPs), SOAR tools, and custom scripts can handle normalization, deduplication, and enrichment at scale. However, automation should be validated — automated enrichment can introduce errors if sources are unreliable or stale.

Common Pitfall: Treating Processing as Analysis

Processing and analysis are distinct. Enriching an IP address with GeoIP data is processing. Assessing whether that IP address represents adversary infrastructure targeting your organization based on multiple corroborating sources is analysis. Conflating the two leads to products that are data-rich but insight-poor.

Phase 4: Analysis

Analysis is the intellectual core of the Intelligence Cycle. It is where processed information is evaluated, correlated, and interpreted to produce judgments and assessments. This is the phase that transforms information into intelligence.

Analytical Techniques in CTI

Several structured analytical techniques (SATs) are commonly used:

  • Analysis of Competing Hypotheses (ACH) — Developed by Richards Heuer (CIA), ACH systematically evaluates evidence against multiple hypotheses to reduce cognitive bias. For example, when investigating a breach, ACH can help determine whether the attacker is a nation-state, criminal group, or insider.
  • Diamond Model of Intrusion Analysis — Developed by Sergio Caltagirone, Andrew Pendergast, and Christopher Betz (2013), this model relates four core features of any intrusion event: adversary, infrastructure, capability, and victim. It helps analysts map relationships and identify pivots.
  • Kill Chain Analysis — Mapping adversary activity to Lockheed Martin's Cyber Kill Chain (Hutchins, Cloppert, Amin, 2011) or MITRE ATT&CK to understand where in an attack sequence the adversary is operating.
  • Pattern and Trend Analysis — Identifying recurring behaviors, targeting patterns, or operational cadences across multiple incidents.

Confidence and Sourcing

Finished intelligence must communicate how confident the analyst is in their judgments. The intelligence community commonly uses a standardized confidence scale:

Confidence Level Meaning
High Multiple independent, reliable sources corroborate the assessment; strong analytical basis
Moderate Some corroboration exists but sources are limited or analytical basis has gaps
Low Assessment is plausible but based on limited or unreliable sourcing; significant analytical uncertainty

Assessments should also use estimative language (e.g., "we assess with moderate confidence that...") rather than stating conclusions as absolute facts. This practice, drawn from the U.S. Intelligence Community's ICD 203 standard, ensures consumers understand the uncertainty inherent in any intelligence judgment.

Common Pitfall: Confirmation Bias

Analysts naturally seek information that confirms their initial hypothesis. Structured analytical techniques exist specifically to counter this tendency. Without deliberate countermeasures, analysis can become an exercise in justifying a predetermined conclusion rather than objectively evaluating evidence.

Phase 5: Dissemination

Dissemination is the delivery of finished intelligence to consumers in a format and timeline that enables them to act on it. Even excellent analysis is worthless if it does not reach the right people at the right time in the right format.

Dissemination Formats

Product Type Audience Format Timeliness
Flash/Alert SOC, IR Short bulletin, IOC list, detection rule Immediate (minutes to hours)
Threat Advisory Security operations, vulnerability management Structured report with context and recommendations Urgent (hours to 1-2 days)
Finished Intelligence Report Analysts, hunt teams, security leadership Long-form written assessment Scheduled or event-driven
Strategic Briefing CISO, executive leadership Presentation or executive summary Periodic (weekly, monthly, quarterly)
Machine-Readable Feed SIEM, TIP, SOAR, firewall STIX/TAXII, CSV, API Continuous/automated

Dissemination Principles

  • Right audience — Match the product to the consumer's role and needs
  • Right format — Executives need summaries, not packet captures; SOC analysts need IOCs, not geopolitical assessments
  • Right time — Intelligence about an active campaign targeting your sector is useless if delivered two weeks after the campaign ends
  • Right classification — Ensure handling markings (TLP, classification levels) are appropriate and do not prevent the consumer from acting

Common Pitfall: One Product for All Audiences

Producing a single report and sending it to everyone is a common failure. A report written for executive leadership will lack the technical detail SOC analysts need. A deeply technical analysis will be ignored by executives. Tailor products to audiences.

Phase 6: Feedback

Feedback closes the loop. Consumers evaluate whether the intelligence they received met their needs, and their input drives adjustments to requirements, collection priorities, and analytical focus.

Feedback Mechanisms

  • Formal reviews — Periodic meetings with stakeholders to assess intelligence relevance and quality
  • Consumption metrics — Are reports being read? Are IOCs being ingested? Are detection rules being deployed?
  • Outcome tracking — Did the intelligence lead to a detection? Did it inform a decision? Did it prevent an incident?
  • Requirement refinement — Stakeholders update or reprioritize intelligence requirements based on changing needs

Common Pitfall: No Feedback Loop

Without feedback, the intelligence team operates in a vacuum. They may spend weeks producing reports that no one reads, or miss critical requirements because no one communicated a change in priorities. Feedback is what makes the cycle a cycle rather than a one-way pipeline.

How the Cycle Drives Operations

In a mature CTI program, the Intelligence Cycle is not an academic exercise — it directly drives security operations:

  • Direction produces intelligence requirements that focus the team's effort
  • Collection gathers data aligned to those requirements
  • Processing prepares that data for analysis
  • Analysis produces assessments, detection content, and hunt hypotheses
  • Dissemination delivers those products to SOC (detection rules), hunt teams (hypotheses), IR (adversary playbooks), and leadership (strategic assessments)
  • Feedback from those consumers refines the requirements and restarts the cycle

Key Takeaways

  • The Intelligence Cycle provides a structured, repeatable methodology for producing CTI
  • Direction and Planning is the most important phase — without clear requirements, everything downstream suffers
  • Collection must be guided by requirements, not by data availability
  • Processing and analysis are distinct phases — enrichment is not analysis
  • Analysis requires structured techniques and must communicate confidence levels
  • Dissemination must match the product to the audience's role, format needs, and timeline
  • Feedback closes the loop and keeps the cycle aligned with organizational needs

Practical Exercise

Map the Intelligence Cycle to a Scenario

Scenario: Your organization has learned from a CISA advisory that a threat group is actively exploiting a vulnerability in a product your organization uses.

For each phase of the Intelligence Cycle, write one to two sentences describing what you would do:

  1. Direction/Planning — What specific questions do you need to answer?
  2. Collection — What sources would you consult?
  3. Processing — What data normalization or enrichment would you perform?
  4. Analysis — What analytical judgments would you make?
  5. Dissemination — Who would you inform and in what format?
  6. Feedback — How would you evaluate whether your intelligence was useful?

This exercise forces you to think through the full cycle rather than jumping straight to collection or analysis.

Further Reading

  • JP 2-0: Joint Intelligence — U.S. Joint Chiefs of Staff. The foundational military doctrine for the Intelligence Cycle, from which the CTI cycle is adapted.
  • "Psychology of Intelligence Analysis" — Richards J. Heuer Jr. (CIA Center for the Study of Intelligence, 1999). Essential reading on cognitive biases in analysis. Available free from the CIA's public library.
  • "A Practical Model for Conducting Cyber Threat Intelligence Analysis" — Sergio Caltagirone (SANS Reading Room). Applies structured analytical methods to CTI.
  • ICD 203: Analytic Standards — Office of the Director of National Intelligence. Establishes standards for analytic judgments, confidence levels, and estimative language used throughout the U.S. Intelligence Community.