Notes on Intelligence Analysis ATP 2-33.4
Preface
ATP 2-33.4 provides fundamental information to a broad audience, including commanders, staffs, and leaders, on how intelligence personnel conduct analysis to support Army operations. It describes the intelligence analysis process and specific analytic techniques and information on the conduct of intelligence analysis performed by intelligence personnel, especially all-source analysts, across all intelligence disciplines. Additionally, ATP 2-33.4 describes how intelligence analysis facilitates the commander’s decision making and understanding of complex environments.
The principal audience for ATP 2-33.4 is junior to midgrade intelligence analysts conducting intelligence analysis. This publication provides basic information on intelligence analysis for commanders, staffs, and other senior military members.
ATP 2-33.4 readers must have an understanding of the following:
- Intelligence doctrine described in ADP2-0 and FM2-0.
- Collection management described in ATP 2-01.
- Intelligence preparation of the battlefield (IPB) described in ATP2-01.3.
- Operational doctrine described in ADP3-0 and FM3-0.
- Joint targeting described in JP 3-60.
Introduction
ATP 2-33.4 discusses doctrinal techniques—descriptive methods for performing missions, functions, or tasks as they apply to intelligence analysis. ATP 2-33.4—
- Describes the intelligence analysis process.
- Discusses structured analytic techniques and the methods for implementing them.
- Describes unique considerations related to intelligence analysis.
Intelligence analysis is central to intelligence. It is the basis for many staff activities, including planning, and occurs across the entire Army. Among other results, analysis facilitates commanders and other decision makers’ ability to visualize the operational environment (OE), organize their forces, and control operations to achieve their objectives. To understand the role of intelligence analysis, intelligence professionals must understand how intelligence analysis corresponds with other staff processes, especially the military decision- making process and information collection (including collection management).
- The introductory figure on pages xii and xiii displays the intelligence analysis process and shows how intelligence analysis fits with the other staff processes to facilitate the commander’s understanding:
- The commander’s initial intent, planning guidance, and priority intelligence requirements (PIRs) drive the collection management plan.
- The entire staff, led by the intelligence and operations staffs, develops the information collection plan that results in reporting.
- All-source intelligence is based on information from all intelligence disciplines, complementary intelligence capabilities, and other available sources, such as reconnaissance missions, patrol debriefs, and security operations.
- Information collected from multiple sources moves through the intelligence analysis process, resulting in intelligence.
- The intelligence staff conducts all-source analysis and produces timely, accurate, relevant, predictive, and tailored intelligence that satisfies the commander’s requirements and facilitates the commander’s situational understanding and the staff’s situational awareness.
Chapter 9 discusses managing long-term analytical assessments, also referred to as analytic design, to ensure the analytical effort is properly focused and carefully planned and executed, and analytical results are communicated effectively to the requestor.
PART ONE
Fundamentals
Chapter 1
Understanding Intelligence Analysis
INTELLIGENCE ANALYSIS OVERVIEW
1-1. Analysis is the compilation, filtering, and detailed evaluation of information to focus and understand that information better and to develop knowledge or conclusions. In accordance with ADP 6-0, information is, in the context of decision making, data that has been organized and processed in order to provide context for further analysis. Information generally provides some of the answers to the who, what, where, when, why, and how questions. Knowledge is, in the context of decision making, information that has been analyzed and evaluated for operational implications (ADP 6-0). Knowledge assists in ascribing meaning and value to the conditions or events within an operation. Analysis performed by intelligence personnel assists in building the commander’s knowledge and understanding. ADP 6-0 provides an in-depth discussion on how commanders and staffs process data to progressively develop their knowledge to build and maintain their situational awareness and understanding.
1-3. Intelligence analysis is a form of analysis specific to the intelligence warfighting function. It is continuous and occurs throughout the intelligence and operations processes. Intelligence analysis is the process by which collected information is evaluated and integrated with existing information to facilitate intelligence production. Analysts conduct intelligence analysis to produce timely, accurate, relevant, and predictive intelligence for dissemination to the commander and staff. The purpose of intelligence analysis is to describe past, current, and attempt to predict future threat capabilities, activities, and tactics; terrain and weather conditions; and civil considerations.
1-4. Army forces compete with an adaptive enemy; therefore, perfect information collection, intelligence planning, intelligence production, and staff planning seldom occur. Information collection is not easy, and a single collection capability is not persistent and accurate enough to provide all of the answers. Intelligence analysts will be challenged to identify erroneous information and enemy deception, and commanders and staffs will sometimes have to accept the risk associated with incomplete analysis based on time and information collection constraints.
- 1-5. Some unique aspects of intelligence analysis include—
- The significant demand on analysts to compile and filter vast amounts of information in order to identify information relevant to the operation.
- The need for analysts to clearly separate confirmed facts from analytical determinations and assessments.
- Insight into how the physical environment (terrain, weather, and civil considerations) may affect operations.
- The ability to assess complex situations across all domains and the information environment.
1-7. Intelligence analysis comprises single-source analysis and all-source analysis.
Single-source and all-source intelligence capabilities include but are not limited to—
- Single-source analytical elements:
- Brigade combat team (BCT) human intelligence (HUMINT) analysis cell.
- Division signals intelligence cell.
- Corps counterintelligence analysis cell.
- Brigade through corps geospatial intelligence cells.
- All-source analytical elements:
- Battalion intelligence cell.
- Brigade intelligence support element (also known as BISE).
- Division analysis and control element (ACE).
- Corps ACE.
- Theater army ACE.
- National Ground Intelligence Center (NGIC).
SINGLE-SOURCE ANALYSIS
1-8. Single-source collection is reported to single-source analytical elements. Single-source analytical elements conduct continuous analysis of the information provided by single-source operations. Following single-source analysis, analytical results are disseminated to all-source analytical elements for corroboration, to update the common operational picture, and to refine all-source intelligence products. A continuous analytical feedback loop occurs between all-source analytical elements, single-source analytical elements, and collectors to ensure effective intelligence analysis.
1-9. Several portions of this publication apply to single-source analysis, especially the intelligence analysis process in chapter 2 and the analytic techniques in chapters 4 through 6. Specific doctrine on single-source analysis is contained in the following publications:
Intelligence disciplines:
- For counterintelligence analysis, see ATP 2-22.2-1, Counterintelligence Volume I: Investigations, Analysis and Production, and Technical Services and Support Activities, chapter 4.
- For HUMINT analysis, see FM 2-22.3, Human Intelligence Collector Operations, chapter 12.
- For open-source intelligence analysis, see ATP 2-22.9, Open-Source Intelligence, chapters 1, 2, and 3.
- For signals intelligence analysis, see ATP 2-22.6-2, Signals Intelligence Volume II: Reference Guide, appendix G.
ALL-SOURCE ANALYSIS AND PRODUCTION
1-10. Various all-source analytical elements integrate intelligence and information from all relevant sources (both single-source and other information collection sources) to provide the most timely, accurate, relevant, and comprehensive intelligence possible and to overcome threat camouflage, counterreconnaissance, and deception.
The intelligence staff is integrated with the rest of the staff to ensure they have a thorough understanding of the overall operation, the current situation, and future operations. Additionally, all-source analytical elements often corroborate their analytical determinations and intelligence products through access to and collaboration with higher, lower, and adjacent all-source analytical elements.
1-11. All-source intelligence analysts use an array of automation and other systems to perform their mission. (See appendix A.) From a technical perspective, all-source analysis is accomplished through the fusion of single-source information with existing intelligence in order to produce intelligence. For Army purposes, fusion is consolidating, combining, and correlating information together (ADP 2-0). Fusion occurs as an iterative activity to refine information as an integral part of all-source analysis and production.
1-12. With the vast amounts of information and broad array of all-source intelligence capabilities, the G-2/S-2 provides the commander and staff with all-source intelligence. All-source intelligence products inform the commander and staff by facilitating situational understanding, supporting the development of plans and orders, and answering priority intelligence requirements (PIRs), high-payoff targets (HPTs), and other information requirements.
1-13. The G-2/S-2 can use single-source intelligence to support the commander and staff. In those instances, it is best to first send that single-source intelligence to the all-source analytical element to attempt to quickly corroborate the information. Corroboration reduces the risk associated with using that single-source intelligence by comparing it to other information reporting and existing intelligence products. Following corroboration and dissemination of the intelligence to the commander and staff, the all-source analytical element incorporates the single-source intelligence into the various all-source intelligence products and the threat portion of the common operational picture.
CONDUCTING INTELLIGENCE ANALYSIS
1-15. The goal of intelligence analysis is to provide timely and relevant intelligence to commanders and leaders to support their decision making. Intelligence analysis requires the continuous examination of information and intelligence about the threat and significant aspects of the OE. To be effective, an intelligence analyst must—
- Understand and keep abreast of intelligence doctrine.
Maintain complete familiarity on all aspects of the threat, including threat capabilities, doctrine, and operations.
- Have knowledge on how to account for the effects of the mission variables (mission, enemy, terrain and weather, troops and support available, time available, and civil considerations [METT-TC]) and operational variables (political, military, economic, social, information, infrastructure, physical environment, and time [PMESII-PT]) on operations.
- Thoroughly understand operational doctrine (especially FM 3-0, Operations), operational and targeting terminology, and operational symbology.
1-16. Analysts conduct intelligence analysis to ultimately develop effective intelligence. They do this by applying the basic thinking abilities (information ordering, pattern recognition, and reasoning) and critical and creative thinking, all described in appendix B. FM 2-0 describes the characteristics of effective intelligence as accurate, timely, usable, complete, precise, reliable, relevant, predictive, and tailored. Beyond those characteristics, intelligence analysts must also understand the six aspects of effective analysis:
- Embracing ambiguity.
- Understanding intelligence analysis is imperfect.
- Meeting analytical deadlines with the best intelligence possible.
- Thinking critically.
- Striving to collaborate closely with other analysts.
- Adhering to analytic standards as much as possible.
EMBRACING AMBIGUITY
1-17. Intelligence personnel must accept and embrace ambiguity in conducting analysis as they will never have all the information necessary to make certain analytical determinations. Intelligence analysts will be challenged due to the constantly changing nature of the OE and the threat and to the fog of war—all imposed during large-scale ground combat operations, creating complex, chaotic, and uncertain conditions.
1-18. Analysts operate within a time-constrained environment and with limited information. Therefore, they may sometimes produce intelligence that is not as accurate and detailed as they would prefer. Having both an adequate amount of information and extensive subject matter expertise does not guarantee the development of logical or accurate determinations. To be effective, analysts must have—
- A detailed awareness of their commander’s requirements and priorities.
- An understanding of the limitations in information collection and intelligence analysis.
- A thorough knowledge of the OE and all aspects of the threat.
- Expertise in applying the intelligence analysis process and analytic techniques.
1-19. The effective combination of the aforementioned bullets provides intelligence analysts with the best chance to produce accurate and predictive intelligence and also to detect threat denial and deception efforts. To adequately account for complexity and ambiguity, intelligence analysts should continually identify gaps in their understanding of the OE and the threat, and factor in those gaps when conducting intelligence analysis.
ANALYTICAL IMPERFECTION
1-20. Given the ambiguity, fog of war, and time-constraints, intelligence analysts must accept imperfection. As much as possible, analysts should attempt to use validated facts, advanced analytic techniques, and objective analytical means. However, using them and providing completely objective and detailed analytical determinations may be challenging, especially during tactical operations. Analysts should also consider that logical determinations are not necessarily facts.
1-21. When presenting analytical determinations to the commander and staff, intelligence personnel must ensure they can answer the so what question from the commander’s perspective. Additionally, they should clearly differentiate between what is relatively certain, what are reasonable assumptions, and what is unknown, and then provide the degree of confidence they have in their determination as well as any significant issues associated with their analysis. This confidence level is normally subjective and based on—
- The collection asset’s capability (reliability and accuracy).
- Evaluation criteria.
- The confidence in the collected data.
- The analyst’s expertise and experience.
- Intelligence gaps.
- The possibility of threat deception.
1-22. Intelligence analysts should be prepared to explain and justify their conclusions to the commander and staff. Over time, the all-source analytical element should learn the most effective way to present analytical determinations to the commander and staff. A deliberate and honest statement of what is relatively certain and what is unknown assists the commander and staff in weighing some of the risks inherent in the operation and in creating mitigation measures.
MEETING ANALYTICAL DEADLINES
1-23. Analysts must gear their efforts to the time available and provide the best possible intelligence within the deadline.
CRITICAL THINKING
1-24. Intelligence analysts must know how to arrive at logical, well-reasoned, and unbiased conclusions as a part of their analysis. Analysts strive to reach determinations based on facts and reasonable assumptions. Therefore, critical thinking is essential to analysis. Using critical thinking, which is disciplined and self- reflective, provides more holistic, logical, ethical, and unbiased analyses and determinations. Applying critical thinking assists analysts in fully accounting for the elements of thought, the intellectual standards, and the traits of a critical thinker.
COLLABORATION
1-25. Commanders, intelligence and other staffs, and intelligence analysts must collaborate. They should actively share and question information, perceptions, and ideas to better understand situations and produce intelligence. Collaboration is essential to analysis; it ensures analysts work together to achieve a common goal effectively and efficiently.
1-26. Through collaboration, analysts develop and enhance professional relationships, access each other’s expertise, enhance their understanding of the issues, and expand their perspectives on critical analytical issues. Collaboration is another means, besides critical thinking, by which intelligence analysts avoid potential pitfalls, such as mindsets and biases, and detect threat denial and deception efforts.
ADHERING TO ANALYTIC STANDARDS
1-27. As much as possible, the conclusions reached during intelligence analysis should adhere to analytic standards, such as those established by the Director of National Intelligence in Intelligence Community Directive (ICD) 203, to determine the relevance and value of the information before updating existing assessments.
INTELLIGENCE ANALYSIS AND COLLECTION MANAGEMENT
1-28. While collection management is not part of intelligence analysis, it is closely related. Analysis occurs inherently throughout collection management, and intelligence analysts must understand the information collection plan.
1-29. Collection management is a part of the larger information collection effort. Information collection is an integrated intelligence and operations function.
The collection management process comprises the following tasks:
- Develop requirements.
- Develop the collection management plan.
- Support tasking and directing.
- Assess collection.
- Update the collection management plan.
1-30. The intelligence warfighting function focuses on answering commander and staff requirements, especially PIRs, which are part of the commander’s critical information requirements. Intelligence analysis for a particular mission begins with information collected based on commander and staff requirements (which are part of collection management); those requirements are usually developed within the context of existing intelligence analysis.
Together, these two activities form a continuous cycle—intelligence analysis supports collection management and collection management supports intelligence analysis.
1-31. Intelligence analysis and collection management overlap or intersect in several areas. While not all inclusive, the following includes some of these areas:
- The all-source intelligence architecture and analysis across the echelons are important aspects of planning effective information collection. To answer the PIR and present the commander and staff with a tailored intelligence product, there must be adequate time. Collection management personnel must understand the all-source intelligence architecture and analysis across the echelons and consider those timelines.
- Collection management personnel depend on the intelligence analysis of threats, terrain and weather, and civil considerations in order to perform the collection management process. Intelligence preparation of the battlefield (IPB) often sets the context for collection management:
- Intelligence analytical gaps are the start points for developing requirements.
- All-source analysts and collection management personnel must understand the threat COAs and how to execute those COAs as reflected in the situation templates.
- Event templates and event matrices are the start points for developing subsequent collection management tools.
- All-source analysts and collection management personnel—
- Use and refine threat indicators during the course of an operation.
- Mutually support and track threat activities relative to the decide, detect, deliver, and assess (also called D3A) functions of the targeting methodology.
- Must confer before answering and closing a PIR.
- The effectiveness of intelligence analysis is an integral part of assessing the effectiveness of the information collection plan during collection management.
1-32. A disconnect between intelligence analysis and collection management can cause significant issues, including a degradation in the overall effectiveness of intelligence support to the commander and staff. Therefore, intelligence analysts and collection management personnel must collaborate closely to ensure they understand PIRs, targeting and information operations requirements (when not expressed as PIRs), threat COAs and other IPB outputs, the current situation, and the context/determinations surrounding current threat activities.
THE ALL-SOURCE INTELLIGENCE ARCHITECTURE AND ANALYSIS ACROSS THE ECHELONS
1-33. All-source analysis, collaboration, and intelligence production occur both within and between echelons. Intelligence analysts not only integrate the broad array of information collected and intelligence produced at their echelon, but they also collaborate across the various echelons and the intelligence community to benefit from the different knowledge, judgments, experience, expertise, and perceptions—all invaluable to the analytical effort.
1-34. At the different echelons, based on a number of factors, the intelligence staff and supporting all-source analytical element are divided into teams to support the various command posts and to perform the various all-source analytical tasks. There is no standard template on how best to structure the all-source analytical effort.
1-37. While the fundamentals of intelligence analysis remain constant across the Army’s strategic roles, large-scale ground combat operations create some unique challenges for the intelligence analyst. (See table 1-1.) The fluid and chaotic nature of large-scale ground combat operations will cause the greatest degree of fog, friction, uncertainty, and stress on the intelligence analysis effort. Army forces will have to fight for intelligence as peer threats will counter information collection efforts, forcing commanders to make decisions with incomplete and imperfect intelligence. These realities will strain all-source analysis.
1-38. Over the past 20 years, the Nation’s peer threats have increased their capabilities and gained an understanding of United States (U.S.) and allied operations. According to ADP 3-0, a peer threat is an adversary or enemy able to effectively oppose U.S. forces worldwide while enjoying a position of relative advantage in a specific region. Peer threats—
- Can generate equal or temporarily superior combat power in geographical proximity to a conflict area with U.S. forces.
- May have a cultural affinity to specific regions, providing them relative advantages in terms of time, space, and sanctuary.
- Generate tactical, operational, and strategic challenges in order of magnitude more challenging militarily than other adversaries.
- Can employ resources across multiple domains to create lethal and nonlethal effects with operational significance throughout an OE.
- Seek to delay deployment of U.S. forces and inflict significant damage across multiple domains in a short period to achieve their goals before culminating.
1-40. As in all operations, intelligence drives operations and operations support intelligence; this relationship is continuous. The commander and staff need effective intelligence in order to understand threat centers of gravity, goals and objectives, and COAs. Precise intelligence is also critical to target threat capabilities at the right time and place and to open windows of opportunity across domains. Commanders and staffs must have detailed knowledge of threat strengths, weaknesses, equipment, and tactics to plan for and execute friendly operations.
1-42. One of the ultimate goals of intelligence analysis is to assist the unit in identifying and opening an operational window of opportunity to eventually achieve a position of relative advantage. Opening a window of opportunity often requires a significant amount of intelligence analysis in order to achieve a high degree of situational understanding. This will be difficult as friendly forces are often at a disadvantage in conducting information collection against the threat.
1-43. The staff must thoroughly plan, find creative solutions, and collaborate across echelons to overcome information collection challenges. Once friendly forces have an open window of opportunity to execute information collection, intelligence analysts will receive more information and should be able to provide timely and accurate intelligence products, updates, and predictive assessments. This timely and accurate intelligence can then assist friendly forces in opening subsequent windows of opportunity to reach positions of relative advantage.
1-44. Facilitating the commander and staff’s situational understanding of the various significant aspects of the OE is challenging. Intelligence analysis must address important considerations across all domains and the information environment as well as support multi-domain operations. Intelligence analysis must include all significant operational aspects of the interrelationship of the air, land, maritime, space, and cyberspace domains; the information environment; and the electromagnetic spectrum. Intelligence analysts use information and intelligence from the joint force, U.S. Government, the intelligence community, and allies to better understand and analyze the various domains and peer threat capabilities.
INTELLIGENCE ANALYSIS DURING THE ARMY’S OTHER STRATEGIC ROLES
1-45. As part of a joint force, the Army operates across the strategic roles (shape OEs, prevent conflict, prevail in large-scale ground combat, and consolidate gains) to accomplish its mission to organize, equip, and train its forces to conduct sustained land combat to defeat enemy ground forces and to seize, occupy, and defend land areas.
Chapter 2
The Intelligence Analysis Process
2-1. Both all-source and single-source intelligence analysts use the intelligence analysis process. The process supports the continuous examination of information, intelligence, and knowledge about the OE and the threat to generate intelligence and reach one or more conclusions. The application of the analytic skills and techniques assist analysts in evaluating specific situations, conditions, entities, areas, devices, or problems.
2-2. The intelligence analysis process includes the continuous evaluation and integration of new and existing information to produce intelligence. It ensures all information undergoes a criterion-based logical process, such as the analytic tradecraft standards established by ICD 203, to determine the relevance and value of the information before updating existing assessments.
2-3. The intelligence analysis process is flexible and applies to any intelligence discipline. Analysts may execute the intelligence analysis process meticulously by thoroughly screening information and applying analytic techniques, or they may truncate the process by quickly screening collected information using only basic structured analytic techniques. The process becomes intuitive as analysts become more proficient at analysis and understanding their assigned OE. The intelligence analyst uses collected information to formulate reliable and accurate assessments.
THE PHASES OF THE INTELLIGENCE ANALYSIS PROCESS
2-4. The phases of the intelligence analysis process are interdependent. (See figure 2-1 on page 2-2.) Through time and experience, analysts become more aware of this interdependence. The phases of the intelligence analysis process are—
- Screen (collected information): Determining the relevance of the information collected.
- Analyze: Examining relevant information.
- Integrate: Combining new information with current intelligence holdings to begin the effort of developing a conclusion or assessment.
- Produce: Making a determination or assessment that can be disseminated to consumers.
2-5. To successfully execute the intelligence analysis process, it is critical for analysts to understand the PIRs and other requirements related to the current OE and mission. This understanding assists analysts in framing the analytic problem and enables them to separate facts and analytical judgments.
Analytical judgments form by generating hypotheses—preliminary explanations meant to be tested to gain insight and find the best answer to a question of judgment.
SCREEN COLLECTED INFORMATION
2-7. During the execution of single-source intelligence or all-source analysis, analysts continuously filter the volume of information or intelligence received through the continuous push and pull of information. It is during the screen phase that analysts sort information based on relevancy and how it ties to the analytical questions or hypotheses they developed to fill information gaps. They do this by conducting research and accessing only the information that is relevant to their PIRs, mission, or time.
2-8. Time permitting, analysts research by accessing information and intelligence from databases, the internet (attributed to open-source information), collaborative tools, broadcast services, and other sources such as automated systems. This screening enables analysts to focus their analytical efforts on only the information that is pertinent to their specific analytic problem.
ANALYZE
2-9. Analysts examine relevant information or intelligence using reasoning and analytic techniques, which enable them to see information in different ways and to reveal something new or unexpected. It may be necessary to gain more information or apply a different technique, time permitting, until a conclusion is reached or a determination is made.
2-10. Analysts also analyze the volume of information based on the information source’s reliability and the information accuracy, as screening information is continuous. This occurs when analysts receive information they immediately recognize as untrue or inaccurate based on their knowledge or familiarity with the analytic problem. Analysts should not proceed with the analysis when there is a high likelihood that the information is false or part of a deception, as this may lead to inaccurate conclusions. False information and deception are more prevalent today with the proliferation of misinformation commonly found in social media readily available on the internet.
2-11. Analysts may decide to retain or exclude information based on results from the screen phase. While the excluded information may not be pertinent to the current analytical question, the information is maintained in a unit repository as it may answer a follow-on question from a new analytical question.
2-12. During operations, intelligence analysts must consider information relevancy, reliability, and accuracy to perform analysis:
- Relevancy: Analysts examine the information to determine its pertinence about the threat or OE. Once the information is assessed as relevant, analysts continue with the analysis process.
- Reliability: The source of the information is scrutinized for reliability. If the source of the information is unknown, the level of reliability decreases significantly.
- Accuracy: Unlike reliability, accuracy is based on other information that can corroborate (or not) the available information. When possible, analysts should obtain information that confirms or denies a conclusion in order to detect deception, misconstrued information, or bad data or information. Additionally, when possible, analysts should characterize their level of confidence in that conclusion.
2-13. There are marked differences in evaluating the accuracy of information between higher and lower echelons. Higher (strategic) echelons have more sources of information and intelligence than lower (tactical) echelons, giving higher echelons more opportunities to confirm, corroborate, or refute the accuracy of the reported data. The role of higher echelons in evaluating the credibility (or probable truth) of information differs from its role in evaluating the reliability of the source (usually performed best by the echelon closest to the source).
2-14. Information is evaluated for source reliability and accuracy based on a standard system of evaluation ratings for each piece of information, as indicated in table 2-1; reliability is represented by a letter and accuracy by a number. Single-source intelligence personnel assign the rating, and it is essential for all-source personnel to understand the evaluation of validated intelligence sources.
2-15. Reliable and accurate information is integrated into the analytical production. Data that is less reliable or accurate is not discarded; it is retained for possible additional screening with other established information or if new requirements arise that are relevant to existing data.
INTEGRATE
2-16. As analysts reach new conclusions about the threat activities during the analyze phase, they should corroborate and correlate this information with prior intelligence holdings using reasoning and analytic techniques. Analysts determine how new information relates to previous analytical conclusions. New information may require analysts to alter or validate initial conclusions. Analysts must continue to evaluate and integrate reliable and accurate information relevant to their mission.
2-17. Analysts resume the analysis based on questions (hypotheses) they established during the screen and analyze phases. At this point, analysts begin to draw conclusions that translate into an initial determination that is likely to require additional analysis and, in certain instances, additional collection. They employ the analytic tradecraft standards to assess probabilities and confidence levels; they employ the action-metrics associated with analytical rigor to draw accurate conclusions. However, some of these conclusions may present alternative COAs not previously considered during IPB. These COAs must be presented to the commander and staff because they might have operational implications.
2-18. Hypotheses are tested and often validated during the integrate phase and become the basis for analytical production. To properly validate the hypotheses, analysts must demonstrate analytical rigor to determine the analytical sufficiency of their conclusions and be willing to present those points that prove the accuracy of their assessment.
PRODUCE
2-19. Intelligence and operational products are mutually supportive and enhance the commander and staff’s situational understanding. Intelligence products are generally categorized by the purpose for which the intelligence was produced. The categories can and do overlap, and the same intelligence and information can be used in each of the categories. JP 2-0 provides an explanation for each of the categories:
- Warning intelligence.
- Current intelligence.
- General military intelligence.
- Target intelligence.
- Scientific and technical intelligence.
- Counterintelligence.
- Estimative intelligence.
- Identity intelligence.
2-20. Intelligence analysis results in the production and dissemination of intelligence to the commander and staff. Intelligence analysts produce and maintain a variety of products tailored to the commander and staff and dictated by the current situation, standard operating procedures (SOPs), and battle rhythms.
Note. When disseminating intelligence products, intelligence analysts must recognize when intelligence information at a higher classification is essential for the commander’s awareness. Intelligence analysts and the intelligence staff must adhere to all appropriate U.S. laws, DOD regulations, classification guidelines, and security protocols.
The classification of U.S. intelligence presents a challenge in releasing information during multinational operations although sharing information and intelligence as much as possible improves interoperability and trust. Commanders and staffs should understand U.S. and other nations’ policies about information sharing, since the early sharing of information (during planning) ensures effective multinational operations.
2-21. An analyst’s ultimate goal is finding threat vulnerabilities and assisting the commander and staff in exploiting those vulnerabilities—despite having answered the commander’s PIR. If the intelligence analysis does not answer the commander’s PIR, the analyst should reexamine the guidance, consider recommending different collection strategies, and review information previously discarded as nonessential.
2-22. In tactical units, analysts must understand that their adjacent and especially their subordinate units may have degraded communications. In those cases, analysts at each echelon must develop their own conclusions and assessments and should use their unit’s primary, alternate, contingency, and emergency (known as PACE) plan to facilitate continuous dissemination of their products and assessments.
Chapter 3
All-Source Analytical Tasks
3-1. Through the application of the all-source analytical tasks, intelligence analysis facilitates commanders and other decision makers’ ability to visualize the OE, organize their forces, and control operations to achieve their objectives. The all-source analytical tasks are—
- Generate intelligence knowledge.
- Perform IPB.
- Provide warnings.
- Perform situation development.
- Provide intelligence support to targeting and information operations.
3-2. In any operation, both friendly and threat forces will endeavor to set conditions to develop a position of relative advantage. Setting these conditions begins with generate intelligence knowledge, which provides relevant knowledge about the OE that is incorporated into the Army design methodology and used later during other analytical tasks.
3-3. The continuous assessment of collected information also mitigates risk to friendly forces while identifying opportunities to leverage friendly capabilities to open a window of opportunity. Analysis presents the commander with options for employing multiple capabilities and gaining a position of relative advantage over the threat.
3-4. For each all-source analytical task, the challenge for the intelligence analyst is understanding the unique requirements and considerations based on the situation, operational echelon, and specific mission.
3-5. There are many forms of analysis associated with unique operational activities. One important example of these types of activities is identity activities, which result in identity intelligence. Identity intelligence is the intelligence resulting from the processing of identity attributes concerning individuals, groups, networks, or populations of interest (JP 2-0). Identity activities are described as a collection of functions and actions conducted by maneuver, intelligence, and law enforcement components. Identity activities recognize and differentiate one person from another to support decision making. Identity activities include—
- The collection of identity attributes and physical materials.
- The processing and exploitation of identity attributes and physical materials.
- All-source analytical efforts.
- The production of identity intelligence and DOD law enforcement criminal intelligence products.
- The dissemination of those intelligence products to inform policy and strategy development,
operational planning and assessment, and the appropriate action at the point of encounter.
GENERATE INTELLIGENCE KNOWLEDGE (ART 2.1.4)
3-6. Generate intelligence knowledge is a continuous task driven by the commander. It begins before receipt of mission and enables the analyst to acquire as much relevant knowledge as possible about the OE for the conduct of operations. Information is obtained through intelligence reach, research, data mining, database access, academic studies, intelligence archives, publicly available information, and other information sources, such as biometrics, forensics, and DOMEX.
3-7. Generate intelligence knowledge includes the following five tasks, which facilitate creating a foundation for performing IPB and mission analysis:
- Develop the foundation to define threat characteristics: Analysts create a database of known hostile threats and define their characteristics in a general location. Analysts can refine and highlight important threats through functional analysis that can be prioritized later during steps 3 and 4 of the IPB process.
- Obtain detailed terrain information and intelligence: Analysts describe the terrain of a general location and categorize it by environment type. For example, desert and jungle environments have distinguishing characteristics that can assist in analyzing terrain during step 2 of the IPB process.
- Obtain detailed weather and weather effects information and intelligence: Analysts describe the climatology of a general location and forecast how it would affect future operations. Analysts should rely on the Air Force staff weather officer of their respective echelons to assist in acquiring weather support products, information, and knowledge. If the staff weather officer is not readily available, analysts should use publicly available information and resources. Information regarding climatology characteristics can assist in analyzing weather effects during step 2 of the IPB process.
- Obtain detailed civil considerations information and intelligence: Analysts identify civil considerations (areas, structures, capabilities, organizations, people, and events [ASCOPE]) within a general location. Analysts can refine this information further when they receive a designated area of interest and can assist in determining how civil considerations will affect friendly and threat operations during step 2 of the IPB process.
- Complete studies: Although analysts do not have a specific operation, mission, or area of responsibility when generating intelligence knowledge, they can compile information into products based on the commander’s guidance. This supports the commander’s visualization and completes studies for dissemination. Completed studies or products include country briefs, written assessments, or graphics. These products inform the commander and staff on current and historic situations that may affect future operations when a mission is received.
PERFORM INTELLIGENCE PREPARATION OF THE BATTLEFIELD (ART 2.2.1)
3-8. Analytical support begins during the MDMP. The military decision-making process is an iterative planning methodology to understand the situation and mission, develop a course of action, and produce an operation plan or order. Commanders use the MDMP to visualize the OE and the threat, build plans and orders for extended operations, and develop orders for short-term operations within the framework of a long-range plan. During the mission analysis step of the MDMP, intelligence analysts lead the IPB effort; however, they cannot provide all of the information the commander requires for situational understanding. Other staff sections or supporting elements assist in producing and continuously refining intelligence products tailored to the commander’s requirements and the operation.
3-9. As analysts begin the IPB process, they should have a general understanding of their OE based on intelligence produced and acquired when generating intelligence knowledge. IPB is a four-step process:
- Step 1—Define the OE. The intelligence staff identifies those significant characteristics related to the mission variables of enemy, terrain and weather, and civil considerations that are relevant to the mission. The intelligence staff evaluates significant characteristics to identify gaps and initiate information collection. During step 1, the AO, area of interest, and area of influence must also be identified and established.
- Step 2—Describe environmental effects on operations. The intelligence staff describes how significant characteristics affect friendly operations. The intelligence staff also describes how terrain, weather, civil considerations, and friendly forces affect threat forces. The entire staff determines the effects of friendly and threat force actions on the population.
- Step 3—Evaluate the threat. Evaluating the threat is understanding how a threat can affect friendly operations. Step 3 determines threat force capabilities and the doctrinal principles and tactics, techniques, and procedures that threat forces prefer to employ.
- Step 4—Determine threat COAs. The intelligence staff identifies and develops possible threat COAs that can affect accomplishing the friendly mission. The staff uses the products associated with determining threat COAs to assist in developing and selecting friendly COAs during the COA steps of the MDMP. Identifying and developing all valid threat COAs minimize the potential of surprise to the commander by an unanticipated threat action.
PROVIDE WARNINGS (ART 2.1.1.1)
3-10. Across the range of military operations, various collection assets provide early warning of threat action. As analysts screen incoming information and message traffic, they provide the commander with advanced warning of threat activities or intentions that may change the basic nature of the operation. These warnings enable the commander and staff to quickly reorient the force to unexpected contingencies and to shape the OE.
3-11. Analysts can use analytic techniques and their current knowledge databases to project multiple scenarios and develop indicators as guidelines for providing warning intelligence. An indicator is, in intelligence usage, an item of information which reflects the intention or capability of an adversary to adopt or reject a course of action (JP 2-0). Analysts project future events and identify event characteristics that can be manipulated or affected. Characteristics that cannot be manipulated or affected should be incorporated into unit SOPs as warning intelligence criteria.
PERFORM SITUATION DEVELOPMENT (ART 2.2.2)
3-12. Intelligence analysis is central to situation development, as it is a process for analyzing information and producing current intelligence concerning the relevant aspects of the OE within the AO before and during operations. Analysts continually produce current intelligence to answer the commander’s requirements, update and refine IPB products, and support transitions to the next phase of an operation.
3-13. Analysts continually analyze the current situation and information to predict the threat’s next objective or intention. During step 3 of the IPB process, analysts compare the current situation with their threat evaluations to project multiple scenarios and develop indicators.
Understanding how the threat will react supports the planning of branches and sequels, affording the commander multiple COAs and flexibility on the battlefield during current operations. For example, observing a threat unit in a defensive posture may indicate an offensive operation within a matter of hours.
Providing this information to the commander enables the staff to pursue a different COA that can place friendly units in a better position of relative advantage. The commander may use a flanking maneuver on the threat since it is in a relatively stationary position, hindering the future offensive operation.
PROVIDE INTELLIGENCE SUPPORT TO TARGETING AND INFORMATION OPERATIONS (ART 2.4)
3-14. Targeting is the process of selecting and prioritizing targets and matching the appropriate response to them, considering operational requirements and capabilities.
- 3-15. Intelligence analysis, starting with the IPB effort, supports target development and target detection:
- l Intelligence analysis support to target development: Target development involves the systematic analysis of threat forces and operations to determine high-value targets (HVTs) (people, organizations, or military units the threat commander requires for successful completion of the mission), HPTs (equipment, military units, organizations, groups, or specific individuals whose loss to the threat contributes significantly to the success of the friendly COA), and systems and system components for potential engagement through maneuver, fires, electronic warfare, or information operations.
- l Intelligenceanalysissupporttotargetdetection:Intelligenceanalystsestablishproceduresfor disseminating targeting information. The targeting team develops the sensor and attack guidance matrix to determine the sensors required to detect and locate targets. Intelligence analysts incorporate these requirements into the collection management tools, which assist the operations staff in developing the information collection plan.
- 3-16. Information operations is the integrated employment, during military operations, of information-
related capabilities in concert with other lines of operation to influence, disrupt, corrupt, or usurp the decision- making of adversaries and potential adversaries while protecting our own (JP 3-13). Intelligence support to military information operations pertains to the collection of information essential to define the information environment, understand the threat’s information capabilities, and assess or adjust information-related effects. Continuous and timely intelligence is required to accurately identify the information environment across the physical, informational, and cognitive dimensions, including the operational variables (PMESII-PT). Intelligence support to military information operations focuses on the following:
- Aspects of the information environment that influence, or are influenced by, the threat.
- Understanding threat information capabilities.
- Understanding the methods by which messages are transmitted and received in order to assess the cognitive reception and processing of information within the target audience.
- Assessing information-related effects (target audience motivation and behavior, measure of effectiveness, and information indicators of success or failure).
PART TWO
Task Techniques
Chapter 4
Analytic Techniques
OVERVIEW
4-1. Intelligence analysts use cognitive processes and analytic techniques and tools to solve intelligence problems and limit analytical errors. The specific number of techniques and tools applied depends on the mission and situation.
4-2. The following distinguishes between a technique, tool, and method:
- Technique is a way of doing something by using a special knowledge or skill. An analytic technique is a way of looking at a problem, which results in a conclusion, assessment, or both. A technique usually guides analysts in thinking about a problem instead of providing them with a definitive answer as typically expected from a method.
- Tool is a component of an analytic technique that facilitates the execution of the technique but does not provide a conclusion or assessment in and of itself. Tools facilitate techniques by allowing analysts to display or arrange information in a way that enables analysis of the information. An example of a tool is a link diagram or a matrix. Not all techniques have an associated tool.
- Method is a set of principles and procedures for conducting qualitative analysis.
APPLYING STRUCTURED ANALYTIC TECHNIQUES
4-3. Structured analysis assists analysts in ensuring their analytic framework—the foundation upon which they form their analytical judgments—is as solid as possible. It entails separating and organizing the elements of a problem and reviewing the information systematically. Structured analytic techniques provide ways for analysts to separate the information into subsets and assess it until they generate a hypothesis found to be either feasible or untrue. Structured analytic techniques—
- Assist analysts in making sense of complex problems.
- Allow analysts to compare and weigh pieces of information against each other.
- Ensure analysts focus on the issue under study.
- Force analysts to consider one element at a time systematically.
- Assist analysts in overcoming their logical fallacies and biases.
- Ensure analysts see the elements of information. This enhances their ability to identify correlations and patterns that would not appear if not depicted outside the mind.
- Enhance analysts’ ability to collect and review data. This facilitates thinking with a better base to derive alternatives and solutions.
4-4. Applying the appropriate structured analytic technique assists commanders in better understanding and shaping the OE. One technique may not be sufficient to assist in answering PIRs; therefore, analysts should use multiple techniques, time permitting. For example, determining the disposition and composition of the threat in the OE is like attempting to put the pieces of a puzzle together. Employing multiple analytic techniques facilitates the piecing of the puzzle, thus creating a clearer picture.
4-5. For thorough analysis, analysts should incorporate as many appropriate techniques as possible into their workflow. Although this may be more time consuming, analysts become more proficient at using these techniques, ultimately reducing the amount of time required to conduct analysis. The exact techniques and tools incorporated, as well as the order in which to execute them, are mission- and situation-dependent. There is no one correct way to apply these techniques as each analyst’s experience, preference, and situation are influencing factors.
4-6. Analysts can apply structured analytic techniques in the analyze and integrate phases of the intelligence analysis process to assist them in solving analytic problems. The analytic problem can vary depending on the echelon or mission.
4-7. The vast amount of information that analysts must process can negatively affect their ability to complete intelligence assessments timely and accurately; therefore, analysts should be proficient at using both manual and automated methods to conduct structured analysis. Additionally, analysts conduct analysis from varying environments and echelons in which the availability of automation and network connectivity may not be fully mission capable.
Chapter 5
Basic and Diagnostic Structured Analytic Techniques
SECTION I – BASIC STRUCTURED ANALYTIC TECHNIQUES
5-1. Basic structured analytic techniques are the building blocks upon which further analysis is performed. They are typically executed early in the intelligence effort to obtain an initial diagnosis of the intelligence problem through revealing patterns. The basic structured analytic techniques described in this publication are—
- Sorting technique: Organizing large bodies of data to reveal new insights.
- Chronologies technique:
- Displaying data over time.
- Placing events or actions in order of occurrence.
- Linearly depicting events or actions.
- Matrices technique:
- Organizing data in rows and columns.
- Comparing items through visual representation.
- Weighted ranking technique:
- Facilitating the application of objectivity.
- Mitigating common cognitive pitfalls.
- Link analysis technique: Mapping and measuring relationships or links between entities.
- Event tree and event mapping techniques: Diagramming hypotheses-based scenarios.
5-2. These techniques—
Improve assessments by making them more rigorous.
Improve the presentation of the finished intelligence in a persuasive manner.
Provide ways to measure progress.
Identify information gaps.
Provide information and intelligence.
SORTING
5-3. Sorting is a basic structured analytic technique used for grouping information in order to develop insights and facilitate analysis. This technique is useful for reviewing massive data stores pertaining to an intelligence challenge. Sorting vast amounts of data can provide insights into trends or abnormalities that warrant further analysis and that otherwise would go unnoticed. Sorting also assists in reviewing multiple categories of information that when divided into components presents possible trends, similarities, differences, or other insights not readily identifiable.
5-4. Method. The following steps outline the process of sorting:
Step1: Arrange the information into categories to determine which categories or combination of categories might show trends or abnormalities that would provide insight into the problem being studied.
Step 2: Review the listed facts, information, or hypotheses in the database to identify key fields that may assist in uncovering possible patterns or groupings.
Step 3: Group those items according to the schema of the categories defined in step 1.
Step 4: Choose a category and sort the information within that category. Look for any insights, trends, or oddities.
Step 5: Review (and re-review) the sorted facts, information, or hypotheses to determine alternative ways to sort them. List any alternative sorting schema for the problem. One of the most useful applications of this technique is sorting according to multiple schemas and examining results for correlations between data and categories. For example, analysts identify from the sorted information that most attacks occurring on the main supply route also occur at a specific time.
5-5. A pattern analysis plot sheet is a common analysis tool for sorting information. It can be configured to determine threat activity as it occurs within a specified time. The pattern analysis plot sheet is a circular matrix and calendar. Each concentric circle represents one day and each wedge in the circle is one hour of the day.
CHRONOLOGIES
5-7. A chronology is a list that places events or actions in the order they occurred; a timeline is a graphical depiction of those events. Analysts must consider factors that may influence the timing of events. For example, the chronological time of events may be correlated to the lunar cycle (moonset), religious events, or friendly patrol patterns. Timelines assist analysts in making these types of determinations.
- 5-8. Method. Creating a chronology or timeline involves three steps:
- Step 1: List relevant events by the date or in order each occurred. Analysts should ensure they properly reference the data.
- Step 2: Review the chronology or timeline by asking the following questions:
- What are the temporal distances between key events? If lengthy, what caused the delay? Are there missing pieces of data that may fill those gaps that should be collected?
- Did analysts overlook pieces of intelligence information that may have had an impact on the events?
- Conversely, if events seem to happen more rapidly than expected, is it possible that analysts have information related to multiple-event timelines?
- Are all critical events necessary and shown for the outcome to occur?
- What are the intelligence gaps?
- What are indicators for those intelligence gaps?
- What are the vulnerabilities in the timeline for collection activities?
- What events outside the timeline could have influenced the activities?
- Step 3: Summarize the data along the line. Sort each side of the line by distinguishing between types of data. For example, depict intelligence reports above the timeline and depict significant activities below the timeline. Multiple timelines may be used and should depict how and where they converge.
5-9. Timelines are depicted linearly and typically relate to a single situation or COA. Multilevel timelines allow analysts to track concurrent COAs that may affect each other. Analysts use timelines to postulate about events that may have occurred between known events. They become sensitized to search for indicators, so the missing events are found and charted. Timelines may be used in conjunction with other structured analytic techniques, such as the event tree technique (see paragraph 5-22), to analyze complex networks and associations.
5-10. Figure 5-3 illustrates a time event chart, which is a variation of a timeline using symbols to represent events, dates, and the flow of time. While there is great latitude in creating time event charts, the following should be considered when creating them:
- Depict the first event as a triangle.
- Depict successive events as rectangles.
- Mark noteworthy events with an X across the rectangles.
- Display the date on the symbol.
- Display a description below the symbol.
- If using multiple rows, begin each row from left to right.
MATRICES
5-11. A matrix is a grid with as many cells as required to sort data and gain insight. Whenever information can be incorporated into a matrix, it can provide analytic insight. A matrix can be rectangular, square, or triangular; it depends on the number of rows and columns required to enter the data. Three commonly used matrices are the—
- Threat intentions matrix—assists in efficiently analyzing information from the threat’s point of view based on the threat’s motivation, goals, and objectives. (See paragraph 5-14.)
- Association matrix—identifies the existence and type of relationships between individuals as determined by direct contact.
- Activities matrix—determines connections between individuals and any organization, event, entity, address, activity, or anything other than persons.
5-12. A key feature of the matrices analytic technique is the formulation of ideas of what may occur when one element of a row interacts with the corresponding element of a column. This differs from other matrices, such as the event matrix (described in ATP 2-01.3), in which the elements of the columns and rows do not interact to formulate outcomes; the matrix is primarily used to organize information. Table 5-3 briefly describes when to use the matrices technique, as well as the value added and potential pitfalls associated with using this technique.
5-13. Method. The following steps outline the process for constructing a matrix (see figure 5-4):
- Step1: Draw a matrix with enough columns and rows to enter the two sets of data being compared.
- Step 2: Enter the range of data or criteria along the uppermost horizontal row and the farthest left vertical column leaving a space in the upper left corner of the matrix.
- Step3: In the grid squares in between, annotate the relationships, or lack thereof, in the cell at the intersection between two associated data points.
- Step 4: Review the hypotheses developed about the issue considering the relationships shown in the matrix; if appropriate, develop new hypotheses based on the insight gained from the matrix.
5-14. The following steps pertain to the threat intentions matrix technique (see figure 5-4): column.
Step1: Enter the decision options believed to be reasonable from the threat’s viewpoint along the farthest left vertical column.
Step 2: Enter the objectives for each option from the threat’s viewpoint in the objectives column.
Step 3: Enter the benefits for each option from the threat’s viewpoint in the benefits column.
Step 4: Enter the risks for each option from the threat’s viewpoint in the risks column.
Step 5: Fill in the implications column, which transitions the analyst from the threat’s viewpoint to the analyst’s viewpoint. Enter the implications from the threat’s viewpoint and then add a slash (/) and enter the implications from the analyst’s viewpoint.
Step 6: Enter the indicators from the analyst’s viewpoint in the indications column. This provides a basis for generating collection to determine as early as possible which option the threat selected.
WEIGHTED RANKING
5-15. The weighted ranking technique is a systematic approach that provides transparency in the derivation and logic of an assessment. This facilitates the application of objectivity to an analytic problem. To simplify the weighted ranking technique, this publication introduces subjective judgments instead of dealing strictly with hard numbers; however, objectivity is still realized. This technique requires analysts to select and give each criterion a weighted importance from the threat’s viewpoint. Analysts use the criticality, accessibility, recuperability, vulnerability, effect, and recognizability (also called CARVER) matrix tool to employ this technique to support targeting prioritization. (See ATP 3-60.) The insight gained from how each criterion affects the outcome allows for a clear and persuasive presentation and argumentation of the assessment.
5-16. Weighted ranking assists in mitigating common cognitive pitfalls by converting the intelligence problem into a type of mathematical solution. The validity of weighting criteria is enhanced through group discussions, as group members share insights into the threat’s purpose and viewpoint; red hat/team analysis can augment this technique. Weighted ranking uses matrices to compute and organize information.
5-17. Method. The following steps describe how to accomplish a simplified weighted ranking review of alternative options:
- Step1: Create a matrix and develop all options and criteria related to the analytical issue. Figure5-5 depicts the options as types of operations and the criteria as the five military aspects of terrain (observation and fields of fire, avenues of approach, key terrain, obstacles, and cover and concealment [OAKOC]).
- Step2: Label the left, upper most column/row of the matrix as options and fill the column with the types of operations generated in step 1.
- Step3: List the criteria (OAKOC) generated in step1in the top row with one criterion per column.
- Step 4: Assign weights and list them in parentheses next to each criterion. Depending on the number of criteria, use either 10 or 100 points and divide them based on the analyst’s judgment of each criterion’s relative importance. Figure 5-5 shows how the analyst assigned the weights from the threat’s perspective to the OAKOC factors using 10 points.
- Step 5: Work across the matrix one option (type of operation) at a time to evaluate the relative ability of the option to satisfy the corresponding criterion from the threat’s perspective. Using the 10-point rating scale, assign 1 as low and 10 as high to rate each option separately. (See figure 5-5 for steps 1 through 5.)
- Step 6: Work across the matrix again, one option at a time, and multiply the criterion weight by the option rating and record this number in each cell. (See figure 5-6.)
LINK ANALYSIS
5-18. Link analysis, often known as network analysis, is a technique used to evaluate the relationships between several types of entities such as organizations, individuals, objects, or activities. Visualization tools augment this technique by organizing and displaying data and assisting in identifying associations within complex networks. Although analysts can perform link analysis manually, they often use software to aid this technique.
5-19. Analysts may use link analysis to focus on leaders and other prominent individuals, who are sometimes critical factors in the AO. Analysts use personality files—often obtained from conducting identity activities using reporting and biometrics, forensics, and DOMEX data—to build organizational diagrams that assist them in determining relationships between critical personalities and their associations to various groups or activities. This analysis is critical in determining the roles and relationships of many different people and organizations and assessing their loyalties, political significance, and interests.
5-20. Method. The following steps describe how to construct a simple link analysis diagram:
Step 1: Extract entities and the information about their relationships from intelligence holdings that include but are not limited to biometrics, forensics, and DOMEX information.
Step 2: Place entity associations into a link chart using link analysis software or a spreadsheet or by drawing them manually:
-
- Use separate shapes for different types of entities, for example, circles for people, rectangles for activities, and triangles for facilities. (See figure 5-7 on page 5-10.)
- Use colored and varying types of lines to show different activities, for example, green solid lines for money transfers, blue dotted lines for communications, and solid black lines for activities. This differentiation typically requires a legend. (See figure 5-7 on page 5-10.)
Step 3: Analyze the entities and links in the link chart.
Step 4: Review the chart for gaps, significant relationships, and the meaning of the relationships based on the activity occurring. Ask critical questions of the data such as—
- Which entity is central or key to the network?
- Who or what is the initiator of interactions?
- What role does each entity play in the network?
- Who or what forms a bridge or liaison between groups or subgroups?
- How have the interactions changed over time?
- Which nodes should be targeted for collection or defeat?
Step 5: Summarize what is observed in the chart and draw interim hypotheses about the relationships.
5-21. The three types of visualization tools used in link analysis to record and visualize information are—
- Link diagram.
- Association matrix.
- Activities matrix.
EVENT TREE
5-22. The event tree is a structured analytic technique that enables analysts to depict a possible sequence of events, including the potential branches of that sequence in a graphical format. An event tree works best when there are multiple, mutually exclusive options that cover the spectrum of reasonable alternatives. It clarifies the presumed sequence of events or decisions between an initiating event and an outcome. Table 5-6 briefly describes when to use the event tree technique, as well as the value added and potential pitfalls associated with this using technique. The following are pointers for analysts using the event tree technique:
- Use this technique in conjunction with weighted ranking, hypothesis-review techniques, and subjective probability to gain added insights.
- Leverage the expertise of a group of analysts during the construction of an event tree to ensure all events, factors, and decision options are considered.
5-23. Method. The following outlines the steps for creating event trees (see figure 5-10):
- Step 1: Identify the intelligence issue/problem (antigovernment protest in Egypt).
Step 2: Identify the mutually exclusive and complete set of hypotheses that pertain to the intelligence issue/problem (Mubarak resigns or Mubarak stays).
Step 3: Decide which events, factors, or decisions (such as variables) will have the greatest influence on the hypotheses identified in step 2.
- Step 4: Decide on the sequencing for when these factors are expected to occur or affect one another.
- Step 5: Determine the event options (Mubarak stays—hardline, reforms, some reforms) within each hypothesis and establish clear definitions for each event option to ensure collection strategies to monitor events are effective.
- Step6: Construct the event tree from left to right. Each hypothesis is a separate main branch. Start with the first hypothesis and have one branch from this node for each realistic path the first event can take. Proceed down each event option node until the end state for that subbranch is reached. Then move to the next hypothesis and repeat the process.
- Step7: Determine what would indicate a decision has been made at each decision point for each option to use in generating an integrated collection plan.
Step 8: Assess the implications of each hypothesis on the intelligence problem.
THE BASICS OF ANALYTIC DESIGN
9-4. Managing long-term analytical assessments is accomplished by performing seven analytic design steps, as shown in figure 9-1 on page 9-2:
- Step 1: Frame the question/issue.
- Step 2: Review and assess knowledge.
- Step 3: Review resources.
- Step 4: Select the analytic approach/methodology and plan project.
- Step 5: Develop knowledge.
- Step 6: Perform analysis.
- Step 7: Evaluate analysis.
EVENT MAPPING
5-24. The event mapping technique uses brainstorming to assist in diagraming scenarios/elements stemming from analyst-derived hypotheses. Scenarios/Elements are linked around a central word or short phrase representing the issue/problem to be analyzed.
5-25. Event mapping scenarios/elements are arranged intuitively based on the importance of the concepts, and they are organized into groups, branches, or areas. Using the radial diagram format in event mapping assists in mitigating some bias, such as implied prioritization, anchoring, or other cognitive biases derived from hierarchy or sequential arrangements.
5-26. Method. The following outlines the steps for applying event maps (see figure 5-11):
Step 1: Place the word or symbol representing the issue/problem to be analyzed in the center of the medium from which the event map will be constructed.
Step 2: Add symbols/words to represent possible actions/outcomes around the central issue/problem.
Step 3: Link the possible actions/outcomes to the central issue or problem. If desired, use colors to indicate the major influence the link represents. For example, use green for economic links, red for opposition groups, or purple for military forces. Colors may also be used to differentiate paths for ease of reference.
Step4: Continue working outward, building the scenario of events into branches and sub branches for each hypothesis in detail.
Step 5: If ideas end, move to another area or hypothesis.
Step 6: When creativity wanes, stop and take a break. After the break, return and review the map and make additions and changes as desired.
Step7: As an option, number the links or decision points for each hypothesis. On a separate piece of paper, write down the evidence for each number to be collected that would disprove that link or decision. Use the lists for each number to develop an integrated collection strategy for the issue/problem.
SECTION II – DIAGNOSTIC STRUCTURED ANALYTIC TECHNIQUES
5-27. Diagnostic structured analytic techniques make analytical arguments, assumptions, and/or intelligence gaps more transparent. They are often used in association with most other analytic techniques to strengthen analytical assessments and conclusions. The most commonly used diagnostic techniques are—
- Key assumptions check technique: Reviewing assumptions that form the analytical judgments of the problem.
- Quality of information check technique:
- Source credibility and access.
- Plausibility of activity.
- Imminence of activity.
- Specificity of activity.
- Indicators/Signposts of change technique:
- Identifying a set of competing hypotheses.
- Creating lists of potential or expected events.
- Reviewing/Updating indicator lists.
- Identifying most likely hypotheses.
5-30. Method. Checking for key assumptions requires analysts to consider how their analysis depends on the validity of certain evidence. The following four-step process assists analysts in checking key assumptions:
- Step 1: Review what the current analytic line of thinking on the issue appears to be:
- What do analysts think they know?
- What key details assist analysts in accepting that the assumption is true?
- Step 2: Articulate the evidence, both stated and implied in finished intelligence, accepted as true.
- Step 3: Challenge the assumption by asking why it must be true and is it valid under all conditions. What is the degree of confidence in those initial answers?
- Step 4: Refine the list of key assumptions to contain only those that must be true in order to sustain the analytic line of thinking. Consider under what circumstances or based on what information these assumptions might not be true.
5-31. Analysts should ask the following questions during this process:
-
- What is the degree of confidence that this assumption is true?
- What explains the degree of confidence in the assumption?
- What circumstances or information might undermine this assumption?
- Is a key assumption more likely a key uncertainty or key factor?
- If the assumption proves to be wrong, would it significantly alter the analytic line of thinking? How?
- Has this process identified new factors that require further analysis?
QUALITY OF INFORMATION CHECK
5-32. Weighing the validity of sources is a key feature of any analytical assessment. Establishing how much confidence analysts have in their analytical judgments should be based on the information’s reliability and accuracy. Analysts should perform periodic checks of the information for their analytical judgments; otherwise, important analytical judgments may become anchored to poor-quality information.
5-33. Determining the quality of information independent of the source of the information is important in ensuring that neither duly influences the other. Not understanding the context in which critical information has been provided makes it difficult for analysts to assess the information’s validity and establish a confidence level in the intelligence assessment. A typically reliable source can knowingly report inaccurate information, and a typically unreliable source can sometimes report high-quality information. Therefore, it is important to keep the two reviews—source and information—separate.
This technique—
- Provides the foundation for determining the confidence level of an assessment and clarity to an analyst’s confidence level in the assessment.
- Provides an opportunity to catch interpretation errors and mitigate assimilation or confirmation bias based on the source:
- Assimilation bias is the modification and elaboration of new information to fit prior conceptions or hypotheses. The bias is toward confirming a preconceived answer.
- Confirmation bias is the conditions that cause analysts to undervalue or ignore evidence that contradicts an early judgment and value evidence that tends to confirm already held assessments.
- Identifies intelligence gaps and potential denial and deception efforts
5-34. Method. For an information review to be fully effective, analysts need as much background information on sources as is possible. At a minimum, analysts should perform the following steps:
- Step 1: Review all sources of information for accuracy; identify any of the more critical or compelling sources. (For example, a human source with direct knowledge is compelling.)
- Step 2: Determine if analysts have sufficient and/or strong collaboration between the information sources.
- Step 3: Reexamine previously dismissed information considering new facts or circumstances.
- Step 4: Ensure any circular reporting is identified and properly flagged for other analysts; analysis based on circular reporting should also be reviewed to determine if the reporting was essential to the judgments made. (For example, a human source’s purpose for providing information may be to deceive.)
- Step 5: Consider whether ambiguous information has been interpreted and qualified properly.(For example, a signals intelligence transcript may be incomplete.)
- Step 6: Indicate a level of confidence analysts can place on sources that may likely figure into future analytical assessments.
Note. Analysts should consciously avoid relating the source to the information until the quality of information check is complete. If relating the source to the quality of information changes the opinion of the information, analysts must ensure they can articulate why. Analysts should develop and employ a spreadsheet to track the information and record their confidence levels in the quality of information as a constant reminder of the findings.
INDICATORS/SIGNPOSTS OF CHANGE
5-36. The indicators/signposts of change technique is primarily a diagnostic tool that assists analysts in identifying persons, activities, developments, or trends of interest. Indicators/Signposts of change are often tied to specific scenarios created by analysts to help them identify which scenario is unfolding. Indicators/Signposts of change are a preestablished set of observable phenomena periodically reviewed to help track events, spot emerging trends, and warn of unanticipated change. These observable phenomena are events expected to occur if a postulated situation is developing. For example, some of the observable events of a potential protest include—
- The massive gathering of people at a specific location.
- People’s rallying cries posted as messages on social media.
- An adjacent country’s aggressive national training and mobilization drills outside of normal patterns.
5-37. Analysts and other staff members create a list of these observable events and the detection and confirmation of these indicators enable analysts to answer specific information requirements that answer PIRs. Collection managers often use these lists to help create an intelligence collection plan.
5-38. This technique aids other structured analytic techniques that require hypotheses generation as analysts create indicators that can confirm or deny these hypotheses. Analysts may use indicators/signposts of change to support analysis during all operations of the Army’s strategic roles and to assist them in identifying a change in the operations.
5-39. Method. Whether used alone or in combination with other structured analysis, the process is the same. When developing indicators, analysts start from the event, work backwards, and include as many indicators as possible. The following outlines the steps to the indicators/signposts of change technique:
- Step 1: Identify a set of competing hypotheses or scenarios.
- Step 2: Create separate lists of potential activities, statements, or events expected for each hypothesis or scenario.
- Step 3: Regularly review and update the indicator lists to see which are changing.
- Step 4: Identify the most likely or most correct hypothesis or scenario based on the number of changed indicators observed.
Chapter 6
Advanced Structured Analytic Techniques
SECTION I – CONTRARIAN STRUCTURED ANALYTIC TECHNIQUES
6-1. Contrarian structured analytic techniques challenge ongoing assumptions and broaden possible outcomes. They assist analysts in understanding threat intentions, especially when not clearly stated or known. Contrarian techniques explore the problem from different (often multiple) perspectives. This allows analysts to better accept analytic critique and grant greater avenues to explore and challenge analytical arguments and mindsets. Proper technique application assists analysts in ensuring preconceptions and assumptions are thoroughly examined and tested for relevance, implication, and consequence.
6-2. The contrarian structured analytic techniques described in this publication are—
- Analysis of competing hypotheses (ACH) technique: Evaluating multiple hypotheses through a competitive process in order to reach unbiased conclusions and attempting to corroborate results.
- Devil’s advocacy technique: Challenging a single, strongly held view or consensus by building the best possible case for an alternative explanation.
- Team A/Team B technique: Using separate analytic teams that contrast two (or more) strongly held views or competing hypotheses.
- High-impact/Low-probability analysis technique: Highlighting an unlikely event that would have major consequences if it happened.
- What if? analysis technique: Assuming an event has occurred with potential (negative or positive) impacts and explaining how it might happen.
ANALYSIS OF COMPETING HYPOTHESES
6-3. Analysts use ACH to evaluate multiple competing hypotheses in order to foster unbiased conclusions. Analysts identify alternative explanations (hypotheses) and evaluate all evidence that will disconfirm rather than confirm hypotheses. While a single analyst can use ACH, it is most effective with a small team of analysts who can challenge each other’s evaluation of the evidence.
6-4. ACH requires analysts to explicitly identify all reasonable alternatives and evaluate them against each other rather than evaluate their plausibility one at a time. ACH involves seeking evidence to refute hypotheses. The most probable hypothesis is usually the one with the least evidence against it, not the one with the most evidence for it. Conventional analysis generally entails looking for evidence to confirm a favored hypothesis.
6-5. Method. Simultaneous evaluation of multiple competing hypotheses is difficult to accomplish without using tools. Retaining these hypotheses in working memory and then assessing how each piece of evidence interacts with each hypothesis is beyond the mental capabilities of most individuals. To manage the volume of information, analysts use a matrix as a tool to complete ACH. (See figure 6-1.) The following outlines the steps used to complete ACH:
Step 1: Identify the intelligence problem.
Step 2: Identify all possible hypotheses related to the intelligence problem.
Step 3: Gather and make a list of all information related to the intelligence problem.
Step 4: Prepare a matrix with each hypothesis across the top and each piece of information down the left side.
Step 5: Determine if each piece of information is consistent or inconsistent with each hypothesis.
Step 6: Refine the matrix. Reconsider the hypotheses and remove information that has no diagnostic value.
Step 7: Draw tentative conclusions about the relative likelihood of each hypothesis.
Step 8: Analyze if conclusions rely primarily on a few critical pieces of information.
Step 9: Report conclusions.
Step 10: Identify milestones for future observation that may indicate events are taking a different course than expected.
DEVIL’S ADVOCACY
6-6. Analysts use the devil’s advocacy technique for reviewing proposed analytical conclusions. They are usually not involved in the deliberations that led to the proposed analytical conclusion. Devil’s advocacy is most effective when used to challenge an analytic consensus or a key assumption about a critically important intelligence question. In some cases, analysts can review a key assumption and present a product that depicts the arguments and data that support a contrary assessment. Devil’s advocacy can provide further confidence that the current analytic line of thought will endure close scrutiny.
Devil’s advocacy can lead analysts to draw one of three conclusions:
- Analysts ignored data or key lines of argument that undermine their analysis and should restart the analysis process.
- The analysis is sound, but more research is warranted in select areas.
- Key judgments are valid, but a higher level of confidence in the bottom-line judgments is warranted.
6-7. Method. The following outlines the steps for the devil’s advocacy technique:
-
- Step 1: Present the main analytical conclusion.
- Step 2: Outlinethemainpointsandkeyassumptionsandcharacterizetheevidencesupportingthe current analytical view.
- Step 3: Select one or more assumptions that appear the most susceptible to challenge.
- Step 4: Review the data used to determine questionable validity, possible deception, and the existence of gaps.
- Step 5: Highlight evidence that supports an alternative hypothesis or contradicts current thinking.
- Step 6: Present findings that demonstrate flawed assumptions, poor evidence, or possible deception.
6-8. Analysts should consider the following when conducting the devil’s advocacy technique:
- Sources of uncertainty.
- Diagnosticity of evidence.
- Anomalous evidence.
- Changes in the broad environment.
- Alternative decision models.
- Availability of cultural expertise.
- Indicators of possible deception.
- Information gaps.
TEAM A/TEAM B
6-9. Team A/Team B is a process for comparing, contrasting, and clarifying two (or more) equally valid analytical assessments. Multiple teams of analysts perform this process, each working along different lines of analysis. Team A/Team B involves separate analytic teams that analyze two (or more) views or competing hypotheses. Team A/Team B is different from devil’s advocacy, which challenges a single dominant mindset instead of comparing two (or more) strongly held views. Team A/Team B recognizes that there may be competing, and possibly equally strong, mindsets on an issue that needs to be clarified. A key requirement to ensure technique success is equally experienced competing mindsets. This mitigates unbalanced arguments.
6-10. Method. The following steps outlines the steps of the team A/team B technique (see figure 6-2):
Step 1: Identify the two (or more) competing hypotheses.
Step 2: Form teams and designate individuals to develop the best case for each hypothesis.
Step 3: Review information that supports each respective position.
Step 4: Identify missing information that would support or bolster the hypotheses.
Step 5: Prepare a structured argument with an explicit discussion of—
- Key assumptions.
- Key evidence.
- The logic behind the argument.
Step 6: Set aside the time for a formal debate or an informal brainstorming session.
Step 7: Have an independent jury of peers listen to the oral presentation and be prepared to question the teams about their assumptions, evidence, and/or logic.
Step 8: Allow each team to present its case, challenge the other team’s argument, and rebut the opponent’s critique of its case.
Step 9: The jury considers the strength of each presentation and recommends possible next steps for further research and collection efforts.
HIGH-IMPACT/LOW-PROBABILITY ANALYSIS
6-11. The high-impact/low-probability analysis technique sensitizes analysts to the potential impact that seemingly low-probability events could have on U.S. forces. New and often fragmentary data suggesting that a previously unanticipated event might occur is a trigger for applying this technique.
6-12. Mapping out the course of an unlikely, yet plausible event may uncover hidden relationships between key factors and assumptions; it may also alert analysts to oversights in the analytic line of thought. This technique can augment hypotheses-generating analytic techniques.
6-13. The objective of high-impact/low-probability analysis is exploring whether an increasingly credible case can be made for an unlikely event occurring that could pose a major danger or open a window of opportunity.
6-14. Method. An effective high-impact/low-probability analysis involves the following steps:
- Step1: Define the high-impact outcome clearly. This will justify examining what may be deemed a very unlikely development.
- Step 2: Devise one or more plausible pathways to the low-probability outcome. Be precise, as it may aid in developing indicators for later monitoring.
- Step 3: Insert possible triggers or changes in momentum if appropriate (such as natural disasters, sudden key leader health problems, or economic or political turmoil).
- Step 4: Brainstorm plausible but unpredictable triggers of sudden change.
- Step 5: Identify a set of indicators for each pathway that help anticipate how events are likely to develop and periodically review those indicators.
- Step 6: Identify factors that could deflect a bad outcome or encourage a positive one.
“WHAT IF?” ANALYSIS
6-15. “What if?” analysis is a technique for challenging a strong mindset that an event will not occur or that a confidently made forecast may not be entirely justified. “What if?” analysis is similar to high-impact/low- probability analysis; however, it does not focus on the consequences of an unlikely event. “What if” analysis attempts to explain how the unlikely event might transpire. It also creates an awareness that prepares analysts to recognize early signs of a significant change.
6-16. “What if” analysis can also shift focus from asking whether an event will occur to working from the premise that it has occurred. This allows analysts to determine how the event might have happened. This technique can augment hypotheses-generating analytic techniques using multiple scenario generation or ACH. “What if?” analysis shifts the question from “How likely is the event?” to the following:
-
- How could the event possibly occur?
- What would be the impact of the event?
- Has the possibility of the event happening increased?
6-17. Like other contrarian techniques, “what if?” analysis must begin by stating the conventional analytic line of thought and then stepping back to consider alternative outcomes that are too important to dismiss no matter how unlikely.
6-18. Method. The “what if?” analysis steps are similar to the high-impact/low-probability analysis steps once analysts have established the event itself:
- Step 1: Assume the event has happened.
- Step2: Select some triggering events that permitted the scenario to unfold to help make the “what if?” more plausible (for example, the death of key leader, a natural disaster, an economic or political event that might start a chain of other events).
- Step 3: Develop a chain of reasoning based on logic and evidence to explain how this outcome could have occurred.
- Step 4: Think backwards from the event in concrete ways, specifying what must occur at each stage of the scenario.
- Step 5: Identify one or more plausible pathways to the event; it is likely that more than one will appear possible.
- Step 6: Generate an indicators/signposts of change list to detect the beginnings of the event.
- Step 7: Consider the scope of positive and negative consequences and their relative impact.
- Step 8: Monitor the indicators developed periodically.
SECTION II – IMAGINATIVE STRUCTURED ANALYTIC TECHNIQUES
6-19. Imaginative structured analytic techniques assist analysts in approaching an analytic problem from different and multiple perspectives. This technique also broadens analysts’ selection of potential COAs, thus reducing the chance of missing unforeseen outcomes. Imaginative techniques facilitate analysts’ ability to forecast events and generate ideas creatively. Additionally, the proper application of imaginative techniques can assist in identifying differences in perspectives and different assumptions among analytic team members. The most commonly used imaginative techniques are—
- Brainstorming technique: Generating new ideas and concepts through unconstrained groupings.
- Functional analysis technique:
- Identifying threat vulnerabilities through knowledge of threat capabilities.
- Identifying windows of opportunity and threat vulnerabilities.
- Outside-in thinking technique: Identifying the full range of basic factors and trends that indirectly shape an issue.
- Red hat/team analysis technique: Modeling the behavior of an individual or group by trying to replicate how a threat would think about an issue.
BRAINSTORMING
6-20. Brainstorming is a widely used technique for stimulating new thinking; it can be applied to most other structured analytic techniques as an aid to thinking. Brainstorming is most effective when analysts have a degree of subject matter expertise on the topic of focus.
6-21. Brainstorming should be a very structured process to be most productive. An unconstrained, informal discussion might produce some interesting ideas, but usually a more systematic process is the most effective way to break down mindsets and produce new insights. The process involves a divergent thinking phase to generate and collect new ideas and insights, followed by a convergent thinking phase for grouping and organizing ideas around key concepts.
6-22. Method. As a two-phase process, brainstorming elicits the most information from brainstorming participants:
Phase1—Divergent thinking phase:
- Step 1: Distribute a piece of stationery with adhesive and pens/markers to all participants. Typically, a group of 10 to 12 people works best.
- Step 2: Pose the problem in terms of a focal question. Display it in one sentence on a large easel or whiteboard.
- Step 3: Ask the group to write down responses to the question, using key words that will fit on the small piece of stationery.
- Step 4: Stick all of the notes on a wall for all to see—treat all ideas the same.
- Step 5: When a pause follows the initial flow of ideas, the group is reaching the end of its collective conventional thinking, and new divergent ideas are then likely to emerge. End phase 1 of the brainstorming after two or three pauses.
Phase2—Convergent thinking phase:
Step 6: Ask group participants to rearrange the notes on the wall according to their commonalities or similar concepts. Discourage talking. Some notes may be moved several times as they begin to cluster. Copying some notes is permitted to allow ideas to be included in more than one group.
Step 7: Select a word or phrase that characterizes each grouping or cluster once all of the notes have been arranged.
Step 8: Identify any notes that do not easily fit with others and consider them as either isolated thoughts or the beginning of an idea that deserves further attention.
Step 9: Assess what the group has accomplished in terms of new ideas or concepts identified or new areas that require more work or further brainstorming.
Step 10: Instruct each participant to select one or two areas that deserve the most attention. Tabulate the votes.
Step 11: Set the brainstorming group’s priorities based on the voting and decide on the next steps for analysis.
FUNCTIONAL ANALYSIS USING CRITICAL FACTORS ANALYSIS
6-23. Critical factors analysis (CFA) is an overarching analytic framework that assists analysts in identifying threat critical capabilities, threat critical requirements, and threat critical vulnerabilities that they can integrate into other structured analytic techniques. This assists friendly forces in effectively identifying windows of opportunity and threat vulnerabilities. At echelons above corps, CFA assists in identifying threat centers of gravity that friendly forces can use for operational planning:
-
- Critical capability is a means that is considered a crucial enabler for a center of gravity to function as such and is essential to the accomplishment of the specified or assumed objective(s) (JP 5-0).
- Critical requirement is an essential condition, resource, or means for a critical capability to be fully operational (JP 5-0).
- Critical vulnerability is an aspect of a critical requirement which is deficient or vulnerable to direct or indirect attack that will create decisive or significant effects
6-24. To conduct CFA successfully, identify threat critical capabilities. The more specific the threat critical capability, the more specificity analysts can apply to threat critical capabilities, requirements, and vulnerabilities. CFA is more effective when conducted by a team of experienced analysts. Additionally, structured brainstorming can amplify this technique. Analysts can determine windows of opportunity by identifying the common denominator or entity that encompasses those identified threat critical capabilities, requirements, and vulnerabilities.
6-25. Method. The following outlines those steps necessary to conduct CFA (see figure 6-3 on page 6-10):
- Step 1: Create a quad-chart. Identify a specific threat mission objective.
- Step 2: Identify all threat critical capabilities that are essential to achieve the threat mission objective and input in the top-right quadrant of the chart. (Threat must be able to achieve X.)
- Step 3: Identify all threat critical requirements—conditions or resources integral to critical capabilities developed in step 1—and input in the bottom-right quadrant of the chart. (To achieve X, the threat needs Y.)
- Step 4: Identify all threat critical vulnerabilities—elements related to threat critical requirements developed in step 2 that appear exposed or susceptible (at risk)—and input in the bottom-left quadrant of the chart. (The threat cannot lose Z.)
- Step 5: Analyze the chart to determine the windows of opportunity by identifying the common denominator (or entity) that encompasses those identified threat critical capabilities, requirements, and vulnerabilities and input in the top-left quadrant of the chart.
- Step 6: Identify all listed critical factors that friendly forces can directly affect to identify potential targets or topics for further collection.
OUTSIDE-IN THINKING
6-26. The outside-in thinking technique assists analysts in identifying the broad range of factors, forces, and trends that may indirectly shape an issue—such as global, political, environmental, technological, economic, or social forces—outside their area of expertise, but that may profoundly affect the issue of concern. This technique is useful for encouraging analysts to think critically because they tend to think from inside out, focusing on factors most familiar in their specific area of responsibility.
6-27. Outside-in thinking reduces the risk of missing important variables early in the analysis process; it should be the standard process for any project that analyzes potential future outcomes. This technique works well for a group of analysts responsible for a range of functional and/or regional issues.
6-28. Method. The following outlines those steps of outside-in thinking (see figure 6-4):
- Step 1: Identify the topic of study.
- Step 2: Brainstorm all key factors (operational variables [PMESII-PT]) that could impact the topic.
- Step 3: Employ the mission variables (METT-TC) to trigger new ideas.
- Step 4: Focus on those key factors over which a commander can exert some influence.
- Step 5: Assess how each of those factors could affect the analytic problem.
- Step 6: Determine whether those factors can impact the issue based on the available evidence.
RED HAT/TEAM ANALYSIS
6-29. The red hat/team analysis technique facilitates analysts’ modeling of threat behavior by attempting to formulate ideas on how the threat would think about an issue. Red hat/team analysis is also a type of reframing technique performed by analysts attempting to solve an intelligence problem by using a different perspective. They attempt to perceive threats and opportunities as would the threat in order to categorize the threat. Categories include but are not limited—
- Command and control.
- Movement and maneuver.
6-31. Method. The following outlines the steps to conduct red hat/team analysis:
- Step 1: Identify the situation and ask how the threat would respond to the situation.
- Step 2: Emphasize the need to avoid mirror imaging. Define the cultural and personal norms that would influence the threat’s behavior (use operational variables [PMESII-PT]/civil considerations [ASCOPE] and threat characteristics, threat doctrine, and threat intentions matrices as aids).
- Step 3: Develop first-person questions that the threat would ask about the situation.
- Step 4: Present results and describe alternative COAs the threat would pursue.
Note. Some publications differentiate between red hat analysis and red team analysis, while others describe them as being the same. When differentiated, red team analysis is categorized as a contrarian technique. For this publication, the two techniques are synonymous.
PART THREE
Intelligence Analysis Considerations
Chapter 7
Analytic Support to Army Forces and Operations
OVERVIEW
7-1. Although the intelligence analysis process does not change, the tasks performed by intelligence analysts differ significantly based on the echelon, the supported functional element, the Army strategic role, and the specific mission. As with many tasks, the most significant factor affecting analysis is time. Time includes both the amount of time to analyze a problem and the timeliness of the final analytical assessment to the decision maker.
ANALYSIS ACROSS THE ECHELONS
7-3. Intelligence analysts conduct analysis during combat operations to support Army forces at all echelons. The commander’s need for the continuous assessment of enemy forces focuses intelligence analysis. The analytical output of the intelligence warfighting function assists commanders in making sound and timely decisions. Analysts must understand at which points in an operation the commander needs specific PIRs answered in order to support upcoming decision points. This understanding assists analysts in creating a timeline for conducting analysis and identifying when information is no longer of value to the commander’s decision making.
7-4. Analytical elements at NGIC and at echelons above corps focus primarily on strategic- to operational- level analytic problems, analytical elements at the corps level focus on both tactical- and operational-level analytic problems, and analytical elements at echelons below corps focus on tactical-level analytic problems. The strategic, operational, and tactical levels of warfare assist commanders—informed by the conditions of their OEs—in visualizing a logical arrangement of forces, allocating resources, and assigning tasks based on a strategic purpose:
- Strategic level of warfare is the level of warfare at which a nation, often as a member of a group of nations, determines national or multinational (alliance or coalition) strategic security objectives and guidance, then develops and uses national resources to achieve those objectives (JP 3-0). At the strategic level, leaders develop an idea or set of ideas for employing the instruments of national power (diplomatic, informational, military, and economic) in a synchronized and integrated fashion to achieve national objectives.
- Operational level of warfare is the level of warfare at which campaigns and major operations are planned, conducted, and sustained to achieve strategic objectives within theaters or other operational areas (JP 3-0). The operational level links the tactical employment of forces to national and military strategic objectives, focusing on the design, planning, and execution of operations using operational art. (See ADP 3-0 for a discussion of operational art.)
- Tactical level of warfare is the level of warfare at which battles and engagements are planned and executed to achieve military objectives assigned to tactical units or task forces (JP 3-0). The tactical level of warfare involves the employment and ordered arrangement of forces in relation to each other.
NATIONAL AND JOINT ANALYTIC SUPPORT
7-5. Intelligence analysis support to national organizations and the joint force focuses on threats, events, and other worldwide intelligence requirements.
THEATER ARMY
7-6. At the theater army level, intelligence analysis supports the combatant commander’s operational mission requirements by enabling the theater command to apply capabilities to shape and prevent potential threat action. Theater army-level analytical activities include but are not limited to—
- Supporting theater campaign plans.
- Developing expertise to analyze threat characteristics within a region.
- Long-term analysis of a region and/or country that enables warning intelligence of imminent threat ground operations.
- Detailed analysis of multi-domain specific requirements.
- Serving as the Army’s interface to national and joint support for operational and tactical forces.
7-7. Analysts assigned to the theater army-level all-source intelligence cell can expect to work with other Services as well as other nations. Analytical assessment support to future operations focuses on threat activities, intent, and capabilities beyond 168 hours within a designated global region assigned to the combatant commander.
SUPPORT TO FUNCTIONAL ELEMENTS
7-12. The Army is committed to providing intelligence support across most unique functional elements. Although all of these elements perform IPB and collection management, the intelligence analysis requirements for these elements vary significantly based on the commander’s designated mission; therefore, when assigned, intelligence analysts must learn the mission-specific intelligence analysis requirements for their functional element.
ANALYSIS ACROSS THE ARMY’S STRATEGIC ROLES
7-13. Intelligence analysts must consider all intelligence requirements for operations to shape, prevent, prevail in large-scale ground combat, and consolidate gains.
- SHAPE OPERATIONAL ENVIRONMENTS
- PREVENT CONFLICT
- PREVAIL IN LARGE-SCALE GROUND COMBAT
- OFFENSIVE AND DEFENSIVE OPERATIONS IN LARGE-SCALE GROUND COMBAT
- CONSOLIDATE GAINS
- Consolidate Gains Through Stability Operations
7-23. A stability operation is an operation conducted outside the United States in coordination with other instruments of national power to establish or maintain a secure environment and provide essential governmental services, emergency infrastructure reconstruction, and humanitarian relief (ADP 3-0). In stability operations, success is measured differently from offensive and defensive operations. Time may be the ultimate arbiter of success: time to bring safety and security to an embattled populace; time to provide for the essential, immediate humanitarian needs of the people; time to restore basic public order and a semblance of normalcy to life; and time to rebuild the institutions of government and market economy that provide the foundations for enduring peace and stability. (See ADP 3-07 for information on stability operations.)
7-24. The main difference between stability operations and other decisive action is the focus and degree level of analysis required for the civil aspects of the environment. Unlike major combat—an environment dominated by offensive and defensive operations directed against an enemy force—stability operations encompass various military missions, tasks, and activities that are not enemy-centric.
7-25. Constant awareness and shared understanding of civil considerations (ASCOPE) about the environment are crucial to long-term operational success in stability operations. Analysts should classify civil considerations into logical groups (tribal, political, religious, ethnic, and governmental). Intelligence analysis during operations that focus on the civil population requires a different mindset and different techniques than an effort that focuses on defeating an adversary militarily.
7-26. Some situations (particularly crisis-response operations) may require analysts to focus primarily on the effects of terrain and weather, as in the case of natural disasters, including potential human-caused catastrophes resulting from natural disasters.
Chapter 8
Analysis and Large-Scale Ground Combat Operations
OVERVIEW
8-1. In future operations, intelligence analysis considerations should include a combination of factors (or elements) that analysts must understand to support the commander. Multi-domain operation considerations differ by echelon. These considerations have their greatest impact on Army operations during large-scale ground combat.
8-2. Situation development enables commanders to see and understand the battlefield in enough time and detail to make sound tactical decisions. Situation development assists in locating and identifying threat forces; determining threat forces’ strength, capabilities, and significant activities; and predicting threat COAs. Situation development assists commanders in effectively employing available combat resources where and when decisive battles will be fought, preventing commanders from being surprised.
8-3. Commanders and staffs require timely, accurate, relevant, and predictive intelligence to successfully execute offensive and defensive operations in large-scale ground combat operations. The challenges of fighting for intelligence during large-scale ground operations emphasize a close interaction between the commander and staff, since the entire staff supports unit planning and preparation to achieve situational understanding against a peer threat.
Since each echelon has a different situational understanding of the overarching intelligence picture, the analytical focus differs from one echelon to another.
Since each echelon has a different situational understanding of the overarching intelligence picture, the analytical focus differs from one echelon to another.
9-3. Long-term analytical assessments are produced using a deliberate and specific execution of the intelligence analysis process over a longer period of time that closely complies with the Intelligence Community Analytic Standards (to include the analytic tradecraft standards) established in ICD 203. This form of analysis includes the careful management of the overall effort, dedicating significant resources to the effort (for example, analysis is conducted by an analytic team), executing various iterations of analysis, and applying advanced structured analytic techniques within the effort.
THE BASICS OF ANALYTIC DESIGN
9-4. Managing long-term analytical assessments is accomplished by performing seven analytic design steps, as shown in figure 9-1 on page 9-2:
- Step 1: Frame the question/issue.
- Step 2: Review and assess knowledge.
- Step 3: Review resources.
- Step 4: Select the analytic approach/methodology and plan project.
- Step 5: Develop knowledge.
- Step 6: Perform analysis.
- Step 7: Evaluate analysis.
FRAME THE QUESTION/ISSUE
9-5. Properly framing the question greatly increases the chance of successful long-term analysis. The analytic team starts with understanding the requestor’s requirement by identifying relevant topics and issues that break down into a primary question that can be analyzed. Framing the question includes refining and scoping the question to carefully capture the requestor’s expectations, mitigate bias, craft an objective analytic question, and develop subquestions. This step results in an initial draft of the primary intelligence question and is followed by reviewing and assessing existing knowledge.
REVIEW AND ASSESS KNOWLEDGE
9-6. Reviewing and assessing knowledge involves an overlap of the analytical effort with collection management. Step 2 includes reviewing available information and intelligence, the collection management plan, and results of on-going intelligence collection, as well as identifying information gaps.
REVIEW RESOURCES
9-7. After understanding what knowledge is available and identifying information gaps, the next step is reviewing available resources, such as tools, personnel, and time.
SELECT THE ANALYTIC APPROACH/METHODOLOGY AND PLAN PROJECT
9-8. Using the results of steps 1 through 3, the analytic team finalizes the primary intelligence question and subquestions, selects the analytic approach/methodology, and develops a project plan. The analytic approach/methodology includes the specific analytic techniques, who will perform each technique, and the sequence of those techniques to ensure analytic insight and mitigate bias.
DEVELOP KNOWLEDGE
9-9. Developing knowledge is the last step before performing analysis. Although discussed as a separate step in the process, developing knowledge occurs continually throughout the process. The analytic team gathers all relevant intelligence and information through ongoing collection, intelligence reach, and internal research.
PERFORM ANALYSIS
9-10. Steps 1 through 5 set the stage for the deliberate execution of analytic techniques, to include adjusting the project plan, if necessary, and assessing the analytical results using the context that was developed while framing the question/issue.
EVALUATE ANALYSIS
9-11. Evaluating analysis, the final step of the process, results in the final analytical results and associated information necessary to make a presentation to the requestor. Evaluating analysis includes assessing the analytical results and the impact of analytic gaps and unconfirmed assumptions, performing analysis of alternatives, and assigning a confidence level to the analytic answer.
COLLABORATION DURING ANALYTIC DESIGN
9-12. Collaboration is critical to long-term analytical assessments and occurs between different stakeholders across the intelligence community. This collaboration ensures a diversity of perspective and depth in expertise that is impossible through any other means. Four specific areas in which collaboration is invaluable are—
- Bias mitigation: Analytic teams with diverse backgrounds and different perspectives can effectively identify and check assumptions, interpret new information, and determine the quality of various types of information.
- Framing/Knowledge review: Analytic teams can engage early in the process to build context, craft analytic questions, share information sources, and develop analytical issues.
- Methodology building: Analytic teams assess the credibility of the analytic approach and clarity of the argument through various means, including peer reviews.
- Perform analysis: Analytic teams can perform various analytic techniques, identify hypotheses, and analyze alternatives as a group to improve the quality of the analytical effort.
TRANSITIONING FROM THE ANALYTIC DESIGN PROCESS TO PRESENTING THE RESULTS
9-13. Managing long-term analytical assessments includes not only presenting an analytic answer but also a confidence level to the answer and alternative hypotheses or explanations for gaps and uncertainty. During evaluate analysis, the last step of the process, the analytic team decides whether the question requires more analysis, and therefore, whether the assessment is exploratory or authoritative and ready to present to the requestor. If the results are ready for presentation, the analytic team deliberately prepares to present those results. Transitioning from long-term analysis to presenting the analytic answer includes stepping back from that analysis, reviewing the assessment, and clarifying the relevance of the analytical results. Then the analytic team determines—
- What is the message: The message characterizes whether the assessment is authoritative or exploratory and includes the “bottom line” of the assessment. Additionally, the assessment includes any shifts in analysis that occurred over time, any impacts on the requestor (decisions and future focus areas), the confidence level, alternative hypotheses, and indicators.
- What is the analytical argument: The analytic team develops an outline for logically progressing through the analytical assessment. An argument map is a useful tool to ensure a logical analytical flow during the presentation and to ensure the message is easily understood. The team may use basic interrogatives (who, what, when, where, why, and how) or a similar tool to capture the critical elements of the message to present to the requestor.
- What are critical gaps and assumptions: Gaps and assumptions identified during the evaluate analysis step become limitations to the certainty of the analytical assessment, and, in some cases, drive future analytical efforts. The analytic team may insert gaps and assumptions within the message and clearly discuss the level of impact on the assessment (for example, in the source summary statement or in the “bottom line” statement).
- What reasonable analytical alternatives remain: For authoritative assessments, answering the questions “what if I am wrong” and “what could change my assessment” provides analysis of alternatives that should be included in the assessment to explain what remains uncertain.
- What product or products should be presented: Determine the best format for the presentation that facilitates the discussion of the argument. If it is exploratory analysis, the format should allow the analytic team to effectively describe the new understanding of the topic and its relevance to the requestor. The team should consider the following when choosing the format: requestor preference, specific tasking/requirement, complexity of the argument, urgency/time constraints, and potential interest of others.
CROSSWALKING ANALYTIC DESIGN WITH TACTICAL INTELLIGENCE ANALYSIS
9-14. Tactical intelligence analysis and analytic design have similarities but also differ in a number of ways. Tactical operations are often chaotic and time-constrained, and therefore, driven by specific commander-centric requirements (for example, PIRs and targeting requirements). The commander and staff plan and control operations by employing several standard Army planning methodologies, including but not limited to the Army design methodology, the MDMP, and Army problem solving.
Analytic design to tactical intelligence analysis crosswalk
Step 1: Frame the question/ issue
Step 2: Review and assess knowledge
Step 3: Review resources
Step 4: Select analytic approach/methodology and plan project
Step 5: Develop knowledge
Step 6: Perform analysis
Step 7: Evaluate analysis
Appendix A
Automation Support to Intelligence Analysis
AUTOMATION ENABLERS
A-1. Many different automation and communications systems are vital to intelligence analysis; they facilitate real-time collaboration, detailed operational planning, and support to collection management. Software updates and emerging technologies continue to improve current intelligence analysis systems to operate more effectively in garrison and in deployed environments.
A-2. Automation processing capabilities and tools readily available on today’s computers enable the intelligence analysis process. The software or related programs in current automation systems allow intelligence analysts to screen and analyze significantly more data than in previous years. The development of analytical queries, data management tools, and production and dissemination software enhances the intelligence analysis process, facilitating the commander’s situational understanding and timely decision making across all echelons.
A-3. Automation is crucial to intelligence analysis; there are four aspects for analysts to consider:
- Automation is a key enabler to the processing and fusion of compatible information and intelligence, but the individual analyst remains essential in the validation of any assessment.
- The analyst must still be heavily involved in building specific queries, analyzing the final assessment, and releasing intelligence.
- Automation relies on the cyberspace domain, which requires extensive defensive actions to ensure data is not corrupted from collection to dissemination. Deception and corruption within the cyberspace domain are likely occurrences; therefore, they require monitoring by both cyberspace experts and intelligence analysts.
- Automation relies on available communications to receive, assess, and disseminate information across the command at all echelons. During periods of disrupted or degraded communications, the intelligence analyst must understand and may have to execute intelligence analysis without the aid of automation.
DISTRIBUTED COMMON GROUND SYSTEM-ARMY
A-4. All communications, collaboration, and intelligence analysis within the intelligence warfighting function are facilitated by the DCGS-A—the intelligence element of Army command and control systems and an Army program of record.
The following highlights some of the most significant tools across the phases of the intelligence analysis process:
Screen:
- Axis Pro/Link Diagram is a software product used for data analysis and investigations that assists in mapping and understanding threat networks comprising threat equipment, units, facilities, personnel, activities, and events.
- Threat Characteristics Workstation provides tools to develop and manage threat characteristics, track battle damage assessments (BDAs), and create doctrinal and dynamic situation templates. The workstation also allows analysts to create graphic and written comparisons of threat capabilities and vulnerabilities, which are included in the intelligence estimate.
- MovINT Client provides an integrated, temporal view of the battlefield, and aggregates air- and ground-force locations, moving target intelligence, aircraft videos, sensor points of interest, and target locations.
Analyze:
- SOCET GXP (also known as Softcopy Exploitation Toolkit Geospatial Exploitation Product), an advanced geospatial intelligence software solution, uses imagery from satellite and aerial sources to identify, analyze, and extract ground features, allowing for rapid product creation.
- Terra Builder/Explorer provides professional-grade tools for manipulating and merging imagery and elevation data of different sizes and resolutions into a geographically accurate terrain database. It also allows analysts to view, query, analyze, edit, present, and publish geospatial data.
- Text Extraction allows analysts to quickly extract information from reports, associate elements with relationships, and identify existing matches in the database.
- ArcGIS (also known as Arc Geographic Information System) allows analysts to visualize, edit, and analyze geographic data in both two- and three-dimensional images and has several options for sharing with others.
Integrate:
- Multifunction workstation interface, a customizable interface that streamlines workflow, supports the commander’s operations by providing accurate and timely intelligence and analysis to support Army forces.
- ArcGIS. (See description under Analyze.)
- Google Earth, a geo-browser that accesses satellite and aerial imagery, ocean bathymetry, and other geographic data of a network, represents the Earth as a three-dimensional globe.
Produce:
- Office 2013 is a suite of productivity applications that includes Microsoft Word, Excel, PowerPoint, Outlook, OneNote, Publisher, Access, InfoPath, and Link.
- Multifunction workstation interface. (See description under Integrate.)
- i2 Analyst Notebook is a software product used for data analysis and investigation. It is part of the Human Terrain System, an Army program that embeds social scientists with combat brigades.
A-5. DCGS-A, like any automation system, is subject to software updates, including changes to the current hardware as well as lifecycle replacements. As such, future versions may include greater analytical cross discipline and domain collaboration and improved interoperability with command and control systems and knowledge management components.
Appendix B
Cognitive Considerations for Intelligence Analysts
OVERVIEW
Analytic skills are the ability to collect, visualize, and examine information in detail to make accurate analytical conclusions. Analytic skills enable Army Soldiers to complete simple and complex tasks; they enable intelligence analysts to use deliberate thought processes to examine a situation critically and without bias.
THE INTELLIGENCE ANALYST
Intelligence analysis support to any operation involves separating useful information from misleading information, using experience and reasoning, and reaching an assessment or conclusion based on fact and/or sound judgment. The conclusion is based on the intelligence analyst’s—
- Experience, skill, knowledge, and understanding of the operation.
- Knowledge of the various intelligence disciplines.
- Knowledge of information collection.
- Understanding of the threats within an OE.
- In-depth understanding of the threat’s military and political structure.
The intelligence personnel conducting the analysis of information and intelligence use basic to advanced tradecraft skills and tools and integrated automated programs to sort raw forms of data and information and apply research skills to formulate an assessment. Analysts are responsible for the timely dissemination and/or presentation (proper writing and presentation techniques) of known facts and assumptions regarding the OE to the commander and staff. There are established tradecraft standards that direct the individual or group of analysts to ensure the analysis meets a common ethic to achieve analytical excellence.
Intelligence analysts follow guidelines, such as the ICD 203 Intelligence Community Analytic Standards, that promote a common ethic for achieving analytical rigor and excellence and personal integrity in analytical practices. (See appendix C.) Additionally, they must build their foundational understandings and integrate their learned skills—critical thinking and embracing ambiguity. Intelligence analysts must be willing to change their determinations over time. Training, knowledge, and experience further develop analysts’ expertise, as these aspects are essential in helping analysts deal with the uncertain and complex environments.
The OE is complex, and the threat attempts to hide its objectives, intent, and capabilities when possible. Therefore, intelligence analysts embrace ambiguity, recognize and mitigate their own or others’ biases, challenge their assumptions, and continually learn during analysis. To assist in mitigating some of the uncertainty associated with conducting intelligence analysis, analysts should increase their proficiency in using analytic techniques and tools, including automated analytic tools and systems, to identify gaps in their understanding of the OE. Furthermore, to be effective, intelligence analysts must have a thorough understanding of their commanders’ requirements and the intelligence analysis process (see chapter 2), which directly contributes to satisfying those requirements.
BASIC THINKING ABILITIES
Army intelligence personnel are required to use basic thinking abilities and complex skills to analyze information. These skills relate to an analyst’s ability to think. Intelligence analysis focuses primarily on thinking. Intelligence analysts must continually strive to improve the quality of their thinking to support the commander’s requirements. The three basic thinking abilities for intelligence analysis are
- Information ordering.
- Pattern recognition.
INFORMATION ORDERING
Information ordering is the ability to follow previously defined rules or sets of rules to arrange data in a meaningful order. In the context of intelligence analysis, this ability allows analysts, often with technology’s assistance, to arrange information in ways that permit analysis, synthesis, and a higher level of understanding. The arrangement of information according to certain learned rules leads analysts to make conclusions and disseminate the information as intelligence. However, such ordering can be inherently limiting—analysts may not seek alternative explanations because the known rules lead to an easy conclusion.
PATTERN RECOGNITION
Humans detect and impose patterns on apparently random entities and events in order to understand them, often doing this without awareness. Intelligence analysts impose or detect patterns to identify relationships, and often to infer what they will do in the future. Pattern recognition lets analysts separate the important from the less important, even the trivial, and conceptualize a degree of order out of apparent chaos. However, imposing or seeking patterns can introduce bias. Analysts may impose culturally defined patterns on random aggregates rather than recognize inherent patterns, thereby misinterpreting events or situations.
REASONING
Reasoning is what allows humans to process information and formulate explanations in order to assign meaning to observed actions and events. The quality of any type of reasoning is based on how well analysts’ analytic skills have been developed, which occurs through practice and application. Improving analytic skills occurs by implementing individual courses of study and organizational training strategies.
There are four types of reasoning that guide analysts in transforming information into intelligence:
- Deductive reasoning is using given factual information or data to infer other facts through logical thinking. It rearranges only the given information or data into new statements or truths; it does not provide new information. Therefore, deductive reasoning is, “If this is true, then this is also true.”
- Inductive reasoning is looking at given factual information or data for a pattern or trend and inferring the trend will continue. Although there is no certainty the trend will continue, the assumption is it will. Therefore, inductive reasoning is, “Based on this trend, this is probably true.”
- Abductive reasoning is similar to inductive reasoning since conclusions are based on probabilities or “guessing.” Therefore, abductive reasoning is, “Because this is probably true, then this may also be true.”
- Analogical reasoningisamethodofprocessinginformationthatreliesonananalogytocompare the similarities between two specific entities; those similarities are then used to draw a conclusion—the more similarities between the entities, the stronger the argument.
Note. Of the four types of reasoning, only deductive reasoning results in a conclusion that is always true. However, during the conduct of intelligence analysis, this statement can be misleading. During operations, there are few situations in which both a rule is always true and there is adequate collection on the threat to apply deductive reasoning with certainty.
Even in the best of circumstances, inductive, abductive, and analogical reasonings cannot produce conclusions that are certain. All of the types of reasoning rely on accurate information, clear thinking, and freedom from personal bias and group thinking.
CRITICAL AND CREATIVE THINKING
Combining good analytic techniques with an understanding of the requirements, area knowledge, and experience is the best way of providing accurate, meaningful assessments to commanders and leaders. However, subject matter expertise alone does not guarantee the development of logical or accurate conclusions. Intelligence analysts apply critical thinking skills to provide more holistic, logical, ethical, and unbiased analysis and conclusions. Critical thinking ensures analysts fully account for the elements of thought, the intellectual standards of thought, and the traits of a critical thinker.
Critical thinking is a deliberate process of analyzing and evaluating thought with a view to improve it. The elements of thought (the parts of a person’s thinking) and the standards of thought (the quality of a person’s thinking) support critical thinking. Key critical thinking attributes include human traits such as intellectual courage, integrity, and humility. Creative thinking involves creating something new or original.
Analysts use thinking to transform information into intelligence. Critical thinking can improve many tasks and processes across Army operations, especially the conduct of intelligence analysis. Critical thinking includes the intellectually disciplined activity of actively and skillfully analyzing and synthesizing information. The key distinction in critical thinking is a reflective and self-disciplined approach to thinking.
For the analyst, the first step in building critical thinking skills is to begin a course of personal study and practice with a goal of improving the ability to reason. This means moving outside the Army body of doctrine and other Army professional writing when beginning this study. Most of the body of thought concerning critical thinking extends throughout various civilian professions, particularly those in academia
ELEMENTS OF THOUGHT
Whenever people think, they think for a purpose within a point of view based on assumptions leading to implications and consequences. People use concepts, ideas, and theories to interpret data, facts, and experiences in order to answer questions, solve problems, and resolve issues. These eight elements of thought assist in describing how critical thinking works:
Element 1—Purpose. All thinking has a purpose. Critical thinkers will state the purpose clearly. Being able to distinguish the purpose from other related purposes is an important skill that critical thinkers possess. Checking periodically to ensure staying on target with the purpose is also important.
Element 2—Question at issue. All thinking is an attempt to figure something out, to settle some question, or to solve some problem. A critical thinker can state questions clearly and precisely, express the questions in several ways to clarify their meaning and scope, and break the questions into subquestions.
Element 3—Information. All thinking is based on data, information, and evidence. Critical thinkers should support their conclusions with relevant information and be open to actively searching for information that supports and contradicts a position. All information should be accurate, clear, and relevant to the situation being analyzed.
Element 4—Interpretation and inference. All thinking contains interpretations and inferences by which to draw conclusions and give meaning to data. Critical thinkers should be careful to infer only what the evidence implies and to crosscheck inferences with each other. They should clearly identify the assumptions and concepts that led to the inferences, as well as consider alternative inferences or conclusions. Developing and communicating well-reasoned inferences represent the most important parts of what intelligence analysts provide because they aid situational understanding and decision making.
Element 5—Concepts. All thinking is expressed through, and shaped by, concepts. A concept is a generalized idea of a thing or a class of things. People do not always share the same concept of a thing. For example, the concept of happiness means something different to each individual because happiness comes in many different forms. For a star athlete, happiness may be winning; for a mother, happiness may be seeing her children do well. To ensure effective communications, critical thinkers identify the meaning they ascribe to the key concepts used in their arguments and determine if others in their group ascribe different meanings to those concepts.
Element 6—Assumptions. All thinking is based, in part, on assumptions. In this context, an assumption is a proposition accepted to be true without the availability of fact to support it. Assumptions are layered throughout a person’s thinking and are a necessary part of critical thinking. The availability of fact determines the amount of assumption an analyst must use in analysis. Critical thinkers clearly identify their assumptions and work to determine if they are justifiable.
Element 7—Implications and consequences. All thinking leads somewhere or has implications and consequences. Analysts should take the time to think through the implications and consequences that follow from their reasoning. They should search for negative as well as positive implications.
Element 8—Point of view. All thinking is performed from some point of view. To think critically, analysts must recognize a point of view, seek other points of view, and look at them fair-mindedly for their strengths and vulnerabilities.
By applying the eight elements of thought, analysts can develop a checklist for reasoning. Developing and using a checklist, as shown in table B-1, can help analysts focus their efforts to a specific problem and avoid wasting time on irrelevant issues or distractions.
AVOIDING ANALYTICAL PITFALLS
Critical thinking is a mental process that is subject to numerous influences. Intelligence analysts involved in analyzing complex situations and making conclusions are prone to the influences that shape and mold their view of the world and their ability to reason. These influences are referred to as analytical pitfalls. The elements of thought, intelligence standards, and intellectual traits assist analysts in recognizing these pitfalls in their analysis and the analysis performed by others. Logic fallacies and biases are two general categories of analytical pitfalls.
Fallacies of Omission
Fallacies of omission occur when an analyst leaves out necessary material in a conclusion or inference. Some fallacies of omission include oversimplification, composition, division, post hoc, false dilemma, hasty generalization, and special pleading:
Oversimplification is a generality that fails to adequately account for all the complex conditions bearing on a problem. Oversimplification results when one or more of the complex conditions pertaining to a certain situation is omitted and includes ignoring facts, using generalities, and/or applying an inadequately qualified generalization to a specific case.
Fallacy of composition is committed when a conclusion is drawn about a whole based on the features of parts of that whole when, in fact, no justification is provided for that conclusion.
Fallacy of division is committed when a person infers that what is true of a whole must also be true of the parts of that whole.
False dilemma (also known as black-and-white thinking) is a fallacy in which a person omits consideration of more than two alternatives when in fact there are more than two alternatives.
Hasty generalizations are conclusions drawn from samples that are too few or from samples that are not truly representative of the population
Fallacies of Assumption
Fallacies of assumption implicitly or explicitly involve assumptions that may or may not be true. Some fallacies of assumption include begging the question, stating hypotheses contrary to fact, and misusing analogies:
-
- Begging the question (also known as circular reasoning) is a fallacy in which the conclusion occurs as one of the premises.
- It is an attempt to support a statement by simply repeating the statement in different and stronger terms. For example, a particular group wants democracy. America is a democratic nation. Therefore, that group will accept American-style democracy.
- When asked why the enemy was not pinned down by fire, the platoon leader replied, “Our suppressive fire was inadequate.” The fallacy in this response is that by definition suppressive fire pins down the enemy or is intended to pin him down. Since the platoon failed to pin down the enemy, the inadequacy of this fire was self-evident.
- Stating hypotheses contrary to fact occurs when someone states decisively what would have happened had circumstances been different. Such fallacies involve assumptions that are either faulty or simply cannot be proven. For example, the statement, “If we had not supported Castro in his revolutionary days, Cuba would be democratic today” is contrary to fact. Besides being a gross oversimplification, the assumption made in the statement cannot be verified.
- Misusing analogies occurs when one generalizes indiscriminately from analogy to real world. One method for weakening an analogous argument is by citing a counter-analogy. Analogies are strong tools that can impart understanding in a complex issue. In the absence of other evidence, intelligence analysts may reason from analogy. Such reasoning assumes that the characteristics and circumstances of the object or event being looked at are similar to the object or event in the analogy.
The strength of a conclusion drawn from similar situations is proportional to the degree of similarity between the situations. The danger in reasoning from analogy is assuming that because objects, events, or situations are alike in certain aspects, they are alike in all aspects. Conclusions drawn from analogies are inappropriately used when they are accepted as evidence of proof. Situations may often be similar in certain aspects, but not in others. A counter-analogy weakens the original analogy by citing other comparisons that can be made on the same basis.
BIASES
A subjective viewpoint, bias indicates a preconceived notion about someone or something. Biases generally have a detrimental impact on intelligence analysis because they obscure the true nature of the information. Intelligence analysts must be able to recognize cultural, organizational, personal, and cognitive biases and be aware of the potential influence they can have on judgment.
Cultural Bias
Americans see the world in a certain way. The inability to see things through the eyes of someone from another country or culture is cultural bias. Biases interfere with the analyst’s ability to think the way a threat commander might think or to give policymakers informed advice on the likely reaction of foreign governments to U.S. policy. Also known as mirror imaging, cultural bias attributes someone else’s intentions, actions, or reactions to the same kind of logic, cultural values, and thought processes as the individual analyzing the situation. Although cultural bias is difficult to avoid, the following measures can lessen its impact:
l Locate individuals who understand the culture:
- Include them in the intelligence analysis process.
- Ask their opinion about likely responses to friendly actions.
- Take care when using their opinions since they may be subject to biases regarding ethnic groups or cultures in the region and their knowledge may be dated or inaccurate.
Locate regional experts, such as foreign and regional area officers, who have lived or traveled through the area and are somewhat conversant regarding the culture. Assess the quality of the information provided against the level of knowledge and experience the individual has for that culture or region.
Organizational Bias
Most organizations have specific policy goals or preconceived ideas. Analysis conducted within these organizations may not be as objective as the same type of analysis done outside the organization. Groupthink and best case are organizational biases that can significantly skew internal analysis.
Groupthink. This bias occurs when a judgment is unconsciously altered because of exposure to selective information and common viewpoints held among individuals. Involving people outside the organization in the analysis can help identify and correct this bias.
Best case. This bias occurs when an analyst presents good news or bad news in the most optimistic light. The judgment is deliberately altered to provide only the information the commander wants to hear. Analysts can avoid this bias by having the moral courage to tell the commander the whole story, good and bad.
Cognitive Bias
The intelligence analyst evaluates information from a variety of sources. The degree of reliability, completeness, and consistency varies from source to source and even from report to report. This variance often creates doubt about the reliability of some sources. Cognitive biases that affect the analyst are—
Vividness.Clearandconciseorvividinformationhasagreaterimpactonanalyticalthinkingthan abstract and vague information. A clear piece of information is held in higher regard than a vague piece of information that may be more accurate. Analysts must consider that an enemy may use deception to portray vivid facts, situations, and capabilities that they want the friendly intelligence effort to believe.
Absence of evidence. Lack of information is the analyst’s most common problem, especially in the tactical environment. Analysts must do their best with limited information and avoid holding back intelligence because it is inconclusive. To avoid this bias, the analyst should—
- Realize that information will be missing.
- Identify areas where information is lacking and consider alternative conclusions.
- Adapt or adjust judgments as more information becomes available.
- Consider whether a lack of information is normal in those areas or whether the absence of information itself is an indicator.
Oversensitivity to consistency. Consistent evidence is a major factor for confidence in the analyst’s judgment. Information may be consistent because it is appropriate, or it may be consistent because it is redundant, is from a small or biased sample, or is the result of the enemy’s deception efforts. When making judgments based on consistent evidence, the analyst must—
- Be receptive to information that comes in from other sources regardless of whether it supports the hypothesis or not.
- Be alert for circular reporting, which is intelligence already obtained by the unit that is then reformatted by other units and intelligence organizations, modified slightly, and disseminated back to the unit. This is a common problem; particularly in digital units, where large volumes of information are being processed. It helps to know, to the degree possible, the original source for all intelligence to ensure that a circular report is not used as evidence to confirm an intelligence estimate or conclusion.
Persistence on impressions. When evidence is received, there is a tendency to think of connections that explain the evidence. Impressions are based on these connections. Although the evidence eventually may be discredited, the connection remains and so do the impressions.
Dependency on memory. The ability to recall past events influences judgment concerning future events. Since memory is more readily available, it is easy to rely on memory instead of seeking new information to support analysis.
Acceptance of new intelligence. Often new intelligence is viewed subjectively; either valued as having more value or less value than current intelligence.
Appendix C
Analytic Standards and Analysis Validation
INTELLIGENCE COMMUNITY ANALYTIC STANDARDS
C-1. During intelligence analysis, the conclusions reached should also adhere to analytic standards, such as those established by the Director of National Intelligence in ICD 203. This directive establishes the analytic standards that govern the production and evaluation of national intelligence analysis to meet the highest standards of integrity and rigorous analytic thinking.
The following identify and describe the five ICD 203 Intelligence Community Analytic Standards, including the nine analytic tradecraft standards:
- Objective: Analysts must perform their functions with objectivity and awareness of their own assumptions and reasoning. They must employ reasoning techniques and practical mechanisms that reveal and mitigate bias. Analysts should be alert to the influences of existing analytical positions or judgments and must consider alternative perspectives and contrary information. Analysis should not be unduly constrained by previous judgments when new developments indicate a modification is necessary.
- Independent of political consideration: Analytical assessments must not be distorted by, nor shaped for, advocacy of a particular audience, agenda, or policy viewpoint. Analytical judgments must not be influenced by the force of preference for a particular policy.
- Timely: Analysis must be disseminated in time for it to be actionable. Analytical elements must be continually aware of events of intelligence interest and of intelligence requirements and priorities in order to provide useful analysis at the right time.
- Based on all available sources of intelligence information: Analysis should be informed by all relevant information available. Analytical elements should identify and address critical information gaps and work with collection managers and data providers to develop access and collection strategies.
- Implement and exhibit the analytic tradecraft standards: See paragraphs C-3 through C-14.
ANALYSIS VALIDATION
C-2. Intelligence analysis and the resultant judgments are incomplete without the estimative language that provides both the probability that an event will occur and the confidence level of the analyst making this assessment. Analysts employ the analytic tradecraft standards to assess probabilities and confidence levels and the actions associated with analytical rigor to draw accurate conclusions.
ANALYTIC TRADECRAFT STANDARDS
C-3. Intelligence analysts exhibit and implement the nine analytic tradecraft standards, one of the five ICD 203 Intelligence Community Analytic Standards. Specifically, they—
- Properly describe the quality and credibility of all underlying sources, information, and methodologies.
- Properlyexpressandexplainuncertaintiesassociatedwithmajoranalyticaljudgments.
- Properly distinguish between underlying intelligence information and analysts’ assumptions and judgments.
- Incorporate analysis of alternatives.
- Demonstrate customer relevance and address implications.
- Use clear and logical argumentation.
- Explain change to or consistency of analytical judgments.
- Make accurate judgments and assessments.
- Incorporate effective visual information where appropriate.
Properly Describe the Quality and Credibility of All Underlying Sources, Information, and Methodologies
C-4. Analytical products should include all underlying sources, information, and methodologies from which analytical judgments are based. Factors affecting source quality and credibility should be described using source descriptors in accordance with ICD 206, Sourcing Requirements for Disseminated Analytic Products. Such factors can include accuracy and completeness, possible denial and deception, age and continued currency of information, and technical elements of collection, as well as source access, validation, motivation, possible bias, or expertise. Source summary statements, described in ICD 206, should be used to provide a holistic assessment of the strengths or vulnerabilities in the source base and explain which sources are most important to key analytical judgments.
Properly Express and Explain Uncertainties Associated with Major Analytical Judgments
C-5. Analysts must properly express and explain uncertainties associated with any major analytical judgment. When briefing their analytical results, analysts, at a basic level, must be able to assess the likelihood of an event happening, expressed by using estimative language. Then, they must express their confidence level—high, moderate, or low—in that assessment. (See figure C-1.) For intelligence analysts to reach a high level of confidence in the accuracy of their analytical assessment, they must apply the actions of high analytical rigor found in table C-1 on page C-5.
Assessing the Likelihood of an Event Happening
C-6. Phrases (such as we judge, we assess, and we estimate) commonly used to convey analytical assessments and judgments, are not facts, proofs, or knowledge. Intelligence analysts use estimative language, shown in figure C-1, to convey their assessment of the probability or likelihood of an event and the level of confidence ascribed to the judgment.
Expressing Confidence in Assessments
C-7. Confidence levels express the strength of the assessment given the reasoning, methodologies, gaps, and assumptions; the number, quality, and diversity of sources; and the potential for deception.
Properly Distinguish Between Underlying Intelligence Information and Analysts’ Assumptions and Judgments
C-8. Analytical products should clearly distinguish statements that convey underlying intelligence information used in analysis from statements that convey assumptions or judgments. Assumptions are suppositions used to frame or support an argument; assumptions affect analytical interpretation of underlying intelligence information. Judgments are conclusions based on underlying intelligence information, analysis, and assumptions. Products should state assumptions explicitly when they serve as the linchpin of an argument or when they bridge key information gaps. Products should explain the implications for judgments if assumptions prove to be incorrect. As appropriate, products should also identify indicators that, if detected, would alter judgments.
Incorporate Analysis of Alternatives
C-9. Analysis of alternatives is the systematic evaluation of differing hypotheses to explain events or phenomena, explore near-term outcomes, and imagine possible futures to mitigate surprise and risk. Analytical products should identify and assess plausible alternative hypotheses. This is particularly important when major judgments must contend with significant uncertainties, or complexity, such as forecasting future trends, or when low probability events could produce high-impact results. In discussing alternatives, products should address factors such as associated assumptions, likelihood, or implications related to Army forces. Products should also identify indicators that, if detected, would affect the likelihood of identified alternatives.
Demonstrate Relevance and Address Implications
C-10. Analytical products should provide information and insight on issues relevant to the commanders and address the implications of the information and analysis they provide. Products should add value by addressing prospects, context, threats, or factors affecting opportunities for action.
Use Clear and Logical Argumentation
C-11. Analytical products should present a clear main analytical conclusion up front. Products containing multiple judgments should have a main analytical conclusion that is drawn collectively from those judgments. All analytical judgments should be effectively supported by relevant intelligence information and coherent reasoning. Products should be internally consistent and acknowledge significant supporting and contrary information affecting judgments.
Explain Change To or Consistency Of Analytical Judgments
C-12. Analysts should state how their major judgments on a topic are consistent with or represent a change from those in previously published analysis or represent initial coverage of a topic. Products need not be lengthy or detailed in explaining change or consistency. They should avoid using reused or unoriginal language and should make clear how new information or different reasoning led to the judgments expressed in them. Recurrent products should note any changes in judgments; absent changes, recurrent products need not confirm consistency with previous editions. Significant differences in analytical judgment, such as between two intelligence community analytical elements, should be fully considered and brought to the attention of customers.
Make Accurate Judgments and Assessments
C-13. Analytical products should apply expertise and logic to make the most accurate judgments and assessments possible, based on the information available and known information gaps. In doing so, analytical products should present all judgments that would be useful to commanders and should include difficult judgments in order to minimize the risk of being wrong. Inherent to the concept of accuracy is that the analytical conclusion that the analyst presents to the commander should be the one the analyst intended to send. Therefore, analytical products should express judgments as clearly and precisely as possible, reducing ambiguity by addressing the likelihood, timing, and nature of the outcome or development.
Incorporate Effective Visual Presentations When Feasible
C-14. Analysts should present intelligence in a visual format to clarify an analytical conclusion and to complement or enhance the presentation of intelligence and analysis. In particular, visual presentations should be used when information or concepts, such as spatial or temporal relationships, can be conveyed better in graphic form, such as tables, flow charts, and images coupled with written text. Visual presentations may range from a plain display of intelligence information to interactive displays for complex issues and analytical concepts. Visual presentations should always be clear and pertinent to the product’s subject. Analytical content in a visual format should also adhere to other analytic tradecraft standards.
ANALYTICAL RIGOR
C-15. Analytical rigor is the application of precise and exacting standards to better understand and draw conclusions based on careful consideration or investigation. There are eight primary action-metrics that lead to analytical rigor. When analysts combine these action-metrics with the intelligence analysis process, they can determine the analytical sufficiency of their conclusions.
Consider alternative hypotheses: Hypothesis exploration describes the extent to which multiple hypotheses were considered in explaining data.
Evaluate depth of research: Information search relates to the depth and breadth of the search process used in collecting data.
Validate information accuracy: Information validation details the levels at which information sources are corroborated and cross-validated.
Examine source bias: Stance analysis is the evaluation of data with the goal of identifying the stance or perspective of the source and placing it into a broader context of understanding.
Scrutinize strength of analysis: Sensitivity analysis considers the extent to which the analyst considers and understands the assumptions and limitations of their analysis.
Amalgamate information: Information synthesis refers to how far beyond simply collecting and listing data an analyst went in their process.
Incorporate expert input: Specialist collaboration describes the degree to which an analyst incorporates the perspectives of domain experts into their assessments.
Assess breadth of collaboration: Explanation critique is a different form of collaboration that captures how many different perspectives were incorporated in examining the primary hypotheses.
Appendix E
Intelligence Production
E-1. The fundamental requirement of intelligence analysis is providing timely, accurate, reliable, and predictive intelligence assessments about the threat and OE to the commander and staff. Therefore, intelligence production requires the dissemination of reports and presentations to support operations. These reports involve various updates to IPB and collection management templates and matrices.
INTELLIGENCE PRODUCTS
E-2. The intelligence products described in this appendix are organized based on the following:
- Threat and OE Analysis reports.
- Current intelligence reports.
- Supplemental analytical reports.
- Analytical assessments that support orders and briefings.
THREAT AND OPERATIONAL ENVIRONMENT ANALYSIS REPORTS
E-3. The intelligence estimate, intelligence running estimate, and Annex B (Intelligence) to the operation order (OPORD) each maintain an analytical assessment of threat forces’ strengths, vulnerabilities, tactics, composition, disposition, training, equipment, and personnel, as well as other OE considerations before, during, and after operations (revision of the original estimate).
Intelligence Estimate
E-5. An intelligence estimate is the appraisal, expressed in writing or orally, of available intelligence relating to a specific situation or condition with a view of determining the courses of action open to the enemy or adversary and the order of probability of their adoption (JP 2-0). Since intelligence analysts will have performed IPB to support the commander’s MDMP effort and likely participated in a thorough staff war- gaming effort to validate friendly and threat COAs, the intelligence estimate is a version of the staff planning effort and part of the larger OPORD.
E-6. The intelligence staff develops and maintains the intelligence estimate to disseminate information and intelligence that define the threat COA along with the requirements to determine the adoption of a COA. The assessments in the intelligence estimate of COA development, including threat strengths, compositions, dispositions, and vulnerabilities, form the basis for future intelligence analytical requirements.
Intelligence Running Estimate
E-7. Effective plans and successful execution hinge on accurate and current running estimates. A running estimate is the continuous assessment of the current situation used to determine if the current operation is proceeding according to the commander’s intent and if the planned future operations are supportable (ADP 5-0). Failure to maintain accurate running estimates may lead to errors or omissions that result in flawed plans or bad decisions during execution. Each staff element is responsible for updating its portion of the running estimate as the operation unfolds.
E-8. The intelligence running estimate enables the intelligence operational officer/noncommissioned officer to continually update the commander on the mission execution from the intelligence perspective. Unlike other intelligence products, the intelligence running estimate combines both the analysis of friendly and allied forces’ intelligence activities to support current operations.
E-9. Figure E-3 illustrates an example intelligence running estimate. The analysis focuses on current threat activities, strengths, and assessed intent/objectives to provide the commander and associated reporting requirements with a consistent summary of the threat. As the operation progresses, the collaborative effort may involve further analysis of the terrain and weather, monitoring the flow of displaced persons on the battlefield as inhibitors to friendly force maneuverability, and, when necessary, additional security requirements.
CURRENT INTELLIGENCE REPORTS
E-10. Current intelligence reports address the current reporting of threat activities on the battlefield. The goal is to provide the commander with predictive analysis of the threat’s intentions for future operations based on what conditions occurred by either threat or friendly actions during the past reporting period. This requires extensive intelligence analytical rigor in assessing threat activities and vigilance to the friendly scheme of maneuver.
Intelligence Summary
E-11. The intelligence summary (also known as INTSUM) is a periodic publication of the G-2/S-2 assessment of the threat situation on the battlefield. It provides the commander with context to support decision making based on the G-2/S-2’s interpretation and conclusions about the threat, terrain and weather, and civil considerations over a designated period of time. This is typically identified in unit SOPs and in associated OPORD reporting instructions. The intelligence summary also provides COA updates based on the current situation. Unit SOPs designate the command’s format for preparing and disseminating an intelligence summary. At a minimum, the intelligence summary should contain the paragraphs and subparagraphs as shown in figure E-4.
Graphic Intelligence Summary
E-12. The graphic intelligence summary (also known as GRINTSUM) can be included with the intelligence summary or disseminated as a separate analytical report. It is a graphical representation of the intelligence summary, with emphasis on the threat forces location compared to friendly forces’ location. The graphic intelligence summary also includes current PIRs and a summary of threat activities. (See figure E-5 on page E-8.) Since the emphasis of a graphic intelligence summary is graphical, most of the written details should be captured in the intelligence summary or an accompanying report.
E-13. There are challenges with using the graphic intelligence summary:
- The size of the graphical portrayal of the OE is often driven by critical facts about the threat that must be shown. Therefore, it is advisable to begin with a general OE map and zoom in on key areas. Ensure the written assessment includes the necessary details by either referencing the accompanying intelligence summary or other report or including the details in the Notes page of a PowerPoint slide.
- The file size must follow the commander’s guidance or unit SOPs. Typically, the graphic intelligence summary is one or two graphics (PowerPoint slides) and limited in bit size for ease in emailing and posting on unit web portals.
Intelligence Report
E-14. The intelligence report (also known as INTREP) demonstrates the importance of intelligence analysis. It is a standardized report, typically one page, used to establish a near current-threat operational standpoint. It points to the threat’s responses to friendly actions and the battlefield environment. Intelligence reports may also highlight time-sensitive critical activities that require corroboration with other units and higher echelons.
SUPPLEMENTAL ANALYTICAL REPORTS
E-15. Supplemental analytical reports, such as the periodic intelligence and supplementary intelligence reports, do not fall into a predetermined dissemination timeline. Periodic intelligence reports and supplementary intelligence reports follow a similar format, designated by a senior operational intelligence officer and staff. These reports allow for expanded analytical efforts, providing assessments of a technical or historical comparative nature. However, once the analysis begins to shape an assessment of threat intentions or capabilities, the urgency for releasing these analytical reports may increase.
Periodic Intelligence Report
E-16. The periodic intelligence report (also known as PERINTREP) is a summary of the intelligence situation that covers a longer period than the intelligence summary. (See figure E-7.) It is a means of disseminating detailed information and intelligence, including threat losses, morale, assessed strength, tactics, equipment, and combat effectiveness.
E-17. The periodic intelligence report includes but is not limited to sketches, overlays, marked maps or graphics, and annexes, providing a written and visual representation of the information and/or intelligence. The report is disseminated through the most suitable means based on its volume and urgency.
Supplementary Intelligence Report
E-18. The supplementary intelligence report (also known as SUPINTREP) is a comprehensive analysis of one or more specific subjects, typically the result of a request or to support a particular operation. This report is formatted similarly to a periodic intelligence report, but it addresses analysis over an extended period of time. Typically, the detailed analysis is from an accumulation of national assessments of threat actions, tactics, and doctrine identified during combat—normally a post-combat review. Maximum use of sketches, photos, overlays, marked maps or graphics, and annexes provides a written and visual representation of the information and/or intelligence. The supplementary intelligence report is disseminated based on the intelligence it contains and the commander’s requirements.
E-19. Specific reports may pertain to but are not limited to the following:
- Technical intelligence summary includes detailed analysis of captured military equipment, communications devices, and can include post-explosive reports.
- Enemy prisoner of war interrogation reports from tactical to national sources.
- Translation of captured enemy documents (DOMEX).
- Cyberspace security updates.
Medical or environmental hazards.
Changes to civil political and other civilian authorities.
ANALYTICAL ASSESSMENTS THAT SUPPORT ORDERS AND BRIEFINGS
E-20. In addition to designated intelligence production requirements, the intelligence staff also provides analytical assessments to orders, briefings, and staff events, as described in FM 6-0. (See table E-1.) Normally, the intelligence analysis identifies the current threat situation and assessed threat capabilities (often tied to a threat COA); the same information exists in the intelligence summary, intelligence report, and intelligence running estimate. For intelligence analysts, the commander, and often the key staff officer, defines the requirement and may provide additional detailed requirements in unit SOPs.
Appendix F
Intelligence Support to Targeting
F-1. The targeting effort is cyclical and closely tied to combat assessments. Targeting is a complex and multidiscipline effort that requires coordinated interaction among many command and staff elements. The functional element necessary for effective collaboration is represented in the targeting working group. Intelligence analysts perform a number of critical tasks as part of this working group and the overall targeting effort. (See ATP 3-60 for more information on targeting.)
TARGETING GUIDELINES
F-2. The threat presents a large number of targets that must be engaged with available information collection assets and attack assets. The targeting process assesses the benefits and the costs of engaging various targets in order to achieve the desired end state. Adhering to the five targeting guidelines should increase the probability of creating desired effects while diminishing undesired or adverse collateral effects:
- Targeting focuses on achieving the commander’s objectives.
- Targeting seeks to create specific desired effects through lethal and nonlethal actions.
- Targetingdirectslethalandnonlethalactionstocreatedesiredeffects.
- Targeting is a fundamental task of the fires warfighting function that encompasses many disciplines and requires participation from many staff elements and components.
- Targeting creates effects systematically.
TARGETING GUIDANCE AND CATEGORIES
F-3. The commander’s targeting guidance must be articulated clearly and simply to enhance understanding. The guidance must be clearly understood by all warfighting functions, especially by the intelligence staff. Targeting guidance must focus on essential threat capabilities and functions that interfere with the achievement of friendly objectives.
F-4. The commander’s targeting guidance describes the desired effects to be generated by fires, physical attack, cyberspace electromagnetic activities, and other information-related capabilities against threat operations. Targeting enables the commander, through various lethal and nonlethal capabilities, the ability to produce the desired effects. Capabilities associated with one desired effect may also contribute to other effects. For example, delay can result from disrupting, diverting, or destroying threat capabilities or targets. Intelligence personnel should understand and only use the 14 terms used in ATP 3-60 to describe desired effects:
- Deceive.
- Defeat.
- Degrade.
- Delay.
- Deny.
- Destroy.
- Destruction.
- Disrupt.
- Divert.
- Exploitation.
- Interdict.
- Neutralize.
- Neutralization.
- Suppress.
F-5. To effectively target the threat, friendly forces use deliberate and dynamic targeting. Deliberate targeting prosecutes planned targets, while dynamic targeting prosecutes targets of opportunity and changes to planned targets. During both categories of targeting, friendly forces may prosecute normal, time-sensitive, and sensitive targets.
TARGETING METHODOLOGY
F-6. The targeting methodology organizes the efforts of the commander and staff to accomplish key targeting requirements. This methodology is referred to as the decide, detect, deliver, and assess methodology. The methodology assists the commander and staff in deciding which targets must be acquired and engaged and in developing options to engage those targets. Options can be lethal or nonlethal, organic, or supporting assets at all levels as listed—maneuver, electronic attack, psychological, attack aircraft, surface-to-surface fires, air to surface, other information-related capabilities, or a combination of these operations.
F-7. The decide, detect, deliver, and assess methodology is an integral part of the MDMP. During the MDMP, targeting becomes more focused based on the commander’s guidance and intent. A very important part of targeting is identifying potential fratricide situations and the necessary coordination measures to positively manage and control the attack of targets. These measures are incorporated in the coordinating instructions and appropriate annexes of the operation plan or OPORD.
DECIDE
F-8. The decide function of the targeting methodology provides the overall focus and sets priorities for information collection and attack planning. It is the most important targeting function and requires close interaction between the intelligence, plans, operations, and fires cells, and the servicing judge advocate. This step draws heavily on the staff’s knowledge of the threat, a detailed IPB (which occurs simultaneously), and a continuous assessment of the situation. Targeting priorities are addressed for each phase or critical event of an operation. The decisions made are reflected in visual products as follows:
- HPT list. The high-payoff target list is a prioritized list of high-payoff targets by phase of the operation (FM 3-09). A high-payoff target is a target whose loss to the enemy will significantly contribute to the success of the friendly course of action (JP 3-60). An HPT is an HVT that must be acquired and successfully engaged for the success of the friendly commander’s mission. A high-value target is a target the enemy commander requires for the successful completion of the mission (JP 3-60).
- Information collection plan. The information collection plan focuses the collection effort to answer PIRs and other significant requirements. If an HPT is not designated as a PIR, it must still be supported by collection. The information collection plan usually supports the acquisition of more HPTs. (See ATP 2-01.)
- Target selection standard matrices. These matrices address accuracy or other specific criteria requiring compliance before targets can be attacked.
- Attack guidance matrix. The attack guidance matrix is a targeting product approved by the commander, which addresses the how and when targets are engaged and the desired effects (ATP 3-60).
Intelligence Preparation of the Battlefield
F-9. In the same manner that targeting involves coordinated interactions among the commander and entire staff, IPB involves the active participation of the entire staff. The interactions between intelligence personnel and fires personnel are important during the IPB process. (For more information on staff collaboration during IPB, see ATP 2-01.3.) Many of the IPB products significantly influence or are brought forward into the targeting effort. These products assist in target value analysis and war gaming. Some examples of important IPB products include—
- The modified combined obstacle overlay.
- Civil considerations (ASCOPE) products.
- Weather effects products.
- Threat models with recommended HVTs.
- Situation templates with threat time phase lines.
- Event templates and matrices, which have named areas of interest (NAIs).
Target Value Analysis and War Gaming
F-10. From the coordination and work performed during the IPB effort, the targeting working group, especially the intelligence staff and targeting officer, perform target value analysis that yields HVT lists (which may include high-value individual lists) for a specific threat COA. Target value analysis continues the detailed analysis of relevant threat factors, including doctrine, tactics, equipment, capabilities, and expected actions for a specific threat COA. The target value analysis process identifies HVT sets associated with critical threat functions.
F-11. Target spreadsheets (or target folders, as appropriate) identify an HVT compared to a type of operation. Target spreadsheets give detailed targeting information for each HVT, which is used during IPB and war gaming. The intelligence staff and targeting officer collaborate to develop and maintain the target spreadsheet.
Target Development
F-30. Target development is the systematic examination of potential targets and their components, individual targets, and even elements of targets to determine the necessary type and duration of the action that must be exerted on each target to create an effect that is consistent with the commander’s specific objective (JP 3-60). This analysis includes deconfliction, aim point recommendations, target materials production, and collateral damage estimation. Target development generally results in products such as target folders, information collection requirements, and target briefs. Detailed analysis should characterize the function, criticality, and vulnerabilities of each target, linking targets back to targeting objectives and measures of effectiveness. Target development includes target vetting and target validation.
Note. Although target development is discussed under detect in ATP 3-60, for this publication, it is more useful to discuss this step under decide.
Target Vetting
F-31. Vetting is a part of target development that assesses the accuracy of the supporting intelligence to targeting (JP 3-60). Vetting establishes a reasonable level of confidence in a target’s designated functional characterization. The BCT intelligence cell accomplishes this by reviewing all target data for accuracy. At a minimum, the assessment includes at a review of target identification, significance, collateral damage estimation, geospatial or location issues, impact on the threat or friendly forces, impact of not conducting operations on the target, environmental sensitivity, and intelligence gain or loss concerns. Vetting does not include an assessment of compliance with the law of war or rules of engagement.
Target Validation
F-32. Validation is a part of target development that ensures all candidate targets meet the objectives and criteria outlined in the commander’s guidance and ensures compliance with the law of war and rules of engagement (JP 3-60). Targets are validated against multinational concerns during some operations. Target vetting and validation should recur as new intelligence is collected or the situation changes. Target validation is performed by targeting personnel, in coordination with planners, servicing judge advocate, and other experts, as required. (See ATP 3-60 for a list of useful target validation questions.)
DETECT
F-33. As much as possible, the procedures and supporting products that are used during the detect function should be developed during the decide function. However, the targeting team must periodically update decisions made during the decide function concerning IPB products, HPT lists, target synchronization matrices, attack guidance matrices, the information collection plan, and the OPORD. Updating these products can occur throughout the detect, deliver, and assess functions of the targeting methodology.
F-34. Based on targeting priorities, the targeting working group establishes target detection and tracking priorities. Target tracking is inherent in target detection. The fires cell provides the intelligence cell with the degree of accuracy required and dwell time for a target to be eligible for engagement. Then the collection manager can match those requirements to the target location error of the information collection asset.
DELIVER
F-39. The deliver function executes the target attack guidance and supports the commander’s plan once HPTs have been located and identified. Target engagement requires several decisions and actions, which are grouped into tactical and technical decisions.
Tactical Decisions
F-40. Tactical decisions are made based on the analysis that was accomplished during target development. Tactical decisions reconfirm or determine the—
- Time of the engagement.
- Desired effect, degree of damage, or both.
- Delivery system to be used through weaponeering and collateral damage estimation.
Time of Engagement and Desired Effect
F-41. Time of engagement and the desired effect that will be achieved on the target are critical considerations. The commander needs to weigh the operational risk of tactical patience balanced against the immediacy of the planned action in the attack guidance matrix.
Delivery System
F-42. This step builds on the analysis performed during target development and includes weaponeering and collateral damage estimation. If the target was already planned, then this step starts with determining if the delivery means is available and still the best weapon or means for the engagement. When the target is a target of opportunity then some analysis is necessary to work through completion of a quick target development.
F-43. Weaponeering is the process of determining the specific means required to create a desired effect on a given target (JP 3-60). As much as possible, weaponeering should be planned during the plan function during target development. Weaponeering considers munitions delivery error and accuracy, damage mechanisms and criteria, probability of kill, weapon reliability, and trajectory.
Technical Decisions
F-45. Once the tactical decisions have been made, the G-3/S-3 directs the appropriate unit to engage the target. The fires cell provides the asset or system manager with selected time of engagement, desired effects, and any special restraints or requests for particular munitions types.
ASSESS
F-47. The assess function of the targeting methodology is nested in the overall continuous assessment of operations within the operations process. Assessment is directly tied to the commander’s decisions throughout the planning, preparation, and execution of operations. Planning for assessment identifies key aspects of the operation that the commander directs be closely monitored, and where the commander wants to make the decisions. Commanders and staffs consider assessment ways, means, and measures. ADP 5-0 discusses overall operational assessment, including measures of effectiveness, measures of performance, and indicators. Intelligence plays a major role in operational assessment.
F-48. Intelligence also plays a major role in assessment as a part of the targeting methodology. The assess function of the targeting methodology is performed through combat assessment. Combat assessment is the determination of the effectiveness of force employment during military operations (JP 3-60). Combat assessment comprises three elements:
- Munitions effectiveness assessment.
- Reengagement recommendation.
F-49. Together, BDA and munitions effectiveness assessment provide the commander and staff with an assessment of the effects achieved against targets and whether the targeting guidance was met. Based on this information, the staff can recommend reengagement when necessary.
Battle Damage Assessment
F-50. Battle damage assessment is the estimate of damage composed of physical and functional damage assessment, as well as target system assessment, resulting from the application of lethal or nonlethal military force (JP 3-0).
Producing BDA is primarily an intelligence cell responsibility but requires coordination across the staff, similarly to IPB and most steps of intelligence support to targeting. BDA requirements should be captured as PIRs or as similar high-priority information collection requirements. BDA provides—
- Commanders with an assessment of the target’s mission effectiveness, overall lstatus, capabilities (whether full or partial), and likely reactions or any change to their intent. This assists the staff in determining if the engagement is meeting the targeting guidance and is critical to any recommendation to reengage the target.
- Important analysis used to conduct quick target development and decide on the allocation or redirection of assets or weapon systems for any reengagement.
F-51. BDA has three components (see table F-1):
Physical damage assessment. The staff estimates the extent of physical damage to a target based on observed or interpreted damage. It is a post-attack target analysis coordinated among all units.
Functional damage assessment. All-source intelligence analysts assess the remaining functional or operational capability of the threat. The assessment focuses on measurable effects and estimates the threat’s ability to reorganize or find alternative means to continue operations. The targeting cell and staff integrate analysis with external sources to determine if the commander’s intent for fires has been met.
Target system assessment. The staff conducts a broad assessment of the overall impact and effectiveness of all types of engagement against an entire target system capability (for example, threat air defense artillery systems). All-source intelligence analysts assist the staff in assessing the threat’s combat effectiveness or major threat subordinate elements or capabilities needed to accomplish a threat mission. This is a relatively permanent assessment (compared to functional damage assessment) that can be used for more than one mission.
F-52. BDA requirements for specific HPTs are determined during the decide function. Often information collection assets can answer either target development and target acquisition requirements or BDA, but not both types of requirements. An asset used for BDA may be unavailable for target development and target acquisition requirements. The intelligence cell receives, processes, and disseminates results that are analyzed based on desired effects to the targeting team attack.
F-53. The targeting team should consider the following BDA principles:
- BDA should measure what is important to commanders, not make important what is easily measurable.
- BDA should be objective. When receiving a BDA product from another echelon, the conclusions should be verified (time permitting) to identify and resolve discrepancies among BDA analysts at different headquarters.
- The degree of reliability and credibility of BDA relies largely on information collection assets. The quantity and quality of information collection assets influence whether the assessment is highly reliable (concrete, quantifiable, and precise) or has low reliability (estimation). Effective BDA uses more than one source to verify each conclusion.
F-54. BDA is more than determining the number of casualties or the amount of equipment destroyed. The targeting team can use other information such as—
- Whether the targets are moving or hardening in response to the attack.
- Changes in deception efforts and techniques.
- Whether the damage achieved is affecting the threat’s combat effectiveness as expected.