Notes from Bringing Intelligence About: Practitioners Reflect on Best Practices

Bringing Intelligence About: Practitioners Reflect on Best Practices

Russell G. Swenson, Editor

With a Foreword by Mark M. Lowenthal, Assistant Director of Central Intelligence

INTRODUCTION

Russell G. Swenson with David T. Moore and Lisa Krizan

This book is the product of studious self-reflection by currently serving intelligence professionals, as well as by those who are in a position, with recent experience and continuing contacts, to influence the development of succeeding generations of intelligence personnel. Contributors to this book represent eight of the fourteen organizations that make up the National Foreign Intelligence Community. A positive image of a community of professionals, engaged in public service, and concerned about continuous self- improvement through “best practices,” emerges from these pages.

Community partners, such as the Central Intelligence Agency (CIA), Defense Intelligence Agency (DIA), the National Security Agency (NSA), and the State Department’s Bureau of Intelligence and Research (INR), for example, share responsibilities for national security issues that allow individual collectors, analysts, issue managers and offices to work together on interagency task forces.

with its strategic focus, the Intelligence Community expects to be forward-looking, envisioning future developments and their repercussions, whereas law enforcement intelligence efforts have typically focused on exploiting pattern analysis to link together the extralegal behavior of individuals and organizations with clear and legally acceptable evidence.

a facile claim of significant differences between law enforcement and national security intelligence may hold up to scrutiny only in terms of the scale of operations supported rather than professional intelligence techniques employed.  We may infer from these observations that the principles of intelligence collection and analysis addressed in this book will apply to intelligence creation in the broadly overlapping cultures of law enforcement and national security intelligence.

The U.S. Intelligence Community was subject during the 1990s to a congressionally mandated reduction in Intelligence Community personnel levels.2 This reduction occurred despite numerous small wars and the continuation of international criminal activity during the decade. When dissenters, such as former Director of Central Intelligence James Woolsey, “talked about the proliferators, traffickers, terrorists, and rogue states as the serpents that came in the wake of the slain Soviet dragon, [they were] accused of ‘creating threats’ to jus- tify an inflated intelligence budget.”3 Even government reports such as that of the United States Commission on National Security (commonly referred to as the Hart-Rudman Report), which warned of catastrophic attacks against the American homeland and a need for vigilance, were dismissed.4

even though collection methods are often arcane, methods of analysis are not very esoteric. Analytic methods used by intelligence analysts are readily available to specialists in the academic world.6 The commonalities that do exist among collectors and analysts across the Community have rarely been noted in intelligence literature. The essays in this book will help fill that gap, and should illuminate for non-specialists the important role of self-reflection among intelligence professionals who remain in government service.

6 Many if not most analysts have been exposed, by training or experimentation, to such techniques as link analysis, the Delphi technique, and analysis of competing hypotheses. Morgan D. Jones, former CIA analyst, has distilled the less structured techniques that intelligence analysts may employ in The Thinker’s Toolkit: 14 Powerful Techniques for Problem Solving (New York: Times Business, Random House, 1995 and 1998)

It may now be true that the value of intelligence to consumers is more dependent on the evaluation of information (grappling with mysteries) than on discovering “secrets.”7 If so, then the evaluation of social trends in various regions might best begin with systematic exploitation of authentic or “grass-roots” reporting from newspapers and other mass media.

language capabilities are indispensable for any country’s intelligence personnel who seek insights through indigenous mass media. Language capabilities must mirror those tongues used across the electronic media that represent the target entities

Garin defines the intelligence corporation in terms of a “learning organization” and then applies external standards from the Baldrige National Quality Program to selected intelligence-producing offices within the Defense Intelligence Agency. This benchmarking study not only identifies best practices, but also shows how such professional standards could be used to identify exemplary offices or individuals across the entire Intelligence Community.

If a communitarian ethos distinguishes intelligence professionals from their more individualistic and self-absorbed brethren in academia, then self- reflection among intelligence practitioners can also easily become a communal good. Tension between a communitarian and individualistic ethos can resolve itself among intelligence professionals through the strength of their bureaucratic (Weberian), nonmonastic tradition. The essays in this volume illustrate how, through self-reflection, that tension may be resolved. For example, individual professionals can easily spell out connections among these essays that would quickly move the discussion to a classified realm—into their “culture.”

That culture is typically characterized by fast-moving events and requirements that preclude introspection about the phenomena of intelligence collection and production.

Self-reflection not only allows the various agency sub-cultures to be displayed, as portrayed here, but also allows “insiders” to realize the subtle connections of their individual work to the overall enterprise. As a further illustration of this principle, the intense intellectual effort that characterized earlier eras of intelligence production and that continues as a part of the enduring culture, is evoked by the observations of William Millward, a World War II intelligence analyst at the UK’s Bletchley Park:

[Analysis] means reviewing the known facts, sorting out significant from insignificant, assessing them severally and jointly, and arriving at a conclusion by the exercise of judgment: part induction, part deduction. Absolute intellectual honesty is essential. The process must not be muddied by emotion or prejudice, nor by a desire to please.9

National intelligence collection management and intelligence analysis remain inherently government functions, and privatized intelligence—with its prospect of reduced congressional oversight—is even more antagonistic to the communal sharing of information than are the more stringently overseen bureaucratic fiefdoms.

to “bring intelligence about” from the point of view of the American people requires peeling back some of the thick mantle of secrecy that has shrouded individual initiatives and management approaches—Community best practices—employed in the execution of ordinary and extraordinary tasks. Readers who look closely at the observations set down by the authors here will find a serviceable tool for unwrapping some of the otherwise enigmatic enthusiasms and motivations of government intelligence professionals.

THE INTELLIGENCE PRO AND THE PROFESSOR: TOWARD AN ALCHEMY OF APPLIED ARTS AND SCIENCES

Pauletta Otis

Recent events have led to an increasing realization that all of the resources of the American public must be engaged in support of national security. The U.S. academic community and the U.S. National Foreign Intelligence Community are institutional managers of “information as power.” Yet their potential for useful cooperation and collaboration is unrealized and the relationship between the communities continues to be somewhat strained. This relationship has been the topic of discussions for many years in a somewhat benign security environment. With dramatic changes in that environment in late 2001, maximizing resources through enhanced cooperation and collaboration assumes critical importance.

From the academic’s point of view, the U.S. Government’s stability and its concern for the common welfare derive from the First Amendment. If a stable democracy depends on an informed public, the public cannot make good decisions without accurate and timely information. Without the “self-correcting” mechanism of free speech, a government tends to be subject to centrifugal forces, to spin on its own axis, and to become an entity quite separate from the citizenry.

Yet, on a practical level, we need to “keep secrets” from U.S. enemies, however they may be defined. A certain level of secrecy is required to protect national interests. How much secrecy is “best” in a democracy is subject to ongoing debate.11 The pendulum swings over time and in response to perceived threats to national security.

COMMONALITIES

Both the IC and the academic community work in the world of word-power. Both are populated by dedicated and committed Americans. In most cases, cooperative efforts have yielded the “best” for the United States in terms of public policy and decisionmaking. Yet, at other times, the communities have been at serious odds. The ensuing problems have been of significant concern to the attentive public and detrimental to the common good. These problems emanate not only from the theory of democracy but from competition over the nature of truth and value for the country. They can also be traced to concern about the nature and processes of intelligence gathering.

The IC and the academic community have much in common: both realize that information is power and that power comes in the form of words. These words form the basis of cultural assumptions, common concepts, and grand theory. In terms of information handling minutiae, intelligence specialists and academic scholars do much the same thing. They research, write, explain, and predict with excruciating self-consciousness and realization of the power of words. Intelligence professionals and academics contribute to pub- lic discussion in both oral forums and in written format. Intelligence professionals and academics are valued and participant citizens: They attend common religious services, participate in the education of children, contribute to community service organizations, and talk to neighbors. At this level, both are truly in the same “game.” In the public domain, both intelligence professionals and academics contribute information, analysis, and opinions. These are read and integrated by the public, by policymakers, and by the “other” community. This common body of public information and analysis is at the core of intelligent decisionmaking in the United States.

For 50-plus years, relations between the IC and academia have wavered between cooperation and competition. The basic tension inherent in keeping “secrets in a democracy” has been played out in specific activities and attitudes.

Between 1939 and 1961, Yale University was overtly supportive of the IC.15 Students and faculty were participants in intelligence collection and analysis. Study centers were financially supported and professional journals published. Information was collected and archived (including the Human Relations Area Files). The money was channeled to individuals or to institutions but seemed to raise few concerns.16 At the political level, the existence of a “community” of intelligence-related agencies and offices has been noted in legislation, suggesting wide acceptance of the idea of government intelligence operations as regularized institutions with special taskings in support of U.S. national security.17

CONTINUITY, CHANGE, AND THE DEVELOPMENT OF DIFFERENCES

The “position descriptions” of an intelligence specialist and of an academic are polar opposites: The basic job of an academic is to find information, collect factoids whether they are useful or not–and scatter the information to the winds hoping and praying to make more sense of the ensuing chaos. Intelligence specialists collect similar factoids for very specific productive purposes. Whereas the academic is spreading information, the intelligence professional is collecting, narrowing, and refining data to produce reports, even if at the same time continuing an academic interest in the subject. The academic may learn for the fun of it; the intelligence professional prefers, or learns to be able to actually do something with the information. In the long run, the academic must write, produce, and contribute to the body of professional literature, but there is not the pressure for immediate production that the intelligence professional must face. The academic can say “that is interesting”; the intelligence professional must certify that “it is important.”

During the Vietnam era, the question was raised in another context: Information concerning populations in Laos, Cambodia, and Vietnam, collected by anthropologists, was subsequently used in support of the war effort.

during the Cold War, the strain between academics and the IC contributed to a number of very ugly scenes. Distinctions were made between those academics who were “patriotic, loyal, and anti-communist,” and those who challenged authority in any context. Simple questioning became a sign of disloyalty.

Administrative penalties for individual scholars included blacklisting, failure to get tenure, lack of research support, and even dismissal from teaching positions. The public presentation of “truth” was simple: teaching loyalty to the country was the real job of the teacher/ aca- demic. Anything contrary was a serious and significant challenge to the future of the nation. And, if the public supported the university both financially and with its “children,” it in turn should be able to expect loyalty on the part of its faculty employees.

Both the intelligence professional and the academic are, to some extent, prisoners of their own bureaucracies and are habitually resistant to change.

Each community has a body of common knowledge which it assumes everyone working within the institutional framework knows and accepts. Because common knowledge is “common,” it is not self-conscious and most individuals do not even realize that basic assumptions are seldom challenged. For example: the academic community assumes that books are good and thus libraries should have them. Individuals in the IC often assume that books are inherently historical and that current information and intelligence must be “near-real-time” and continually refreshed. There is a distinct attitude of arrogance when these individuals discuss the currency of information with academics. What is not at issue, of course, is that the intelligence professional has to produce and is charged with an indications and warning responsibility mandating inclusion of current information.

Academics are known for jargon and obfuscation; the IC is known for producing acronyms seemingly at leisure. The tribal languages are difficult to learn without a willing translator.

Academics are notoriously fractious and democratic. Anyone can challenge anyone on any idea at any time–and do so. Sometimes it is not very polite or courteous even when the comment begins with: “I understand your premise, but, have you considered…..?” Bloodletting is the rule rather than the exception. Members of the IC appear to be more polite although “outsiders” may assume that they derive enjoyment from more discreet disagreement. The various forms of interaction are notable when individuals from the IC quietly participate in sections of the International Studies Association or the Political Science Association, or when academics vociferously contribute to conferences held under the auspices of the IC.

There is further the problem of “group-think” on both sides. Each assumes not infrequently that they have a lock on reality and often simply refuse to be open-minded. This occurred during the Vietnam era when the IC felt itself misunderstood and abused, and the academic community believed that its information and personnel were being used by intelligence agencies in ways of which academe did not approve. The complexity of the story is well-known but the mythology remains extant.19 Part of the mythology is that intelligence agencies will collect information from academics but not return or share information in a collegial manner.

The academic community believes that it is not afflicted with the IC’s tunnel vision and can contribute to lateral thinking that supports enhanced problem-solving capabilities.

THE ACADEMIC’S DECISION

The motivations for an academic to work for the IC are, of course, individualistic and mixed. He or she may simply want a “steady” job that pays well–even if only working over the summer break. There can also be an ego boost when intelligence experts value the information and assessment done by an academic in a particular field. The value attached to being “appreciated” for the information and analysis cannot be overstated as there is very little of that inherent in the academic community.

A number of problems may emerge if an academic goes to work for the IC, becomes a contractor, or produces information that can be used by that community. He may “go native,” that is, become more “intelligence-centric” than the IC needs or desires, all in an attempt to “belong.” There is a tendency for self-delusion and thinking that the information contributed is going to “make a difference.” Sometimes it does; sometimes it is just part of the package provided for a specific tasking. The academic really has no way of knowing what impact his or her work is having.

There are special areas in which the academic can contribute to the IC. A primary contribution is in the collection of facts and information using sources that the IC is precluded from using. Academics can generally travel more easily than intelligence analysts and can establish human networks across international boundaries that can be very useful especially in times of crisis. Academics generally are trained in the development of analytical frameworks and employ them with a degree of ease and clarity. Many academics contribute a unique ability to critique or evaluate projects based on a wide reading of the literature combined with “on the ground” experience. The good academic is always a per- verse thinker–looking to examine the null hypothesis or suggesting alternative explanations. They tend not to accept the “generally accepted explanation.” This habit can make them a bit difficult as conversationalists especially if “critique” is translated as “criticism.” A good academic scholar will insist on “rigor,” as in following the requirement to test basic assumptions, validate criteria used in hypothesis building, monitor the internal validity of hypotheses, and pinpoint the public implications of certain theories. Academic rigor mandates “elegance”; that is, a hypothesis must be important, clear, and predictive. Even in descriptive analysis, an academic insists on looking at a range of possible alternative choices and at the internal coherence of the final product.

There is another caveat: most academics believe that intelligence professionals are better paid and have more job security for doing similar work.

The ultimate caveat: it is hard to explain to professional colleagues that “I am not a spy.” The mystique outlives the reality. It can be overcome with honesty and some degree of good humor.

There are specific benefits to the IC if academics and scholars are employed wisely. One of the often-heard criticisms of the IC is that it has a special susceptibility to “group- think” because analysts seldom have internal challenges. Projects, after all, are group-produced and tend to reduce an individual’s unique contribution in favor of the final product. The academic, simply by the nature of the job, challenges the norms, supports individual initiative and creativity, and values novel or unconventional thinking.

SUGGESTIONS FOR MUTUAL SUPPORT AND MUTUAL BENEFIT

For all of the reasons just noted, it is important that the IC encourage the participation of academics. Academics can make significant contributions in creative thinking, rigor of analysis, and professional presentation of products.

But the choice of which academic to invite as a participant and the venue for participation are not givens. The IC might do well to choose individual academics not for their geographic convenience, near to the Washington or New York areas, but for the written scholarly work produced in their “home environment.”

CONCLUSION

Academics and intelligence professionals are concerned with much the same set of problems. The approaches to problem solving may differ but certainly the practices inherent in American democratic tradition have constructed the intellectual environment so as to provide common definitions of contemporary threats and challenges. Both communities agree that liberty, democracy, freedom, equality, law, and property are defensible American values. Cooperation between the IC and academics, and specific contributions that academics can make to the production of good intelligence, can and must be further supported. It would be foolish to waste available skills and talent. There is too much at stake.

VIA THE INTERNET:
NEWS AND INFORMATION FOR THE ANALYST FROM NORTH AFRICAN ELECTRONIC MEDIA

John Turner

The remarkable growth of electronic media over the past decade in Francophone North Africa provides specialists in the region with a wealth of news and other information. This survey and analysis of the French-language media of Algeria, Morocco, and Tunisia examines how these media go about providing information on North African politics, economics, and culture in a way that does take account of expatriates, who make up an important sector of the politically active population. For “outsiders’’ who specialize in the interpretation of the region, news media offer a “grass-roots’’-based prism through which future developments may be anticipated.1 Observable trends in the North Africa media, when carefully documented and placed in context, validate the contention of various authors, from Robert Steele and Gregory Treverton to Arthur Hulnick, who have addressed the promises and perils of devoting a greater share of intelligence resources to the exploitation open-source information. They have found much more promise than peril.2

MEDIA ISSUES: OBJECTIVITY AND ACCURACY

The political heritage of North Africa and its impact on the region’s media culture together raise the inevitable questions of editorial freedom, objectivity and accuracy. The ideal standard for North African publications, as in other countries, is one of adherence to strict standards of reporting and analyzing events in a climate of universal press freedom. However, “red lines’’ exist in editorial freedom, objectivity, and accuracy that are perilous to cross. North African media have suffered negative sanctions imposed by government officials as the result of reporting that has broken taboos.19

Objectivity is a concern for analysts and researchers in a media culture where most papers, not only those owned by manifestly political groups, but also the independent press, are controlled by interests that have a marked political agenda. Nevertheless, in both Algeria and Morocco a system of checks and balances exists, with daily and weekly print media and their electronic counterparts in vigorous competition for the domestic and foreign market. These publications maintain high standards of journalism, and are normally quite open about their particular biases or journalistic objectives. Such competition and declared interests ensure a degree of confidence in political and economic analysis that allows a high degree of confidence among business and government consumers who seek information for decisionmaking. Tunisia, with its dearth of news dailies and strong government influence in the media, remains problematic for the area specialist seeking in- depth analysis of some events, although most economic and business reporting remains of high quality and supports its citizens’ decisionmaking effectively.20

Accuracy has historically been less of a concern. A flourishing media culture and rivalry among news dailies and weeklies in Algeria and Morocco ensures a degree of competition that normally means news items are reliable; that is, consistently reported in different publications with a sufficient degree of detail and verification of factual data to make them satisfactory records of events. In such an information-rich climate, attempts by government or private parties to plant stories would be subject to immediate scrutiny and detection by the press. The latter frequently analyzes government reporting on an issue and takes it to task for various shortcomings, thereby ensuring, through the resulting reliability, that validity or accuracy is also addressed.

For observers of North African affairs, trends in the French-language electronic news media are reassuring for the near future. Area specialists and other interested professionals will have a steadily growing body of information from which they can draw for a reasonably authentic representation of social trends. If political, economic, and social information remains readily available without the geographic limitations associated with print media distribution, the confidence level in information associated with this region among government and business interests worldwide will continue to improve. The ability of North African intelligence specialists to take advantage of multiple rather than single-source information to provide analysis to decisionmakers will, however, increasingly have to take account of Arabic-language media as well as independent Internet news sites in order to ensure a more complete picture of events.

IMPROVING CIA ANALYSIS BY OVERCOMING INSTITUTIONAL OBSTACLES

Stephen Marrin

The accuracy of CIA intelligence analysis depends in part upon an individual analyst’s expertise, yet programs implemented to increase this expertise may not be sufficient to increase the accuracy of either an individual’s analysis or the institution’s output as a whole. Improving analytic accuracy by increasing the expertise of the analyst is not easy to achieve. Even if expertise development programs were to result in greater regional expertise, language capability, or an improved application of methodological tools, the production process itself still takes place within an institutional context that sets parameters for this expertise. The agency’s bureaucratic processes and structure can impede an analyst’s acquisition and application of additional expertise, preventing the full realization of the potential inherent in expertise development programs. Therefore, any new reform or program intended to improve analytic accuracy by increasing the expertise of its analysts should be supplemented with complementary reforms to bureaucratic processes — and perhaps even organizational structure — so as to increase the likelihood that individual or institutional improvement will occur.

CIA’s intelligence production may be subject to improvement, but making that a reality requires a sophisticated understanding of how an analyst operates within the Directorate for Intelligence’s [DI’s] institutional context. However, empirical verification of this hypothesis is impossible since, as Jervis notes, “[r]igorous measures of the quality of intelligence are lacking” and are insurmountably difficult to create.

this essay is a conceptual “what-if” exploration into the interplay between the acquisition and application of expertise on three levels: that of the individual, bureaucratic processes, and organizational structure.

THE GOAL: IMPROVING CIA’S ANALYSIS

Improving the accuracy of CIA’s finished intelligence products could make an immediate and direct improvement to national security policymaking as well as reduce the frequency and severity of intelligence failures.

In the author’s experience and observation, a DI analyst interprets the international environment through an information-processing methodology approximating the scientific method to convert raw intelligence data into finished analysis. The traditional “intelligence cycle” describes how an analyst integrates information collected by numerous entities and disseminates this information to policymakers. As William Colby—former Director of Cen- tral Intelligence (DCI) and veteran operations officer—notes, “at the center of the intelligence machine lies the analyst, and he is the fellow to whom all the information goes so that he can review it and think about it and determine what it means.4 Although this model depicts the process in sequential terms, more accurately the analyst is engaged in never-ending conversations with collectors and policymakers over the status of international events and their implications for U.S. policy. As part of this process, intelligence analysts “take the usually fragmentary and inconclusive evidence gathered by the collectors and processors, study it, and write it up in short reports or long studies that meaning- fully synthesize and interpret the findings,” according to intelligence scholar Loch Johnson.5

Intelligence failures of every stripe from the trivial to the vitally important occur every day for a variety of reasons, including the mis-prioritization of collection systems, hasty analysis, and inappropriately applied assumptions.

Administrators at the Joint Military Intelligence College note that “analysis is subject to many pitfalls — biases, stereotypes, mirror-imaging, simplistic thinking, confusion between cause and effect, bureaucratic politics, group-think, and a host of other human failings.”7 Yet most intelligence failures do not lead to direct negative consequences for the U.S. primarily because the stakes of everyday policymaking are not high, and errors in fact and interpretation can be corrected as the iterative process between intelligence and policy develops. As a result, most failures or inaccuracies are eventually corrected and usually never even noticed. However, sometimes intelligence failure is accompanied by either great policymaker surprise or serious negative consequences for U.S. national security, or both.

The CIA’s May 1998 failure to warn American policymakers of India’s intentions to test nuclear weapons is an illustration of both kinds of failure. This lapse — widely criti- cized by foreign policy experts and the press — highlighted intelligence limitations such as the DI’s inability to add together all indications of a possible nuclear test and warn top policymakers. According to New York Times correspondent Tim Weiner, these indicators included “the announced intentions of the new Hindu nationalist government to make nuclear weapons part of its arsenal, the published pronouncements of India’s atomic weapons commissioner, who said…that he was ready to test if political leaders gave the go-ahead, and …missile tests by Pakistan that all but dared New Delhi to respond.”8 CIA’s inability to integrate these indicators — a failure of analysis — led to charges of “lack of critical thinking and analytic rigor.”9 Admiral David Jeremiah—who headed the official investigation into the failure—concluded that intelligence failed to provide warning in part because analysts “had a mindset that said everybody else is going to work like we work,” otherwise known as mirror-imaging.

CIA’s search for ways to improve analytic accuracy and prevent intelligence failure— if successful—could have a positive impact on national security policymaking. The CIA is arguably the centerpiece of the United States’ fourteen-agency intelligence community (IC) and “has primary responsibility for all-source intelligence analysis in the [IC] and the preparation of finished national intelligence for the President and his top policymakers,” according to former CIA Inspector General Fred Hitz.12 If CIA analysis does have this kind of central role in influencing policy, improving its accuracy should provide policy- makers with the opportunity to create or implement policies that more effectively protect national security and advance national interests.

THE METHOD: INCREASING ANALYTIC EXPERTISE

Improving the capabilities and knowledge of the individual CIA analyst through pro- grams reflecting Admiral Jeremiah’s recommendation, is one way to improve the accuracy of intelligence analysis. An analyst’s expertise, defined as “the skill of an expert,” 13 is a crucial component for the production of accurate intelligence analysis. “In the lexicon of US intelligence professionals, ‘analysis’ refers to the interpretation by experts of unevaluated [‘raw’] information collected by the [IC],” according to Loch Johnson.14 The presumption is the more “expert” an analyst is the more accurate the resulting interpretation will be.

Analytic expertise is a multi-faceted concept because CIA uses a complex web of ana- lytic specialties to produce multi-disciplinary analysis. Not hired directly by the CIA or even the DI, most analysts are hired by the individual DI offices and assigned to “groups” that cover specific geographic areas, and are then assigned a functional specialty—”disci- pline” or “occupation” in DI terminology—such as political, military, economic, leader- ship, or scientific, technical, and weapons intelligence, according to CIA’s website.16 An analyst’s expertise can vary depending on his or her relative degree of regional knowl- edge, familiarity with disciplinary theory, and with intelligence methods in general:

■ Regional expertise is essentially area studies: a combination of the geography, history, sociology, and political structures of a defined geographic region. The DI’s regional offices are responsible for an analyst’s regional expertise and develop it by providing access to language training, regional familiarization through university courses, or in- house seminars.

■ Disciplinary expertise relates to the theory and practice that underlies the individual analytic occupations. For example, economic, military, political and leadership analysis are built on a bed of theory derived from the academic disciplines of economics, military science, political science, and political psychology, respectively. Disciplinary expertise can be acquired through previous academic coursework, on-the-job experience, or supplementary training.

For the most part each CIA analyst possesses a very small area of direct responsibility defined by a combination of regional area and discipline as they work in country teams with analysts of other disciplines and interact with other regional or disciplinary special- ists as the need arises. CIA’s small analytic niches create specialists, but their specialties must be re-integrated in order to provide high-level policymakers with a bigger picture that is more accurate and balanced than that arising from the limited perspective or knowledge of the niche analyst. This process of re-integration—known as “coordination” in DI parlance—allows analysts of all kinds to weigh in with their niche expertise on pieces of finished intelligence before they are disseminated. According to CIA analyst Frank Watanabe: “We coordinate to ensure a corporate product and to bring the substantive expertise of others to bear.”

former CIA officer Robert Steele observed that “[t]he average analyst has 2 to 5 years’ experience. They haven’t been to the countries they’re analyzing. They don’t have the language, the historical knowledge, the in-country residence time or the respect of their private-sector peers,” as reported by Tim Weiner.

Increasing expertise may not be sufficient to produce accuracy or prevent failure. As Jervis notes, “experts will [not] necessarily get the right answers. Indeed, the parochialism of those who know all the facts about a particular country that they consider to be unique, but lack the conceptual tools for making sense of much of what they see, is well known.”24 In addition, “[e]ven if the organizational problems…and perceptual impediments to accu- rate perception were remedied or removed, we could not expect an enormous increase in our ability to predict events” because “(t)he impediments to understanding our world are so great that… intelligence will often reach incorrect conclusions.”25 That is because human cognitive limitations require analysts to simplify reality through the analytic process, but reality simplified is no longer reality. As a result, “even experts can be wrong because their expertise is based on rules which are at best blunt approximations of reality. In the end any analytic judgment will be an approximation of the real world and therefore subject to some amount of error”26 and analytic inaccuracies—and sometimes intelligence failure—will be inevitable. Therefore, although increasing expertise is a goal, it cannot be the only goal for increasing DI capabilities.

FIRST INTERVENING FACTOR: BUREAUCRATIC PROCESSES

Bureaucratic processes can impede the acquisition or application of expertise gained in expertise development programs, thereby limiting any potential improvement in overall analytic accuracy. The DI is a bureaucracy, and like every bureaucracy creates “standard operating procedures” (SOPs) which are necessary for efficient functioning but over time usually prove to be rigid in implementation.

Analysts must have the opportunity to apply newly acquired expertise back at their desks in the DI for any improvement to result. However, if analysts do not have the opportunity to apply this expertise, it will likely wither for lack of practice. The DI produces many types of finished intelligence—some reportorial, some analytical, and some estima- tive—to meet policymakers’ varying needs for information. In addition to daily updates on developments worldwide, policymakers and their staffs also use information and analyses when crafting policy and monitoring its implementation. Obstacles to the development of expertise appear when shorter product types are emphasized over longer more research-oriented ones. Short turnaround products—including daily updates known as “current” intelligence—have at times been emphasized over other products for their greater relevancy to policymakers, but this emphasis at the same time has reduced the expertise of the DI writ large because they require different kinds of mental operations that reduce the scope and scale of an analyst’s research and knowledge.

Robert Jervis once observed that “the informal norms and incentives of the intelligence community often form what Charles Perrow has called ‘an error-inducing system.’ That is, interlocking and supporting habits of the community systematically decrease the likelihood that careful and penetrating intelligence analyses will be produced and therefore make errors extremely likely.”30 Bureaucratic processes can contribute to the creation of this kind of “error-inducing system.”

The Swinging Pendulum

For much of the DI’s history, analysts acquired expertise by writing long reports. As former CIA officer Arthur Hulnick notes: “a great deal of research was being done, but …much of it was done to enable the analyst to speak authoritatively on a current issue, rather than for publication.”

In a 1993 article, former analyst Jay Young argued that “the needs of the policy-maker too often get lost in the time-consuming, self-absorbed and corrosive intelligence research production process. The DI’s research program is run with the same rigid attention to production quotas as any five-year plan in the old USSR. … This fixation on numerical production leads managers to keep analysts working away on papers whose relevance may be increasingly questionable. …

In short, too much of the Agency’s longer-term research is untimely, on subjects of marginal importance and chock-full of fuzzy judgments.”

Losing Expertise to Gain Relevance

The swing of the pendulum that emphasizes policymaker relevance over analytic depth causes the DI to produce a large amount of current intelligence that prevents the acquisition and application of analytic expertise. Many analysts are facile data interpreters able to integrate large volumes of new information with existing knowledge, and interpret— based on underlying conceptual constructs — the significance of the new data in terms of its implications for U.S. foreign policymaking. This process provides policymakers with exactly what they are looking for from intelligence analysis. However, if provided with minimal time in which to integrate and process the information, the intelligence analyst by necessity cuts corners. When the DI emphasizes intelligence “on-demand,” analysts meet the much shorter deadlines by reducing the scope and scale of their research as well as sidestepping the more laborious tradecraft procedures by not rigorously scrutinizing assumptions or comparing working hypotheses to competing explanations.

Many times current intelligence analysis consists of a single hypothesis — derived within the first hour of the tasking — that the analyst intuitively believes provides the best explanation for the data. Current intelligence as a product has a lesser chance of being accurate because it lacks the self-conscious rigor that tradecraft entails even though it is the best that can be done under the press of quick deadlines.

The production of a single piece of current intelligence has limited effect on expertise because it draws on the knowledge and tools that an analyst has developed through training and prior analysis. However, if over time the analyst does not have the time to think, learn, or integrate new information with old to create new understandings, knowledge of facts and events may increase but the ability to interpret these events accurately decreases.

Intelligence scholar Robert Folker observed that in his interviews of intelligence analysts a “repeated complaint was the analyst’s lack of time to devote to thoughtful intelligence analysis. In a separate interview at CIA it was revealed that [intelligence analysts]… spend little time or effort conducting analysis.”39 Therefore, the effectiveness of expertise development programs in improving analytic accuracy may depend in part on whether the CIA is able to redress the over- emphasis on short-term analysis.

A Structural Fix?

The DI appears to be remarkably blind to differentiation in both analysis and analysts, perhaps because it assigns tasks to “analysts” and equates the output with “analysis.” As a result, “[w]e do not and never have used the term ‘analysis’ rigorously in the [IC]” according to Robert Bovey, formerly a special assistant to a DCI.41 Robert Jervis illustrated the problems of equating the two in 1988 by arguing that “most political analysis would better be described as political reporting” and that instead of following an analytical approach “the analyst is expected to summarize the recent reports from the field—“cable-gisting.” … [T]he reporting style is not analytical—there are few attempts to dig much beneath the surface of events, to look beyond the next few weeks, to con- sider alternative explanations for the events, or to carefully marshal evidence that could support alternative views.”42 Jervis correctly differentiated the analytical intelligence product from its non-analytic cousin, but failed to distinguish between the analysts best suited for each. The DI — instead of differentiating between analysts — uses a one-size- fits-all recruitment, placement, training, and promotion strategy, and for the most part views analysts as interchangeable.

As a result it has perennially had difficulty creating an appropriate mix of analytic abilities and skills for intelligence production when an issue or crisis develops. In particular, over time the shift in expertise corresponding to a preference for longer papers or shorter more current pieces is especially noticeable.

SECOND INTERVENING FACTOR: ORGANIZATIONAL STRUCTURE

The DI’s organizational structure also influences which kind of analyst expertise is acquired and applied in finished intelligence products. Political scientist Thomas Hammond argues that organizational structure impacts analytic output. He employs information-flow models to demonstrate that—given the same information—one group of intelligence analysts organized by discipline would produce different analytic output than another group organized by region. He also concludes that it is impossible to design a structure that does not impact output.44 If we accept his argument, then organizational structure always affects internal information flows, and likely outputs as well. Theoretically, therefore, an organizational structure divided along disciplinary lines will accentuate learning of political, economic, military, or leadership methodologies through constant, interactive contact and awareness of the projects of other analysts.

From approximately the early 1960s to 1981, the DI was structured primarily by discipline with political, economic, military, and leadership offices each subdivided by geography. In 1981 “(a) (DI)-wide reorganization…shuffled most analysts and created geographically based, or “regional” offices out of the previous organization.”47 According to former CIA officer Arthur Hulnick: “These [new] offices combined political, economic, military, and other kinds of research under “one roof,” thus making more detailed analytic research feasible. After some grumbling, the new offices began to turn out the in- depth products consumers had been seeking, while still providing current intelligence materials.”48 This integration of analytic disciplines in regional offices provided a more productive interpretation of the forces at work within a target country or region but also negatively affected the DI’s ability to maintain disciplinary knowledge.

The distribution of what had previously been centralized knowledge of analytic tools, specialized product formats, and disciplinary theory throughout the DI meant that new leadership analysts were not well trained in their discipline. These new analysts relied solely on the fast-dissipating knowledge of the handful of former LDA officers who happened to be assigned to their team or issue. In addition, actual physical co-location did not occur for months — and in some cases years — due to lack of space in CIA’s overcrowded headquarters building. As a result of being “out of sight, out of mind,” leadership analysts were frequently not informed of ongoing projects, briefings, and meetings, and such incidents had a negative impact on the finished analytical product. When leadership analysts were not included in briefings, the DI risked failing to keep its consumers fully informed of both leadership dynamics and changes within the country or region. In addition, products published and disseminated without coordination at times contained factual errors such as the wrong names or positions for foreign government officials, or distortions in analysis due to the lack of leadership analyst input.

Therefore, the elimination of a leadership analysis-based office resulted in both increased incorporation of leadership analysis and insight into the regional teams’ products, and decreased corporate knowledge and expertise in leadership analysis as a discipline.

PUTTING THE PIECES TOGETHER AGAIN

By the mid-1990s DI analysts were realizing that while multi-disciplinary analysis on country teams made for better integration of disciplines, it also led to the dissipation of disciplinary knowledge. Former LDA analysts’ efforts to sustain their hold on disciplinary knowledge triggered similar efforts by political, economic and military analysts to both sustain and reconstruct occupational-specific knowledge. Without the framework of structural organization to bind each discipline together, over time they had each grown apart.

in 1997 the DI created the “senior-level” Council of Intelligence Occupations (CIOC) with the initial intent of disciplinary “work- force strategic planning…in the areas of recruitment, assignments, and training” as well as “identify[ing] core skills and standards for expertise in the [DI].”

In practice, CIOC became a home for senior analysts interested in learning and teaching their discipline’s methodologies. They “establish[ed] a professional development program for all DI employees that provides explicit proficiency criteria at each level so that everyone can see the knowledge, skills, and experiences required for advancement within each occupation or across occupations.”55 All this was done—according to the CIA website — “so that the DI has the expertise needed to provide value-added all-source analysis to its customers.”

There may be no easy solution to the expertise trade-offs inherent in organizational structure. By definition, the structure will create synergies in the areas emphasized but the opportunity cost of doing so means losing the synergies in other areas.

TO THE FUTURE

Expertise-development programs that create the potential to improve overall analytic accuracy—such as formal training in methodologies, regions, disciplines, or languages, or informal training resulting from greater overseas experience — do not provide much in the way of improvement if in fact the DI’s business practices prevent application of this hard-earned expertise. CIA’s leaders should continue to pursue ways to increase analyst expertise, for they could contribute to increased analytic accuracy. Yet at the same time the DI must adapt its practices to leverage the potential improvements of these programs if the CIA is to provide its policymaking customers with the intelligence they need now and in the future.

APPRAISING BEST PRACTICES IN DEFENSE INTELLIGENCE ANALYSIS

Thomas A. Garin, Lt Col, USAF

ABOUT THE AUTHOR

Lt Col Tom Garin, an Air Force officer assigned to the National Reconnaissance Office (NRO), arrived at the Joint Military Intelligence College in September 1999 to occupy the General Thomas S. Moorman, Jr. Chair for National Reconnaissance Systems. He has taught graduate-level core and elective courses on missiles and space systems, structured techniques for intelligence analysis, and research methodologies, as well as an undergraduate core course on space systems. Prior to this post, Lt Col Garin was Chief, Organizational Development at the NRO. There, he used a balanced scorecard approach to link human resource policies with the overall NRO mission.

An adaptable organization is one that has the capacity for internal change in response to external conditions. To be adaptable, organizations must be able to learn. Organizations can learn by monitoring their environment, relating this information to internal norms, detecting any deviations from these norms, and correcting discrepancies.

A telling aspect of any knowledge organization is the way in which it manages its information.

Knowledge management can equip intelligence organizations for the fast-paced, high- technology information age. By building upon the best aspects of total quality management, and observing the criteria for performance excellence adopted for the Baldrige National Quality Program, managers can theoretically lead professionals to work together effectively as a group in a situation in which their dealings with one another affect their common welfare.

STUDY DESIGN

The DIA is expected to ensure the satisfaction of the full range of foreign military and military-related intelligence requirements of defense organizations, UN Coalition Forces, and non-defense consumers, as appropriate. Specifically, DIA supports: (1) joint military operations in peacetime, crisis, contingency, and combat; (2) service weapons acquisition; and (3) defense policymaking. In the past fifteen years, several analysis and production units within DIA have received prestigious awards demonstrating the confidence and appreciation that senior U.S. leaders have in the agency’s performance. On the basis of these observations, a case could be made that DIA has become a world-class organization for meeting U.S. military intelligence requirements. Clearly, the DIA is a worthy organization to examine for evidence of how quality intelligence analysis can be carried out.

The three essential characteristics of operations research are (1) systems orientation, (2) use of interdisciplinary teams, and (3) adaptation of the scientific method. The Baldrige criteria provide a similar function for assessing information organizations. From a systems perspective, the Baldrige criteria focus on a leadership triad and a results triad.

A Typical Office

A typical analysis office consists of a manager, the senior intelligence officer (SIO), analysts, liaison officers, and administration/technical support staff. The manager handles a set of processes common to many organizations such as planning, organizing, staffing, directing, coordinating, reporting, and budgeting. The SIO is a senior analyst, a subject matter expert on a grand scale, and is responsible for the content of the analytical product. A typical SIO approves all finished intelligence products, coordinates budget to contract external studies, and is the unit’s chief training officer. Analysts tend to be subject matter experts and are accustomed to using technology to support analysis. Liaison officers connect the analysis office either to operational forces in the field or to other organizations in the Intelligence Community. The administration/technical support staff provide a variety of functions such as disseminating the finished intelligence product, providing graphic support, and arranging travel.

Consumer Groups

Analysts produce intelligence for a number of different consumer groups.2 These consumer groups include: (1) civilian and military policymakers; (2) military planners and executors; (3) acquisition personnel responsible for force modernization; (4) diplomatic operators; and (5) intelligence operators.

Participants

This intelligence analysis benchmarking study is the result of a class project for the Joint Military Intelligence College’s graduate-level course ANA 635, “Structured Techniques for Intelligence Analysts.” This course helps students understand and apply descriptive and quantitative methodologies to intelligence analysis.

Process

The team followed established benchmarking procedures in conducting this study, which included on-site visits and interviews with members of each unit. The study team adopted a “systems” perspective that isolated the concepts of leadership, strategic planning, consumer focus, information and analysis, people and resources, process management, and mission accomplishment. Taken together, these seven elements define an organization (or a unit within a larger organization), its operations, and its performance.

THE EVALUATION CRITERIA

The Malcolm Baldrige National Quality Improvement Act of 1987, Public Law 100- 107, provides for the establishment and conduct of a national quality improvement pro- gram under which awards are given to organizations that practice effective quality man- agement. The Act established the Malcolm Baldrige National Quality Award. In implementing this Act, the National Institute of Standards and Technology makes avail- able guidelines and criteria designed for use by business, education and health care orga- nizations for self-evaluation and improvement of their competitive positions.3 In this paper, the NIST criteria are applied to DIA, under the premise that a U.S. government organization, through its component offices, could benefit from a systematic evaluation of its “competitive” position; namely, its ability to develop and maintain agility in a change- intensive environment.

Task Assignment

Initiative. In many offices, analysis is self-generated. The office leader assigns each analyst to an area of the world and constantly pushes for finished products. The analyst is expected to act on his own initiative, do the analysis, and produce articles. In this scheme, there really is no clear top-down tasking process. One subject observed that leaders need to do a better job of prioritizing analysts’ work because, even as analysts assemble data to put in databases, create briefings and produce finished intelligence products, they often write on relatively insignificant items. Their workload is usually events-driven. Consum- ers call in to their office and request certain information. The analysts attempt to tell the consumer what they don’t already know. Their work may not always reflect the consumer’s interests. To counter this tendency, some subjects maintain, managers need to incorporate a perspective on vital national interests into their analysts’ tasks.

Another subject identified two ways in which analysis tasks are self-generated. In one, the analyst submits a formal proposal to do general research and, if approved, then does the work. In the second, the analyst submits a more structured proposal to study a region of the world by looking forward perhaps about 10 years. If approved, the analyst in this case identifies problems, examines them, and synthesizes them into most important issues. A problem and its attendant issues could become the basis for a National Intelligence Estimate (NIE).4

The self-generated analysis may have certain implications. First, the analysts may be doing analysis that serves the personal interest of the analyst, but does not meet a military leader’s or other policymaker’s requirements. Second, the analyst may be focusing attention on analysis projects that are the easiest or quickest to publish, satisfying a need of the analysis office, rather than producing analysis that is most beneficial to the Intelligence Community. Third, the analyst may focus on easier problems that reside in the short-term instead of tougher future-oriented problems in order to give the appearance of being more productive by producing more intelligence.

Leader and Analyst Interaction

Initiative. Most interview subjects agreed that leaders from inside or outside their office do not interfere with the analyst’s work. They provide oversight only when the ana- lyst asks for it because analysts do not like to be micro-managed. Each manager considers his or her analysts subject matter experts. They may ask the analyst, “are you finished yet?” or “do you need help on A/B/C?” For additional support, analysts often appeal to other experts as needed.

The SIO for an office gets involved only if an analyst brings an issue or question to him. Sometimes, the SIO walks around the office, talks to the analysts, and looks for ways to help obtain certain information. The SIO gets formally involved in the analytic process, however, when the analyst has created the final product. Usually, it has been coordinated through the analyst’s peers before it gets to the SIO. The SIO makes any substantive refor- matting corrections before the final product can leave the office.

Directive. Usually during a crisis situation, there is face-to-face interaction between an office leader and analysts from the moment the requirement appears until it is met. One subject agreed that there should be a constant exchange of information between the leader and the analyst even during routine analysis. Some leaders are better at it than others. The lower level managers in their office worked directly with the analysts.

Usually, they assign an analyst a project, discuss with them how to do it, and continually ask them how they are doing. Another subject described the production process this way: the analyst produces, the boss does a sanity-check review; the analyst coordinates his or her work with other staffs, and they suggest changes.

Feedback

Initiative. The interaction between leadership and the analyst during feedback sessions varies from office to office. It usually follows a similar pattern: the analyst produces his or her findings, managers review the work; analysts make the recommended changes, and analysts coordinate the work with appropriate offices. In some cases, the analyst decides who should coordinate on his or her analysis. Competition between offices impedes the coordination process. One subject regrets that neither top-down guidance nor tools exist to support the coordination process.

One subject said all input from leaders and peers comes after the analysis is done. As a result, the usual case is for the assessment in the final product to be conceptually close to the analyst’s own findings. Once the feedback process concludes, analysts are free to post their finished products on the electronic “broadcast” media.

An analyst may also send a copy directly to the consumer, or may inform specific consumers of its URL. As feasible, the product will be “downgraded” to a lower classification level by obscuring sources or methods to enable it to reach a wider audience on different electronic systems.

Directive. After completing the analysis, the analyst will normally present his or her information to a team chief or SIO for coordination. After considering the revisions, the SIO will give the analyst permission to send written findings forward. Because this information may appear unaltered in the U.S. President’s read book, an SIO does not hesitate to “wordsmith. This feedback may be frustrating and humiliating for the analyst. The SIO may direct the analyst to fix the logical development of an argument or to present more and better data. Some analysts will simply rework their product and resubmit it to the SIO. Others, however, will discuss the comments with the SIO. The SIO typically prefers the latter method because the analyst will usually understand the comments better and will learn more from the experience. Almost all of the products go to their consumers; only rarely will a product be killed within the analyst’s office.

Comments

Good leadership demands high quality products to inform warfighters and policymak- ers. Leaders need analysis to answer tough questions. At the same time, managers need to keep in mind important questions about the analysis process itself:

  • Are military leaders able to do their job better because of their analyst’s product?
  • Are the benefits of doing the analysis outweighing the costs?
  • Are we doing risk analysis? That is, are we attempting to understand the balance between threat and vulnerabilities by examining local trade-offs?

Good leadership ensures a value-added coordination process. Analysts often find the coordination process tough. One subject said that before the first draft, the analyst’s job was the best job. After the first draft, however, the analyst’s job was difficult. Another subject called into question the coordination process because of the low quality of finished products posted in the database. Another subject said the coordination process is a hum- bling experience for the analyst because it is tough to get back your paper marked up with corrections. One subject said a co-worker tested the system. He worked really hard on one project and sent it forward. Next time, he slapped something together and sent it forward.

He noticed that it didn’t make any difference either way. He was devastated. There was no motivation to excel. From these testimonials, one could conclude that unsystematized (non-standard) coordination may hinder the work of analysts.

Good leadership theoretically removes such barriers to effective use of analytical resources by providing leading-edge administrative and computer support; quick-turn- around graphics support for urgent tasks; and hassle-free travel support. One subject com- pared DIA administrative support unfavorably with what he grew to expect when he worked for the Central Intelligence Agency. He could drop something off at 5PM for graphics support and pick it up when he got to work in the morning. When he traveled, his proactive travel office would remove some of the hassles associated with traveling. He would tell someone in the travel office where he needed to go. They would call him back a short time later with all the information he needed to know about the trip. Then, when he got back from his trip, he told them what he did and they reimbursed him. There were no required travel reports and no receipts. In fact, they told him to get rid of all his receipts. In case he was caught; it would be better for no one else to know where he had been!

Summary

Leadership is a critical aspect in determining the success of the intelligence analysis process. Leadership demands high-quality products, ensures a value-added coordination process, removes technical or administrative barriers, and intervenes when needed. Some leaders assign tasks directly to the analysts. Others rely on the Senior Intelligence Officer (SIO) to assign tasks or rely on the initiative of the analysts themselves to seek out the right problems and analyze them. Leadership tends to provide little or no interaction dur- ing the analysis process except if and when the analyst seeks it. Analysts do not like to be micro-managed. In a crisis situation however, the leader takes a more direct and hands-on approach. Although feedback varied from office to office, ideally the analyst should seek short, regular feedback on an informal basis with the office leader.

STRATEGIC PLANNING

Baldrige criteria compel an examination of how an analysis office prepares for future challenges, records its institutional history, and uses academic or professional sources. Within the category of future challenges, a distinction may be made between capability and issue challenges. Future capability challenges deal with the people and process issues. Future issue challenges, on the other hand, are country- or target-dependent. These challenges deal specifically with the emerging threat to U.S. interests. Although none of the offices has a written, formal strategic plan, in most, leaders claimed to have thought about strategic planning issues and were working toward a future vision. Their daily mis- sion though was to “get the right information to the right people at the right time.”

Future Challenges

Capability Challenges. Organizational capability centers on people and processes. For those subjects who indicated that they have thought about the future in strategic plan- ning terms, leaders recognized that the commitment of their people to an organizational ideal is a necessary ingredient for success. In practice, then, organizational capabilities must support strategic planning. Analysis organizations need to consider their staff capabilities and ensure that staffs have the necessary knowledge, skills, and tools for success. Our study subjects generally favored a team approach for intelligence analysis. Leaders wanted analysts to consult with experts and coordinate with stakeholders before sending the final product to consumers.

Since none of the subjects claimed to use a formal strategic planning process, our study team could not discuss the process with them in detail. A generic strategic planning process, however, should include (1) doing advanced planning, (2) performing an environmental scan based on external and internal information, (3) setting a strategic direction using the vision statement, (4) translating the strategic direction into action via an implementation plan, and (5) making a performance evaluation. A performance evaluation or assessment should include (1) defining goals, (2) describing success, (3) setting targets, (4) measuring performance, and (5) adjusting goals based on performance. The DIA has a formal process to do strategic planning, but it was not clear to the study team how the organizational strategic planning process might be linked to the lower levels of the organization.

Institutional History

Most subjects said they did not have a written institutional history. There exist no for- mal analyses of their office’s strengths or weaknesses. The office does rely, then, on retaining experienced individuals for its corporate memory.6 Some offices maintain briefings about their office’s operations as a surrogate for a written institutional history. Anyone assigned to an analytic unit is expected to set about building connections and learning operations and systems. Only one subject said they had formed a history office to record significant events for their office, beginning in the late 1990s. Even this admirable initiative, however, with its focus on events, rather that process itself, would fail to capture the nuances of truly useful and recoverable institutional history.7

Academic or Professional Sources

In the interviews, office leaders tended to support academic training, attendance at professional conferences and conventions as experiences useful to the development of professional expertise. However, there exist at the office level no guides or plans for such professionalization opportunities.

“Factions,” an analytical tool developed at CIA for systematic analysis of political instability, is a good example of research and development (R&D) work done in collaboration with members of the academic community.10 In the 1980s there was no particular pressure or encouragement across the Community to use structured analytic methods. Analysts gathered the facts, looked at them in their uniquely “logical” way, and came to some conclusions. Some DIA analysts use structured techniques to do intelligence analysis. The subjects we spoke to, however, described their analytical process as gathering facts, examining them logically and drawing some conclusions. More emphasis on structured techniques at DIA may be appropriate.

Summary

Although none of our subjects had a written, formal strategic plan, most of them had thought about strategic planning issues and were working toward a future common vision among the personnel in their office. Office leadership is often engaged with senior leader- ship about future issues. Managers know the direction they want to go in and recognize the need to support a strategic plan. Furthermore, a focus on important future issues such as which country is most problematic clearly makes an analysis shop proactive and saves time in crisis situations.

Theoretically, the strategic management process includes (1) advanced planning; (2) an environmental scan that includes a review of internal and external information; (3) setting a strategic direction; (4) translating the strategic direction into action; and (5) evaluating performance. Consumers, stakeholders, and subjects influence the process by stating their requirements, receiving products and services, and providing feedback. Information and analysis influence each step of the process by providing management with important information, data, and feedback.

Most of our subjects relied on the corporate memory of senior analysts instead of maintaining a recorded institutional history. As senior analysts approach retirement age and leave government service, analysis offices will lose a great deal of expertise. Some offices maintained a written office history or had on file significant briefings about their office’s operations. All of our subjects used academic or professional sources to develop their capabilities as needed. They also knew what commercial tools were available to help them to do more for their consumers.

CONSUMER FOCUS

The Baldrige criteria applicable to a consumer focus suit the DIA case very well. As an information organization, this agency is obligated, through its analysis units, to discern consumers’ requirements. A key question for this study is whether relationships with consumers tend to be ad hoc or the result of a deliberate strategy.

Consumer Requirements

Consumers make their requirements known through a formal production database, telephone calls, individual initiatives, by e-mail, and through personal visits. They make their initial requirements known primarily on an ad hoc basis. The most meaningful requirements are informal because the analyst gets to talk directly to the consumer. Subsequently, the analyst can help the consumer refine the formal request in the Community’s intelligence production database. Some consumers make telephone calls to analysts that require an answer in two hours or less. This can make the analyst’s job very stressful. Consumers use emails, on the other hand, as a secondary contact medium and mostly for follow-up activities. On occasion, a consumer will come to the analyst’s office to maximize the opportunity to gain a very specific pre-deployment, contextual understanding of target information. One subject noted that although most analytic tasks are management-driven, there are also many self-generated products. The latter could have a positive effect, especially in promoting analyst self-esteem, so long as the self-generated products are of interest to the entire Community.

Consumer and Analyst Interaction

The different frames of reference that one might expect to characterize any interaction between analysts and consumers are on vivid display in anecdotes related by most inter- view subjects. Much of the time, consumers appear to shy away from direct interaction with an analyst. On occasion, the consumer’s rank is too far above that of the analyst to allow an analyst to directly approach an individual consumer, especially in an organization like DIA, often still characterized by a culture that finds meaning in and derives behavioral clues from a hierarchy of authority. Occasionally, however, informal interaction does occur, to include face-to-face conversations or telephone calls to enhance clarity of requirements. One subject noted that his contact with consumers is mainly personal and direct. This analyst works with consumers to build training products that look as much as possible like the real world. Toward that end, an analyst might attend survival school as a student, rather than as a VIP, in order to absorb and then re-create more realistically some part of that training environment.

Production tasking usually passes through bureaucratic levels at both the producer and consumer ends, which reinforces pre-existing tendencies keeping analysts and consumers away from each other. Suddenly, an answer to a formal or infor- mal request simply appears. A general absence of feedback from consumers characterizes the intelligence production process in the offices included in this study. Any interaction with consumers that does occur happens on an ad hoc basis.

Because most products are published electronically on systems such as Intelink, it is possible for the agency to track in considerable detail the organizations and even particular analysts that access and explore particular documents. In practice, however, this is rarely done at the level of detail that allows an analyst to be certain who the consumer really may be. Even if one accepts the concept that within the Department of Defense, information becomes actionable intelligence mainly through the briefing process, it is wise to remember that cognitive processes do depend ultimately on purposeful information-seeking activity, which can hypothetically be tracked in such detail as to inform intelligence producers exactly who their true consumers really are.

Feedback

Analysts provide finished products to their consumers in a variety of ways: through the formal production requirements management database, in hard copy, posted to a website, or by email. Sometimes, depending on the size of the electronic file, analysts post a prod- uct on a “hidden” URL. Then, they call a known consumer to let them know where they have posted the product. Once the consumer has the product, the analyst can delete it from the system. Feedback appears in many forms. In one office, informal feedback pre- dominates through personal contact with the consumers. Another office obtains consumer feedback either via the office web site’s “feedback” button, or via a verbal response from the consumer that the product met or did not meet the need. Hard copies automatically include a publication office’s formal feedback form. All products do not get this form. One subject said a team in their division uses a questionnaire to find out if their consum- ers liked their product. One office maintains a sizable travel budget so analysts may travel to their consumer’s office regularly.

Marketing a product can attain great importance. An analyst typically does not want his or her finished product to sit on the table in the J-2’s office. The usual desire is for the information to reach much lower levels, where the material remains actionable information, which may only later become actionable intelligence. They do this by using different means of communication.

The analyst who writes the product can market it by mailing hard copies, by telling people where to find it on Intelink or by responding to specific requests by sending email attachments or making a personal phone call.

Summary

Building relationships with consumers tends to be more ad hoc than the result of a deliberate strategy. Although consumer feedback varies from office to office, most of the subjects said that “working-level” consumers support their intelligence analysis process. Some got feedback from an office web site.

INFORMATION AND ANALYSIS

Information and analysis criteria address the question of how well DIA offices are meeting challenges in this core functional area, and reveal whether a meaningful approach is being taken by the organization for measuring product quality. Representatives of most of the analysis offices indicated that they are meeting the requirements embodied in these criteria. Informal means for measuring analysis performance included counting products, meeting production schedules, regularly updating databases, answering key consumer-oriented questions, and gauging performance based on “perception.’’

Meeting Current Analytic Challenges

Specific indicators of success that were quickly asserted by interview subjects included the following:

  1. Positive comments or lack of negative comments from consumers;
  2. Evidence of consumer use of intelligence products;
  3. Knowledge of the number of terrorist attacks averted;
  4. A downward trend in the number of requests for information (RFI);
  5. Evidence of improvement of analysts’ capabilities based on accumulated experience.

Not surprisingly, a tendency exists for a dissatisfied consumer to be more vocal. A lack of negative comments from consumers is therefore considered a positive indicator of success. Consumers never want DIA to stop sending them daily intelligence products. Even if routine DIA assessments are only placeholders, the consumer still expresses a demand for the products because one article could change everything.

Measurement of actionability is never easy because often no recordable action is required, and the producer faces the difficult task of determining whether the product has simply influenced the way a consumer views an important issue. No one from among the offices under study here claims to have an answer to this difficult issue.

A different type of indicator of success is the level of improvement in analyst savvy, given the accumulation of experience. An office’s SIO is in a good position to determine whether the analyst is making improvement. Comments from the SIOs contacted in the course of this study indicate that they examine the analyst’s work with a critical eye to determine whether the analyst is growing in the job. If it is determined that the analyst is in fact getting better at doing analysis, then the SIO rewards the analyst with more work and more challenging jobs. The SIO also rewards the analysts with an appropriate performance review. If however, the analyst does not show improvement in the current task, then the SIO working with the analyst and manager will attempt to diagnose the problem and provide appropriate training to improve proficiency in doing analysis.

A Standard Approach to Measuring Quality

No subject claimed to have plans to implement any kind of standard approach to mea- sure analytic quality. One subject said he was not opposed to some standard for measuring quality. He has not, however, seen a useful measure for it. In another office, analysts “just publish in order to keep their database filled.” Little or no attention is given to quality. Either the product meets the requirement or it doesn’t.

Comments

All of the subjects said they had no formal performance measurements, but neither did they employ informal means. To justify this situation, one subject said intelligence analysis is purely subjective. There are no numbers to count. If analysts meet their production schedule, then they are considered to be successful. If analysts get all of their work done, then they have met the minimum success requirement.

Managers of intelligence analysis are ambivalent about managing for results. Though interested in demonstrating success, they are uneasy about being held accountable for poor organizational performance. There are severe problems in measuring performance in practice: (1) problems in getting agreement on what good performance means for intelligence analysis; (2) intelligence analysis goals are unreasonable compared with available resources; (3) key performance indicators are unavailable or too costly to collect; and (4) management is unwilling to make necessary changes to improve intelligence analysis.

Office leaders appear to be most interested in how many products the analyst finished, even though this crude measure is not supplemented with any method for independently weighting the relative value of those products.

One subject noted that his office does track certain indicators of quality, such as whether the consumer liked the product. A subsidiary indicator lies in evidence of whether the consumer responded to the product. Did anyone else want a copy of the product?

One subject prefers using the term “gauge” instead of “measure” because “measure” assumes some sort of reasoned technique. Often, there is a bit of a “gut check” in any office with respect to how well managers perceive their office to be meeting its mission. Most of the measurement comes in the form of ad hoc consumer feedback. A well-executed mission, like a successful noncombatant evacuation operation (NEO) for instance, tells the office that they are doing a good job.

Summary

Informal means for measuring analysis performance included counting products, meeting production schedule, updating databases, answering key consumer-oriented questions, and gauging performance based on perception. Two common ways for analysts to know if they are meeting their consumer’s needs are by a lack of negative feedback and by an increase in product use. Our subjects do not keep track of forecast accuracy because it is too difficult to track. Instead analysts individually keep a mental checklist on any forecasts they make and how the situation developed over time. Although none of our subjects planned to implement any standard approach to measure analytic quality in the near future, they continue to examine ways to do analysis more effectively without destroying their organization’s “can do” ethos.

PEOPLE AND RESOURCES

The people and resources category offers criteria to examine the knowledge, skills, and experiences of the analysts and the availability of analysts’ tools. Leaders can improve the quality of a product by promoting greater expertise among the intelligence analysts and by inducing analysts to take advantage of information found in open sources. Study subjects said they need analysts who think across interdisciplinary boundaries and they prefer analysts who have military operations experience.

Knowledge, Skills, and Experience

According to representatives of the five offices in DIA, analysts bring with them or develop certain skills to help them perform their analysis and at the same time they must develop an awareness of how certain analytic pitfalls may influence their work. The ana- lytic skills include: (1) technical or topical expertise; (2) knowledge of the target, sources, and analytic techniques; (3) ability to search and organize data; (4) ability to synthesize data or use inductive reasoning; and (5) ability to express ideas. The analytic pitfalls include: (1) personal biases or inexperience; (2) lack of data or access to data; (3) asking the wrong question;14 (4) misunderstanding data; (5) flaws in logic; (6) no review or evaluation process; (7) denial and deception by the target; (8) politicization or non-objectivity; (9) groupthink; and (10) mirror imaging.

One problem frequently voiced is that new hires top out in four years at the GS-13 level, then they have to move out of the office to continue advancing in their career. A consensus exists among office managers that any office struggles to maintain their analysis capability as people move to better jobs for promotion. Having people who are career intelligence officers, especially if they also possess a military operations background, is commonly cited as a key to a quality staff.

PROCESS MANAGEMENT

Process management criteria apply to information handling, intelligence analysis, and collaboration. Processes associated with the dissemination of information and intelligence are beyond the scope of this study, but in this area, the advent of online publication and the loosening of “need to know” rules governing access to sensitive information have relieved some classic dissemination bottlenecks.15 Good analysts are commonly charac- terized by DIA officials as “focused freethinkers.” Other commonly acknowledged max- ims are that the analysis process is a highly competitive process, and that analysts rush to be first to publish vital intelligence findings. At the same time, analysts must work together by collaborating with other analysts to ensure accuracy.

None of our subjects said they observed a formal or structured process in doing intelligence analysis. One subject summed up this approach as simply “sitting down and going to work.” A second subject claimed to use an informal and semi-structured, term research process. In this mode, the analyst first has an idea related to their “account.” They discuss with their team chief the problem they are interested in solving. If they get permission, then they do the research. Another subject added that, likewise, he does not have a “hard- wired” approach, yet applies a routine to the process. One subject said his office has a process that only remains in the mind of each of the office’s analysts. It is, in the estimation of this subject, an application of the scientific method or at least a disciplined thought process. In this scheme, managers give the analyst a problem, and the analyst does research to determine what they have and what they need. If there is time, he or she will request collection of certain data. The analyst reviews the data and answers the question.

Information Handling

Analysts do their own fieldwork to gather any information beyond that from routinely available, written sources. One SIO emphasized that his analysts “gather” information, they do not “collect” it. For example, they do not have diplomatic immunity when they are visiting other countries; they must get specific permission from the country to be there.

Analysts report that they try to lay down the steps of their analysis in a logical way. For example, they use several paths to move from general information to more specific infor- mation. Some analysts write their papers as if they were doing research in a graduate school. Others write products that look like newspaper articles. Analysts tend to protect their sources in a manner similar to the way newspaper reporters protect the identity of their sources.

Intelligence Analysis Process

The analysis process includes (1) problem identification; (2) data gathering; (3) analy- sis, synthesis, collaboration, and examination of alternative propositions using all sources of data; and (4) direct support to consumers with briefings and publications. In the end, analysts make an assessment of what the data probably mean or what the data could mean. General Colin Powell (USA, Ret.) sums up his expectations as a consumer of intelligence in this way: “Tell me what you know, what you don’t know, and what you think— in that order.” In this oft-quoted dictum, he suggests that analysts can best convey what is known about the problem, evaluate the completeness of their knowledge, and interpret its meaning. It remains unclear whether the same guidance is as suitable for written products as for oral briefings. In the course of analysis, whether presented in written or oral format, it is important for the analyst to answer for himself certain key questions:

  1. What do you know about the issue?
    2. What is a logical, sequential way to present the facts?
    3. What conclusions or implications can be drawn from these facts?

Analysts generally find it distasteful to coordinate their products with members outside of their team. Nonetheless, when the “corporate” analysis process works as it can, all of the teams contribute to the analyst’s final product and, as a result, the analyst produces a superior product. Managers do want analysts to exchange information with each other. They have a distaste for competition among analysts. The analysts’ rationale for resisting widespread coordination is that they consider themselves subject matter experts.

Collaboration and Competition

Collaboration. If analysts choose not to collaborate with other Community experts, then the SIO may force them to do so during the coordination and review process. Manag- ers want analysts to verify the validity of all of their information in a manner similar to the way that they feel compelled to check and verify human-resource information.

Competition. Collaboration among intelligence analysts is always a problem because analysts, like many school children, tend to be competitive with each other. The acknowl- edged lack of coordination within DIA is not a new or exceptional phenomenon. New analysts may not know who to talk to on a particular subject or where to go to get useful information. A more senior mentor or manager can usually provide the analysts with that kind of information, but the exchange of information between analysts doesn’t normally happen in the unsupervised work environment. Analysts are very competitive within the office and especially with other agencies in the IC. There is a strong desire to publish first, a tendency that inhibits full disclosure to competing analysts.

Summary

Good analysts are “focused freethinkers.” Nevertheless, “out-of-the-box”-type thinking needs to be tempered by using an informal, disciplined thought process and collaborating with other analysts The scientific method is a streamlined process that ensures the analyst doesn’t wander too far

Definition of Success

All of our subjects agreed that success is a difficult concept to define. Our subjects did, however, agree that it might be possible. First, success is positive feedback from consum- ers. If they do not provide explicit positive feedback, then the analysis shop can look for indirect indicators.

Second, an intelligence analysis success occurs when a unit that is known to depend on DIA products completes its military operation successfully.

Third, success is obtaining an expanding budget and a growing workforce. These developments indicate that the topic focus is perceived as interesting by agency leader- ship. One subject labeled this type of success as “bureaucratic success.”

Fourth, success to some analysts means that evidence indicates that they are able to look at things in the open sources and link them in ways others could not envision. The analyst would observe a situation, notice a trend, and take a position on a tough issue.

One possible way to measure success is to count the number of electronic hits on a particular document on a database. If consumers are asking questions, then analysts can feel they are being helpful. An informal summary metric, then, is the measure of information flow out of the office.

Summary

Intelligence analysis supports DoD organizations and warfighter operations. Mission success is positive feedback from consumers or a successful completion of military operations. Bureaucratic success, on the other hand, results in more money to spend and more analysts to do the job. Analysts consider themselves successful if their consumers changed their behavior based on the information they provided to them.

STRATEGIES FOR SUCCESSFUL INTELLIGENCE ANALYSIS

“What is good intelligence analysis?”

Good or useful intelligence analysis may be best defined by its opposite. Bad intelligence does not specify what the threat is or how it will manifest itself. For example, the analyst may conclude there will be a 50 percent chance of a chemical weapons attack against the U.S. in the next 10 years. That conclusion is not meaningful or helpful to planners and users of intelligence.17 In another example, the analyst may conclude that all ports are dangerous to U.S. ships. This analysis is impractical because ships have to use ports occasionally.

In government work, the applied nature of intelligence production would seem to offer the opportunity to develop and apply consistent measures of productivity and quality, just because the products are indeed used by specific consumers. If this is true, then there is room to bring “operations research” into play as well as the Baldrige criteria, to encour- age the development and collection of carefully codified, surrogate measures of productivity and quality. Successful use of these tools does not remove from the manager the task of decisionmaking but rather it requires of him or her rather different kinds of decisions. In other words, using operations research tools provides managers with extra insight into their particular subject and hence leads them into much more difficult but fruitful areas. Consultants with scholarly backgrounds could come into the intelligence analysis work environment and make vast improvement to the intelligence analysis capability using such tools to assist them, in the author’s opinion.

CORE COMPETENCIES FOR INTELLIGENCE ANALYSIS AT THE NATIONAL SECURITY AGENCY David T. Moore and Lisa Krizan

Seekers of Wisdom first need sound intelligence.

— Heraclitus1

What makes an intelligence analyst successful in the profession? This question strikes at the heart of the National Foreign Intelligence Community’s mission to provide actionable information to national leaders and decisionmakers.

What is a qualified intelligence analyst?

In this paper the authors propose a set of functional core competencies for intelligence analysis, shown in the figure below, which provides a starting point for answering fundamental questions about the nature of ideal intelligence professionals, and how analysts who share these ideals can go about doing their work. Keeping in mind the complex nature of the threats to U.S. national security, we argue that the strategy for deploying intelligence analysts and for carrying out intelligence production must become more rigorous to keep pace with 21st Century foes, and to defeat them.

Functional core competencies for intelligence analysis

The authors began exploring the art and science of intelligence analysis at their agency as part of a corporate initiative to add rigor to its analytic practice.

Sherman Kent, who helped shape the national peacetime intelligence community, argues that intelligence requires its own literature. According to Kent, a key purpose of this literature is to advance the discipline of intelligence. Kent believed “[as] long as this discipline lacks a literature, its method, its vocabulary, its body of doctrine, and even its fundamental theory run the risk of never reaching full maturity.”6 Through the publication of articles on analysis and subsequent discussion, “original synthesis of all that has gone before” occurs.7 In keeping with Kent’s mandate to develop an intelligence literature that provokes discussion and further methodological development, we seek comment and fur- ther discussion among scholars of intelligence studies.

DEFINITIONS AND CONTEXT

Intelligence refers to information that meets the stated or understood needs of policymakers…. All intelligence is information; not all information is intelligence.

— Mark Lowenthal8

Intelligence is timely, actionable information that helps policymakers, decisionmakers, and military leaders perform their national security functions. The intelligence business itself depends on professional competencies, what John Gannon, former Chairman of the National Intelligence Council, refers to as “skills and expertise.” He notes that “this means people—people in whom we will need to invest more to deal with the array of complex challenges we face over the next generation.”

Ultimately, analysis leads to synthesis and effective persuasion, or, less pointedly, estimation.10 It does so by breaking down large problems into a number of smaller ones, involving “close examination of related items of information to determine the extent to which they confirm, supplement, or contradict each other and thus to establish probabilities and relationships.”11

Since the advent of the Information Age, “[collecting] information is less of a problem and verifying is more of one.”12 Thus the role of analysis becomes more vital as the supply of information available to consumers from every type of source, proven and unproven, multiplies exponentially. Intelligence analysts are more than merely another information source, more than collectors and couriers of information to consumers. Further,

[the] images that are sometimes evoked of policymakers surfing the Net themselves, in direct touch with their own information sources, are very misleading. Most of the time, as [policymakers’] access to information multiplies, their need for processing, if not analysis, will go up. If collection is easier, selection will be harder.13

At its best, the results of intelligence analysis provide just the right information permit- ting national leaders “to make wise decisions—all presented with accuracy, timeliness, and clarity.”14 The intelligence provided must “contain hard-hitting, focused analysis relevant to current policy issues….Therefore, analysis of raw information has the most impact on the decisionmaker and [therefore] producing high-quality analytical product should be the highest priority for intelligence agencies.”

Treverton adds that intelligence must anticipate the needs of policy. “By the time policy knows what it needs to know, it is usually too late for intelligence to respond by developing new sources or cranking up its analytic capacity.”

A former policymaker himself, he asserts that intelligence is useful to policy at three stages during the life of an issue:

  • If the policymakers are prescient, when the issue is just beginning; however there is likely to be little intelligence on the issue at that point.
  • When the issue is “ripe for decision.” Here policymakers want intelligence that permits alternatives to be considered; however, intelligence often is only able to provide back- ground information necessary for understanding the issue.
  • When the policymakers have made up their minds on the issue, but only if intelligence supports their view. They will be uninterested or even hostile when it does not support their view.21

These limitations notwithstanding, Treverton suggests that policymakers can and should establish a symbiotic relationship with the intelligence analysts who advise them:

[If] you call them in, face to face, they will understand how much you know, and you’ll have a chance to calibrate them. You’ll learn more in fifteen minutes than you’d have imagined. And you’ll also begin to target those analysts to your concerns and your sense of the issue.22

Similarly, the analyst has responsibilities to the policymaker. In commenting on this rela- tionship, Sherman Kent asserts

[intelligence] performs a service function. Its job is to see that the doers are generally well informed; its job is to stand behind them with a book opened at the right page to call their attention to the stubborn fact they may be neglecting, and—at their request—to analyze alternative courses without indicating choice.23

In Kent’s view, the intelligence analyst is required to ensure, tenaciously, that policy- makers view those “right” pages, even when they may not wish to do so.

MEASURING SUCCESS IN INTELLIGENCE ANALYSIS

Intelligence must be measured to be valued, so let us take the initiative and ask our management, [and] the users, to evaluate us and our products.

— Jan P. Herring24

Any observer can expect that a successful intelligence analyst will have certain personal characteristics that tend to foster dedication to the work and quality of results.

Intelligence Process

Successful intelligence analysis is a holistic process involving both “art” and “science.” Intuitive abilities, inherent aptitudes, rigorously applied skills, and acquired knowledge together enable analysts to work problems in a multidimensional manner, thereby avoiding the pitfalls of both scientism and adventurism. The former occurs when scientific methodology is excessively relied upon to reveal the “truth”; the latter occurs when “inspiration [is] unsupported by rigorous analysis.”26

A vital contributor to the analytic process is a spirit of competition, both within an intelligence-producing agency and especially between intelligence agencies. There is a tendency for analysts working together to develop a common mindset. This trap occurs typically when analysts fail to question their assumptions about their role in the intelligence process and about the target. The Council on Foreign Relations’ independent task force on the future of U.S. intelligence recommends that “competitive or redundant analy- sis be encouraged” precisely for these reasons.27

Successful analysis adds value—to the information itself, to institutional knowledge, to fellow intelligence professionals, to the process, and to the institution or unit itself—in terms of reputation and the degree to which good analytic practices endure despite changes in target, consumer, and personnel. Successful analysts are those whose work, whenever possible goes to the level of making judgments or estimating.

What role does management play in ensuring analytic success? First and foremost, management effectively uses financial and political capital to ensure that analysts have access to consumers, and to the resources they require to answer those consumers’ intelli- gence needs. This includes the organization of the work itself, allocation of materiel and personnel, and coordination with consumers and other producers. When management is successful, the analyst has the necessary tools and the correct information for successful intelligence analysis. Good morale among analytic personnel becomes an indicator of effective management. A good understanding of the unit’s mission and the analysts’ own satisfaction with his or her performance naturally produces a feeling of empowerment and a belief that the organization places great value on analytic talent.

Intelligence Product

The products of successful analysis convey intelligence that meets or anticipates the consumer’s needs; these products reveal analytic conclusions, not the methods used to derive them. Intelligence products are successful if they arm the decisionmaker, policy- maker or military leader with the information and context—the answers—needed to win on his or her playing field.

Readiness: Intelligence systems must be responsive to existing and contingent intelligence requirements of consumers at all levels.
Timeliness: Intelligence must be delivered while the content is still actionable under the con- sumer’s circumstances.

Accuracy: All sources and data must be evaluated for the possibility of technical error, misper- ception, and hostile efforts to mislead.

Objectivity: All judgments must be evaluated for the possibility of deliberate distortions and manipulations due to self-interest.

Usability: All intelligence output must be in a form that facilitates ready comprehension and immediate application. Intelligence products must be compatible with the consumer’s capabilities for receiving, manipulating, protecting, and storing the product.

Relevance: Information must be selected and organized for its applicability to a consumer’s requirements, with potential consequences and significance of the information made explicit to the consumer’s circumstances.

Measures of success for intelligence products28

Six “underlying ideas or core values” for intelligence analysis, identified by William Brei for operational-level intelligence, and shown in the figure above, establish the analyst’s “essential work processes.”29 Since they are defined in terms of the consumer, they also can be used as a checklist to rate the quality of products provided to the consumer.

William S. Brei, Captain, USAF, Getting Intelligence Right: The Power of Logical Procedure, Occasional Paper Number Two (Washington DC: Joint Military Intelligence College, 1996), 6.

  • Was the reported intelligence accurate? (Accuracy)
  • Are there any distortions in the reported judgments? (Objectivity)
  • Is the reported intelligence actionable? Does it facilitate ready comprehension? (Usability)
  • Does it support the consumer’s mission? Is it applicable to the consumer’s requirements? Has its significance been made explicit? (Relevance)

Brei asserts that accurate data provide the foundation for subsequent objective judgments, and the expression of objective judgments in a usable form provides much of the basis of a relevant product. Thus, unverified data cannot only cost an intelligence product its Accuracy, but also damage its Relevance to the customer.32

Brei’s principles provide a means for evaluating a given intelligence product based on the meaning it conveys and the value of that intelligence to the consumer. His approach, when combined with an “insider’s” view of the intelligence production process, analytic methods and personnel management practices, makes a comprehensive evaluation of intelligence analysis appear possible

CHARACTERISTICS OF SUCCESSFUL INTELLIGENCE ANALYSTS

A sophisticated intelligence analyst is one who is steeped in the history and culture of a region, has lifelong interest in the area, and approaches the study of the region as a professional responsibility, and probably as an avocation as well.

— Ronald D. Garst and Max L. Gross34

Who are the most successful intelligence analysts? What makes them successful? In setting forth the functional core competencies for successful intelligence analysis we observe there are characteristics which, while not necessary for successful intelligence analysis per se, do seem to be associated with analysts considered to be the most success- ful at their trade.35

Probably the most indispensable characteristics of successful intelligence analysts are high self-motivation and insatiable curiosity. Analysts want to know everything they can about the objects under their scrutiny. Reading and observing voraciously, they ferret out every possible piece of information and demonstrate a sense of wonder about what they discover. As new fragments appear, novel connections are discovered between the new and older information as a result of intense concentration leading to epiphanous moments of “aha” thinking. The most successful analysts tend to enjoy their work—“It’s play, not work.” Indeed, they often will stay late at the office to pursue a thorny problem or an engaging line of reasoning.

Employee orientation programs that acknowledge these characteristics may be most successful in initiating new employees into the analytic culture. When personal characteristics are embodied in compelling “war stories” told by mentors and peers, they can reinforce the cultural values of the agency, building corporate loyalty by reinforcing the sense of membership.

ABILITIES REQUIRED FOR INTELLIGENCE ANALYSIS

The competent intelligence analyst must have a unique combination of talents.

— Ronald D. Garst and Max L. Gross37

Abilities arise from aptitudes that can develop from an individual’s innate, natural characteristics or talents. Although aptitudes may largely be determined by a person’s genetic background, they may also be enhanced through training.38

Communicating

Teaming and Collaboration

Teaming and collaboration abilities enhance intelligence analysis, since the analyst’s relationship with consumers, peers, subordinates, and supervisors shapes the intelligence production process. Formalized means of enhancing all these abilities can lead intelligence professionals to considerably greater effectiveness as analysts and leaders of analysts. This is why the Director of Central Intelligence has indicated that collaboration is a cornerstone of strategic intelligence.41 A collaborative environment also minimizes the likelihood of intelligence failures.

We identify four distinct teaming abilities, to show the complexity of the concept. Typically, formal training programs address leadership abilities only in the context of the management function; here, we focus on the analysis process itself.

Influencing: Those with this ability effectively and positively influence superiors, peers, and subordinates in intelligence work. Analysts often need to persuade others that their methods and conclusions are valid, and they often need to leverage additional resources. The ability to influence determines the level of success they will have in these areas.

Leading: Those who are more senior, more skilled, and more successful in intelligence analysis have an obligation to lead, that is, to direct others and serve as role models. The ability to lead involves working with and through others to produce desired business outcomes. Thus, developing leadership abilities enhances the field of intelligence analysis.

Following: Almost every grouping of humans has a leader. Everyone else is a follower. Analysts must enhance their abilities to work within a team, taking direction, and act-ing on it.

Synergizing: Drawing on the other three teaming abilities, players in the intelligence process cooperate to achieve a common goal, the value of which is greater than they could achieve when working alone.

Thinking

As our species designation—sapiens—suggests, the defining attribute of human beings is an unparalleled cognitive ability. We think differently from all other creatures on earth, and we can share those thoughts with one another in ways that no other species even approaches.

— Terence W. Deacon, The Symbolic Species.43

Intelligence analysis is primarily a thinking process; it depends upon cognitive func- tions that evolved in humans long before the appearance of language.44 The personal characteristics of intelligence analysts are manifested in behaviors that reflect thinking and/or the inherent drive to think. Our national survival may depend on having better developed thinking abilities than our opponents.

Information Ordering: This ability involves following previously defined rules or sets of rules to arrange data in a meaningful order. In the context of intelligence analy- sis, this ability allows people, often with the assistance of technology, to arrange infor- mation in ways that permit analysis, synthesis, and extraction of meaning. The arrangement of information according to certain learned rules leads the analyst to make conclusions and disseminate them as intelligence. A danger arises, however, in that such ordering is inherently limiting—the analyst may not look for alternative explanations because the known rules lead to a ready conclusion.

Pattern Recognition: Humans detect patterns and impose patterns on apparently ran- dom entities and events in order to understand them, often doing this without being aware of it. Stellar constellations are examples of imposed patterns, while criminal behavior analysis is an example of pattern detection. Intelligence analysts impose or detect patterns to identify what targets are doing, and thereby to extrapolate what they will do in the future. Pattern recognition lets analysts separate “the important from the less important, even the trivial, and to conceptualize a degree of order out of apparent chaos.”45 However, imposing or seeking patterns can introduce bias. Analysts may impose culturally defined patterns on random aggregates rather than recognize inher- ent patterns, thereby misinterpreting the phenomena in question.

Reasoning: The ability to reason is what permits humans to process information and formulate explanations, to assign meaning to observed phenomena. It is by reasoning that analysts transform information into intelligence, in these three ways:

  1. Induction: Inductive reasoning combines separate fragments of information, or specific answers to problems, to form general rules or conclusions. For example, using induction, a child learns to associate the color red with heat and heat with pain, and then to generalize these associations to new situations.46 Rigorous induction depends upon demonstrating the validity of causal relationships between observed phenomena, not merely associating them with each other.
  2. Deduction: Deductive reasoning applies general rules to specific problems to arrive at conclusions. Analysts begin with a set of rules and use them as a basis for interpreting information. For example, an analyst who follows the nuclear weapons pro- gram of a country might notice that a characteristic series of events preceded the last nuclear weapons test. Upon seeing evidence that those same events are occur- ring again, the analyst might deduce that a second nuclear test is imminent.47 How- ever, this conclusion would be made cautiously, since deduction works best in closed systems such as mathematics, making it of limited use in forecasting human behavior.
  3. Abduction: Abductive reasoning describes the thought process that accompanies “insight” or intuition. When the information does not match that expected, the analyst asks “why?,” thereby generating novel hypotheses to explain given evidence that does not readily suggest a familiar explanation. For example, given two ship- ping manifests, one showing oranges and lemons being shipped from Venezuela to Florida, and the other showing carnations being shipped from Delaware to Colombia, abductive reasoning is what enables the analyst to take an analytic leap and ask, “Why is citrus fruit being sent to the worldwide capital of citrus farming, while carnations are being sent to the world’s primary exporter of that product? What is really going on here?” Thus, abduction relies on the analyst’s preparation and experience to suggest possible explanations that must then be tested. Abduction generates new research questions rather than solutions.48

SKILLS REQUIRED FOR INTELLIGENCE ANALYSIS

Any institution that relies on professionals for success and seeks to maintain an authentic learning climate for individual growth must require its members to read (to gain knowledge and insight), research (to learn how to ask good questions and find defensible answers), discuss (to appreciate opposing views and subject their own to rigorous debate), and write (to structure arguments and articulate them clearly and coherently).

Critical Thinking

It is by thinking that analysts transform information into intelligence. Critical think- ing is the cognitive skill applied to make that transformation. Critical thinking can be defined as

[An] intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action….Thinking about [our] thinking while [we’re] thinking in order to make [our] thinking better.50

There is a clear need to educate and train intelligence analysts to use their minds…[Only] by raising their awareness can the intelligence unit be assured that the analysts will avoid the traps in being slave to conformist thought, precedent and imposed cultural values—all enemies of objective analysis.51

An ordered thinking process requires careful judgments or judicious evaluations leading to defensible conclusions that provide an audit trail. When the results of analysis are controversial, subject to alternate interpretations, or possibly wrong, this audit trail can prove essential in defending the process used to reach the conclusions.

Foreign Language Proficiency

Furthermore, foreign language proficiency provides more than just a translation of non-English materials. The structure of a target’s language and that target’s culture are closely related. One well-known theory of this relationship, by Edward Sapir and Benjamin Whorf, posits that “language is a force in its own right and it affects how individuals in a society conceive and perceive reality.”64 Thus concepts essential to understanding the target are communicated in a context that goes beyond simplistic translation.

Context must determine the translation, and an analyst lacking foreign language skills must trust the linguist to correctly understand that context. The expertise required for that understanding might render the linguist a better intelligence analyst than the original analyst. This begs the question: “Is such duplication of personnel affordable?”

Research

Research skills provide discipline and consistency for the creation of value-added intelligence. By providing methodologies for defining the requirement to be answered, as well as methodologies for answering that query, research skills ensure analytic consistency and enable thorough exploration of the issues. Necessary research skills include methods of problem definition that ensure that, in collaboration with the consumer, analysts correctly define or redefine the problem in terms of a “research question,” so as to understand the consumer’s and the analyst’s own objectives.

Information Gathering and Manipulation

Information is the grist for intelligence analysis, and to be successful, analysts must aggressively seek it out. Different information/data manipulation skills are required for the various stages of the intelligence process.

  • Collection: This stage involves gathering information from all sources. The intelligence analyst directs the collection process, causing specific resources to be tasked. Related information manipulation skills include selecting and filtering in order to assess whether the information and its sources are of value.
  • Monitoring: Reliability of sources and the validity of the information are always in question. Monitoring skills focus on information review, and often may involve analy- sis of descriptors and summaries of that data.
  • Organizing: Skillful arrangement, formatting, and maintenance of data for analysis and technical report generation ensure access to materials in a usable format.
  • Analysis/Synthesis: Information manipulation skills can point to patterns, relation- ships, anomalies and trends.
  • Interpretation: This is the stage in the process where information is transformed into intelligence by cognitive manipulation, that is, assigning meaning to analyzed and syn- thesized information using critical thinking. Computers aid in this step, however, a study of 12 major “analytic” software tools concludes “true analysis will remain a peo- ple function, assisted by computer technology.”68
  • Dissemination: Dissemination, except for some graphic products, is now of course mostly electronic. Information preparation and presentation skills allow its transforma- tion and publication, so that the results of analysis appear in usable formats, which may be further tailored by users.
  • Coordination: Coordination requires analysts as well as their managers to employ “collegial” skills in the bureaucratic environment; these skills are also needed to avoid diluting the intelligence message down to the “lowest common level of agreement.”
  • Evaluation: Internal and intra-community evaluation allows the intelligence to be dis- cussed and placed in larger contexts than that viewed by a single agency. Such collab- oration may also identify the additional intelligence required to clarify issues. Evaluation can become a continuous part of the production process.69

Project/Process Management

Few analysts enjoy the luxury of working full time on only one problem or on one aspect of a particular problem. We distinguish between projects and processes. The former tend to have finite scope and goals whereas the latter are open-ended. Both require planning, imple-

mentation, monitoring, and negotiating skills.70 A project/process plan defines and clarifies what needs to be accomplished; identifies necessary resources; creates a timeline, including milestones; and makes the analyst accountable for successful completion.

KNOWLEDGE REQUIRED FOR INTELLIGENCE ANALYSIS

Without a solid knowledge base concerning the region or issue to which the analyst is assigned, . . . the individual will not even know what questions to ask. That is, the person will not really be qualified to be called an “analyst.”

— Ronald D. Garst and Max L. Gross71

Knowledge consists of familiarities, awareness, or understanding gained through experience or study; it includes both empirical material and that derived by inference or interpretation.72 Depending on the specific target, the knowledge required can vary widely.

Target Knowledge

Doing intelligence analysis in the information age is often like “being driven by some- one with tunnel vision.”73 In the quest to answer a consumer’s questions, the analyst often pushes aside “all the fuzzy stuff that lies around the edge—context, background, history, common knowledge, social resources.”74 Yet, to do so is perilous, for these provide bal- ance and perspective. They offer breadth of vision and ultimately allow analysts to make sense of the information under study. By providing the context into which analysts place their work, fields of study such as anthropology, comparative religion, economics, geography, history, international relations, psychology, and sociology all interact to contribute vital knowledge about the target, which both analysts and consumers need to understand. Changes in the culture, religion, geography, or economic systems (among others) of a tar- get may themselves be subjects of an intelligence requirement.

The following selection of topics exemplifies some non-traditional but essential target knowledge areas required for thorough intelligence analysis.

Culture: Culture can be defined as a group’s values, standards, and beliefs. In turn, culture defines that group. The study of culture reveals the roles of individuals in the community, and how they relate to non-members of the culture. This provides insights into behaviors that are of value in predicting future behavior. This is true when the tar- get is a people or a nation as well as when the target is a specific subgroup or individual member within a culture. Adda Bozeman points out that because political systems are grounded in cultures, “present day international relations are therefore by definition also intercultural relations … [A]nalysts and policymakers in the West would be more successful in their respective callings if they would examine the cultural infra- structures of the nations and political systems they are dealing with.”

Message of Language: The message of language is a part of culture, and while isolating it makes an artificial distinction, we do so to reiterate its importance for intelligence analysis. What languages are utilized, by whom, and in what context, is essential in understanding the target’s culture. For example, much is revealed if members of an insurgent group primarily communicate using the language of the elite members of their culture. Additionally, what the language indicates about class and personal relationships may provide clues to behaviors.

Technology: Technology itself can be the subject of study by the intelligence analyst. Someone developing a target may analyze specific technologies and their infrastructure as they pertain to that target. Further, the role of technology within a region, nation, or people is an indicator of behavior. The domains of communications, utilities, transportation, manufacturing, and others, as well as the attitudes of the people to them, are rich sources of study. Technology also can provide insights into sources of information that will be available to the intelligence analyst.

Professional Knowledge

In addition to understanding their targets, intelligence analysts need to know a great deal about the context and nature of the intelligence profession, and the resources available to help them do their job well. Understanding the plans and policies of their own government enables analysts to frame their work in terms of the nation’s strategic and tactical objectives. Intelligence consumers are government officials. Their needs drive analytical process and priorities. Analysts base collection tasking on the imperative to match information sources to consumer needs. These information sources, such as human-source reporting, signals intercepts and documentary research, provide the analyst with raw materials for the creation of intelligence through analysis, synthesis and interpretation.

In addition, analysts need to know what specific sources of information relevant to a particular inquiry are available for exploitation. Knowing which expert sources and subject matter experts can guide the analytic process, or can offer different or additional perspectives, enhances intelligence work. The reliability of these sources is also critical. When different sources provide contradictory information, the reliability of one source versus another may provide insights into which information is accurate; the sources may be open or secret, technical or human.

Finally, others, known and unknown, may be examining similar information for the same or different consumers. Awareness that sources of information, possibly vital information, exist, even though they remain undiscovered or untapped, keeps the analyst constantly seeking out new connections.

IMPLICATIONS FOR THE INTELLIGENCE ANALYSIS WORKFORCE

Returning to our thesis, what makes an intelligence analyst successful? Given that the analyst’s purpose is to create intelligence, success means following an effective process (rigorous analysis, sound management) and creating a quality product (that conveys intelligence and meets the consumer’s needs). To do this requires appropriate abilities, knowledge and personal characteristics for rigorous intelligence analysis and production. Well- honed capabilities to communicate, cooperate and think, coupled with the skills that ensure technical competency, provide the means for intelligence work. Informed, deep knowledge of the issues and their background provides both content and context for analysis. Analysts who are motivated to succeed, to know targets, and to share that knowledge ensure that consumers receive intelligence of the highest caliber.

IMPLICATIONS FOR THE INTELLIGENCE ANALYSIS WORKFORCE

[Much] of the work of the intelligence community is highly specialized and requires exceptional creativity…. It is also safe to say that some of the most pressing analytic skills the community will require are precisely those we cannot even foresee at this time.

— Bruce D. Berkowitz and Allan E. Goodman79

a possible uptick in hiring and rising rates of eligibility for retirement mean that, at the least, the savvy of the analytic population will continue to dwindle at the lower end and retire from the upper end.80 Even an adequately sized analytic workforce, lacking adequate mentoring and training from senior, expert analysts, will leave the Intelligence Community unable to meet security challenges.

According to the Council on Foreign Relations’ independent report on the future of intelligence, “less than a tenth of what the United States spends on intelligence is devoted to analysis; it is the least expensive dimension of intelligence… This country could surely afford to spend more in those areas of analysis where being wrong can have major adverse consequences.”82 Winning the talent war requires smart investment in the hiring, training, and deployment of analysts.

training is of little value unless it can be immediately applied. Thus organizational structures, culture, and processes must be aligned to permit and to reward rigorous analysis. Unless analysts are recognized and appreciated for performing sophisticated analysis, they will not embrace change. Significant recognition for high-level analysis will inspire others to follow, creating a culture that fosters and sustains excellence in tailored intelligence production.

employees transferring into the analytic disciplines from other fields must have the prerequisite abilities and skills for analysis before joining this discipline. The field of intelligence analysis cannot safely be a catchall for employees transferring from downsized career fields.

The Aspin-Brown Commission on the Roles and Capabilities of the United States Intelligence Community identified several additional actions to improve the quality of analysis. These include a minimal prerequisite to visit target countries as part of analytic orientation, rewards for acquiring and maintaining foreign language proficiency, encour- agement to remain within substantive areas of expertise, and periodic rotational assign- ments to consumer agencies.89 Enacted as part of employee training and orientation, these measures can substantially enhance analysts’ target knowledge and skills.