CNA’s Open Source Analysis of Soviet Military Writings

Explanatory Note: CNA’s Soviet navy studies program in the 1970s and 80s analyzed the traditional bodies of evidence, classified and unclassified, including Soviet naval operations, exercises, building programs, etc. But its main focus was on analyzing open source Soviet military writings. I was a team member and the program’s director from 1973 to 1982. The program’s work and successes are described in my “The Return of Great Power Competition: Cold War Lessons about Strategic Anti-Submarine Warfare and Defense of Sea Lines of Communication” Naval War College Review, Summer 2020, forthcoming. As I note in the post, open source writings yielded conclusions about Soviet strategic intentions eight years before the Intelligence Community reached identical conclusions drawn from traditional intelligence sources. Jamie McConnell and Bob Weinland were lead analysts. Others making important contributions were Susan Clarke, Mary Fitzgerald, Ken Kennedy, Hung Nguyen, Charlie Petersen, Richard Remnek, Abe Shulsky, Lauren Van Meter, and Barry Blechman (work published later as a Brookings Institution monograph).

This post describes the methodologies used at CNA during the Cold War. Its aim is to illuminate techniques that proved successful then with the hope that they may have something useful to say to analysts today. A conclusion suggests several measures that might inform open source work in the future.

FULL POST – CNA’s Open Source Analysis of Soviet Military Writings

 Analysts at CNA drew on all the standard sources of information, classified and unclassified, to infer the Soviet navy’s strategic purposes—building programs, operations, exercises, organizational structure, etc. But the major focus of effort was on interpretation of open source Soviet military writings. Analysts examined the adversary’s public statements at two levels: at the level of capabilities and tactics, taking their often revealing statements at face value; at the strategic level, reading what they say and inferring their true beliefs and intentions.

 For the big strategic questions open source writings provided the best and earliest answers. The experience of the Cold War showed that insights into Soviet planning at the strategic level rarely came from any other source—the 1980-81 SCI breakthrough being the momentous exception. Interpreting the Soviets at the strategic level relied on a variety of content analysis techniques long used in various fields of the social sciences: frequency of mention of a topic usually indicated its importance. Absence of reference to a salient topic could also signal importance. Imputing to your adversary plans to do something that he has never contemplated could suggest your own intentions. For example, the Soviets said that when the US Navy deployed the Trident SSBN, it intended to use its ASW forces to defend Trident against possible attack. Using these relatively simple techniques, a close reading of the Soviet military press in 1971-73 showed that the Soviets were seriously concerned about the possible vulnerability of their SSBNs and were intent on defending them—the so-called “pro-SSBN” mission.1

 Two other, more subtle, techniques yielded deeper strategic meaning: pure linguistic interpretation of the Soviet military vocabulary and inferences drawn from the byzantine forms of expression commonly used in Soviet military discourse. In the latter, the Soviets rarely stated an important point; they only implied it.2 Linguistic interpretation was central to James McConnell’s exegesis of the Gorshkov articles (1972-73) in Morskoy sbornik. This interpretation gave a larger strategic meaning to “pro-SSBN.”

 The single English word “defense” is rendered in Russian by two words: zashchita (защи́та) and oborona (оборона). McConnell detected in Gorshkov and in other authoritative Soviet writers that zashchita defense tasks were assigned by the General Staff, roughly equivalent to the US Joint Chiefs of Staff. The land-based missiles of the Strategic Rocket Forces were for zashchita (defense) to fight and win a war. Oborona  defense tasks, on the other hand, were assigned by the Defense Council, the highest political body dealing with defense, the equivalent of the US National Security Council. The missiles of the Soviet Navy—to be withheld from initial strikes, as described above—were for oborona (defense) to achieve the war’s political goals. This linguistic difference was highly indicative of the role of SSBNs as a strategic reserve.3

 Yet there was more evidence of this role to be found in analysis of the byzantine forms that often marked Soviet writings. Metaphor and (ostensibly) historical analogy were used to express an idea with real contemporary meaning. This form of exposition was presumably meant to communicate a message transparently to an internal audience, but obscured to outsiders. The most telling example, also from McConnell, was Gorshkov’s treatment of the Royal Navy’s Admiral Jellicoe in World War I. With one exception, the Battle of Jutland, Jellicoe did not commit the British Grand Fleet to battle. Instead the Fleet was held back as a “strategic reserve” in protected “bastions” at Scapa Flow in the Orkneys, while a world war raged on hundreds of miles to the south.

In the ensuing 40 years every Soviet naval historian, without fail, had excoriated Jellicoe. They said he should have come forward, destroyed the German navy, and help turn the tide (quite implausibly) in a frozen land war. Then, suddenly in 1973, Gorshkov reverses this assessment of Jellicoe: Jellicoe, he said, was right! He made the correct decision. Maintaining the Grand Fleet as a strategic reserve was wise because possessing a reserve of strategic power can be decisive in determining the outcome of a war. In other words, Gorshkov was saying, Jellicoe was smart to have his reserve. And I’ve got mine.

This mode Soviet expression was surely byzantine. In this sense it was similar to the alien idea it expressed—using a navy to protect a strategic nuclear reserve. It was alien to the modes of expression of US strategic thinking, yet its proper interpretation did yield a valid insight of considerable strategic utility. But it was not recognized as valid at the time. This was not a new problem. The Intelligence Community has had a blind spot to conclusions drawn from open sources from the earliest days of work in the field. Consider some cases where accurate conclusions were ignored or rejected:

  • World War II – Alexander George and others in the US and Britain analyzed Nazi war propaganda and drew valid forecasts of important German moves like the V1 and the V2 missiles and their tank offensive at Kursk. Their results were generally ignored, as George documented in a doctoral thesis at the University of Chicago.4
  • Korean War – Open source work forecast Chinese intervention if the US moved north.
  • Cold War – Besides CNA’s, important open source work of others were also ignored or rejected. Robert Herrick’s case was the most notable.

Despite, or perhaps because of, this rather dismal record, since the late 1990s, open source work has expanded, its status has been elevated, and, presumably, its conclusions are used more widely by the Intelligence Community today. The establishment of the National Open Source Center and the office of Assistant National Intelligence Director for Open Source bear witness to these advances.

The experience of the Cold War suggests several measures that could be adopted or enhanced across the discipline as whole to improve the quality and strengthen the utility of open source work. First, analysts should study systematically the phenomenon of “disinformation.” Disinformation is difficult, if not impossible, for any large organization to inject into its planning documents, except perhaps for the briefest periods. The simple reason is that you cannot lie to your own people without engendering confusion if not chaos. But detecting and guarding against disinformation is always an obligation both to avoid being tricked and, especially, so the open source analyst can assuage doubts about the reliability of open source work that many of its consumers harbor. The latter are usually convinced that they themselves would never publicly reveal their own true beliefs and intentions to their adversaries and are similarly convinced that their (secretive and duplicitous) adversaries follow the same dictate. Open source analysts have to be able to explain cogently how they reached their conclusions. In other words, analysts must be able to show that their techniques work not just in practice but also in theory.

 Second, analysts must make sure that conclusions drawn from open source work are properly protected. Just because the sources being analyzing are unclassified that does not mean the conclusions drawn are unclassified as well. Analysts need to be self-policing. For example, they should weigh carefully the desirability of putting into the public domain, most especially via the Internet, important conclusions bearing on important issues.

 Third, open source analysis would benefit from a general accounting of which of its many techniques are efficacious and which are less so. This would seem especially important where, today or in the future, some “analysts” on the internet may in fact be bogus, intent on misleading or confusing genuine academic debate. A record of systematic assessment of the discipline would also aid in the integration of open source work with other established sources of intelligence to produce genuinely “all source” intelligence. NIEs that do not include a healthy measure of evidence drawn from open sources are unlikely to be as accurate nor as substantial as they could be.


1. Hattendorf, citing Dismukes, “Evolving Wartime Missions of the Soviet General Purpose Force Navy,” (Secret) June, 1973 (Center for Naval Analyses 001061, p.16). John B. Hattendorf, The Evolution of the U.S. Navy’s Maritime Strategy, 1977–1986, Newport Paper 19 (Newport, RI: Naval War College Press, 2004), chapter 2 (first published in a classified version as Newport Paper 6, 1989)

2. James M. McConnell, with Susan Clark and Mary Fitzgerald, “Analyzing the Soviet Military Press – Spot Report No. 1: The Irrelevance Today of Sokolovskiy’s Book Military Strategy,” Arlington, VA: Center for Naval Analyses, CRM 35-85 (May, 1985)

3. For a quite accessible account of McConnell’s methods and findings see Steven Walt, “Analysts in War and Peace” Professional Paper 458, (Arlington, VA: Center for Naval Analyses, 1987) 4. Alexander George, Propaganda Analysis (Chicago, IL: Row, Peterson and Company, 1959)

Bradford Dismukes, San Francisco, November 24, 2020

Leave a comment

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: