The Devil is in the Details: Seven Tests to Apply to any Cyber Conflict Concept

In September 1997 the US Naval Institute ran an article I wrote titled “The Devil is in the Details: Information War in the Field and Fleet.” Now 13 years later there are a few things I wrote in that article I wish I could take back! But for the most part I believe it still offers some cautionary notes relevant to the still emerging field of cyber conflict.

In its day, the article was one of the first to signal that it is ok to write about the emperor having no clothes when it comes to new information warfare concepts. There was already great promise in cyberconflict concepts then, but plenty of snake oil salesmen too, and I proposed seven tests that officers in the field and fleet could use to help them vet and form personal opinions on the relevance of a concept to their mission.

The article was cited a few military journals and military academia, and then was cited by a source I would not have predicted. I was the 7th citation in a book written by two Chinese Colonels (Qiao Liang and Wang Xiangsui) in their widely read “Unrestricted Warfare.”

I thought of this old article after reading news of a computer trojan on ground systems being implicated in the crash of a civilian airliner.  A trojan contributing to a crash is exactly the kind of thing my old article would have said was not possible. If I had been told that was possible in 1997, or even in 2010, I would have almost certainly accused the briefer of exaggeration. Today I work hard to stay up with the capabilities of the threat and I know cyber threats are very serious and are costing us $Billions in real value and probably $Trillions in opportunity cost. But contributing to the crash of an airliner?

But now there is a new fact to examine in the progression of the cyber threat, and one that should cause us all to rethink the path we are on regarding our information systems. I’ll write more on that soon.

And some additional thoughts/questions: Aren’t you glad we have a Cyber Command to help ensure we reduce the chances of this type of threat to our military aircraft?  Don’t you wish we had an organization able to help reduce the chances of this type of threat impacting civilian aircraft?  Do you think we should all be moving at warp speed towards a new FAA designed Next Generation Air Traffic Control System?  Will it be safer than the old one? If you think so, what makes you think so?  Can you help improve on concepts like “Enhancing Computer Security by Two Orders of Magnitude.”

The September 1997 article is posted below.


The Devil is in the Details: Information War in the Field and Fleet

(Second honorable mention, USNI Colin Powell Joint Warfighting Essay Contest)

Published September 1997 USNI Proceedings

LCDR Robert D. Gourley, USN

Some of the nation’s brightest minds are now shaping our approach to information warfare. They are setting up special schools, writing papers and books, debating strategies and codifying joint information warfare doctrine. The JCS and military services are establishing new information warfare commands and orchestrating organizational changes at national, theater, and operational levels. Modern information warfare has the potential of doing for today’s military what ULTRA and MAGIC did for our forces during World War II. Those programs provided insights into enemy intentions and formed the basis of our successful deception plans. Now invigorated with powerful computer and communications systems, information warfare will soon represent an integral part of the American way of war.

But the Department of Defense has moved so fast to embrace information warfare concepts that mid-level officers of all services have had little time to catch their breath. We will soon have to implement concepts that we hardly understand. There is also a growing body of evidence that suggests some information warfare concepts will not work in the field or fleet.

This article is not a primer on information warfare. Such articles abound in professional journals and the open press. It does, however, seek to widen the debate by encouraging constructive criticism on information warfare. It does so by providing seven tests for evaluating information warfare briefings, point papers, articles or books. Applying these tests to a quality information warfare plan will prove its relevance. But these tests will unravel flawed concepts. Testing concepts that fall somewhere between the ends of this failure/success spectrum will provide insight on how to make good plans better.

These seven tests are:
– A test for an over reliance on metaphors.
– A test for plans that overestimate the threat.
– A test for plans that overestimate our own capabilities.
– A test for historical relevance and accuracy.
– A test for extraordinary attempts to avoid criticism.
– A test for unsupported assumptions.
– A test for nonstandard definitions.

#1: Test for an over reliance on metaphors. Metaphors are an important part of our language. They are especially important in explaining new concepts or ideas. Unfortunately, the metaphor is sometimes mistaken for reality. Test any concept explained by a metaphor to ensure the metaphor does not become the concept.

Some metaphors lead to restrictive thinking. For example, a popular metaphor in information warfare is that of “The five pillars of C2W.” Its advocates draw images of five columns labeled OPSEC, PSYOPS, EW, DECEPTION and PHYSICAL DESTRUCTION that rest on a solid foundation labeled INTELLIGENCE. The pillars hold up a rooftop labeled THE MISSION. While useful to introduce the idea, the metaphor does not reflect how C2W should work in the field or fleet. C2W planners may contribute to plans for use of military functions other than the five in the metaphor. The pillars also imply separation. C2W should not consist of stovepiped functions developed separately from each other or the mission. Support to the mission requires planning and executing each function together with other functions and planning efforts. Perhaps a better metaphor in this case is that of strands of a rope– individually they may be strong, but bound together their strength is stronger still– and adding more strands will make the rope even stronger.

Expansive metaphors can also result in misleading interpretations. For example, another common information warfare metaphor is that of “Cyberspace,” the imaginary world behind the screen of your computer. Fantasy novels and the popular press have made the term a household word. Military versions include the “Cyberbattlespace.” The metaphor has reached the point where people talk of “fighting in cyberspace” and of creating teams of “cyberwarriors” to lead those fights. What actually is behind a computer screen is the inner workings of a display device. Like the rest of the computer and every computer network, it is a physical construct of matter that moves energy according to the laws of physics. Strictly speaking, we cannot “fight” in cyberspace any more than we can “walk” inside a Picasso painting.

#2: Test for an overestimation of the threat. There is a serious threat to our nation’s information systems. Hackers attack private and government computer systems on a daily basis. Our economy loses billions of dollars a year to computer crime. The General Accounting Office estimates that hackers conducted up to 250,000 attacks on federal computers last year alone.i Many of these were embarrassing breaches involving DoD information.

The threat is serious, but fortunately our most sensitive networks are well protected, and new technology is making them even more secure. Leaders in government and industry recognize the problems of computer security and are devoting resources to protect them. The Department of Defense and the military services are implementing comprehensive defensive measures. The FBI has computer investigation squads. The Justice Department has doubled its funding for computer crime prosecution. The Secret Service has a computer crime section. The CIA is opening an information warfare technology center. State and local law enforcement officials around the country are following suit.ii Industry’s defensive efforts are even greater than those of the government. Information is industry’s life blood, and they are devoting time, money and brain power to protect it. In an attempt to coordinate the efforts of industry and government, the President has formed a Commission on Critical Infrastructure Protection, which is now conducting public hearings on the issue.iii

Many information warfare experts describe threats from computer literate opponents linking into systems to reroute trains, crash stock markets, open drawbridges, cause midair collisions, reroute HOV lanes and destroy birth records. Some experts paint a threat of computer terrorists crippling our nation by attacks on banking, business communications, power generation, law enforcement and air-traffic control computers. Threat characterizations like those may be neglecting the high investment both government and industry are already making in defensive information warfare. Incorrect threat estimates can result in a waste of resources that should be applied to our real weaknesses. If you suspect an information warfare concept was built to counter an unrealistic threat, you should probably inquire about the details of the threat assessment it is based upon.

#3: Test for an overestimation of our own capabilities. The military can and should expand its capabilities to attack enemy computers. Since computers do so many things that few ever predicted, some people fall into the trap of thinking that we can build computers that can do anything. This is not quite true.

How can you tell if a briefer or a particular reading is guilty of overestimating our capabilities? It is easier to understand the realm of possibilities with a background in quantum mechanics and computer science. But even if you are not conversant in these areas you can still make informed judgments. All you need is a good foundation in the performance of current C4I systems, coupled with an ability to ask probing questions. For example, if you were told that we could degrade an enemy’s oil producing capability by injecting a virus into a computer, your first question should be “How?.” If you are told that the same computer the enemy uses for oil production has been penetrated in labs, ask how the virus will be delivered in hostile territory. If the answer is by space or aircraft, our capability is probably being overestimated. If the answer is by a spy, you should ask if he is in place now or if we need to mount an operation to get him in place. Keep in mind that networks of agents take years to establish. Like many other military operations, the devil is in the details of information warfare plans.

#4: Test for historical relevance and accuracy. Information warfare theorists frequently frame their ideas with historical references. This search for supporting historical tidbits sometimes results in erroneous interpretations. These errors can become so widespread that they begin to be accepted as fact. For example, some information age strategists quote General von Motlke’s vision for reorganizing the German General Staff to make optimum use of telegraph systems. Historians tell us there is no record of von Moltke ever saying any such thing. These fictitious quotes are being used to bolster arguments for organizational change today.iv

Many information war strategists draw parallels between the concept they support and concepts of the past. History can teach us the relevance of these parallels, many of which are right on the mark. For example, Major General Grange and Colonel Kelly highlight the information warfare strategies of Genghis Khan to remind their readers that Òarmies have conducted information operations throughout history.v This is certainly true.

Other information warfare advocates draw parallels that may be less relevant. For example, many try to build support for their ideas by referencing the German Blitzkrieg. They describe Blitzkrieg as a revolutionary concept that allowed the Germans to take French forces with similar technologies by surprise. They point out that there were some in Germany who opposed Guderian’s new ideas. The point of the analogy is usually that we must accept the new information warfare idea being advocated or we will fail in war.

Unfortunately, history also shows us that all new ideas are not great ideas. The Maginot line was a new idea, designed to convince the Germans that war would be untenable. The complex appeasement plan negotiated by Prime Minister Chamberlain prior to World War II was considered by many a new idea that would avoid war. Both of these new ideas failed completely. Concepts should never be condoned just because they are new.

Some information warfare concepts use historical models to explain the history of man and then predict the future. Testing these models to see how well they apply to the past can help you determine if they are too simplistic to allow for reliable predictions of the future. For example, one popular model in information warfare concepts is the “Three Waves” theory proposed by Alvin and Heidi Toffler. This model proposes that the way nations make war is tied to the way they make wealth, and that society has changed its economic and military systems in three waves. From 10,000 years ago till the 19th century, society and war were agriculturally based. When an industrial wave swept through the world the dominate war form became based on mass production. Now that we are riding the crest of an informational wave, knowledge will be central to our way of

Although this is a good summary of the history of civilization, it is not surprising to learn that there are many historical exceptions that do not fit this model.vii It is also not surprising that this type of model is just too general for short term predictions. For developments over the next ten years, simple trend analysis will provide a better assessment of the future security environment.

#5: Test the concept for extraordinary attempts to avoid criticism. This may actually be a signal that the idea deserves more of your scrutiny. Avoiding criticism may come in the form of calling attention to the wisdom of the developers of the idea. A briefer may say that “a group of certified geniuses including a Nobel prize winner developed this idea.” If you hear one like that, keep in mind that educated people are not immune from generating foolish ideas, especially if the subject is outside their area of expertise. For example, two noted PhD criminologists recently published an article on information warfare threats that included references to computer viruses that in reality did not exist. They had read some common computer jokes, and believed the jokes to be true. The jokes had made up names of viruses such as “the Gingrich virus” (the joke says the virus makes you sign a contract with your computer).viii

You might also be told that the concept’s supporters include some of the highest ranking officers in our military, perhaps even a service chief or the Chairman of the JCS. This does not mean the idea no longer deserves your scrutiny. Based on the attention given to joint and service professional military education, our seniors encourage independent thought on national security issues and would welcome professional dialog on information warfare.

Criticism avoidance may also take the form of a briefer skimming over key parts of a concept while explaining that “You wouldn’t understand this part, so I’ll skip it.” I’ve heard phrases like that in briefs to mid-grade military officers. There are certainly some complex information warfare issues to sort out, but there are few whose salient aspects cannot be described to today’s well-educated military officers. If someone resorts to avoiding complicated material and says it is because you would not understand, it is a good indication that you are dealing with a modern day snake-oil salesman. As a rejoinder you could make it clear that you expect plain language explanations on the concept. Perhaps you can remind the briefer that Carl Sagan could satisfactorily explain the entire cosmos to the average American using nothing but plain English.

You may encounter a briefer who answers simple questions with “I can’t get into that… it is classified.” If so, you may want to reply with a general question like “is there anything about it you can describe at our current classification?” If the answer is still no, you may wish to contact a member of the individual’s parent command or a co-worker who may be cleared to higher levels. Keep in mind that your objective should never be to gain access to information that you are not cleared for. Classification can (and sometimes should) be used as a trump card that will not let you give some concepts your full scrutiny.

#6: Test for unsupported assumptions. These can creep into arguments on any controversial subject. However, the only information warfare assumptions officers in the field or fleet should accept are those defended by good arguments. For example, a common assumption in information warfare is that it “will in and of itself relegate other more traditional forms of warfare to the sidelines.”ix There is no evidence that this is the case at all. In fact, the new JCS Joint Vision 2010 provides the assessment that solving future crises will always require an ability to put “boots on ground.”x

Another common assumption is that we must reorganize to use information warfare strategies. Many of our organizations can and should change, but information warfare strategies need not affect every organization. Efficient staffs have long been able to implement new ideas with current structures. Most Unified CINC staffs and many of their subordinate staffs are now grappling with how to best reorganize to take advantage of information warfare. Often, they may find that they need no substantial changes.

#7: Test for nonstandard definitions. Almost every organization dealing with information warfare (including those in academia and industry) defines information warfare concepts differently. There are those that say we shouldn’t haggle about such a minor point. However, words and how they are defined can have a significant impact on how we transform concepts into reality. Fortunately there is a “no haggle” solution to this issue. We in the military should insist on using the definition JCS promulgates to us as doctrine. This means the best source (at this writing) is DoD’s Joint Pub 3-13. “Joint Doctrine for C2W.” It defines information warfare as “actions taken to achieve information superiority by affecting adversary information, information-based processes, information systems, and computer-based networks while defending one’s own.”xi

The U.S. military must continue to develop strategic theories of information warfare. Such theories will, in turn, drive joint doctrine, technologies, organizations, and procedures designed for use by operators in the field and fleet. Currently, those with little military experience and senior officers removed from day to day operations are leading the debate on information warfare. But the nation’s real information warfare experts are officers in the field and fleet skilled in making assessments in a data rich environment. By taking a more active role in this debate and applying the tests presented above to current concepts, operators can better the information warfare plans we will be expected to implement in crisis or war.

i. GAO Report. Information Security: Computer Attacks at Department of Defense. (GAO/AIMD-96-84. May 1996).

ii. Shannon Buggs, Court-martial to begin in computer spying case. THE RALEIGH NEWS & OBSERVER (Raleigh, N.C. December 9, 1996).

iii. John Schwartz, Retired General’s Mission: Making Cyberspace Secure. WASHINGTON POST (Washington, D.C., January 31, 1997). A19.

iv. R. L. DiNardo and Daniel J. Hughes, Some Cautionary Thoughts on Information Warfare. AIRPOWER JOURNAL (Winter 1995).

v. Major General David Grange, Colonel James Kelly, Victory through Information Dominance. ARMY (Association of the U.S. Army, March 1997). 33.

vi. Alvin and Heidi Toffler, WAR AND ANTI WAR: Survival at the Dawn of the 21st Century. (New York, Little, Brown and Co., 1993).

vii. For a critique of this model see Robert J. Bunker’s The Tofflerian Paradox. MILITARY REVIEW (May-June 1995). 7.

viii. David Carter, PhD. , and Andra Katz, PhD, “Computer Crime: An Emerging Challenge for Law Enforcement.” LAW AND ENFORCEMENT BULLETIN (FBI Academy, Quantico VA, December 1996).

ix. Steve Kish, Do We Need an Information Warrior? MARINE CORPS GAZETTE (Quantico VA, January 1997). 20.

x. Chairman of the Joint Chiefs of Staff, Joint Vision 2010. (Pentagon, Washington, D.C., 1997). 18.

xi. Joint Pub 3-13.1 Joint Doctrine for Command and Control Warfare (C2W). (Pentagon, Washington, D.C., February 7 1996).

, ,

Leave a Reply