Editor's note: This post by Sean Lawson provides context on cyber conflict, an area of interest at the nexus of national security and technology. - bg
British sociologist Frank Furedi notes that an increasingly prominent feature of postmodern society is a "crisis of causality" that is "a cultural mood that assumes the uncertainty of causality between action and effect" (p. 8). This crisis of causality, he argues, is at the root of our growing fears of various unseen threats and vulnerabilities. Cyber threats of various sorts are identified as prime examples of the "expanding empire of the unknown" at the heart of our anxieties about life in advanced Western societies (p. 25-26).
One example of the crisis of causality in cybersecurity is the "attribution problem"--i.e. it is difficult to uncover the perpetrator of a cyber attack with certainty. But growing fear of cyber threats is also indicative of a more general attribution problem, one in which we often focus upon the most exotic, hypothetical causes of our misfortunes while ignoring the actual causes. Cyber war is the exotic, hypothetical cause du jour.
One recent example comes in a piece by David Rothkopf for Foreign Policy that warns us all that "The Phantom War has Begun." The "phantom war" is, of course, cyber war. And it's scary precisely because it is "rock[ing] your world" even though "most people don't even know it." Rothkopf's piece was in large part inspired by an Office of the National Counterintelligence Executive report on foreign economic and industrial espionage against the United States that "gives special attention to foreign collectors' exploitation of cyberspace." Rothkopf writes,
Imagine wars that were conducted constantly, wars in which both sides might not be bent on destroying one another but would rather focus on capturing resources or slowing down economic performance or producing popular frustration or distributing misinformation or manipulating elections or markets. [...] Imagine what the future might hold.
Imagine the consequence of simply a rapidly spreading perception that a stock market's security had been compromised and that prices were being manipulated. We'd be back to paper trading or no trading at all in a matter of hours.
While the world looks to the economic crisis in Europe this week, acknowledging that international finance is the first place we are feeling the risks associated with living in such a closely interconnected world, there are rumblings of further such risks. But how long will it take for us to come up with a strategy to defend ourselves and limit damage in this already on-going war....or an appropriate code of ethics for conducting it...this war that it seems quite likely no one can ever win.
While the world wonders how to come up with solutions to the financial crisis that we are already in, a crisis caused by a perfect storm of bad economic policies and good old fashioned greed and corruption (not cyber war), Rothkopf and many others are instead worried about strategies for imagined futures.
This is not the first time that Rothkopf has used actual, ongoing crisis as a means to motivate a response to hypothetical, imagined futures. In April 2011, he seized on the Japanese earthquake, tsunami, and Fukishima nuclear accident to refocus our attention on what really matters: hypothetical cyber wars! Less than one week after the earthquake and tsunami that resulted in the destruction of entire cities, tens of thousands of people killed or displaced, and one of the worst nuclear accidents in history, Rothkopf told readers that all of this "may, in a way, be yesterday's news." Tomorrow's news, he said, will be about "cyber war." Astonishingly, he lamented that "This is the deeply unsettling situation effectively framed by General Alexander in his testimony [earlier that week] and rather than having been obscured by this week's news [Japanese earthquake] it should only have been amplified by it."
The idea that the very real and ongoing tragedy in Japan should have "amplified" the Congressional testimony of a government official angling for more money to respond to hypothetical threats is ridiculous and offensive. This is especially the case when we remember that an OECD report released around the same time as Rothkopf's post had indicated that cyber attacks are unlikely to have the kinds of impacts that we have seen in Japan, the ripples of which have had global impacts. (My own analysis of the potential for "cyber-doom" for the Mercatus Center at George Mason University reached many of the same conclusions as the OECD study.)
But this tendency to overlook reality in favor of thinking about imagined futures is not limited to Rothkopf. In fact, it has become quite common among national security professionals in the United States, especially those who write about and work on cybersecurity issues. Less than two weeks after Rothkopf's post, Brent Scowcroft, former National Security Advisor to President George H.W. Bush, "compared the cyber threat to the danger posed by nuclear weapons during the Cold War. He said, 'Cyber has the same capabilities', before adding that actually, 'in many ways it's more daunting.'"
Again, such comments are both ridiculous and offensive when we remember that there were hundreds of thousands of casualties as a result of the atomic bombings of Hiroshima and Nagasaki, and that during the Cold War the United States and Soviet Union produced enough nuclear weapons to kill every living thing on the planet multiple times over. While cyber attack has not come remotely close to showing that kind of potential for destruction, many of those nuclear weapons built during the Cold War are still with us and nations like Iran and North Korea are working feverishly to acquire their own nuclear arsenals, despite the emergence of cyber wonder weapons like Stuxnet.
For the last three years, policymakers in the United States, from the White House to the Congress to the Department of Defense, have framed cyber threats primarily in economic terms and with an emphasis on theft of intellectual property and decreased economic competitiveness. Some key examples:
- "Our digital infrastructure has already suffered intrusions that have allowed criminals to steal hundreds of millions of dollars and nation-states and other entities to steal intellectual property and sensitive military information. Other intrusions threaten to damage portions of our critical infrastructure. These and other risks have the potential to undermine the Nation’s confidence in the information systems that underlie our economic and national security interests" (Cyberspace Policy Review, p. i).
- President Obama's speech introducing the Cyberspace Policy Review emphasized first and foremost that “America's economic prosperity in the 21st century will depend on cybersecurity.”
- The Obama administration's International Strategy for Cyberspace states that "Cyberspace can be used to steal an unprecedented volume of information from businesses, universities, and government agencies; such stolen information and technology can equal billions of dollars of lost value. […] Results can range from unfair competition to the bankrupting of entire firms, and the national impact may be orders of magnitude larger. The persistent theft of intellectual property, whether by criminals, foreign firms, or state actors working on their behalf, can erode competitiveness in the global economy, and businesses’ opportunities to innovate" (pgs. 17-18).
- Senators Sheldon Whitehouse and Jon Kyl justified their Cyber Security Public Awareness Act of 2011 by arguing that cyberattacks are causing "what could be the largest illicit transfer of wealth in world history," as well as the "loss of countless American jobs."
- The DoD Strategy for Operating in Cyberspace also frames the threat as primarily economic: "While the threat to intellectual property is often less visible than the threat to critical infrastructure, it may be the most pervasive cyber threat today. Every year, an amount of intellectual property larger than that contained in the Library of Congress is stolen from networks maintained by U.S. businesses, universities, and government departments and agencies. As military strength ultimately depends on economic vitality, sustained intellectual property losses erode both U.S. military effectiveness and national competitiveness in the global economy" (p. 4).
- General Keith Alexander, head of U.S. Cyber Command and director of the National Security Agency, has said that "economic espionage for commercial and technological advantage is an everyday event" that "can take on hitherto unimaginable scale; a conqueror once had to capture a city before his army could loot it" (p. 4).
Certainly, cyber espionage and theft of intellectual property are problems and cybersecurity should be improved. However, cyber attacks are not the cause of the U.S. decline in economic competitiveness. They are not the cause of our education problems. They are not the cause of our financial meltdown, mortgage crisis, and loss of jobs. They are not the cause of our inability to develop and then procure hi-tech weapons that work, in adequate numbers, at a reasonable price, and in a reasonable amount of time.
Two recent reports from the The National Academies of Science indicate that the overwhelming cause of the United States' lack of competitiveness is insufficient investments in education, research and development, and infrastructure. The reports do not see the situation improving. In neither report are cyber attacks or espionage identified as prime causes of poor economic competitiveness. A recent report from the JASON group extends this critique to the Department of Defense when it finds that DoD has underfunded basic science research that is key to long-term innovation in favor of spending on more short-term, applied, mission-oriented projects.
It is not nefarious foreigners launching phantom wars via an ethereal cyberspace that are unfairly stealing our future from us. We are doing just fine at that all on our own.
In 1994, futurist and bestselling author, Alvin Toffler, told a reporter that "They [terrorists or rogue states] won’t need to blow up the World Trade Center. Instead, they’ll feed signals into computers from Libya or Tehran or Pyongyang and shut down the whole banking system if they want to. We know a former senior intelligence official who says, 'Give me $1 million and 20 people and I will shut down America. I could close down all the automated teller machines, the Federal Reserve, Wall Street, and most hospital and business computer systems.'"*
Such scenarios were par for the course in the late 1990s. Even on the eve of the terrorist attacks of September 11, 2001, Bush administration officials worried publicly about the threat of cyber attacks while indications of a physical attack by al-Qa'ida were being missed. In retrospect, Marcus Sachs, a staff member of the President's Critical Infrastructure Protection Board during the Bush administration, said that
"Based on what we knew at the time, the most likely scenario was an attack from cyberspace, not airliners slamming into buildings" [...] Sachs acknowledges that, in hindsight, the effort was misdirected. "We had spent a lot of time preparing for a cyber attack, not a physical attack," says Sachs. "Our priorities had to change a little bit."
Ironically, David Rothkopf's blog purports to tell us "how the world is really run." In one way, he does give us that insight: It is often run by individuals who have a hard time distinguishing between what is happening and what might happen, between what they can imagine and what is possible and/or probable. We need to keep our eye on the actual list of social, economic, and security problems that are affecting us now and which seem to grow by the day. Let's not get distracted once again by "phantom" threats and miss the reality staring us in the face.
* Elias TD, (1994) Toffler: Computer Attacks Wave of Future. South Bend Tribune (Indiana), 2 January.
[Cross-posted from Forbes.com.]