Controlling Cyber Conflict? Arms Control, International Norms, and Strategic Restraint

Map Unavailable

Date(s) - 6/21/20118:30 am - 11:00 am


On June 21, 2011 the Marshall Institute hosted a panel discussion to explore how cyber conflict can be mitigated.  The panelists were Dr. John B. Sheldon, Professor, School of Advanced Air & Space Studies, Maxwell AFB, Alabama, and Marshall Institute Fellow; Mr. David E. Hoffman, Contributing Editor to Foreign Policy and The Washington Post and author of The Dead Hand: The Untold Story of the Cold War Arms Race and its Dangerous Legacy; Dr. Christopher A. Ford, Hudson Institute; Dr. James A. Mulvenon, Defense Group Inc.; and Professor Martha Finnemore, George Washington, University.

Jeff Kueter announced the start of a new program at the Marshall Institute examining cyber security issues with the goal of bringing bring together experts and the public to discuss the highest level policy questions.  The country is moving forward to address the problems we face in the new environment in cyberwarfare, but the Institute is concerned that without addressing these higher-order questions, we may be making bad policy.  This forum on Controlling Cyber Conflict is designed to stimulate a broader policy discussion among the public about some of these questions.

John Sheldon introduced the event’s topic and pointed out that as a strategic theorist, he avoids the term ‘cyberwar;’ war involves a whole range of means, including cyber means or air means.  “It is important,” he pointed out, “that we differentiate between cyber incidents and cyber attacks that are criminal in nature or terroristic, rather than everything being war.”  Cyber conflicts don’t necessarily manifest themselves in a physical sense, so mitigating them, protecting critical infrastructure, and inducing restraint in hostile actors is a challenge.  Options range from arms control to developing international norms of behavior to strategic restraint on the parts of the opponents themselves.  Sheldon described some of the recent efforts to start developing ideas about how to do this, both in academic essays and in conferences with interested parties from around the world.  The utility of such agreements and norms is a matter of debate, and the panelists will present their own opinions on that issue.

David Hoffman began by affirming that cyber conflict, whether criminal, espionage, agitation or war between nations, is in fact a national security issue for the U.S. and affects our prosperity, innovation and the heart of our society.  He drew parallels to the Cold War confrontation between the superpowers and particularly the standoff of offensive nuclear weapons.  Technology, he noted, drove the nuclear standoff toward more and more risk and danger, and it will do so today in cyber conflict.  Likewise, the U.S. and U.S.S.R. carried out offensive and defensive biological weapons programs through the 1960s.  Biological weapons shows similarities to cyber weapons: both technologies can be used for both good and for harm; attacks are extremely hard to attribute and lack a warning mechanism; deterrence against them is uncertain and complex; and resilience may be the best defense.  Even though President Nixon unilaterally renounced offensive biological weapons, the Soviet Union did not believe that the U.S. had actually given up germ warfare and launched an entirely new offensive biological weapons program after signing the treaty.  The driver here was mistrust, lack of transparency and technology.

While cyber defense makes sense, Hoffman asked to what extent should the U.S. carry out and participate in offensive cyber actions.  Some have recommended “maintaining America’s offensive military advantage in cyberspace” as a form of deterrence, echoing the policy of the Cold War.  Hoffman doubts the efficacy of that form of deterrence, due to the problems of attribution, civilian control of decision-making and the unclear threshold for retaliation.   He used the counter-example of Iran adopting the U.S. policy of using all necessary means to defend our nation and interests and announcing that they reserve the right to use any and all means to retaliate against whoever put the Stuxnet in their uranium enrichment program.  “If the U.S. government makes a policy of engaging in offensive cyber actions,” he concluded, “other will also.  In fact, they already are.  That will lead us inexorably to a cyber arms race.”  Since we don’t fully understand the consequences of engaging in a cyber arms race, as we didn’t understand the cycle of reaction that took place in the nuclear and biological arms races, such a race will likely present us with unintended consequences, risks and dangers as well.

Prof. Finnemore focused her discussion on lessons to be drawn from other policy areas where we have tried to regulate and instill shared expectations in our adversaries, partners and citizens.  She observed that when discussing cybersecurity, most people don’t think of arms control, but concentrate on a smaller area, such as crime or privacy, and involving all of the many stakeholders will be a challenge.   While the default solution to problems of international coordination is usually to draft a big multilateral treaty, which would have the force of law and some illusion of clarity, it is worth thinking about other kinds of social engineering tools.

Since the treaty-making process cannot keep up with rapid technological change in the cyber world, cultivating norms of behavior or shared expectations of proper behavior in cyber space might be a complimentary strategy; people might be willing to coordinate out of self-interest, as stepping on toes costs money and creates security threats.  Such a “building cyber norms” route might be able to involve the variety of parties needed to get effective coordination on a cyber platform, much of which is in private hands.  She explained that parties with very different views, both state and non-state, might be reluctant to sign formal treaties, but might be willing to participate and coordinate and over time, their expectations are shaped by their participation in these groups, which makes it a more flexible instrument.  Norms, such as national responsibility for attacks originating on a country’s soil or servers or cleaning up a country’s networks, can be grafted onto existing normative architectures, which makes the new norms of behavior you are promoting seem intuitive and plausible.  In the same way, a cyber version of laws of war could be developed; one could plan to protect civilian targets by de-coupling their networks in certain ways to isolate and protect them from cascades from a cyber attack.

“If you’re going to promote a norm, make it clear, make it useful, make it doable,” said Finnemore.  Americans have a long record of teaching best practices abroad, though some weaker states will require assistance; making it easy for people to change their behavior to do the right thing will help make a new practice succeed.  And creating new norms to govern the way we secure our nation in an internet world may be harder than similar issues we have tackled, due to the number of actors.   Controlling cyber-threats is similar to controlling public health threats; individual actions and behavior make the difference, whether washing hands or changing what people do on their office computer space.  Finnemore compared cyber security to controlling carbon emissions and reminded the audience that big coordination problems, such as our ability to reduce carbon or agreeing who’s responsible for reducing them, have not worked out so well.  In both cases there are lots of stakeholders involved, but shifting these expectations is a social problem and we benefit by thinking about it in sociological ways.

James Mulvenon described his involvement in the unofficial U.S.-China cyber security dialogue.  He noted an important difference in terminology which reflected different political viewpoints: the U.S. wanted to talk about cyber security, which we defined as protecting the network, but the Chinese wanted to talk about information security, which they defined as both protecting the network and policing network content.  When discussing cyber crime and intrusions, the Chinese protested that the U.S. refuses to extradite people and respond in other ways, because of the political motivation behind their demands; they want us to help them imprison Falun Gong people and torture Uyghurs and Tibetans.  The Chinese emphasized that all the servers and routers on the network are within the boundaries of a sovereign nation and are bound by national laws; the U.S. is an outlier among the nations because we think of the internet in terms not only of state sovereignty, but also of privacy and civil liberties.  Mulvenon argued that the important conflict is really between state sovereignty and a view that there is an actual non-sovereign part of cyberspace that needs to be defended; certainly in the U.S. many people feel a sense of entitlement as to what they are allowed to do on the network.

The attribution problem and our inability to calculate certainty of effects in the cyber world led Mulvenon’s Cyber Conflict Studies Association to conclude that in the cyber realm, we face an environment of instability.  The Chinese appreciate the attribution problem, since it is the heart of their computer network exploitation and attack doctrine.  “I would argue that the Chinese and Russians are much more comfortable with cyber as an overt tool of national power than we are,” he said, and they are comfortable with the use of cyber as an overt tool or with its use by proxies, even non-national proxies, operating on their behalf, which the U.S. would never be comfortable with.   The Chinese and Russians are interested in arms control, though without enforcement or verification, primarily for international reputation reasons.

He pointed out that there are already many international norms in cyber space which are not set by governments.  An informal cabal of fiercely anti-government individuals actually run the global internet and operate by a set of norms which is not codified in any way by any government or any multilateral institution.  The Chinese, on the other hand, demand that the governance of cyberspace need to be state-based and they dislike the current situation, which legitimates the role of non-state actors. They want to move internet governance to an international state-based body which they can control and which allows everyone to sidestep blame about the intrusions.  Mulvenon doubts that the Chinese will give up intrusion attacks because of their great espionage value, but they are also interested in curbing cyber attacks and in designating targets that would be immune from attack.  That is almost impossible, he explained: “The nature of the network means that unintended cascade effects from designated targets into non-designated targets (the Stuxnet problem) means you can’t be that clean.”  Most U.S. military cyber traffic rides on leased commercial bandwidth, which means both military and civilian networks would be targets in cyber conflict.  He closed by urging the audience to update antivirus software and use encryption on their computers, “because all of you, including my mother, are the weak link.”

Christopher Ford offered his thoughts on alternative approaches to cyber arms control.  He discussed the “Advanced Persistent Threat” (APT) faced by U.S. networks, widely attributed to the People’s Republic of China.  So far, the APT seems to have been used for spying, but may be used or has been used to implant malicious code, enabling an attacker to crash or manipulate key computer networks in the future. Such a threat requires mitigation, but the often suggested methods, cyber arms control or traditional deterrence, are problematic in different ways.  Conventional arms control requires clear definitions about what is regulated and the ability to observe and verify it, which cannot be done in the cyber realm.

Ford outlined various proposals of internet regulation have been proposed, some of which are based on the idea that the present internet regulatory system, which prizes freedom, was unfair and threatening.  The Chinese government, for example, believes that the most significant cyber threat is making uncensored information available to its citizens.  He warned that cyber arms control is unlikely to work and may facilitate actions that we cannot ourselves support.  Instead, promulgating a collection of “best practices” to encourage governments and private entities to work together on robust cyber-security and in investigating attacks may be helpful; examples are NATO’s help for Estonia after the 2007 cyber attacks and a National Security Agency program to help internet service providers identify malicious code used in cyber attacks.  Identifying the rules which apply to conflict in cyberspace is important, and cyber planners are already moving toward compliance with law-of-war principles used in other arenas.

In spite of the problems of attribution, deterrence also will play a role.  “I applaud the fact that U.S. military doctrine now reportedly speaks increasingly of cyber attacks as something that could, in theory, provoke a conventional military response,” said Ford.  While conflict in cyberspace presents new challenges, they are not unprecedented and laws and policies will be developed to meet them, just as they have been done recently in the counter-insurgency and counter-terrorist operations.

Partner & Fellow Blogs