{"id":1004,"date":"2011-01-10T09:05:17","date_gmt":"2011-01-10T14:05:17","guid":{"rendered":"http:\/\/www.harvardnsj.com\/?p=1004"},"modified":"2015-01-06T12:35:29","modified_gmt":"2015-01-06T17:35:29","slug":"cybersecurity-and-national-policy","status":"publish","type":"post","link":"https:\/\/journals.law.harvard.edu\/nsj\/2011\/01\/cybersecurity-and-national-policy\/","title":{"rendered":"Cybersecurity and National Policy"},"content":{"rendered":"<p><strong><em>**On January 10, this essay was nominated for the Third Annual Social Security Blogger Awards. Visit <a href=\"https:\/\/365.rsaconference.com\/blogs\/security-blogger-meetup\/2011\/01\/10\/and-the-winners-are\">https:\/\/365.rsaconference.com\/blogs\/security-blogger-meetup\/2011\/01\/10\/and-the-winners-are<\/a>.<\/em><\/strong><\/p>\n<p>By Daniel E. Geer, Jr., Sc.D. &#8211;<\/p>\n<p><a href=\"https:\/\/journals.law.harvard.edu\/nsj\/wp-content\/uploads\/sites\/82\/2015\/01\/Volume-1_Geer_Final-Version.pdf\">Click here to download the published PDF version<\/a><a href=\"http:\/\/www.journals.law.harvard.edu\/nsj\/wp-content\/uploads\/sites\/82\/2010\/04\/Volume-1_Geer_Final-Version.pdf\"><br \/>\n<\/a><\/p>\n<p>What follows is the author\u2019s own opinion, <em>i.e.<\/em>, it is not a strategic leak of anything going on in any corridor of power.\u00a0 I am reminded of one of my mentors who said that, if you are lucky, as you age you will compensate for your loss of creativity with burgeoning critical skills.\u00a0 As such, much of what I say here will be critical, but I hope that it is taken as critical in the sense of analytic rather than critical in the sense of harping.<\/p>\n<p>Let me begin with some biases.\u00a0 First, security is a means, not an end.\u00a0 Therefore, a cybersecurity policy discussion must necessarily be about the means to a set of desirable ends and about affecting the future.\u00a0 Accordingly, security is about risk management, and the legitimate purpose of risk management is to improve the future, not to explain the past.<\/p>\n<p>Second, unless and until we devise a scorekeeping mechanism that apprises spectators of the state of play on the field, security will remain the province of \u201cThe Few\u201d.\u00a0 Sometimes leaving everything to The Few is positive, but not here as, amongst other things, demand for security expertise so outstrips supply that the charlatan fraction is rising.<\/p>\n<p>Third, the problems of cybersecurity are the same as many other problems in certain respects, yet critically different in others.\u00a0 We often misclassify which characteristics are \u201csame\u201d and which are \u201cdifferent\u201d, beginning with the sharp differences between the realities of time and space in the physical world versus the digital world.\u00a0 Examples of these differences include the original owner continuing to possess stolen data after the thief takes it, and law enforcement lacking the ability to work at the speed of light.<\/p>\n<p>When I think about cybersecurity and national policy, I can only conclude that the problem is the problem statement.\u00a0 At the highest level of abstraction, let me propose that the problem statement for a National Policy is this:<\/p>\n<blockquote><p>To move from a culture of fear,<\/p>\n<p>to a culture of awareness,<\/p>\n<p>to a culture of measurement.<\/p><\/blockquote>\n<p>While this statement is not operationalizable per se, it demonstrates my biases that security is a means and that game play cannot improve without a scorekeeping mechanism.<\/p>\n<p>None of that is new; the worry over fear has been said all but word-for-word for almost five centuries.<\/p>\n<blockquote><p>The thing I fear most is fear. \u2014 Montaigne, <em>ca.<\/em> 1570<\/p>\n<p>There is nothing terrible but fear itself. \u2014 Bacon, 1620<\/p>\n<p>The only thing I am afraid of is fear. \u2014 Wellington, 1836<\/p>\n<p>Nothing is so much to be feared as fear itself. \u2014 Thoreau, 1851<\/p>\n<p>The only thing we have to fear is fear itself. \u2014 Roosevelt, 1933<\/p><\/blockquote>\n<p>Neither is the idea of a goal state where measurement holds sway, as in Lord Kelvin\u2019s iconic 1883 remark:<\/p>\n<blockquote><p>When you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in your thoughts, advanced to the state of <em>Science<\/em>.<\/p><\/blockquote>\n<p>But enough of my beliefs, though now that you know them, perhaps you can account for them in my remaining remarks.<\/p>\n<p style=\"text-align: center;\">*******<\/p>\n<p>To set the rest of what I am going to say on the bedrock of its foundation, the United States\u2019s ability to project power depends on information technology, and, as such, cyber insecurity is the paramount national security risk.\u00a0 This point bears repetition: because the United States\u2019s ability to project power depends on information technology, cyber insecurity is the paramount national security risk.<\/p>\n<p>Those with either an engineering or management background are aware that one cannot optimize everything at once \u2014 that requirements are balanced by constraints.\u00a0 I am not aware of another domain where this is as true as it is in cybersecurity and the question of a policy response to cyber insecurity at the national level.\u00a0 In engineering, this is said as \u201cFast, Cheap, Reliable: Choose Two\u201d.\u00a0 In the public policy arena, we must first remember the definition of a free country: a place where that which is not forbidden is permitted.\u00a0 As we consider the pursuit of cybersecurity, we will return to that idea time and time again; I believe that we are now faced with \u201cFreedom, Security, Convenience: Choose Two\u201d.<\/p>\n<p>Some time ago, I was asked to categorically state what types of risks rose to such a level that they could legitimately be considered national security concerns.\u00a0 Based on my view, much like the Treasury Department\u2019s view on bank failure \u2014 that a public loss of confidence is to be avoided at all bearable cost, but that everything short of this amounts to nothing more than some private tragedy \u2014 my answer then was that only two kinds of vulnerabilities were that important.\u00a0 The first is any mechanism that, to operate correctly, must be a single point of function, thereby containing a single point of failure.\u00a0 The red telephone on the President\u2019s desk is just such a mechanism; having twenty-three red telephones would be far worse than having one red telephone.\u00a0 As such, that single red telephone deserves defense in depth, which is simply a referendum on one\u2019s willingness to spend money for layers; it is rarely, if at all, a research-grade problem.<\/p>\n<p>The other national security scale risk is cascade failure,<a href=\"#_ftn1\">[1]<\/a> and cascade failure is so much easier to detonate in a monoculture \u2014 when the attacker has only to write one bit of malware, not ten million.\u00a0 The idea is obvious; believing in it is easy; acting on its implications is, evidently, difficult.\u00a0 Despite what you might think, I am sympathetic to the actual reason we continue to deploy computing monocultures<a href=\"#_ftn2\">[2]<\/a> \u2014 making systems almost entirely alike remains our only hope for managing them in a consistent manner.\u00a0 Put differently, when you deploy a computing monoculture you are making a risk management decision: that the downside risk of a \u201cblack swan event\u201d<a href=\"#_ftn3\">[3]<\/a> is more tolerable than the downside risk of perpetual inconsistency.<\/p>\n<p>Since first considering that question, I have decided there is now another national security scale issue, arising as a side effect of global supply chains and device complexity.\u00a0 It is simply not possible to provide product or supply chain assurance without a surveillance state.\u00a0 This matters not just philosophically, but practically.<\/p>\n<p>The root source of risk is dependence \u2014 dependence on system state, including dependence on expectations of system state reliability.\u00a0 Indeed, my definition of security has co-evolved with my understanding of risk and risk\u2019s source, to where I currently define security as the absence of unmitigatable surprise.\u00a0 Thus, increasing dependence results in heightened difficulty in crafting mitigations.\u00a0 This increasing complexity embeds dependencies in a manner that may diminish the frequency of surprises; however, the surprises will be all the more unexpected when they inevitably occur.<\/p>\n<p>And that is the crux of the matter: our dependence on all things cyber as a society is now inestimably irreversible and irreversibly inestimable.\u00a0 That sounds more apocalyptic than I intend, but the competent risk manager always asks, \u201cHow bad could it be?\u201d or, in the altogether American tortious style, \u201cWho will have to pay?\u201d<\/p>\n<p>This leads to my first conclusion about cybersecurity and national policy: our paramount aim cannot be risk avoidance but rather risk absorption \u2014 the ability to operate in degraded states, in both micro and macro spheres, to take as an axiom that our opponents have and will penetrate our systems at all levels, and to be prepared to adjust accordingly.<a href=\"#_ftn4\">[4]<\/a> To this extent, security becomes a subset of reliability in that an insecure system will not be reliable, but a reliable system is not necessarily secure.<\/p>\n<p>That tenet of a free society, <em>viz.<\/em>, anything not forbidden is permitted, interacts strongly with the rate of change in the digital world.\u00a0 No society needs rules against impossibilities.\u00a0 The rate at which we are turning the impossible into the possible is accelerating and will continue to do so because technologic change is now in a positive feedback loop.\u00a0 This leads to my second conclusion: free society rulemaking will trail modalities of risk by increasing margins, even if that rulemaking comes (God forbid) from some one-world government.\u00a0 This second conclusion evokes the Second Amendment, that an armed citizenry is a <em>sine qua non<\/em> of freedom.<\/p>\n<p>Ed Giorgio, then chief cryptanalyst for the NSA, famously remarked that, \u201cIn our line of work, security and privacy are a zero sum game.\u201d\u00a0 I do not intend to argue his point for him.\u00a0 I do not have to; every proposal for reinventing the Internet stresses the law-and-order essentialness of strong authentication in service of attribution, based on the finding that without attribution there is no deterrence to cybercrime because forensics will never close the widening evidentiary gap.\u00a0 Looking backwards, we see ready examples: the general public was only too happy to yield to cell-phone location tracking so that they could call 911 without having to know where they were.\u00a0 Looking forward, without universal strong authentication, tomorrow\u2019s cybercriminal will not need the fuss and bother of maintaining a botnet when, with a few hundred stolen credit cards, he will be able to buy all the virtual machines he needs from cloud computing operators.\u00a0 In short, my third conclusion is that if the tariff of security is paid, it will be paid in the coin of privacy.<\/p>\n<p>As much as I would wish, the market is unlikely to come to the rescue here, since a market only exists where there is demand.\u00a0 I do not see that demand; I do not see it in the general population (which is far more dependent on the digital world than it wants to realize, inured as it is to convenience <em>\u00fcber alles<\/em>), and I do not see it in government.\u00a0 It has been said over and over for twenty years, \u201cIf only we could make government\u2019s procurement engine drive the market toward secure products.\u201d\u00a0 This, ladies and gentlemen, is a pleasant fiction.\u00a0 First, the United States government\u2019s buying power is staggering but, on a world scale, it is less than market-driving, and growing less so.\u00a0 Second, we have long since demonstrated that the single greatest barrier to introducing innovation into government is that procurement rules obliterate any hope of real entrepreneurs approaching the city gate.\u00a0 Third, 90-plus percent of the installable base is not in government, but in the private sector.\u00a0 Sadly, then, my fourth conclusion is that market demand is not going to provide, in and of itself, a solution.<\/p>\n<p style=\"text-align: center;\">*******<\/p>\n<p>It would be rude and facile to repeat that when you do not know where you are going, any direction will do.\u00a0 But there are so many directions we could go now that I risk sounding rude and facile.\u00a0 Putting aside bread and circuses, I wish we had some sort of consensus on what goal state we wanted, but I am not even sure of that, and, in any case, the rate of change may not only make means temporary, it may also do so for ends.\u00a0 You can think of this as evolution in that evolution\u2019s winners are a random selection.<\/p>\n<p>Government\u2019s usual triad of tools \u2014 regulation, taxation, and insurance pricing \u2014 are all potential avenues; however, we must remember that all are subject to what I refer to as the Four Verities of Government:<\/p>\n<blockquote><p>Most meaningful ideas are not appealing.<\/p>\n<p>Most appealing ideas are not meaningful.<\/p>\n<p>Not every problem has a good solution.<\/p>\n<p>Every solution comes with side effects.<\/p><\/blockquote>\n<p>Nevertheless, let us consider what we might do.<\/p>\n<p>Government regulation is easiest to apply when the regulatees are few and those few are already well regulated; it is far harder to regulate a fragmented vastness and\/or to introduce regulation where there was none.\u00a0 For this reason, and whether just or unjust, the major telecoms will continue to be compelled to play the role of government\u2019s outsourced private police force.\u00a0 This took one form under the Bush administration and it is taking an all but identical form under the Obama administration.\u00a0 The demand for \u201csafe pipes\u201d inexorably leads to deputizing those who own the most pipes.\u00a0 Even so, that beats nationalization.<\/p>\n<p>Accretive sequestration of social policy in the Tax Code is precisely why the Tax Code is complex.\u00a0 Using the Tax Code to encourage investment in cybersecurity is just as possible as using the Tax Code to encourage investment in research and development.\u00a0 One must note, however, that using the Tax Code to do anything has perhaps the richest source of side effects, also known as unintended consequences.\u00a0 Inserting cybersecurity concerns into the Tax Code would be no different.<\/p>\n<p>Government can drive insurance pricing primarily through the codification of liability, which, in turn, forces collectivization of liability risk into insurance pools.\u00a0 There are many options for liability here, and, as any software buyer knows, the high-order bit of every page of every end-user license agreement (EULA) reads, \u201cIt is not our fault.\u201d\u00a0 EULAs are an outrage and require fixing, yet it is all but clear that attempting to regulate software quality, even given the poor quality of monopolistic providers, just exports the software business to China in perpetuity.<\/p>\n<p>Consistent with the paradigm of risk tolerance, it might be possible to say, as an engineer would, that no system may fail silently.\u00a0 In a sense, that is what the \u201cdata breach\u201d laws assume \u2014 that breaches are inevitable and the proper response is notification of affected parties.\u00a0 Interestingly, the first of these rules, California SB 1386, was drafted by taking a toxic waste spill rule and substituting data-on-the-information-superhighway for poison-on-the-public-street.<a href=\"#_ftn5\">[5]<\/a> If you consider notification an adequate mitigation, then this law creates a form of security as the surprise is followed by mitigation.\u00a0 The regulation here would be performance standards for latency of cleanup steps, like notification.<\/p>\n<p>Verizon\u2019s 2009 Data Breach Investigations Report<a href=\"#_ftn6\">[6]<\/a> included two findings that are more important than all the others: one, that 75 percent of all data losses are discovered by unrelated third parties; and two, that whether data breaches are preponderantly insider attacks or outsider attacks depends on your definition of insider.\u00a0 If \u201cinsider\u201d means \u201con the payroll\u201d, then insider attacks are not the most important issue.\u00a0 If, however, you define insider to include folks on your payroll plus employees of your partners who have as-of-right access to your data, then the majority of data losses are insider attacks.<\/p>\n<p>Those two findings must be true of government too, and even more so as is implied by the mention of procurement.\u00a0 Government could lead by example here, not as a market-directing procurement model, but in terms of requiring government suppliers to be sufficiently instrumented such that data loss events are discovered by the second party, not some third party, and that, as a condition of contract, the contracting firm must attest to its recent data-handling performance at regular intervals.\u00a0 In other words, treat data as if it were money.\u00a0 As data represents an increasing fraction of total corporate wealth, this is less cathartic than other options.<\/p>\n<p>It is important to understand that cyber insecurity is driven by sentient opponents, not by bad luck or stray alpha particles.\u00a0 At the same time, that the opponents think and calculate and are not inanimate random processes only changes the clock, not the azimuth of drift.\u00a0 This, perhaps, strengthens the argument for latency-based performance standards.<\/p>\n<p>That 90-plus percent of the critical infrastructure is in private hands means that state-level opponents unremarkably spend over 90 percent of their effort on the private sector.\u00a0 No component of the commercial world is without compromise, but the primary targets are essential industries, the secondary are the counterparties to those primary targets, and the tertiary are sites that can be prepped for future attacks.\u00a0 It would be wise to structure our defenses accordingly.\u00a0 This leads directly to whether government\u2019s cooperation with the private sector should not focus on the Defense Industrial Base and if the Defense Industrial Base should be expanded to include cybersecurity firms and technology within its remit.<a href=\"#_ftn7\">[7]<\/a><\/p>\n<p>Some degree of international engagement is essential for no other reason than that our opponents are location-less.\u00a0 Much work has been done on this, but the path to any treaty is steep and the clock of upward progress ticks in years, not minutes.\u00a0 The Council of Europe\u2019s Convention on Cybercrime<a href=\"#_ftn8\">[8]<\/a> is a case in point.\u00a0 At the same time, the recent decision of the Internet Corporation for Assigned Names and Numbers (ICANN) to wildly proliferate the number of top-level domains and the character sets in which domains can be enumerated is the single most criminogenic act ever taken in or around the digital world.\u00a0 United Nations treaties are all but useless \u2014 unenforceable and thus popular with the worst state offenders \u2014 so \u201ccoalitions of the willing\u201d are the best we can hope for, taking the G8\u2019s Financial Action Task Force<a href=\"#_ftn9\">[9]<\/a> as an example.\u00a0 Put differently, international engagement is likely necessary but certainly insufficient.<a href=\"#_ftn10\">[10]<\/a><\/p>\n<p>It is straightforward to see the value of information sharing: as a matter of logic you cannot, for example, know whether you are a target of choice or a target of chance unless you compare your attack pressure to that of others.\u00a0 Sharing information between the private sector and the government has been worked on in many ways, but, to my mind, little has come of all that good intent and effort.\u00a0 To begin with, no General Counsel in any industry believes that the protections against Freedom of Information Act (FOIA) access to security data shared with the government will actually work when the time comes, nor do they believe that they would have effective recourse if proven correct in their skepticism.\u00a0 Now repeat that same sentence substituting \u201cantitrust\u201d for \u201cFreedom of Information Act\u201d.\u00a0 General Counsels are not being unduly risk averse here \u2014 the codified protections against FOIA and antitrust, as applied to private security data, have never been tested in court.\u00a0 And as our newest Supreme Court Justice has candidly said, \u201cReal policy is made at the Circuit Court of Appeals.\u201d<\/p>\n<p>Yet for all of that, we clearly need information sharing.\u00a0 My own hope has long been for a technical guarantee of non-reversibility of shared data, something you can call de-identification or even anonymization of log data, thereby removing the General Counsel from the equation without giving congressional committees new weapons.\u00a0 Others I trust and admire have repeatedly proven by demonstration that it is all but impossible to craft such a technical guarantee and thus it is the technologists who argue for a procedural guarantee even as the sadder-but-wiser policy people pine for a technical guarantee.\u00a0 There does not seem to be a simple solution to this problem, though, in the private sector, some sharing does take place; for example, banks quickly share suspicions about stocks actively involved in pump-and-dump schemes.\u00a0 Providing security clearances to the management committees of every U.S. business with a claim to criticality does not seem workable either.<\/p>\n<p>Above I opined that the ability to operate in a degraded state is an essential capability for government systems and private sector systems.\u00a0 A second essential capability is a means to assure broad awareness of the gravity of the situation.\u00a0 To the extent that awareness is a trained response, we have to ask whether we can get awareness without the \u201ctraining\u201d that a thoroughgoing crisis provides.\u00a0 Yes, decision-making under crisis conditions is especially fraught with unintended side effects, but memory of a crisis usefully serves to keep certain issues clearly in our mind.\u00a0 The unanswered question is whether we can proactively keep awareness of cybersecurity clearly in our mind.\u00a0 There are many who have proposed that the process we went through as a nation to understand, set policy for, and contain nuclear weapons has lessons for us here, because the gravity of the nuclear situation and our awareness of it did not flag despite decades of relative quietude.<\/p>\n<p>There is a third essential, one that flows from recognizing the limits of central action in a decentralized world, and that is some measure of personal responsibility and involvement.\u00a0 We all know that patching behavior leaves much to be desired \u2014 Verizon\u2019s report showed that data loss events frequently involve open vulnerabilities for which patches had been available in excess of a year at the time of breach, and Qualys demonstrated that actual in-the-field patching follows a half-life curve, and thus never completes.<a href=\"#_ftn11\">[11]<\/a> We all have seen the scanning results that show a majority of home machines are compromised.\u00a0 We all have heard that the price of stolen personal data is falling as the supply side grows ever more efficient and automated.\u00a0 We are all aware that at the present levels of infection, peer-to-peer pairings almost always involve a transmission opportunity.\u00a0 And so I ask, whose responsibility is this?<\/p>\n<p>You may view an infected machine as a weapon.\u00a0 If I do not lock up my guns and they are used for the commission of a crime, then I will, at the very least, have some explaining to do.\u00a0 You may simply not want to drive through an intersection if you know that a majority of opposing traffic lacks brakes.\u00a0 I do not believe we will find the political will to make personal culpability a serious enough matter to effect widespread change, but I am at a loss to argue in any other direction.\u00a0 I ask this: if it is not the responsibility of the end user to avoid being an unwitting accomplice to constant crime, then whose responsibility is it?\u00a0 If you say that it is the responsibility of Internet Service Providers (ISPs) \u2014 that \u201cclean pipes\u201d argument<a href=\"#_ftn12\">[12]<\/a> \u2014 then you are flatly giving up on not having your traffic inspected at a fine level of detail.\u00a0 If you say that it is the software manufacturer\u2019s responsibility, we will soon need the digital equivalent of the Food and Drug Administration to set standards for efficacy and safety.\u00a0 If you say that it is the government\u2019s responsibility, then the mythical Information Superhighway Driver\u2019s License must soon follow.\u00a0 To my mind, personal responsibility for Internet safety is the worst choice, except for all the others.<\/p>\n<p>At the risk of repetition, let me be clear that contemplative reaction to compromised counterparties is the core of awareness.\u00a0 They must be dealt with, and they demonstrate why there must be a personal responsibility component, or else we\u2019ll be left with Big Brother.\u00a0 Ever since my team created the Kerberos network authentication system,<a href=\"#_ftn13\">[13]<\/a> the idea has been \u201cI\u2019m OK and you\u2019re OK, but the big bad network in between us cannot be trusted for a second.\u201d\u00a0 Authentication, authorization, and accountability all begin with authentication \u2014 and that, in turn, begins by asking the Operating System the name of the user.\u00a0 What has really changed is that it is not true that \u201cI\u2019m OK and you\u2019re OK\u201d, since it is entirely likely that the counterparty to whom you are connecting is already compromised.\u00a0 It is more like \u201cI think I\u2019m OK, I have to assume you are 0wned and the network may make this worse.\u201d\u00a0 A secure network connection?\u00a0 Who cares if the other end is hosed?\u00a0 Purdue\u2019s Gene Spafford was correct, but early, when he likened network security in the absence of host security to hiring an armored car to deliver gold bars from someone living in a cardboard box to someone sleeping on a park bench.<\/p>\n<p style=\"text-align: center;\">*******<\/p>\n<p>These are heady problems.\u00a0 They go to the heart of sovereignty.\u00a0 They go to the heart of culture.\u00a0 They go to the heart of \u201cLand of the Free and Home of the Brave\u201d.\u00a0 They will not be solved centrally, yet neither will they be solved without central assistance.\u00a0 We have before us a set of bargains, bargains between the Devil and the Deep Blue Sea.\u00a0 And not to decide is to decide.<\/p>\n<p>For me, I will take freedom over security and I will take security over convenience, and I will do so because I know that a world without failure is a world without freedom.\u00a0 A world without the possibility of sin is a world without the possibility of righteousness.\u00a0 A world without the possibility of crime is a world where you cannot prove you are not a criminal.\u00a0 A technology that can give you everything you want is a technology that can take away everything that you have.\u00a0 At some point, in the near future, one of us security geeks will have to say that there comes a point at which safety is not safe.<\/p>\n<p>I know full well that my views are neither pleasant nor fashionable, nor even attention getting enough to be dismissed.\u00a0 While time will tell if I am right, it would give no man pleasure then to say \u201cI told you so.\u201d<\/p>\n<p><em>Dr. Geer is the Chief Information Security Officer at In-Q-Tel.<\/em><\/p>\n<hr size=\"1\" \/>\n<p><a name=\"_ftn1\"><\/a>A cascade failure is a failure\u00a0that may begin at any of a large number of equivalent locations\u00a0but, once begun, proceeds due to the interconnectedness\u00a0of the components of a larger system.\u00a0 In nature, an avalanche\u00a0is a cascade failure; in electric power, a cascade failure occurs when\u00a0one power station goes offline and increases the load on others,\u00a0leading to the failure of the weakest remaining operating\u00a0unit, and so forth.<\/p>\n<p><a name=\"_ftn2\"><\/a> Creating a computing monoculture contributes to managerial\u00a0efficiency but risks having existing vulnerabilities\u00a0being exploited with similar industrial efficiency.\u00a0 In the\u00a0agricultural setting, the spread of a blight across a\u00a0susceptible species is made worse if that species occupies\u00a0entire counties or larger.\u00a0 In the computing setting, an\u00a0exploitable vulnerability invites attack in proportion to\u00a0its widespread identicality, as the attacker has a much\u00a0better return on his investment in crafting the exploit\u00a0when the breadth of vulnerable targets is all but\u00a0universal.<\/p>\n<p><a name=\"_ftn3\"><\/a> \u201cBlack swan event\u201d is a term of art attributable to Nicholas Taleb, who authored a book of the same name.\u00a0 The idea is that the things\u00a0we (should) fear most are too rare to develop experience\u00a0with handling and, frequently, represent complex failure\u00a0modes for which no one can be faulted for not having\u00a0recognized <em>a priori<\/em>.\u00a0 Taleb also stresses that our reaction\u00a0to an unprecedented beneficial event will be mild and\u00a0relaxed whereas our reaction to an unprecedented\u00a0malicious event will be panic-driven and anything\u00a0but relaxed, and this asymmetry of effect is present\u00a0even when the beneficial and malicious alternatives\u00a0have equal probability.\u00a0 <em>See generally<\/em> Nassim Nicholas Taleb, The Black Swan (2007).<\/p>\n<p><a name=\"_ftn4\"><\/a> I note, for the record, that the United States can absorb substantially more risk than most small countries, which, on a relative basis, are suffering or will suffer the most harm.<\/p>\n<p><a name=\"_ftn5\"><\/a> <em>See<\/em> Ann. Cal. Civ. Code \u00a7\u00a7 1798.29, 1798.82, 1798.84 (effective July 1, 2003).<\/p>\n<p><a name=\"_ftn6\"><\/a> Verizon Business, 2009 Data Breach Investigations Report (2009), <em>available at <\/em>www.verizonbusiness.com\/resources\/security\/reports\/2009_databreach_rp.pdf.<\/p>\n<p><a name=\"_ftn7\"><\/a> <em>See<\/em> U.S. Dep\u2019t of Homeland Security, National Infrastructure Protection Plan: Partnering to Enhance Protection and Resiliency (2009), <em>available at<\/em> http:\/\/www.dhs.gov\/xlibrary\/assets\/NIPP_Plan.pdf.<\/p>\n<p><a name=\"_ftn8\"><\/a> Convention on Cybercrime, <em>opened for signature<\/em> Nov. 23, 2001, T.I.A.S. 13174, E.T.S. 185.<\/p>\n<p><a name=\"_ftn9\"><\/a> The Financial Action Task Force (FATF) is an inter-governmental body committed to the development and promotion of international policies to combat money laundering and terrorist financing.\u00a0 FATF Home Page, http:\/\/www.fatf-gafi.org.<\/p>\n<p><a name=\"_ftn10\"><\/a> Note that I am not covering the question of what some call cyberwarfare.\u00a0 I refer you to two further readings if that is your interest: the analysis of the Russian-Georgian conflict by the U.S. Cyber Consequences Unit, and the National Research Council\u2019s consideration of the state of cyberwarfare policy.\u00a0 <em>See<\/em> U.S. Cyber Consequences Unit, Overview by the US-CCU of the Cyber Campaign Against Georgia in August of 2008 (2009), <em>available at<\/em> http:\/\/www.registan.net\/wp-content\/uploads\/sites\/82\/2009\/08\/US-CCU-Georgia-Cyber-Campaign-Overview.pdf; National Research Council, Technology, Policy, Law, and Ethics Regarding U.S. Acquisition and Use of Cyberattack Capabilities (William A. Owens et al. eds., 2009), <em>available at<\/em> http:\/\/www.anagram.com\/berson\/nrcoiw.pdf.\u00a0 These two collectively make the case that non-governmental actors can play a major role in cyberwarfare and that cyberwarfare is freighted with unanticipatable collateral damage.<\/p>\n<p><a name=\"_ftn11\"><\/a> <em>See<\/em> Qualys, Inc., The Laws of Vulnerabilities: Six Axioms for Understanding Risk (2006), <em>available at<\/em> http:\/\/www.qualys.com\/docs\/Laws-Report.pdf.<\/p>\n<p><a name=\"_ftn12\"><\/a> \u201cClean pipes\u201d is a term of art describing one\u00a0possible mechanism for general Internet safety where\u00a0ISPs are responsible for the\u00a0data they carry and, as such, are obliged to conduct\u00a0inspection before cartage begins.\u00a0 This is in complete\u00a0and sharp contrast to the \u201ccommon carrier\u201d assumption\u00a0that underlies not only Internet service provision\u00a0but also commercial freight handling and many other\u00a0aspects of modern life \u2014 <em>i.e.<\/em>, it is not the responsibility\u00a0of the carrier to inspect what it carries beyond the\u00a0obvious ideas of not accepting, say, a package which\u00a0is emitting smoke at the time of acceptance.<\/p>\n<p><a name=\"_ftn13\"><\/a> The Kerberos network authentication system is a product\u00a0of MIT\u2019s Project Athena, which was the first effective\u00a0and freely available mechanism for two parties who\u00a0share a common administrative authority to establish\u00a0mutual proof of identity.\u00a0 It is now so commonplace\u00a0that it has become part of the woodwork, so to speak,\u00a0and is very likely the single most commonly used\u00a0program in the world.\u00a0 The author was\u00a0privileged to be the manager in charge of all\u00a0technical development for Athena, including Kerberos.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Daniel Geer&#8217;s essay on cybersecurity policy, found in NSJ Volume 1, has been nominated for the Third Annual Social Security Blogger Awards.<\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[242],"tags":[],"class_list":["post-1004","post","type-post","status-publish","format-standard","hentry","category-main"],"jetpack_featured_media_url":"","jetpack_shortlink":"https:\/\/wp.me\/peZtUX-gc","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/journals.law.harvard.edu\/nsj\/wp-json\/wp\/v2\/posts\/1004","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/journals.law.harvard.edu\/nsj\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/journals.law.harvard.edu\/nsj\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/journals.law.harvard.edu\/nsj\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/journals.law.harvard.edu\/nsj\/wp-json\/wp\/v2\/comments?post=1004"}],"version-history":[{"count":0,"href":"https:\/\/journals.law.harvard.edu\/nsj\/wp-json\/wp\/v2\/posts\/1004\/revisions"}],"wp:attachment":[{"href":"https:\/\/journals.law.harvard.edu\/nsj\/wp-json\/wp\/v2\/media?parent=1004"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/journals.law.harvard.edu\/nsj\/wp-json\/wp\/v2\/categories?post=1004"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/journals.law.harvard.edu\/nsj\/wp-json\/wp\/v2\/tags?post=1004"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}