.Trends in Cyber Security .Dan Geer, 6 November 13, NRO Thank you for the invitation to speak with you today, which, let me be clear, is me speaking as myself and not for anybody or anything else. As you know, I work the cyber security trade, that is to say that my occupation is cyber security. Note that I said "occupation" rather than "profession." On 18 September, the U.S. National Academy of Sciences, on behalf of the Department of Homeland Security, concluded that cyber security should be seen as an occupation and not a profession because the rate of change is too great to consider professionalization.[1] You may well agree that that rate of change is paramount and thus why cyber security is the most intellectually demanding occupation on the planet. In writing this essay, I will keep my comments to trends rather than point estimates, just as you asked in your invitation, but let me emphasize the wisdom of your request by noting that the faster the rate of change, the more it is trends that matter and not the value of any given variable at any given time. With luck, each of these trends will not be something that you would argue with as a trend. Argument, if any, will be in their interpretation. Note also that these trends do not constitute a set of mutually exclusive, collectively exhaustive characterizations of the space in which we live and work. Some of them are correlated with others. Some of them are newly emergent, some not. Some of them are reversible to a degree; some not reversible at all. I am not, today anyway, looking for causality. Trend #1: Polarization Much has been written about the increasing polarization of American life.[2] The middle is getting smaller whether we are noting that only the middle class is shrinking, that it is the middle of the country that is depopulating, that the political middle is lonelier and lonelier, that both farms and banks are now only too small to matter or too big to fail, that almost all journalism is now advocacy journalism, that middle tier college education is a ticket to debt and nothing else. I submit that this trend towards polarization has come to cyber security. High end practice is accelerating away from the low end. The best skills are now astonishingly good while the great mass of those dependent on cyber security are ever less able to even estimate what it is that they do not know, much less act on it. This polarization is driven by the fundamental strategic asymmetry of cyber security, namely that while the workfactor for the offender is the incremental price of finding a new method of attack, the workfactor for the defender is the cumulative cost of forever defending against all attack methods yet discovered. Over time, the curve for the cost of finding a new attack and the curve for the cost of defending against all attacks to date must cross. Once those curves cross, the offender never has to worry about being out of the money. That crossing event occurred some time ago. I'll come back to this first bullet at the end, but I mention it first as polarization is becoming structural and of all the trends the most telling. You can confirm this by asking the best cyber security people what they do on the Internet and what they won't do on the Internet. You will find it sharply different than what the public at large does or will do. The best people know the most, and they are withdrawing, they are rejecting technologies. To use the words and style of the Intelligence Community, they are compartmentalizing. Trend #2: Trends themselves The idea that under the pressure of constant change about all you can measure is the slope of the curve has gone from don't-bother-me-with-math to everybody's-doing-it. A Google search for the phrase "information security trends" turns up 13,400 hits and no two of the top ten are from the same source. Consultancies talk about what they are seeing in the back room, product vendors talk about evolving needs, and reporters talk about what they are seeing out on the street. I am one of those folks. A Wall Street colleague and I run the Index of Cyber Security.[3] The ICS is what is called a sentiment-based index; if you are familiar with the US Consumer Confidence Index,[4] then you already know what a sentiment-based index is. Respondents to the ICS are top drawer cyber security practitioners with direct operational responsibility who share, each month, how their view of security in several areas has changed since the month before. Because there are no absolutes in cyber security, not even widely agreed upon definitions of the core terms that make up cyber security practice, a sentiment-based Index is, in fact, the best decision support that can be done. The Index asks the respondents monthly whether each of two dozen different risks has gotten better, gotten worse, gotten a lot better, gotten a lot worse, or stayed the same since the month before. Out of this, the Index of Cyber Security is calculated and released at 6pm on the last calendar day of the month, in further similarity to the Consumer Confidence Index. We write an analytic annual report that I have given to the organizers for your further reading. As an index of risk, a higher ICS number means higher risk. That risk number has risen, and seems likely to continue to rise. It is a composite trend line, but what is more interesting is that the components of the risk are much more varied, i.e., what is the dominating risk one month may not be the next. We think that this captures, in part, the dynamic nature of cyber security and does so in a way not otherwise being done. Respondents seem to agree that the ICS does offer decision support to front-line people such as themselves. Trend #2, then, is that there is increasingly wide acceptance that absolute measures are not worth seeking and a kind of confirmation that cyber security is a practice, not a device. Trend #3: Physics and its impact on data As you well know, more and more data is collected and more and more of that data is in play. The general, round-numbers dynamic of this trend are these: Moore's Law continues to give us two orders of magnitude in compute power per dollar per decade while storage grows at three orders of magnitude and bandwidth at four. These are top-down economic drivers and they relentlessly warp what is the economically optimum computing model. The trend is clear; the future is increasingly dense with stored data but, paradoxically, despite the massive growth of data volume, that data becomes more mobile with time. As is obvious, this bears on cyber security as data is what cyber security is all about. In 2007, Jim Gray gave a seminal talk[5] about the transformation of science, coining the term "fourth paradigm." By that he meant that the history of science is that science began as an endeavor organized around empirical observation. After that came the age of theory -- theorizing as the paradigm of what science did. Then science became computational, again meaning that the paradigm of what science did was to calculate. His argument for a fourth era was that of a paradigm shift from computational science to data intensive science. You here at NRO need no primer on the power of that shift in paradigm, but I am here to tell you that cyber security is embracing that fourth paradigm and it is doing it now. Ecology professor Philip Greear would challenge his graduate students to catalog all the life in a cubic yard of forest floor. Computer science professor Donald Knuth would challenge his graduate students to catalog everything their computers had done in the last ten seconds. It is hard to say which is more difficult, but everywhere you look, cyber security practitioners are trying to get a handle on "What is normal?" so that that which is abnormal can be identified early in the game. Behavioral approaches leading towards intrusion detection are exactly the search for anomaly, and they are data based. The now-famous attack on RSA Data Security that led to RSA buying Net Witness is an example of wanting to know everything so as to recognize something. I'm on the record at book length [6] that the central organizing principle behind a competent security program is to instrument your data sufficiently well that nothing moves without it being noticed. Physics has made it possible to put computers everywhere. Physics has made it possible to fill them all with data. Cyber security is barely keeping up, and not just because of two, three, or four orders of magnitude in the physics upstream of the marketplace. Trend #4: Need for prediction We all know that knowledge is power. We all know that there is a subtle yet important distinction between information and knowledge. We all know that a negative declaration like "X did not happen" can be only proven if you have the enumeration of *everything* that did happen and can show that X is not in it. We all know that a stitch in time saves nine, but only if we know where to put the stitch. We all know that without security metrics, the outcome is either overspending or under protecting. The more technologic the society becomes, the greater the dynamic range of possible failures. When you live in a cave, starvation, predators, disease, and lightning are about the full range of failures that end life as you know it and you are well familiar with all of them. When you live in a technologic society where everybody and everything is optimized in some way akin to just-in-time delivery, the dynamic range of failures is incomprehensibly larger and largely incomprehensible. The wider the dynamic range of failure, the more prevention is the watchword. As technologic society grows more interdependent within itself, the more it must rely on prediction based on data collected in broad ways, not targeted ways. Some define risk as the probability of a failure times the cost of that failure. To be clear, a trend in favor of making predictions is a trend subsidiary to a trend in the cost of failure. I've written at length elsewhere about how an increasing downside cost of failure requires that we find ways to be resilient, but not resilient in the sense of rich redundancy, not resilient in the sense of having quick recovery mechanisms, but resilient in the sense of having alternate primary means that do not share common mode risks. As such, I strongly recommend that manual means be preserved wherever possible because whatever those manual means are, they are already fully capitalized and they do not share common mode risk with digital means. There is now more information security risk sloshing around the economy than could actually be accepted were it exposed. The tournament now turns to who can minimize their risk the best, which, in the civilian economy at large, means who can most completely externalize their downside information security costs. The weapons here are perhaps as simple as the wisdom of Delphi, "Know thyself" and "Nothing to excess" -- know thyself in the sense of quantitative rigor and a perpetual propensity to design information systems with failure in mind; nothing to excess in the sense of mimicking the biologic world's proof by demonstration that species diversity is the greatest bulwark against loss of an ecosystem. Trend #5: Abandonment If I abandon a car on the street, then eventually someone will be able to claim title. If I abandon a bank account, then the State will eventually seize it. If I abandon real estate by failing to remedy a trespass, then in the fullness of time adverse possession takes over. If I don't use my trademark, then my rights go over to those who use what was and could have remained mine. If I abandon my spouse and/or children, then everyone is taxed to remedy my actions. If I abandon a patent application, then after a date certain the teaching that it proposes passes over to the rest of you. If I abandon my hold on the confidentiality of data such as by publishing it, then that data passes over to the commonweal not to return. If I abandon my storage locker, then it will be lost to me and may end up on reality TV. The list goes on. Apple computers running 10.5 or less get no updates (comprising about half the installed base). Any Microsoft computer running XP gets no updates (comprising about half the installed base). The end of security updates follows abandonment. It is certainly ironic that freshly pirated copies of Windows get security updates when older versions bought legitimately do not. Stating the obvious, if Company X abandons a code base, then that code base should be open sourced. Irrespective of security issues, many is the time that a bit of software I use has gone missing because its maker went missing. But with respect to security, some constellation of {I,we,they,you} are willing and able to provide security patches or workarounds as time and evil require. Would the public interest not be served, then, by a conversion to open source for abandoned code bases? But wait, you say, isn't purchased software on a general purpose computer a thing of the past? Isn't the future auto-updated smartphone clients transacting over armored private (carrier) networks to auto-updated cloud services? Maybe; maybe not. If the two major desktop suppliers update only half of today's desktops, then what percentage will they update tomorrow? If you say "Make them try harder!," then the legalistic, regulatory position is your position, and the ACLU is already trying that route. If smartphone auto-update becomes a condition of merchantability and your smartphone holds the keying material that undeniably says that its user is you, then how long before a FISA court orders a special auto-update to *your* phone for evidence gathering? If you say "But we already know what they're going to do, don't we?," then the question is what about the abandoned code bases. Open-sourcing abandoned code bases is the worst option, except for all the others. But if seizing an abandoned code base is too big a stretch for you before breakfast, then start with a Public Key Infrastructure Certifying Authority that goes bankrupt and ask "Who gets the keys?" Trend #6: Interdependence The essential character of a free society is this: That which is not forbidden is permitted. The essential character of an unfree society is the inverse, that which is not permitted is forbidden. The U.S. began as a free society without question; the weight of regulation, whether open or implicit, can only push it toward being unfree. Under the pressure to defend against offenders with a permanent structural advantage, defenders who opt for forbidding anything that is not expressly permitted are cultivating a computing environment that does not embody the freedom with which we are heretofore familiar. Put concretely, the central expression of a free society is a free market, and the cardinal measure of a free market is the breadth of real choice -- choice that goes beyond color and trim and body style to choices that optimize discordant, antithetical goal states. The level of choice on the Internet is draining down. You may revel in the hundreds of thousands of supposedly new voices that have found a way to chatter in full view. You may note that new "apps" for Android plus iPhone are appearing at over a thousand per day. You may rightly remind us all that technology is democratizing in the sense that powers once reserved for the few are now irretrievably in the hands of the many. What stands against that, and why I say that it stands against that, is increasing interdependence. We humans can design systems more complex than we can then operate. The financial sector's "flash crashes" are an example of that; perhaps the fifty interlocked insurance exchanges for Obamacare will soon be another. Above some threshold of system complexity, it is no longer possible to test, it is only possible to react to emergent behavior. The lowliest Internet user is entirely in the game of interdependence -- one web page can easily touch scores of different domains. While writing this, the top level page from cnn.com had 400 out-references to 85 unique domains each of which is likely to be similarly constructed and all of which move data one way or another. If you leave those pages up and they have an auto-refresh, then moving to a new network signals to every one of those ad networks that you have so moved. The wellspring of risk is dependence, especially dependence on shared expectations of shared system state, i.e., interdependence on the ground. If you would accept that you are most at risk from the things you most depend upon, then damping dependence is the cheapest, most straightforward, lowest latency way to damp risk, just as the fastest and most reliable way to put more money on a business's bottom line is through cost control. Trend #7: Automation Shoshana Zuboff of the Harvard Business School notably described three laws of the digital age, . Everything that can be automated will be automated. . Everything that can be informated will be informated. . Every digital application that can be used for surveillance and . control will be used for surveillance and control. It is irrelevant, immaterial and incompetent to argue otherwise. For security technology, Zuboff's Laws are almost the goal state, that is to say that the attempt to automate information assurance is in full swing everywhere, the ability to extract information from the observable is in full swing everywhere, and every digital application is being instrumented. Before In-Q-Tel, I worked for a data protection company. Our product was, and I believe still is, the most thorough on the market. By "thorough" I mean the dictionary definition, "careful about doing something in an accurate and exact way." To this end, installing our product instrumented every system call on the target machine. Data did not and could not move in any sense of the word "move" without detection. Every data operation was caught and monitored. It was total surveillance data protection. What made this product stick out was that very thoroughness, but here is the point: Unless you fully instrument your data handling, it is not possible for you to say what did not happen. With total surveillance, and total surveillance alone, it is possible to treat the absence of evidence as the evidence of absence. Only when you know everything that *did* happen with your data can you say what did *not* happen with your data. But this trend of automating is now leaving the purely defensive position behind. In a press release two weeks ago today,[7] DARPA signaled exactly that, and I quote [T]he Defense Advanced Research Projects Agency intends to hold the Cyber Grand Challenge -- the first-ever tournament for fully automatic network defense systems. DARPA envisions teams creating automated systems that would compete against each other to evaluate software, test for vulnerabilities, generate security patches and apply them to protected computers on a network. The growth trends ... in cyber attacks and malware point to a future where automation must be developed... The automation trend is irreversible, but it begs a question that I fear no one will answer in a way that doesn't merely reflect their corporate or institutional interest, namely are people in the loop a failsafe or a liability?[8] Trend #8: Dual use I've become convinced that all security technology is dual use. While I am not sure whether dual use is a trend or a realization of an unchanging fact of nature, the obviousness of dual use seems greatest in the latest technologies, so I am calling it a trend in the sense that the straightforward accessibility of dual use characteristics of new technology is a growing trend. There are a lot of examples, but in the physical world any weapon usable for defense can be repurposed for offense. Every security researcher looking for exploitable flaws is deep in the dual use debate because once discovered, those flaws can be patched or they can be sold. The cyber security products that promise total surveillance over the enterprise are, to my mind, an offensive strategy used for defensive purposes. There was a time when flaws were predominantly found by adventurers and braggarts. Ten plus years of good work by the operating system vendors elbowed the flaw finders out of the operating system and, as a result, our principal opponents changed over from adventurers and braggarts to being professionals. Finding vulnerabilities and exploiting them is now hard enough that it has moved out of the realm of being a hobby and into the realm of being a job. This changed several things, notably that braggarts share their findings because they are paid in bragging rights. By contrast, professionals do not share and are paid in something more substantial than fame. The side effect has been a continued rise in the percentage of all vulnerabilities that are previously unknown. The trend, in other words, is that by crushing hobbyists we've raised the market price of working exploits to where now our opponents pay for research and development out of revenue. Simulating what the opponent can do thus remains the central task of defensive research. Much of that research is in crafting proofs of concept that such and such a flaw can be taken advantage of. Corman's neologism of "HD Moore's Law" says that the trend in the power of the casual attacker grows as does the trend of the power in Metasploit.[9] It is hard to think of a better description of dual use. Trend #9: The blurring of end-to-end To my mind, the most important technical decision ever made was that the security of the Internet was to be "end-to-end."[10] "End-to-end" is a generic technical term yet simple to explain: the Internet was built on the premise that two entities could connect themselves to each other and decide what they wanted to do. The network was a delivery vehicle, but the form, content, and security of the connection between the two ends was to be their own choice. End-to-end is a model where the terminal entities are smart and the network is dumb. This is completely (completely) different than a smart network with dumb terminal entities at the end of the wire. No other design decision of the Internet comes close to the importance of it's being an end-to-end design. With end-to-end, security is the choice of the terminal end-points, not something built into the fabric of the Internet itself. That is American values personified. It is the idea that accountability, not permission seeking, is the way a government curbs the misuse of freedoms, and, as accountability scales but permission seeking does not, accountability wins. End-to-end security is the digital manifestation of the right of association and, in any case, is what enabled the Internet to become relevant in the first place. End-to-end does precisely what Peter Drucker told us to do: "Don't solve problems, create opportunities." The provision of content from anywhere to anywhere, which is the very purpose of an internetwork, is a challenge to sovereignty. America's Founders wanted no sovereign at all, and they devised a government that made the center all but powerless and the periphery fully able to thumb its nose at whatever it felt like. Much ink has been spilled on the frontier ethic versus the wishful policies favored by the comfortable urbanity of the welfare state, but the Internet's protocols have everything in common with the former and nothing in common with the latter. The free man requires the choice of with what degree of vigor to defend himself. That is a universal; America's Founders laid that down in the Second Amendment, just as did George Orwell in the English democratic socialist weekly "Tribune" when he said, "That rifle on the wall of the laborer's cottage or working class flat is the symbol of democracy. It is our job to see that it stays there." Were George Washington or George Orwell still among us, they would know that smart end-points and dumb networks are what freedom requires, that smart networks protecting dumb end-points breed compliant dependency. But the trend is otherwise, and not just because of the fatuous fashionability of entitlement, but rather because of a blurring of what the term "end" means. So very many people have adopted automatic synchronization of multiple devices they own that one has to ask whether their tablet is an end or their collection of mutually synchronized devices is an end. So many Internet-dependent functions are spread silently across numerous entities and applications that what is the end may well be more dynamic than can be described. If an end implies unitary control on the part of an owner, then set theory says that mutually synchronized devices are a unitary end. That blurring of "end" makes end-to-end provisioning problematic as a set of devices cannot be assumed to be equally on and equally participating in any given transaction. Quoting Clark & Blumenthal[11] There is a risk that the range of new requirements now emerging could have the consequence of compromising the Internet's original design principles. Were this to happen, the Internet might lose some of its key features, in particular its ability to support new and unanticipated applications. We link this possible outcome to a number of trends: the rise of new stakeholders in the Internet,... new government interests, the changing motivations of the growing user base, and the tension between the demand for trustworthy overall operation and the inability to trust the behavior of individual users. This is nowhere so evident as in security, that is to say in the application of the end-to-end principle to cyber security. What does end-to-end secure transport mean when travelocity.com is showing you a page dynamically constructed from a dozen other entities? Trend #10: Complexity in the supply chain Even without resorting to classified information, it is now clear that supply chain attacks have occurred. Whether reading journalistic accounts or Richard Clarke's novel _Breakpoint_, the finding is that the supply chain creates opportunities for badness. None of the things I've yet read, however, blames the supply chain risk on its complexity, per se, but that is the trend that matters. Security is non-composable -- we can get insecure results even when our systems are assembled from secure components. The more components, the less likely a secure result. This applies to supply chains that are growing ever more complex under the pressure of just-in-time, spot market sourcing of, say, memory chips and so forth and so on. Because the attacker has only to find one component of that chain to be vulnerable while the defender has to assure that all components are invulnerable, rising supply chain complexity guarantees increased opportunity for effective attack. It cannot do otherwise, and the trend is clear. Trend #11: Monoculture(s) Beginning with Forrest in 1997,[12] regular attention has been paid to the questions of monoculture in the network environment. There is no point belaboring the fundamental question, but let me state it for the record: cascade failure is so very much easier to detonate in a monoculture -- so very much easier when the attacker has only to write one bit of malware, not ten million. The idea is obvious; believing in it is easy; acting on its implications is, evidently, rather hard. I am entirely sympathetic to the actual reason we continue to deploy computing monocultures -- making everything almost entirely alike is, and remains, our only hope for being able to centrally manage it in a consistent manner. Put differently, when you deploy a computing monoculture you are making a fundamental risk management decision: That the downside risk of a black swan event is more tolerable than the downside risk of perpetual inconsistency. This is a hard question, as all risk management is about changing the future, not explaining the past. Which would you rather have, the unlikely event of a severe impact, or the day-to-day burden of perpetual inconsistency? When we opt for monocultures we had better opt for tight central control. This supposes that we are willing to face the risks that come with tight central control, of course, including the maximum risk of all auto-update schemes, namely the hostile takeover of the auto-update mechanism itself. Computer desktops are not the point; embedded systems are. The trendline in the number of critical monocultures seems to be rising and many of these are embedded systems both without a remote management interface and long lived. That combination -- long lived and not reachable -- is the trend that must be reversed. Whether to insist that embedded devices self destruct at some age or that remote management of them be a condition of deployment is the question. In either case, the Internet of Things and the appearance of microcontrollers in seemingly every computing device should raise hackles on every neck.[13] Trend #12: Attack surface growth versus skill growth Everyone here knows the terminology "attack surface" and knows that one of the defender's highest goals is to minimize the attack surface wherever possible. Every coder adhering to a security-cognizant software lifecycle program does this. Every company or research group engaged in static analysis of binaries does this. Every agency enforcing a need-to-know regime for data access does this. Every individual who reserves one low-limit credit card for their Internet purchases does this. I might otherwise say that any person who encrypts their e-mail to their closest counterparties does this, but because consistent e-mail encryption is so rare, encrypting one's e-mail marks it for collection and indefinite retention by those entities in a position to do so, regardless of what country you live in. In cyber security practice, the trend is that we practitioners as a class are getting better and better. We have better tools, we have better understood practices, and we have more colleagues. That's the plus side. But I'm interested in the ratio of skill to challenge, and as far as I can estimate, we are expanding the society-wide attack surface faster than we are expanding our collection of tools, practices, and colleagues. If you are growing more food, that's great. If your population is growing faster than your improvements in food production can keep up, that's bad. In the days of radio, there was Sarnoff's Law, namely that the value of a broadcast network was proportional to N, the number of listeners. Then came packetized network communications and Metcalfe's Law, that the value of a network was proportional to N squared, the number of possible two-way conversations. We are now in the era of Reed's Law where the value of a network is proportional to the number of groups that can form in it, that is to say 2 to the power N. Reed's Law is the new reality because it fits the age of social networks. In each of these three laws as publicly stated, the sign bit is positive, but in parallel with the claim that everything is dual use, the sign bit can also be negative because interconnections are a contributor to the net attack surface. If an Internet of Things is indeed imminent, then the upward bend in the curve of the global attack surface will grow steeper regardless of what level of risk there is for any one thing so long as that level of risk is always non-zero. Trend #13: Specialization Everyone my age working in cyber security was trained for something else, and because of that switch between one field and another brings along the hybrid vigor of seeing the cyber security world through a different lens. Statisticians, civil engineers, and lawyers alike can contribute. But the increasing quality of prepatory education, the increasing breadth of affairs for which cyber security is needful, and the increasing demand for skill of the highest sort means the humans in the game are specializing. While some people like to say "Specialization is for insects," tell me that the security field itself is not specializing. We have people who are expert in forensics on specific operating system localizations, expert in setting up intrusion response, expert in analyzing large sets of firewall rules using non-trivial set theory, expert in designing egress filters for universities that have no ingress filters, expert in steganographically watermarking binaries, and so forth. Generalists are becoming rare, and they are being replaced by specialists. This is biologic speciation in action, and the narrowing of ecologic niches. In rough numbers, there are somewhere close to 5,000 various technical certifications you can get in the computer field, and the number of them is growing thus proving the conjecture of specialization and speciation is not just for insects and it will not stop. -------------- What does it all mean? All of these trends reflect state changes ongoing and likely to continue to move forward. If we could count on them to maintain some smooth progression, then we might plan actions around them, but we cannot. At any moment, a game changer may arrive, but that is not something you can plan for, per se. I began with the trend of polarization and I end with it. The range of cyber security skills between the best and the worst is growing wider. As the worst outnumber the best and always will, we need look no further than the history of empire where, in the end, it is polarization that kills them. The Internet is an empire. The Internet was built by academics, researchers, and hackers -- meaning that it embodies the liberal cum libertarian cultural interpretation of "American values," namely that it is open, non-hierarchial, self organizing, and leaves essentially no opportunities for governance beyond a few rules of how to keep two parties in communication over the wire. Anywhere the Internet appears, it brings those values with it. Other cultures, other governments, know that these are our strengths and that we are dependent upon them, hence as they adopt the Internet they become dependent on those strengths and thus on our values. A greater challenge to sovereignty does not exist, which is why the Internet will either be dramatically balkanized or it will morph into an organ of world government. In either case, the Internet will never again be as free as it is this morning. That polarization of cyber security within the Internet grows from our willing dependence on it despite the other trends of which I've spoken. I don't see us deciding to damp that risk by curbing dependence though, to be clear, that is precisely the trajectory which my own life now follows. I don't see the cyber security field solving the problem as the problem to be solved is getting bigger faster than we are getting better. I see, instead, the probability that legislatures will relieve the more numerous incapable of the joint consequences of their dependence and their incapability by assigning liability so as to collectivize the downside risk of cyber insecurity into insurance pools. We are forcibly collectivizing the downside risk of disease most particularly the self-inflicted ones, why would we not do that for the downside risk of cyber insecurity and, again, particularly the self-inflicted ones? Where there are so many questions and so few answers, such deep needs and such shallow appreciation of trend directions, the greatest risk is the risk of simplistic solutions carried forward by charismatic fools. There is never enough time, thank you for yours. -------------- [1] "Professionalizing the Nation's Cyber Workforce?" www.nap.edu/openbook.php?record_id=18446 [2] _Hollowing out the Middle_, Carr & Kefalas; _Race Against the Machine_, Brynjolfsson & McAfee; _Average Is Over_, Cowen [3] "The Index of Cyber Security," cybersecurityindex.org [4] "The Consumer Confidence Index," Technical Note, 2011 tinyurl.com/3sb633k [5] Gray, "eScience," NRC-CSTB, Mountain View CA, 2007 research.microsoft.com/en-us/um/people/gray/talks/NRC-CSTB_eScience.ppt [6] Geer, _Economics and Strategies of Data Security_, 2008 [7] www.darpa.mil/NewsEvents/Releases/2013/10/22.aspx [8] Geer, "People in the Loop: Failsafe or a Liability?", 2012 geer.tinho.net/geer.suitsandspooks.8ii12.txt [9] Corman, "Intro to HDMoore's Law," 2011 blog.cognitivedissidents.com/2011/11/01/intro-to-hdmoores-law [10] Saltzer, Reed, & Clark, "End-to-End Arguments in System Design," 1981 web.mit.edu/Saltzer/www/publications/endtoend/endtoend.pdf [11] Clark & Blumenthal, "Rethinking the design of the Internet, The End-to-End Arguments vs. the Brave New World," 2001 cyberlaw.stanford.edu/e2e/papers/TPRC-Clark-Blumenthal.pdf [12] Forrest, Somayaji, & Ackley, "Building Diverse Computer Systems," HotOS-VI, 1997 www.cs.unm.edu/~immsec/publications/hotos-97.pdf [13] Farmer, "IPMI: Freight Train to Hell v2.01," 2013 fish2.com/ipmi/itrain.pdf ===== this and other material on file under geer.tinho.net/pubs