This repository has been archived on 2017-04-03. You can view files and clone it, but cannot push or open issues/pull-requests.
blog_post_tests/20120606184437.blog

31 lines
12 KiB
Plaintext

Evaluating the harm from closed source
<p>Some people are obsessive about never using closed-source software under any circumstances. Some other people think that because I&#8217;m the person who wrote the foundational theory of open source I ought to be one of those obsessives myself, and become puzzled and hostile when I demur that I&#8217;m not a fanatic. Sometimes such people will continue by trying to trap me in nutty false dichotomies (like <a href="http://esr.ibiblio.org/?p=538&#038;cpage=1#comment-381058">this guy</a>) and become confused when I refuse to play.</p>
<p>A common failure mode in human reasoning is to become too attached to theory, to the point where we begin ignoring the reality it was intended to describe. The way this manifests in ethical and moral reasoning is that we tend to forget why we make rules &#8211; to avoid harmful consequences. Instead, we tend to become fixated on the rules and the language of the rules, and end up fulfilling Santayana&#8217;s definition of a fanatic: one who redoubles his efforts after he has forgotten his aim.</p>
<p>When asking the question &#8220;When is it wrong (or right) to use closed-source software?&#8221;, we should treat it the same way we treat every other ethical question. First, by being very clear about what harmful consequences we wish to avoid; second, by reasoning from the avoidance of harm to a rule that is minimal and restricts peoples&#8217; choices as little as possible.</p>
<p>In the remainder of this essay I will develop a theory of the harm from closed source, then consider what ethical rules that theory implies.</p>
<p><span id="more-4371"></span></p>
<p>Ethical rules about a problem area don&#8217;t arise in a vacuum. When trying to understand and improve them it is useful to start by examining widely shared intuitions about the problem. Let&#8217;s begin by examining common intuitions about this one. </p>
<p>No matter how doctrinaire or relaxed about this topic they are, most people agree that closed-source firmware for a microwave oven or an elevator is less troubling than a closed-source desktop operating system. Closed source games are less troubling than closed-source word processors. Any closed-source software used for communications among people raises particular worries that the authors might exploit their privileged relationship to it to snoop or censor.</p>
<p>There are actually some fairly obvious generative patterns behind these intuitions, but in order to discuss them with clarity we need to first consider the categories of harm from closed-source software.</p>
<p>The most fundamental harm we have learned to expect from closed source is that it will be poor engineering &#8211; less reliable than open source. I have made the argument that bugs thrive on secrecy at length elsewhere and won&#8217;t rehash it here. This harm varies in importance according to the complexity of the software &#8211; more complex software is more bug-prone, so the advantage of open source is greater and the harm from closed source more severe. It also varies according to how serious the expected consequences of bugs are; the worse they get, the more valuable open source is. I&#8217;ll call this &#8220;reliability harm&#8221;.</p>
<p>Another harm is that you lose options you would have if you were able to modify the software to suit your own needs, or have someone do that for you. This harm varies in importance according to the expected value of customization; greater in relatively general-purpose software with a large range of potential use cases for modified versions, less in extremely specialized software tightly coupled to a single task and a single deployment. I&#8217;ll call this &#8220;unhackability harm&#8221;.</p>
<p>Yet another harm is that closed-source software puts you in an asymmetrical power relationship with the people who are privileged to see inside it and modify it. They can use this asymmetry to restrict your choices, control your data, and extract rent from you. I&#8217;ll call this &#8220;agency harm&#8221;.</p>
<p>Closed source increases your transition costs to get out of using the software in various ways, making escape from the other harms more difficult. Closed-source word processors using proprietary formats that no other program can fully handle are the classic example of this, but there are many others. I&#8217;ll call this &#8220;lock-in harm&#8221;.</p>
<p>[Update, two days later] A commenter points out another kind of harm from closed source: secrets can be lost, taking capabilities with them. There are magnetic media from the early days of computing &#8211; some famous cases include data of great historical interest recorded by the U.S. space program in the 1960s &#8211; that are intact but cannot be read because they used secret, proprietary data formats embodied only in hardware and specifications that no longer exist. This typifies an ever-present risk of closed-source software that becomes more severe as software-mediated communication gets more important. I&#8217;ll call this &#8220;amnesia harm&#8221;.</p>
<p>Finally, a particular software product is said to have &#8220;positive network externalities&#8221; when its value to any individual rises with the number of other people using it. Positive network externalities have consequences like those of lock-in harm; they raise the cost of transitioning out.</p>
<p>With these concepts in hand, let&#8217;s look at some real-world cases. </p>
<p>First, firmware for things like elevators and microwave ovens. Low reliability harm, because (a) it&#8217;s relatively easy to get right, and (b) the consequences of bugs are not severe &#8211; the most likely consequence is that the device just stops dead, rather than (say) hyper-irradiating you or throwing you through the building&#8217;s roof. Low unhackability harm &#8211; not clear what you&#8217;d do with this firmware if you could modify it. Low agency harm; it is highly unlikely that a toaster or an elevator will be used against you, and if it were it would be as part of a sufficiently larger assembly of surveillance and control technologies that simply being able to hack one firmware component wouldn&#8217;t help much. No lock-in harm, and no positive externalities. [There is some potential for amnesia harm if the firmware embodies good algorithms or tuning constants that can't be recovered by reverse-engineering.]</p>
<p>Because it scores relatively low on all these scales of harm, highly specialized device firmware is the least difficult case for tolerating closed source. But as firmware develops more complexity, flexibility, and generality, the harms associated with it increase. So, for example, closed-source firmware in your basement router can mean serious pain &#8211; there have been actual cases of it hijacking DNS, injecting ads into your web browsing, and so on.</p>
<p>At the other end of the scale, desktop operating systems score moderate to high on reliability harm (depending on your application mix and the opportunity cost of OS failures). They score high on unhackability harm even if you&#8217;re not a programmer, because closed source means you get fixes and updates and new features not when you can invest in them them but only when the vendor thinks it&#8217;s time. They score very high on agency harm (consider how much crapware comes bundled with a typical Windows machine) and very high on lock-in [and amnesia] harm (closed proprietary file formats, proprietary video streaming, and other such shackles). They have strong positive externalities, too.</p>
<p>Now let&#8217;s talk about phones. Closed-source smartphone operating systems like iOS have the same bundle of harms attached to them that desktop operating systems do, and for all the same reasons. The interesting thing to notice is that dumbphones &#8211; even when they have general-purpose processors inside them &#8211; are a different case. Dumbphone firmware is more like other kinds of specialized firmware &#8211; there&#8217;s less value in being able to modify it, and less exposure to agency harm. Dumbphone firmware differs from elevator firmware mainly in that (a) there&#8217;s some lock-in [and amnesia] harm (dumbphones jail your contacts list) and (b) in being so much more complex that the reliability harm is actually something of an issue.</p>
<p>Games make another interesting intermediate case. Very low reliability harm &#8211; OK, it might be annoying if your client program craps out during a World of Warcraft battle, but it&#8217;s not like having your financial records scrambled or your novel manuscript trashed. Moderate unhackability harm; if you bought a game, it&#8217;s probably because you wanted to play <em>that game</em> rather than some hypothetical variant of it, but modifying it is at least imaginable and sometimes fun (thus, for example, secondary markets in map levels and skins). No agency harm unless they&#8217;re embedding ads. No lock-in harm, [low odds of amnesia harm,] some positive externalities.</p>
<p>Word processors (and all the other kinds of productivity software they&#8217;ll stand in for here) raise the stakes nearly to the level of entire operating systems. Moderate to high reliability harm, again depending on your actual use case, High unhackability harm for the same reasons as OSes. Lower agency harm than an OS, if only because your word processor doesn&#8217;t normally have an excuse to report your activity or stream ads at you. Very high lock-in [and amnesia] harm. If the overall harm from closed source is less here than for an OS, it&#8217;s mainly because productivity programs are a bit less disruptive to replace than an entire OS.</p>
<p>So far I haven&#8217;t made any normative claims. Here&#8217;s the only one I really need: we should oppose closed-source software, and refuse to use it, in direct proportion to the harms it inflicts.</p>
<p>That sounds simple and obvious, doesn&#8217;t it? And yet, there are people who I won&#8217;t name but whose initials are R and M and S, who persist in claiming that this position isn&#8217;t an ethical stance, is somehow fatally unprincipled. Which is what it looks like when you&#8217;ve redoubled your efforts after forgetting your aim.</p>
<p>Really, this squishy &#8220;unprincipled&#8221; norm describes the actual behavior even of people who talk like fanatics about closed source being evil. Who, even among the hardest core of the &#8220;free software&#8221; zealots, actually spends any effort trying to abolish closed-source elevator firmware? That doesn&#8217;t happen; desktop and smartphone OSes make better targets because they&#8217;re <em>more important</em> &#8211; and with that pragmatism, we&#8217;re right back to comparative evaluation of consequential harm, even if the zealot won&#8217;t acknowledge that to himself.</p>
<p>Now that we have this analysis, it leads to conclusions few people will find surprising. That&#8217;s a feature, actually; if there were major surprises it would suggest that we had wandered too far away from the intuitions or folk theory we&#8217;re trying to clarify. Conclusions: we need to be most opposed to closed-source desktop and smartphone operating systems, because those have the most severe harms and the highest positive-externality stickiness. We can relax about what&#8217;s running in elevators and microwave ovens. We need to push for open source in basement routers harder as they become more capable. And the occasional game of Angry Birds or Civilization or World of Warcraft is not in fact a terrible act of hypocrisy.</p>
<p>One interesting question remains. What is the proper ethical response to situations in which there is <em>no</em> open-source alternative?</p>
<p>Let&#8217;s take this right to an instructive extreme &#8211; heart pacemakers. Suppose you have cardiac arrhythmia; should you refuse a pacemaker because you can&#8217;t get one with open-source firmware? </p>
<p>That would be an insane decision. But it&#8217;s the exact kind of insanity that moralists become prone to when they treat normative rules as worship objects or laudable fixations, forgetting that these rules are really just devices for the avoidance of harm and pain.</p>
<p>The sane thing to do would be to notice that there are kinds of harm in the world more severe than the harm from closed source, remember that the goal of all your ethical rules is the reduction of harm, and act accordingly.</p>