This repository has been archived on 2017-04-03. You can view files and clone it, but cannot push or open issues/pull-requests.
blog_post_tests/20140927002349.blog

23 lines
7.5 KiB
Plaintext

Program Provability and the Rule of Technical Greed
<p>In a recent discussion on G+, a friend of mine made a conservative argument for textual over binary interchange protocols on the grounds that programs always need to be debugged, and thus readability of the protocol streams by humans trumps the minor efficiency gains from binary packing.</p>
<p>I agree with this argument; I&#8217;ve made it often enough myself, notably in <cite>The Art of Unix Programming</cite>. But it was something his opponent said that nudged at me. &#8220;Provable programs are the future,&#8221; he declaimed, pointing at sel4 and CompCert as recent examples of formal verification of real-world software systems. His implication was clear: we&#8217;re soon going to get so much better at turning specifications into provably correct implementations that debuggability will soon cease to be a strong argument for protocols that can be parsed by a Mark I Eyeball.</p>
<p>Oh foolish, foolish child, that wots not of the Rule of Technical Greed.</p>
<p><span id="more-6271"></span></p>
<p>Now, to be fair, the Rule of Technical Greed is a name I just made up. But the underlying pattern is a well-established one from the earliest beginnings of computing.</p>
<p>In the beginning there was assembler. And programming was hard. The semantic gap between how humans think about problems and what we knew how to tell computers to do was vast; our ability to manage complexity was deficient. And in the gap software defects did flourish, multiplying in direct proportion to the size of the programs we wrote.</p>
<p>And the lives of programmers were hard, and the case of their end-users miserable; for, strive as the programmers might, perfection was achieved only in toy programs while in real-world systems the defect rate was nigh-intolerable. And there was much wailing and gnashing of teeth.</p>
<p>Then, lo, there appeared the designers and advocates of higher-level languages. And they said: &#8220;With these tools we bring you, the semantic gap will lessen, and your ability to write systems of demonstrable correctness will increase. Truly, if we apply this discipline properly to our present programming challenges, shall we achieve the Nirvana of defect rates tending asymptotically towards zero!&#8221;</p>
<p>Great was the rejoicing at this prospect, and swiftly resolved the debate despite a few curmudgeons who muttered that it would all end in tears. And compilers were adopted, and for a brief while it seemed that peace and harmony would reign.</p>
<p>But it was not to be. For instead of applying compilers only to the scale of software engineering that had been accustomed in the days of hand-coded assembler, programmers were made to use these tools to design and implement ever more complex systems. The semantic gap, though less vast than it had been, remained large; our ability to manage complexity, though improved, was not what it could be. Commercial and reputational victory oft went to those most willing to accrue technical debt. Defect rates rose once again to just shy of intolerable. And there was much wailing and gnashing of teeth.</p>
<p>Then, lo, there appeared the advocates of structured programming. And they said: &#8220;There is a better way. With some modification of our languages and trained discipline exerted in the use of them, we can achieve the Nirvana of defect rates tending asymptotically towards zero!&#8221;</p>
<p>Great was the rejoicing at this prospect, and swiftly resolved the debate despite a few curmudgeons who muttered that it would all end in tears. And languages which supported structured programming and its discipline came to be widely adopted, and these did indeed have a strong positive effect on defect rates. Once again it seemed that peace and harmony might prevail, sweet birdsong beneath rainbows, etc.</p>
<p>But it was not to be. For instead of applying structured programming only to the scale of software engineering that had been accustomed in the days when poorly-organized spaghetti code was the state of the art, programmers were made to use these tools to design ever more complex systems. The semantic gap, though less vast than it had been, remained large; our ability to manage complexity, though improved, was not what it could be. Commercial and reputational victory oft went to those most willing to accrue technical debt. Defect rates rose once again to just shy of intolerable. And there was much wailing and gnashing of teeth.</p>
<p>Then, lo, there appeared the advocates of systematic software modularity. And they said: &#8220;There is a better way. By systematic separation of concerns and information hiding, we can achieve the Nirvana of defect rates tending asymptotically towards zero!&#8221;</p>
<p>Great was the rejoicing at this prospect, and swiftly resolved the debate despite a few curmudgeons who muttered that it would all end in tears. And languages which supported modularity came to be widely adopted, and these did indeed have a strong positive effect on defect rates. Once again it seemed that peace and harmony might prevail, the lion lie down with the lamb, technical people and marketeers actually get along, etc.</p>
<p>But it was not to be. For instead of applying systematic modularity and information hiding only to the scale of software engineering that had been accustomed in the days of single huge code blobs, programmers were made to use these tools to design ever more complex modularized systems. The semantic gap, though less vast than it had been, remained large; our ability to manage complexity, though now greatly improved, was not what it could be. Commercial and reputational victory oft went to those most willing to accrue technical debt. Defect rates rose once again to just shy of intolerable. And there was much wailing and gnashing of teeth.</p>
<p>Are we beginning to see a pattern here? I mean, I could almost write a text macro that would generate the next couple of iterations. Every narrowing of the semantic gap, every advance in our ability to manage software complexity, every improvement in automated verification, is sold to us as a way to push down defect rates. But how each tool actually gets used is to scale up the complexity of design and implementation to the bleeding edge of tolerable defect rates.</p>
<p>This is what I call the Rule of Technical Greed: As our ability to manage software complexity increases, ambition expands so that defect rates and expected levels of technical debt are constant.</p>
<p>The application of this rule to automated verification and proofs of correctness is clear. I have little doubt these will be valuable tools in the relatively near future; I follow developments there with some interest and look forward to using them myself. </p>
<p>But anyone who says &#8220;This time it&#8217;ll be <em>different!</em>&#8221; earns a hearty horse-laugh. Been there, done that, still have the T-shirts. The semantic gap is a stubborn thing; until we become as gods and can will perfect software into existence as an extension of our thoughts, somebody&#8217;s still going to have to grovel through the protocol dumps. Design for debuggability will <em>never</em> be waste of effort, because otherwise, even if we believe our tools are perfect, proceeding from ideal specification to flawless implementation&#8230;how else will an actual human being actually <em>know</em>?</p>
<p>UPDATE: Having learned that &#8220;risk homeostasis&#8221; is an actual term of art in road engineering and health risk analysis, I now think this would be better tagged the &#8220;Law of Software Risk Homeostasis&#8221;.</p>