This repository has been archived on 2017-04-03. You can view files and clone it, but cannot push or open issues/pull-requests.
blog_post_tests/20100204133156.blog

26 lines
13 KiB
Plaintext

Error cascade: a definition and examples
<p>I&#8217;ve used the term &#8220;error cascade&#8221; on this blog several times, notably in referring to AGW hysteria. A commenter has asked me to explain it, and I think that&#8217;s a good idea as (a) the web sources on the concept are a bit confusing, and (b) I&#8217;ll probably use the term again &mdash; error cascades are all too common where science meets public policy.</p>
<p><span id="more-1642"></span></p>
<p>In medical jargon, an &#8220;error cascade&#8221; is something very specific: a series of escalating errors in diagnosis or treatment, each one amplifying the effect of the previous one. This is a well established term in the medical literature: <a href="http://www.ncbi.nlm.nih.gov/pubmed/17301195">this abstract</a> is quite revealing about the context of use.</p>
<p>There&#8217;s a slightly different term, <a href="http://www.info-cascades.info/">information cascade</a>, which is used to describe the propagation of beliefs and attitudes through crowd psychology. Information cascades occur because humans are social animals and tend to follow the behavior of those around them. When the social incentives are right, humans will substitute the judgment of others for their own.</p>
<p>A useful, related concept is <a href="http://www.powells.com/biblio/9780674707580?&#038;PID=27627">preference falsification</a>, the act of misrepresenting one&#8217;s desires or beliefs under perceived social pressures. Preference falsification amplifies informational cascades &mdash; humans don&#8217;t just substitute the judgment of others for their own, they talk themselves into beliefs most around them don&#8217;t actually hold but have become socially convinced they should <em>claim</em> to hold!</p>
<p>I use the term &#8220;error cascade&#8221; in a meaning halfway between the restricted sense of the medical literature and &#8220;information cascade&#8221;, and I apply it specifically to a kind of bad science, especially bad science recruited in public-policy debates. A <em>scientific</em> error cascade happens when researchers substitute the reports or judgment of more senior and famous researchers for their own, and incorrectly conclude that their own work is erroneous or must be trimmed to fit a &#8220;consensus&#8221; view.</p>
<p>But it doesn&#8217;t stop there. What makes the term &#8220;cascade&#8221; appropriate is that those errors spawn other errors in the research they affect, which in turn spawn further errors. It&#8217;s exactly like a cascade from an incorrect medical diagnosis. The whole field surrounding the original error can become clogged with theory that has accreted around the error and is poorly predictive or over-complexified in order to cope with it.</p>
<p>Here&#8217;s a classic example of missing what&#8217;s in front of your face (which, incidentally, I first learned of from James Blish&#8217;s <cite>Cities In Flight</cite>; never let anyone tell you reading SF isn&#8217;t useful). For a couple of decades, cell biologists ignored the evidence of their own eyes <a href="http://www.ichg2006.com/abstract/1033.htm">when counting human chromosomes</a>. The correct number is 46, but a very respected researcher incorrectly &#8220;corrected&#8221; his early count of 46 to 48 and the error persisted. At least this one was relatively harmless; yes, the wrong number hung around in textbooks for while, but there wasn&#8217;t any generative theory that depended on it in a big way.</p>
<p>For a cascade with wider theoretical consequences in its field, there&#8217;s the tale of Robert Andrews Millikan and the electron mass. The famed <a href="http://en.wikipedia.org/wiki/Oil-drop_experiment">oil-drop experiment</a> of 1909 demonstrated that electrical charge was quantized, and by implication proved the existence of subatomic particles. For this he deservedly got the physics Nobel in 1923 &mdash; but his value for the mass of the electron was significantly wrong. It was too low. </p>
<p>Because Millikan was such an eminence, it took a long time and a lot of confusion and thrashing to correct this. If you get the mass of the electron wrong it has lots of consequences; all theories that use it have at least to include unphysical bugger factors to cancel the error. You end up with even applied science getting screwed up; if I recall correctly what I first read long ago about this debacle, it caused some problems for the then-new technique of spectroscopy.</p>
<p>And yes, preference falsification distorts individuals&#8217; models of what others around them actually believe even in hard science. I once tripped over this in an amusing way, when I volunteered to be on a panel on cosmology and dark matter at some SF convention (might have been Arisia 2004). I did this in the belief that I&#8217;d probably be the lone dark-matter skeptic on the panel &mdash; the stuff smells altogether too damned much like phlogiston to me. But all four of the other panelists (all of them working physicists or astronomers) <em>also</em> turned out to be dark-matter skeptics, surprising not only me but each other as well!</p>
<p>For anybody who wonders, I favor the alternative explanation of why galaxies don&#8217;t fly apart that gravity departs from inverse square at sufficiently long distances (admittedly, this is a purely aesthetic difference, because that theory is not yet testable). But I digress. I didn&#8217;t tell that story to argue for this theory, but to illustrate how social pressure to falsify preferences scientists can lead scientists to get stuck in <em>erroneous models of what their peers believe</em>, as well as ignoring experimental evidence.</p>
<p>In extreme cases, entire fields of inquiry can go down a rathole for years because almost everyone has preference-falsified almost everyone else into submission to a &#8220;scientific consensus&#8221; theory that is (a) widely but privately disbelieved, and (b) doesn&#8217;t predict or retrodict observed facts at all well. In the <em>worst</em> case, the field will become pathologized &mdash; scientific fraud will spread like dry rot among workers overinvested in the &#8220;consensus&#8221; view and scrambling to prop it up. Yes, anthropogenic global warming, I&#8217;m looking at <em>you!</em> </p>
<p>But climatology is far from the only field to get stuck in a rathole. I have reason to suspect, for example, that Noam Chomsky&#8217;s theory of universal grammar may have done something similar to comparative linguistics. I have spoken with linguists who will mutter, if no colleague can hear them, that Chomskian &#8220;universal grammar&#8221; has Indo-European biases and has to be chopped, diced, and bent out of shape to fit languages outside that group, to the point where it becomes vacuous (and effectively unfalsifiable). The gods alone know what distorting effects this rathole has had on analysis of language morphology (which would be like electron-mass measurements or chromosome counts in this case), but we&#8217;re not likely to be shut of them until Chomsky is dead.</p>
<p>There an important difference between the AGW rathole and the others, though. Errors in the mass of the electron, or the human chromosome count, or structural analyses of obscure languages, don&#8217;t have political consequences (I chose Chomsky, who is definitely politically active, in part to sharpen this point). AGW theory most certainly does have political consequences; in fact, it becomes clearer by the day that the IPCC assessment reports were <a href="http://www.dailymail.co.uk/news/article-1245636/Glacier-scientists-says-knew-data-verified.html#ixzz0dd7XhIAS">fraudulently designed to fit the desired political consequences</a> rather than being based on anything so mundane and unhelpful as observed facts.</p>
<p>When a field of science is co-opted for political ends, the stakes for diverging from the &#8220;consensus&#8221; point of view become much higher. If politicians have staked their prestige and/or hopes for advancement on being the ones to fix a crisis, they don&#8217;t like to hear that &#8220;Oops! There is no crisis!&#8221; &mdash; and where that preference leads, grant money follows. When politics co-opts a field that is in the grip of an error cascade, the effect is to tighten that grip to the strangling point.</p>
<p>Consequently, scientific fields that have become entangled with public-policy debates are far more likely to pathologize &mdash; that is, to develop inner circles that collude in actual misconduct and suppression of refuting data rather than innocently perpetuating a mistake. The CRU &#8220;team&#8221; isn&#8217;t the only example of this. The sociological literature attacking civilian firearms possession has been <a href="http://esr.ibiblio.org/?p=314">rife with fraud for decades</a>. In a more recent example, prominent sociologist Robert Putnam has admitted that he sat for <em>years</em> on data indicating that increases in ethnic diversity result in a net loss of trust and social capital, because he feared that publishing it would give aid and comfort to political tendencies he disliked.</p>
<p>So&#8230;how do you tell when a research field is in the grip of an error cascade? The most general indicator I know is <a href="http://en.wikipedia.org/wiki/Consilience">consilience</a> failures. Eventually, one of the factoids generated by an error cascade is going to collide with a well-established piece of evidence from another research field that is not subject to the same groupthink.</p>
<p>Here&#8217;s an example: Serious alarm bells rang for me about AGW when the &#8220;hockey team&#8221; edited the Medieval Warm Period out of existence. I knew about the MWP because I&#8217;d read Annalist-style histories that concentrated on things like crop-yield descriptions from primary historical sources, so I knew that in medieval times wine grapes &mdash; implying what we&#8217;d now call a Mediterranean climate &mdash; were grown as far north as southern England and the Lake M&auml;laren region of Sweden! When the primary historical evidence grossly failed to match the &#8220;hockey team&#8217;s&#8221; paleoclimate reconstructions, it wasn&#8217;t hard for me to figure which had to be wrong.</p>
<p>Actually, my very favorite example of an error cascade revealed by consilience failure isn&#8217;t from climatology: it&#8217;s the the oceans of bogus theory and wilful misinterpretations of primary data generated by anthropology and sociology to protect the &#8220;tabula rasa&#8221; premise advanced by Franz Boas and other founders of the field in the early 20th century. Eventually this cascade collided with increasing evidence from biology and cognitive psychology that the human mind is <em>not</em> in fact a &#8220;blank slate&#8221; or completely general cognitive machine passively accepting acculturation. Steven Pinker&#8217;s book <a href="http://www.amazon.com/Blank-Slate-Modern-Denial-Nature/dp/0670031518">The Blank Slate</a> is eloquent about the causes and the huge consequences of this error.</p>
<p>Consilience failures offer a way to spot an error cascade at a relatively early stage, well before the field around it becomes seriously pathologized. At later stages, the disconnect between the observed reality in front of researchers&#8217; noses and the bogus theory may increase enough to cause problems <em>within</em> the field. At that point, the amount of peer pressure required to keep researchers from breaking out of the error cascade increases, and the operation of social control becomes more visible. </p>
<p>You are well into this late stage when anyone invokes &#8220;scientific consensus&#8221;. Science doesn&#8217;t work by consensus, it works by making and confirming predictions. Science is not democratic; there is only one vote, only Mother Nature gets to cast it, and the results are not subject to special pleading. When anyone attempts to end debate by insisting that a majority of scientists believe some specified position, this is the social mechanism of error cascades coming into the open and swinging a wrecking ball at actual scientific method right out where everyone can watch it happening.</p>
<p>The best armor against error cascades is knowing how this failure mode works so you can spot the characteristic behaviors. Talk of &#8220;deniers&#8221; is another one; that, and the moralistic quasi-religious language that it goes with, is a leading indicator that scientific method has left the building. Sound theory doesn&#8217;t have to be buttressed by demonizing its opponents; it demonstrates itself with predictive success.</p>
<p>UPDATE: Kudos to Bore Patch for pointing out a real humdinger of an example error cascade: <a href="http://borepatch.blogspot.com/2010/02/canals-of-mars-climate-research-unit.html">canals on Mars</a>.</p>