December 10, 2004
CODE WARRIORS:
The Origin of Specious: And why reductionists are winning the Darwin wars (Harvey Blume, 9.23.02, American Prospect)
Stephen Jay Gould, who died of cancer at the age of 60 this past May, defined a place in American culture likely to remain vacant now that he is gone. He was, of course, the country's foremost opponent of creationism and champion of Darwinism, with a unique ability to bring the HMS Beagle and baseball batting averages together in a perfect paragraph or two. But what we may come to value most about him is the lonely stance he took in the Darwin wars.In the heated, often venomous battle over Charles Darwin's legacy, Gould faced a redoubtable crew from the fields of sociobiology, evolutionary psychology, genetics and philosophy. What's more, many of these individuals, including E.O. Wilson, Stephen Pinker, Daniel Dennett, Richard Dawkins and Robert Wright, have literary and polemical talents rivaling his own. Science will decide the relative merits of their arguments over topics such as punctuated equilibrium, speciation and the nature of complexity. But the cultural stakes of the dispute are obvious already. Gould's opponents advocate one form or another of a digital Darwinism. Their grand syntheses are unimaginable without the computer revolution. Their reductionist emphasis -- and their hopes for a single, internally coherent theory of everything from mitochondria to the human mind -- draws heavily on the tools, methods and examples of digitalization. Gould's views, on the other hand, owed next to nothing to computers. His Darwinism would have sounded much the same without computer code, artificial intelligence (AI) or the Internet.
Gould was by no means oblivious or opposed to digitalization. He records, for example, that browsing a window display festooned with the "beeping, flashing, almost living and pulsating" offshoots of computer technology forced his "reluctant paleontologist's soul to a recognition that the revolution is already upon us -- the most profound change in human life since everything from trains to television brought us all together." And he did not laugh at the great geek dream that a silicon brain might one day be built that would far surpass the organic model. Gould saw that real AI would signify a break with nature as we thought we knew it, but that didn't bother him. He was a fan of breaks, ruptures and discontinuities; his insistence on their importance to evolution was a chief bone of contention with his opponents. But when it came to digital discontinuity, he lacked any compelling personal need to make it to the other side. The typewriter was his keyboard of choice, when pencil and paper didn't suffice. And he preferred face-to-face encounters to e-mails and the Internet.
Compare this modus operandi with that of, say, Dawkins. His book The Blind Watchmaker was delayed, he confesses with the sly grin of the confirmed hacker, because he felt compelled to first write "Scrivener," his very own word processor. He was "addicted," as he put it, to machine code, the most unevolved (not to say barbaric) and certainly the most demanding of all computer languages, the use of which requires familiarity with the hardware foibles of one's particular machine. It's not a great stretch to see how the author of Scrivener might also be the leading proponent of the notion that the gene -- as opposed to the organism or the species -- is the basic unit and driving force of evolutionary change. Dawkins waxes rhapsodic about the fact that organisms and computers are, beneath it all, code-driven things. "The machine code of the genes," he writes, "is uncannily computer-like. Apart from differences in jargon, the pages of a molecular-biology journal might be interchanged with those of a computer-engineering journal."
Of course, in science, what inspires an idea has no bearing on its validity. Coding Scrivener might well have helped Dawkins understand the mechanics of the selfish gene. And about the inescapability of cultural influence on scientific work, all sides in the Darwin wars agree. As Gould put it, "The social embeddedness of science is not always a negative. Sometimes it helps you along to an insight you didn't have before." Dennett's complementary formulation is that progress "made in science is greatly abetted by the temporary hampering of the imagination [enforced by culture]." It would seem that on this issue, at least, a sort of Christmas truce prevailed among combatants. The problem, though, is that cultural influences as distinct as Gould's and Dennett's reinforce very different views of science. And when computers are the source of influence, as in Dennett's case, it's never clear where culture ends and science begins.
Dennett draws on the discipline of artificial intelligence not just for metaphors but for literal models of the human mind. AI, in his view, gives philosophers, at long last, a lab. If you want to try out a particular theory of mind, he admonishes them, don't sit around and theorize. Get out the manual, write the code, run the damn thing and see what happens. The philosophers, he might have written, have only interpreted the world; the point, however, is to code it. Like Dawkins, Dennett believes that code is the great unifier. If Dawkins can write Scrivener, then over the eons -- during which time its products are launched, tested and debugged by natural selection -- nature can, and in fact has, coded up such things as monkeys.
Some notions just parody themselves. That evolution works just like a computer code, thereby proving no one wrote the code is obviously one of them. Posted by Orrin Judd at December 10, 2004 5:58 AM
Once the article starting waxing nostalgic for the eructations of that unscientific, Marxist mountebank, Stephen Jay Gould, I stopped reading. If you ever read his answer to 'The Bell Curve,' you would know that he was mathematically-challenged once he ran out of fingers and toes, not even having the capacity of an Eskimo, who can at least conceive of a number greater than 20.('1 man' plus 1=21)
Posted by: Bart at December 10, 2004 6:56 AMYour consequent does not follow from the antecedent.
In other words, even if true, that evolution works just like a computer code says nada about how the code came to be.
Bart:
Dr. Gould ran himself into a corner. His leftist beliefs could only hold if humans are essentially identical and malleable, yet Evolutionary theory dictates they are not. OJ, in this sense, is an Evolutionist.
Posted by: Jeff Guinn at December 10, 2004 7:02 AMJeff,
I am an 'evolutionist' myself and frankly don't like to argue on these threads about it because it strikes me as so self-evident just from powers of observation. The Jeffersonian notion of 'uncaused first cause' works fine for me. Just sit in the bleachers at Yankee Stadium and you will see virtually the entire Ascent of Man chart pass before you.
Posted by: Bart at December 10, 2004 7:22 AMWith us at the top?
Posted by: David Cohen at December 10, 2004 7:51 AMOne hopes, but sometimes my table manners would indicate otherwise.
Posted by: Bart at December 10, 2004 8:09 AMHey, I wrote a word processor in 6502 assembly with a friend of mine for a Commodore PET. I used it into college, but the computer only had 16K of memory so it didn't hold much text.
I even held off writing a book to write a program that turned out to be already written (MORE 3.1)
Jeff, believe me, if evolution works like computer code, it can only explain the existence of bugs.
Posted by: Randall Voth at December 10, 2004 9:06 AMThis is nonsense, of course, the notion that we will crack the code of the human mind, and reproduce it on silicon. Dawkins and Dennett are over-reaching themselves. The mind of a fruit fly possesses more complexity than any word processor. The problem with the brightest people is that they have noone above them in the mental food chain to give them a sense of their mental limits, so they imagine that they can solve any problem that they set their minds to.
The complexity of the human entity (mind and body, you can't consider one without the other) is greater than the ability of the human mind to grasp.
Posted by: Robert Duquette at December 10, 2004 12:03 PMRobert is exactly right.
Posted by: David Cohen at December 10, 2004 1:39 PMCuriously, Dawkins et al are trying to do what Orrin contends they ought to do to be called 'science.'
The alternative approach, that of Mayr, Stebbins etc. is constantly jeered at here.
Make up your minds.
Posted by: Harry Eagar at December 10, 2004 3:47 PMBoy, you'd think the folks at the American Prospect might check first to see if anybody had ever run that title before -- Ronald Bailey's 1997 article in Reason had almost the exact same headline.
Posted by: Matt Murphy at December 10, 2004 8:06 PM"The complexity of the human entity (mind and body, you can't consider one without the other) is greater than the ability of the human mind to grasp."
Amen to that.
The most powerful computers today are but a tiny fraction of the capacity of the human brain. Not only that, but the human brain does not need to squeeze all its processing power through one CPU and perhaps one or two additional processors; instead, a hundred billion neurons are arranged and connected in three dimensions. It is beyond our comprehension, and while the simplification of comparing the human brain to a computer is useful in some contexts, on the whole it doesn't come anywhere close to doing it justice.
Posted by: creeper at December 13, 2004 4:18 AM