Thursday, December 11, 2014

The Testament of Gideon Mack by James Robertson

The Testament of Gideon Mack [AmazonGoodreads] by James Robertson has a single unambiguous theme: doubt. That is the only unambiguous feature of the book. Robertson manages masterfully to question every aspect of his own story until one is forced to question nearly everything. Upon finishing the book, I was almost certain that I had, in fact, read it.
The book's notional plot describes the fantastic story of the eponymous Gideon Mack, a Presbyterian minister in a small Scottish seaside village, and his encounter with the Devil. At least, it might have been the Devil. Mack thought so. Sometimes. He might have just been insane. Many of the signs were there, from the fantasies of his loveless childhood to his literal howling at the base of a standing stone that might, or might not, have been imagined. Mack wrote his testament and it supposedly made its way to a publisher who doubted whether he should publish it. The publisher's notes frame Mack's version and supply both context and, following Mack's death, a conclusion of a sort. In a nice twist in the endnotes, the supposed publisher assures us that each sale of The Testament of Gideon Mack will benefit an aged care home in the fictional town described in the story. Robertson leaves us little choice but to doubt his word from beginning to end. This is fiction that will demand that you think.
Mack himself is a classic anti-hero, a characterization that Robertson uses in a footnote to describe another author's character. Robertson draws on many literary references. His characters read, and are influenced by, novels and histories both real and fictional. The author himself holds a Ph.D. in history from Edinburgh University; His dissertation on the works of Sir Walter Scott, a fellow Scotsman and author of such famous early nineteenth century works as Ivanhoe , echoes in The Testament. The women in Mack's life, Jenny the unloved and now deceased wife, Elsie the lover, and diseased Catherine the disputant, find their archetypes in nineteenth century English and French literature such as Gustave Flaubert's Madame Bovary and Lord Byron's Don Juan .
The real publisher, Hamish Hamilton in Scotland, part of Penguin Books, assures us in a standard disclaimer, "This is a work of fiction. Names, characters, places, and incidents are either the product of the author’s imagination or are used fictitiously, and any resemblance to actual persons, living or dead, business establishments, events, or locales is entirely coincidental." I would personally be saddened to think that our playful Dr. Robertson did not sneak in an intentional resemblance to a living person or two, just to carry his device to its logical conclusion.
Robertson authored two novels prior to The Testament and has authored two since. Hamish Hamilton is publishing a new short story by Robertson every day of 2014, each one 365 words long. None of the dozen or so I read included Robertson's broad Scottish vernacular. The Testament relies upon it. Having a copy of the Oxford English Dictionary handy is critical for American readers and others who are unfamiliar with the meanings of hunkersboaks, or a smirr of rain. He is forced to explain commonly used words such as kirk (church) and manse (minister's house). Where the Scots Gaelic becomes too thick, Robertson helpfully supplies footnotes from his fictional publisher. The scene feels Scottish, from the persistent rain and angry young men to the shale beaches and craggy topology. The result is a novel that is fresh and genuine in its love of setting.
Robertson is also a partner in the grant-funded Scots language children's publisher Itchy Coo. Itchy Coo's Web site proudly offers a Hame page and an Aboot Us description. A character in The Testament amusingly pokes fun at authors who try to set Scots down in print. In fact, The Testament fairly brims with jibes at authors. One character is a perennially unpublished novelist. Another complains that "everybody thinks they have a novel in them" before declaring that writing is "a refuge from confusion". The process of writing does facilitate a certain clarity of thought, in my experience, if only because one must decide what to say.
Robertson holds out much hope for the written word, as if warding off a merely spoken evil. Mack laments the premature telling of his story by saying, "if people could have read this full and honest account rather than heard me announce it amid the din and confusion of that day, then perhaps they might have reacted with more open minds." At least three of his characters are writers.
The core of the book, though, is our relationship to truth. What is it? Can we know it? Would we understand it if it were presented to us? Why do people believe as they do? What are the costs of their beliefs? The Testament asks the big questions, explores them from many angles, and leaves you to answer them as you see fit. Robertson explores truth from many angles, including the ability to trust "facts" as they are presented, asks what we can truly perceive, informs on the limitations of human thought, wonders about the wisdom of teaching fairy stories to children, and worries about passions left uncontrolled.
Mack's justification for writing his testament is repeatedly referred to as a drive toward truth. But the absolute truth can be amazingly harmful to a community, as Robertson explores. Mack is shunned by everyone by the end of his truncated life for his effort. Robertson's story does not lend credence to Sam Harris' more recent thesis that we should all just tell the unabashed truth at all times (in Lying [AmazonGoodreads]).
Naturally enough for a book set in Scotland, the central pole around which the book swings in its discussion of truth is religion, and its antipode, doubt of religion. "What is religion if not a kind of madness, and what is madness without a touch of religion?" Robertson asks.
We are told, "Human beings are at one and the same time utterly splendid and utterly insignificant." Could there be a more succinct description of the confusion of our times? Our Western civilization has advanced from an ego-centrism under the sky of the only world to one in which we are simultaneously the only intelligent life form we know and yet a mere speck in the vastness of an impossibly large universe. Robertson plays on our confusion from all angles. He gives us the character of Peter Macmurray, elder of the church and Mack's institutional nemesis, who represents the authority of religious institutions, and the character of Lorna Sprott, a fellow minister who is both a true believer and a sad alcoholic crushed under the authority to which she has submitted herself.
Robertson's tight writing shines in his description of Macmurray: "By day he is an accountant and by night, as Jenny used to say, he adds the saved and subtracts the damned, and always comes out with a minus figure."
Robertson deeply questions the role of the Church of Scotland and its relationship with Scottish culture. The nasty element of control in Western religions does not escape Robertson's notice. He has Mack speak of "the overwhelming weight that bears down on most people who enter a church - the weight of years of learning not to disrupt, not to object, not to speak out against authority." At one point he declares that, "The great age of religion had passed", only to suggest that the Kirk could still have a role in society. It is this Gaelic sensitivity to culture and identity that makes The Testament a wonderfully human book. Robertson recognizes and acknowledges non-traditional roles for traditional institutions but can only hope that they will come to see the world in the doubting way that he does. To not is tantamount to denying our recently-won knowledge and risks living with a permanently entrenched cognitive dissonance. Such dissonance is clearly evident in America's science-denying evangelicalism. Robertson informs us that the same stresses exist within other societies.
Lest one consider Robertson's religious beliefs clearly defined, he dithers. His characters are true believers, agnostics, fakers, atheists, vacillators. Elsie's husband John exclaims, "There are no answers, don't you see?", but John is a troubled and desperately unhappy man. He has not replaced religion with a working philosophy that might, as Aristotle suggested, provide him a replacement comfort. Robertson subtly pokes at the religious who express belief in the Devil while steadfastly finding Mack's claims to have met him utterly ridiculous. "The whole religion thing - not being able to reject it and not being able to embrace it", as Elsie says, seems to come closest to his position.
"The present", Robertson says, "was a mere waiting room for the future." That lovely observation is the exact opposite of the mindfulness of the present encouraged by Buddhism. It is also perhaps an unintended or unwanted consequence of our society's current affair with invention and discovery. The scientist or the engineer works toward a future in which theEureka moment will happen. The teenager waits for the new model mobile phone. The salaryman waits for retirement. We are a society of delayed gratification. Those who defer gratification until after death are religion's real losers.
There are many minor recurring themes in The Testament. Perhaps the most central to the book's exploration of truth is the tendency of people to see intelligent action where it is not. This is known to psychologists as hyperactive agency detection. If I suppose a tiger is responsible for the rustling of leaves I just heard and I am right, I might save my life by running away. If I am wrong, little harm is done. This instinctive survival trait causes no end of confusion for the modern person, living as we do with the distinct absence of tigers.
Gideon Mack's life is dominated by hyperactive agency detection. Upon seeing a bee fly out of a drawer, Mack "wondered if there was a message in it; any kind of meaning at all." When confronted with the appearance of the standing stone, he noted:
  • "It seemed to me that the Stone had provoked this crisis, had engineered it in some way."
  • "Because the Stone prevented it."
  • "Perhaps the Stone was wielding some strange power over events and had brought her to my door at this moment."
  • "The Stone did not want to be photographed. I no longer wished to share the Stone with anybody."
The last, of course, makes one immediately think of J.R.R. Tolkien's characters Bilbo and Frodo Baggins and the One Ring.
Later, in a museum exhibition, "I got up on the wooden step, and this seemed to trigger the tape." Mack jumps to unfounded conclusions quickly, seemingly just to avoid missing one. Robertson, whether consciously or not is impossible for me to say, seems to warn against such actions.
Failures of imagination, and another attendant willingness to jump to conclusions, similarly haunt Robertson's characters. Chance appearances of plot devices seem "incredible" to them. When faced with an experience that he cannot readily explain, Mack races to cognitive closure. He seems unwilling or incapable of keeping an open mind until additional facts are acquired. The unnamed being that he (possibly) encounters must be the Christian Devil. God must exist if the Devil does. All of this is wrapped in layers of tortured logic by means of justification. When Mack feels that he could not have reasonably survived his near death experience, he proclaims, "I was of the opinion, therefore, that I must be dead."
Robertson's characters are not an unrealistic stretch from everyday human experience. We have all met the gullible, but Robertson's exploration goes deeper than pedestrian gullibly. He probes the limits of humankind to judge likelihood. It is something that we do poorly.
Radiolab, a weekly radio show syndicated across the United States by National Public Radio, recently illustrated our intuitive problem with comprehending statistics in an episode called Stochasticity. Stochasticity is a florid academic word for randomness. The study of stochasticity provides techniques to understand events whose results can only be measured statistically. We cannot know whether a particular coin toss will result in a "heads" or a "tails", but we can know that, given a high number of tosses, the results should be about half of each.
My caveats (a high numberabout half) are important and point to our difficulties in understanding the random world. Radiolab interviewed Deborah Nolan, a professor of statistics atUC Berkeley, who demonstrated how poor we are at understanding random acts. She asked a group of students to make up a list of 100 coin tosses. Simultaneously, the Radiolab hosts were asked to toss a real coin 100 times and record the results. Nolan immediately spotted which list constituted the real coin tosses. How? By choosing the list that contained a run of seven "tails" in a row. The students felt that such a run would not appear to be random, but real random sequences include such apparent patterns quite often. Our misunderstanding of randomness stems from the simple fact that the human cortex is a very effective pattern recognition engine. We seek patterns for our own survival. They guide our actions. Lack of patterns, randomness, confuses us. We seek, and often find, patterns that do not exist.
The phenomenon of seeing patterns that aren't there is common enough that there is a word for it: apophenia. Psychologists associate the onset of such persistent delusional thinking with schizophrenia - unless it is associated with an established religion.
Pascal's Wager appears and reappears throughout The Testament. The French mathematician Blaise Pascal famously argued that everyone should believe in God because "If you gain, you gain all; if you lose, you lose nothing." The problem with Pascal's Wager is that it postulates a lack of cost to belief in a god. Nothing could be farther from the truth. Pascal did not acknowledge that belief requires a separation from reality, and that bad things happen to the human brain when it is forced to close cognitive dissonance based on too few facts. Wild leaps of the imagination or other mental gymnastics are required to make sense of nonsense. Robertson compares authorial leaps of imagination, which are the very basis of creativity, to leaps of faith. But in admittedly extreme cases, leaps of faith may also lead to anything from standing on a street corner with a sign reading, "The End is Nigh" to suicide bombing to the Toronto family that left a corpse in their house for six months expecting resurrection. We do ourselves no favors by encouraging delusion.
Or do we? We often think of evolution as a search algorithm that fits an animal to its environment. People have been around long enough to evolve to fit people-dominated environments. We have evolved to cooperate with other people. Sometimes, really quite often in fact, that means that we need to compromise our understanding of the world in order to get along with others. Robertson explores the prices of compromise and failures to compromise by presenting us with characters who span the gamut. 
Both real and fictional Presbyterian ministers make their appearance in The Testament. Their interests, with the single exception of the true believer Lorna, transcend those of traditional Christianity. Robertson neatly brings in the nineteenth century minister Robert Kirk and his book of folk tales, The Secret Commonwealth of Elves, Fauns and Fairies[AmazonGoodreads] to make this point. The book makes a variety of appearances across the generations. Robertson intertwines folk myths, mainstream religion, doubt, love, lust, friendships, vanity, and self-obsession into a tapestry that approaches the complexity of real-life thought.
It is almost not worth bothering to mention the fatherhood obsession shared by Gideon and, more subtly, his father James. It is too obvious. However, this does lead to some brilliant foreshadowing with Elsie's daughter Katie and her imaginary friend. The friend seems to be Mack's Devil, lending yet another bit of support for the reality of an illusion that one had just decided was an illusion. Similarly, Elsie's eleventh-hour admission of the length of her affair with Mack, and the depth of it, questions Mack's veracity just after others had established it. Robertson's misdirection took some careful construction.
Robertson, while pillorying religious belief, does not spare non-religious thought. The atheists in The Testament are generally unhappy, and the one professed agnostic is depicted as physically crippled and verbally vitriolic. Robertson asks, but does not answer, what makes one happy to live one's life. Perhaps, being Scottish, he has no idea. More likely he simply was not aware of modern scholarship which has started to unravel this conundrum, such as this study of coping strategies of the irreligious.
Robertson employs some beautiful metaphors throughout The Testament. My personal favorite is this:
"Walking through a deserted city in the hours before dawn is sobering way beyond the undoing of the effects of alcohol. Everything is familiar, and everything strange. It’s as if you are the only survivor of some mysterious calamity which has emptied the place of its population, and yet you know that behind the shuttered and curtained windows people lie sleeping in their tens of thousands, and all their joys and disasters lie sleeping too. It makes you think of your own life, usually suspended at that hour, and how you are passing through it as if in a dream. Reality seems very unreal."
Walking past the sleeping multitudes is a wonderful depiction of the atheist experience. One often feels the weight of the mass delusion that grips our world. Naturally, and very Robertson, true believers must feel the same way. One is left to make of it what one will.
My one substantial criticism stems from an experience that Robertson could presumably not personally explore. Mack's character undergoes a near death experience, but Robertson, it seems, could not pull from observation to make his description plausible. My own near drowning left me uncomfortable with Robertson's portrayal. Although I recognized Mack's reported lack of panic, I experienced no flashing of my life before me, nor a feeling that I had left too much unresolved. The immediacy of the situation dominated my mind, even as I began to think that perhaps I should try breathing water after all. Those who told me I could not could have been mistaken. I do not fault Robertson overly much for his reliance on clichés for this part of his story. I am glad for him that he has not gained the insight.
The Testament of Gideon Mack should not have been long-listed for the 2006 Man Booker Prize. It should have won it. Perhaps it didn't because some members of the review committee were themselves religious. The book is designed in a certain sense to offend. It would not offend a doubter, but it threatens the homey comfort of the believer. "How can it be blasphemous? It’s the truth. There isn’t a word of a lie in what you’ve heard." Gideon Mack tells his friend and fellow minister Lorna. She replies,  "Of course it’s blasphemous. It goes against everything we stand for. You simply mustn’t repeat it." I urge you to ignore her advice.

Friday, December 05, 2014

Book Review: Can War be Eliminated? by Christopher Coker

In Can War be Eliminated? [AmazonGoodreads], Christopher Coker takes only 108 pages to provide his answer. No, says Coker. The fault, as Shakespeare's Cassius told his friend Brutus, is not in our stars but in ourselves.
Coker is a professor of international relations at the London School of Economics and Political Science (LSE), although he is described on the book's back cover as an "internationally known philosopher of war" - a much more fetching title to be sure. Coker's "expert's page" at the LSE lists his academic publications since 2000 and prominently includes the cover image for this book. It is easier to see a list of his books on his Amazon author page, which stretches farther back in time.
The book is marketed using a clever summary taken from Coker's prologue: "This book challenges the view that war is an idea that we can cash in for an even better one - peace." It is perhaps more instructive to provide that quote its context:
In this brief essay, I will argue that, contrary to what many would argue, war is not pathological, any more than it is socially dysfunctional, and it most certainly is not just a bad idea that we can cash in for a better one, peace. It has played such a central role in the human story because it is embedded in our cultural evolution and, unfortunately, this is likely to remain the case for some time yet. (pp. xiv)
Coker has been recently prolific, breaking his two-decade pattern of slow and steady publication. He published Warrior Geeks: How 21st Century Technology is Changing the Way We Fight and Think About War [AmazonGoodreads] in 2013, followed closely by Can War be Eliminated? in early 2014 and Men At War: What Fiction Tells us About Conflict, From The Iliad to Catch-22 [AmazonGoodreads] in mid 2014. Both Warrior Geeks and Men at War are substantially longer works.
One might wonder why he chose to highlight Can War be Eliminated? on his LSE expert's page instead of his larger volumes that have frankly attracted better reviews. The answer may lie in the not uncommon criticisms to his books that they are laden with academic speech, relatively inaccessible by the general reader, and are heavily historical. I do not fault Coker for either of those. He is, after all, a philosopher of war. His outlook is deeply historical by design and his understanding sufficiently deep to warrant the use of a large vocabulary. His miserly social media presence (his profile on Linked In has only one connection and he does not seem to be present on others) strongly suggests that the general public is not his target audience. It would seem that he is an ivory tower academic, struggling to comprehend his subject rather than to profit by it. Can War be Eliminated? may be more than the most recent book he placed on his LSE page. It might represent his answer to his most vexing question.
Let us examine Coker's thesis that war is neither pathological nor socially dysfunctional. War is unfortunately not pathological in its most common sense, as being associated with a disease. Our species wars by its very nature. Coker is particularly critical of the political scientist John Mueller who has argued that war is "just an idea" and may thus be cast aside if we choose to do so. Mueller argues that war is a cultural construct. Coker disagrees, as do I. War is, however, pathological in the sense of "behavior that is habitual, maladaptive, and compulsive" (from The American Heritage® Dictionary of the English Language, 4th Edition). Culture can make us more willing to war even to the point of compulsion. Coker dances around this second sense, exploring it by example more than by clear-eyed acknowledgement.
Coker's first chapter is entitled simply "Evolution", and his second simply "Culture".  These two chapters hold for me the key to Coker's way of thinking. We are evolved to war and we have built cultures that universally acknowledge that central fact of our existence. How could it have been otherwise? Our ancestors, hairless, relatively weak, and fearful as predators go, had but two small advantages to allow them to live in a hostile world; the opposable thumbs that we heard so much about in school, and the social brains that developed in ratchet with them. We banded together and made tools.
It is interesting to note that in Ernest L. Schusky and T. Patrick Culburt's 1967 anthropology textbook, Introducing Culture [AmazonGoodreads], the index entry for weapons reads: "see also Tools". Indeed, the words are used interchangeably in that text. Perhaps that tells us all we need to know. Our weak, vulnerable species would be in serious trouble in any jungle or forest of the world were it not for our ability to cooperate and make tools. That it is only catching up to us recently is what should amaze us.
The psychiatrist Andy Thomson has noted that "We are all trapped in a Stone Age brain." Coker agrees, commenting that "we remain linked to our prehistoric past." (pp. 108). That brain has left us with an ability to racket inventions and yet fear our own results. It is perhaps not surprising that the Christian Bible's most highlighted passage on Amazon's Kindle is from Philippians 4:6-7. The passage provides advice to lessen anxiety. We live, as so many have lived before us, in a time of great turmoil. Our turmoil tends to be of the kind seeking purpose rather than the existential kind our forebears so often faced. Regardless of media hype, we in the West have few truly existential threats at the moment.
We have difficulty separating real problems from relatively minor ones. It is ridiculous, for example, to characterize the Islamic State in the Levant as an existential threat to the United States. A proper existential threat occurs when one's food supply runs out or when your enemies have killed your neighbors and are on their way to kill you. The Yazidis of Eastern Kurdistan face an existential threat from the Islamic State. The Munda tribal people of India's Jharkhand state face an existential threat as their forests are destroyed, as do the Sarayaku people of Ecuador. One might feel an existential threat from climate change if holding ocean front land in Kiribati. One may not reasonably fear an existential threat if gasoline prices rise or Walmart finds it more difficult to source cheap travel mugs.
It is not even reasonable to fear an existential threat from the ebola virus, whether in the United States or even in West Africa. About 5,000 people have died from the recent ebola outbreak as I write this. Between a quarter million and a half million people die annually from influenza. The World Health Organization estimates that more than 600,000 people die annually from malaria and roughly half the world's population are at risk of contracting that disease. If we choose to fear diseases, where should our fears reasonably lie? Our stone age brain ensures that we do not always choose our fears based on reason.
We are, as a species, particularly poor at judging long term risks precisely because our brains clamor for attention when short term risks, however unlikely, present themselves. This is a legacy of our stone age brains that Coker rightfully highlights. It is also why we can so suddenly stumble into war.
That is not to say that I agree with all of Coker's evolutionary outlook. In fact, one particular sentence fussed me rather greatly. "Devoid of anti-social instincts we probably might have led a peaceful life," Coker assured, "but that is not how we are designed biologically." (pp. 5) There is much wrong with that. Firstly, it is our social instincts that have led to warfare. Our natural state, as Coker rightly points out, is in small hunter-gatherer tribes. Such tribes are tiny by modern standards, 25-75 people with a mean somewhere well shy of the average. Life in tribe was as simple as putting up with the oddities of your extended family group for your own protection and steadfastly defending it against anyone in other groups. Other humans, your "out group" in the phrasing of sociologists, constituted the greatest threat to your existence even in the presence of lions, tigers and bears. Oh, my.
Coker acknowledges this strong linkage between in-group bonding and out-group hostility:
Generally speaking, the more co-operative a species is within the group, the more hostility there is between groups. When there is a very variegated society, such as in New Guinea, which more than 800 languages, out-group enmity can be fierce.
It seems odd that he seemingly misses the point that our social instincts and our anti-social instincts are two sides of the same coin. We bond for survival and that causes us to fight with others. One immediately comes to think of modern religions. "Oh, the Protestants hate the Catholics; And the Catholics hate the Protestants; And the Hindus hate the Moslems; And everybody hates the Jews." goes the lyrics to Tom Lehrer's satirical song National Brotherhood Week. We are simply incapable of defining an in-group without inadvertently defining an out-group. We make our brothers and sisters into perceived enemies by the very act of creating society.
Secondly, what life would we be living if our species hadn't evolved to war? It is a counterfactual thought experiment that is impossible to answer, but one strongly suspects that the entire evolution of our species would have been drastically different, so different in fact that "we" would not be here.
Lastly, we were certainly not designed at all. That might be just loose speech. The philosopher Daniel Dennett spent much time in The Intentional Stance (1987) explaining why it is perfectly acceptable to view the products of evolution as if they were "good for something", an idea he reprised in Darwin's Dangerous Idea (1995). Nevertheless, given the contention in my home country related to the theory of evolution, I propose that more a careful characterization is necessary. It is only by being focused on what we are, as evolved animals, and being conscious of our own tendencies to fall into cognitive traps, that we can approach some form of objective truth.
Coker notes the opinion of the English philosopher Thomas Hobbes that war is "central to the human condition". Hobbes' most famous quotation on the subject is undoubtedly from his masterwork Leviathan, "the condition of Man... is a condition of Warre of every one against every one" and it is reasonable to presume that it was to this that Coker referred. It always seems learned to quote from ancient philosophers on the topic of your writing, but I do not believe that Coker has Hobbes rightly aligned in this particular case. The quotation comes from Chapter XIV of Leviathan, entitled "Of the First and Second Naturall Lawes and of Contracts" (sic). The context is Hobbes' discussion of liberty and natural rights and is part of his definition of what became known as social contract theory. It is from Leviathan that we get the idea that the natural state of humanity was "solitary, poor, nasty, brutish and short" and that only the imposition of authority (government) has lifted us from that fate.
There was no room in Hobbes for the innate notions of humanity's group cohesion, or other inborn cognitive mechanisms that encourage us to foster stable social structures. Hobbes' natives are out for number one, always at each others' throats, locked a deadly competition to get ahead. We know now that hunter-gatherer tribes do not function like that at all. They are certainly not the noble savages of Hobbes' contemporary, the English poet John Dryden, who coined the term in 1672, just 21 years after Leviathan was published. These two competing concepts of the natural state of humanity, cruel and competitive versus noble and uncorrupted, had been strongly debated during the sixteenth and seventeenth centuries. Modern philosophies and legal systems, from Henry David Thoreau and Karl Marx to Bertrand Russell and Jean-Paul Sartre, still echo with the struggle to resolve these competing views. So too Jared Diamond's The World Until Yesterday: What Can We Learn from Traditional Societies? [AmazonGoodreads], a recent and brilliant description of primitive societies and our relationship to them. Coker notes that Diamond "makes short work of the idea that primitive societies are innately peaceful". (pp. 4)
My view is that the argument is purely a matter of scale. Humans do have innate cognitive biases that encourage us to form groups for survival. These traits, coupled with other cognitive features, ensure that the groups we form are rather small, most often just 25-50 close individuals with some much larger number of acquaintances. We consciously and subconsciously protect our group even at to the extreme of shunning or killing individuals who threaten group cohesion. That puts lie to Hobbes. However, we also are naturally fierce when we are both crowded and presented with an out-group that threatens our in-group. That puts the lie to the noble savage.
Onto culture. Coker's chapter romps lightly from Richard Dawkins' "selfish genes" to religion to action films to women warriors. In the middle of this scree, he has left out a few key aspects of our love affair with war. Chief among these is the culture of war production and how it can become a self-fulfilling prophecy through economic incentives. US President Dwight D. Eisenhower, himself famously a career military man and the Supreme Commander of the Allied Forces in Europe during World War II, warned against the growth of such a culture in 1960:
In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist.
We now live with the consequences of ignoring Eisenhower's words. It is an almost shocking omission by Coker.
He does better when discussing what historian Ronald Wright has labeled a "progress trap" in his excellent book A Short History of Progress [AmazonGoodreads]. Wright tells us that our pattern of overconsumption causes collapse and cycling, and that with each cycle the cost increases. To make the same point, Coker brings in Charles Darwin, Richard Dawkins, the military theorist Carl von Clausewitz, the political activist Barbara Ehrenreich, the poet and soldier Wilfred Owen, the artist Yoko Ono and others.
Coker slides along well-trodden paths by informing us that the Wright Brothers (inventors of the airplane) saw their invention used for war before being widely used for peaceful purposes. He adds in quotes from Guglielmo Marconi (radio) and Hiram Maxim (the machine gun), just to make sure we got the point. Somehow he missed Alfred Nobel (the inventor of dynamite).
The book brims with quotations. They appear when randomly opening the book to almost any page, especially near the beginning. Here is a random sample:
  • "War is not the best way to settle differences, but it is the only way of preventing them from being settled for you." -- G.K. Chesterton
  • "War: thunder against it." -- Flaubert
  • "War is the art of embellishing death" -- Japanese proverb
  • "War is always a matter of doing evil in the hope that some good may come of it" -- Basil Liddel-Hart
  • "[War] is a protean activity... like disease, it exhibits the capacity to mutate, and it mutates fastest in the face of efforts to control or eliminate it" -- John Keegan
Sometimes Coker follows a thread of his own making long enough to drop an insight. "Technology is simply the further evolution of evolution;" we are told, "and technological evolution produces a variety of gadgets, machines, tools and techniques which help us to evolve its power to evolve." He nearly (seemingly independently) recapitulated Stuart Kauffman's theory of the adjacent possible. Typically, though, he fails to follow through. These insights are worth mining from the book if one is willing to piece them together oneself. Coker won't do it for you.
Coker does an excellent job summarizing the many aspects of the small-scale culture of warfare in a short space, but fails to address many large-scale cultural phenomena. The problem with using culture as the only solution to our social problems is that culture is fragile. Very fragile. It doesn't require a social collapse to decimate a culture. All it takes is parents who don't teach their children. An entire spoken language can become endangered within a single generation if it is not taught, as has been the case with many of the native languages of the Americas, from Yupik Eskimo speakers in Alaska to the Ona of Tierra del Fuego. Although a language can amazingly survive for generations with a handful of speakers (often shamans), one could hardly call such a culture healthy. In rare cases, a language and a culture can make a comeback, as with Irish since the independence of that country, but that is not the most common result. The linguist Andrew Dalby reports in Language in Danger: How Language Loss Threatens Our Future [AmazonGoodreads] that the world loses a language every two weeks.
The multi-generational linguistic research project Ethnologue currently estimates that there are 7,106 known living languages. At the current rate, we could reach monoculture in a mere 68 years. Let's hope that doesn't happen. I agree with Dalby that a world rich in linguistic diversity helps to make us resilient to a changing world. Unfortunately, loss of languages reduces our societal resilience much as modern warfare has become less adaptive to our long term survival.
Of course, cultures can be destroyed more quickly by the simple expediency of genocide. That ugly term is used so often in relation to Hitler's twelve million strong butcher's bill in Europe that we tend to forget that 3-4 times that many people were killed in Mao Ze-Dong's China and that Leopold II of Belgium's reign of terror in the Congo Free State has been estimated to have killed many more than Jozef Stalin's sickening but, by comparison, paltry seven million. We have a much harder time guessing the decimation of native peoples throughout the age of empire for which no good numbers exist. Hideki Tojo, prime minister of Japan during World War II, and the Cambodian communist Pol Pot don't even make the short list with their scant millions.
Having referenced Jared Diamond to good effect in his chapter on evolution, Coker fails to mention Diamond's later work such as Collapse: How Societies Choose to Fail or Succeed [AmazonGoodreads]. That book informs us of the many reasons that societies can fail, most of which come down to overoptimism. Negative political and economic reactions to Al Gore's book An Inconvenient Truth: The Planetary Emergency of Global Warming and What We Can Do About It [AmazonGoodreads] provide an example of such maladaptive optimism from our own time and place. We most often act as if we can afford to pursue short term goals at the expense of long term ones. Messengers who dare to warn of long term consequences are invariably pilloried.
That we invariably consider short term challenges to be more important than long term ones is another mammalian feature driven by evolution for our own immediate safety. Our ancestors could not afford to plan for the future when hunger and danger existed on a daily basis. Further, our brains ensure that we both overreact to any perceived immediate threat and also ascribe an intelligence to it. This is known in psychological circles as hyperactive agency detection. That is certainly a necessary survival skill but it works against us as societies get larger and we create long term problems for ourselves.
Criticisms of Collapse, for example, such as a review in The Economist, often take Diamond to task for not being optimistic enough. This should not be surprising. We humans areoptimistic by nature and have a terrible aversion to bad news. Must all writing be formulaic such that it end on a high note? Jane Goodall's Reason for Hope: A Spiritual Journey [AmazonGoodreads] is really quite a depressing book in spite of its title. Why should we have hope in the face of facts to the contrary? Because it sells books? Or because there is actually reason to be hopeful? Because, it seems, we need to hope in order to continue to act. Warnings like Diamond's, Gore's and Goodall's tell us in no uncertain terms that we can hope all we like, but we need to act as well. Cultural adaptations to avoid or limit warfare, such as strategic arms reduction treaties, proxy wars and the United Nations, are some of the ways that we can act. Coker does not explore these ideas nearly enough nor does he acknowledge their limitations.
It would have been useful to hear Coker's thoughts on the geopolitical analyst George Friedman's works, such as The Next 100 Years: A Forecast for the 21st Century [Amazon,Goodreads]. There is no question in Friedman's mind that geopolitical realities will cause continued warfare. Coker does not cite him.
Where Coker focuses his prognostication on combined human-robot battlefields of the near future, Friedman is spends his time attempting to convince us that warfare will inevitably spread to space. Friedman's position is seemingly at odds with both history and international treaties. Although the United States unilaterally withdrew from the 1972 Anti-Ballistic Missile Treaty with the former Soviet Union in 2002 (an event that led to the formation of the US Missile Defense Agency), space-based weapons are prohibited by the older 1967Outer Space Treaty. Most of the world's countries have signed that treaty, and over a hundred have ratified it including the United States, Russia and China. Only three attacks have been conducted in space to date: two satellites destroyed by the United States in 1985 and 2008 and one by China in 2007. All three were considered to be tests and, tellingly, all three were launches from the Earth. There is no current sign of Friedman's space-based weapons.
That may not be the end of the story. Friedman is probably correct when he worries that the military exploitation of space will not stop with communications and surveillance. All it takes to start an arms race is for one party to pursue advantage. I doubt very much that the United States would go to war with a country for withdrawing from the Outer Space Treaty. Friedman's arms race scenario would be a more natural result. It would have been useful for Coker to consider it.
Coker leaves out some surprising aspects of warfare. The nuclear powers have withheld from conducting a nuclear war since the dropping of the atomic bombs Fat Man and Little Boy by the United States on the Empire of Japan, but few think we are completely done with nuclear weapons. Coker fails to address the world's nuclear arsenal at all.  This might be due to his focus on the so-called "Western way of war", which is, in a word, limited.
Coker's book is, at its end, quite seriously dissatisfying. Although it is clearly written and well researched, it is also a collection of contradictions. In it is the fascinating and the mundane, the insightful and the obvious, the frightening and the comforting. It spans the ways people think and have thought about war from the ancient Greeks to the robotic battlefields of our near future. It leaves one with little room to doubt the negative conclusion he gives to his strawman question. The common wisdom, and that used by Coker, is that no individual society dare give up war any more than they dare give up agriculture.
Yet one is left to wonder, as Coker occasionally seems to do, how the situation might need to change anyway. For war is not the academic exercise that the book paints. War is the horrible suffering of millions so that a few in power can pursue their goals. War is death on a massive scale. War is dead sons, scarred daughters, homeless children. War is grown adults, irredeemably reduced to wounded animals, crying out shamelessly to their absent mothers. War leaves a trail of broken people, lost in alcoholism, drugs, domestic violence and private pain. It is war that teaches us the meanings of PTSD, POW, MIA, and KIA much more often than it presents us with heroes. War leaves farmland strewn with land mines and cities plagued with unexploded ordinance. These hideous presents to the future are rediscovered by livestock, children and the elderly. It is war that teaches us to hate so much that we yearn to try it again. War is a drug to which we are addicted and a high we dare not allow ourselves when we wield tools of mass destruction. 
I am left dissatisfied with Coker's book for its easy answer as much as for its failings of content. There simply must be a better way to conduct our affairs than to allow war to dominate us, even if it means changing ourselves. Is it possible to give up war if other conditions apply, such as expansion into new frontiers off of our planet or by daring to make fundamental changes to the human brain? We should really think this through before we have the technologies to accomplish them. Coker does not dare to discuss any of the more radical alternatives.

Wednesday, November 26, 2014

Book Review: The Philosophical Breakfast Club by Laura J. Snyder

I wandered into Riverby Books in downtown Fredericksburg one blustery autumn afternoon in the mood for some serendipity. Used bookstores may be going the way of the dodo (that is, headed for an early and forced extinction but destined to be brought back by some mad scientist of the near future in a slightly twisted guise), but they are the best sources of serendipity that I know. The smell of the place is distinctive but hardly uncommon. One may find it in used bookstores, libraries and scholars holes from Timbuctu to Taipei, but never in a chain bookstore. The most concentrated source of the scent of old paper and paper mold may be found in the city block of Powell's City of Books, the Mecca of knowledge nerds. I would pay good money for iBooks or Kindle to produce such a smell.
The two glass windows flanking Riverby's wood-and-glass door invite one to stop and browse. A classic Underwood manual typewriter shows the first line from a classic novel on the top half of a sheet of US Letter. Tell the cashier who wrote it and get 10% off. A selection of banned books are proudly displayed with index cards explaining where and when they were banned. A whiteboard advertises specials, includes cute drawings and exhorts engagement. Inside, cookbooks vie with vintage comics, classic novels with the history of Rome, used philosophy books sold back by university students and a stack of restored Compact Oxford English Dictionaries in the original slipcases and including the reading glasses necessary for the tiny print. The book binder's corner is almost hidden by a display of fine bindings in leather and gold. The rare book cabinets sit just across from the constantly changing novelty shelves that stretch from floor to high ceiling, some at intentionally odd angles. There are stacks of recently acquired books on the stairs, posters and maps on the walls. Original drawings from Maurice Sendak's Where the Wild Things Are catch my wife's eye every time we stroll by. I enter and inhale.
I walk past the novelties on my way to Ancient Egypt, Greece and Rome but something catches my eye. Something almost always does. A copy of The Philosophical Breakfast Club: Four Remarkable Friends Who Transformed Science and Changed the World [AmazonGoodreads] by Laura J. Snyder sits facing outward on an set of shelves canted sideways right into the natural traffic flow. It is shiny and new with a perfect dust jacket. The cover mirrors Riverby itself with its full height bookshelves, tabletop telescope and photographs of four of the most important scientists of the early nineteenth century, William WhewellJohn HerschelRichard Jones, and Charles Babbage. I knew of the astronomer Herschel and of course Babbage the mathematical genius who invented the first programmable computer, friend to the first computer programmer Augusta Ada King nee Byron, Countess of Lovelace (known as Ada Lovelace). I recalled vaguely that Whewell had coined the word "scientist". Snyder would soon fill in the many missing details.
The book promises to show how these four friends set out to revolutionize the practice of science and then, incredibly, did it. As a history, it has it all; a catchy title, a true and interesting story told like a novel, and subject matter that keeps giving and giving. Snyder proves herself a historian to watch. I would read anything she writes. Thank you Riverby. I feel guilty at the recognition that Snyder won't see any of the paltry $5.98 I hand over for the hardback. I would have paid twice that at Amazon but never would have found it on Amazon. It sold for six times that amount when it was first published in 2011. Riverby and Powell's capture my business by providing the serendipity still missing from online alternatives.
Fortunately for us all, Snyder has written more. Her Web site lists two more books as grand in scope as The Philosophical Breakfast Club: This year brought Reforming Philosophy: A Victorian Debate on Science and Society [AmazonGoodreads] and next promises us Eye of the Beholder: Johannes Vermeer, Antoni van Leeuwenhoek, and the Reinvention of Seeing [AmazonGoodreads] about the seventeenth century developments in optics. She gave a TED talk on The Philosophical Breakfast Club in 2012 and somehow manages to find time to teach at St. John’s University in New York. I console myself that she will see some cash from me for the new works.
The Philosophical Breakfast Club begins with a triumphant speech by Whewell at the British Association for the Advancement of Science at the height of his career and soon flashes back to his childhood. Snyder's story hangs off of Whewell, who served in life as the instigator, foil and correspondent for the other three as well as a first-rate scientist in his own right. Snyder uses the the voluminous extant letters of the four men and many of their contemporaries to paint a detailed, personal account of their extraordinary lives. She seems to know as much of the four men as their wives and mothers and perhaps, with the advantage of distance, more.
William Whewell was a mathematical whiz. He was a poor boy who struggled to pay his way during his time as a student at Cambridge. His home parish raised money to get him there and he worked throughout his time there in order to stay. Before his long run as an academic was over, he would become Master of Trinity College, hold a professorship in mineralogy, write physics textbooks, revolutionize the study of the tides with one of the first worldwide data collection efforts, and write a major work of philosophy. Says Snyder,
Whewell suggested "Eocene, Miocene, and Pliocene" to the geologist Charles Lyell as names for historical epochs, and in a few years he would give Michael Faraday the terms "ion, cathode, and anode" for his electrical research.
Besides "scientist", Whewell would also coin the word "physicist", both by analogy with the word artist. He was one of the most fascinating and capable characters you have probably never heard of.
The Reverend Richard Jones was a founder of modern economics. His Essay on the Distribution of Wealth and on the Sources of Taxation [full text] in 1831 was the first major work of economics to rely on both the massive collection of real-world data and the statistical analysis thereof. He served first as a rural curate, interested mostly in gardening, and following his book, as a professor of political economy. He arguably, as Snyder argues, would not have finished the massive work except for Whewell's constant prodding. The one survivingphoto of Jones shows an unhappy man, looking downward, corpulent and clearly depressed. He is the only one of the four whose eyes do not radiate their intelligence and learning.
Sir John Hershel was not just any astronomer. He was the son of astronomer William Herschel, the discoverer of the planet Uranus, and carried on his work on cataloging double stars and nebulae in both the northern and southern hemispheres. Throughout a lifetime of science, he discovered seven planetary moons, conducted a major study in the botany of South Africa and was one of the inventors of photography.
Charles Babbage, another mathematician, has had the most lasting reputation of the four due to the influence of computing on our current age. 
Snyder shows herself equally comfortable with history, science and the history of science. The dispassionate eye of the historian mixes nicely with the sympathetic observer of human nature. The only one who is treated at arm's length is perhaps Babbage. Babbage's behavior, especially in middle age, was variously described as "spiteful", "cruel", and "snide". Babbage was "never one to underestimate his own intelligence." Snyder seems to have justification for these descriptions in the letters of her protagonists as they struggled to maintain positive relations with their misanthropic friend.
Here is the voice of the professional historian, introducing the concerns of the original Luddites:
Babbage had realized that just as steam-driven mechanical looms were replacing men and women in the wool and cotton mills, so too could a machine replace the human computers. [He] seemed unconcerned that unemployed English computers would riot like the unemployed wool and cotton workers had done in Whewell's home county of Lancashire in 1813 (it helped that these were still part-time laborers; only after 1832 did computing for the Nautical Almanac become a full-time job). During these labor disturbances and others that took place between 1811 and 1816 in the manufacturing districts in the north of England, displaced workers destroyed mechanical looms and clashed with government troops. The term Luddite was coined in this period, after Ned Ludd, an English laborer who was lionized for having destroyed two stocking frames in a factory around 1779. Babbage would have none of this Luddism. Progress in science and industry required more mechanical means of calculation, as well as mechanical means of factory manufacturing, and nothing would stand in the way of that progress. (pp. 83)
Snyder's clear prose not only introduces Babbage's immediate concerns and the environment in which he worked, but ties Whewell's personal story into the picture, explains a modern term few know the full etymology of and throws in a substantial portion of the early history of computing. All in one easily readable paragraph! She handily ties her history to our own modern day problems, which we instantly recognize in Babbage's.
Here is the science writer, not a scientist, but one who is perfectly comfortable explaining complex ideas:
Newton established that the attractive forces of the sun and moon produced a tide-generating force; but it still remained to be shown how the law of universal gravitation could account for particular tides. His theory as applied to the tides did correctly predict some of the observed phenomena, such as the known fact that the oceanic high tides lag roughly three hours behind the syzygies (when the sun, earth, and moon are aligned, which happens at the time of the full moon and the new moon). But Newton's analysis was inconsistent with many of the observations that did exist, indicating that the relation between the tides and the gravitation between the earth, sun and moon was still not fully understood. In particular, his theory did not provide any understanding of factors that might counteract the attractive force. (pp. 171)
A lessor writer would have dodged the word syzygies altogether, or discussed Whewell's groundbreaking work on understanding tides without bothering to describe why Newton's theory was insufficient. Perhaps she knew of syzygy's many synonyms from her studies, such as the philosophic sense used to denote a close union or alternatively a union of opposites.
Of course, no one is perfect. Snyder credits Whewell for bringing about "a particular vision of what a modern scientist does", especially in his massive data collection efforts. She fails to mention Whewell's contemporary Matthew Fontaine Maury, who used identical methods on the other side of the Atlantic in the same years. There is more to that story than Snyder admits. "Where Maury mapped the oceans, Whewell mapped the coasts", notes Caren Cooper of Cornell University.
In describing the mechanism connecting the number-crunching central "mill" in Babbage's Analytical Engine to the "store" that provided memory storage, Snyder makes a point of referencing a modern expert who calls it a "memory data bus". She apparently missed that the term bus has become the common term of art. She did recognize just how much of Babbage's architecture for his earlier Difference Engine and later, general purpose, Analytical Engine presaged the von Neumann architecture used in the first stored-program digital computers, although she failed to mention John von Neumann by name. Readers may be rightfully surprised to discover that Babbage not only designed the machines but peripheral devices such as graph plotters and printers.
Snyder does not address that most vexing question for large-scale mechanical computing: How did Babbage intend to overcome the large amount of friction caused by the interactions of his many gears? There is modern evidence to suggest that both friction and the inability to finely machine the parts required doomed the Difference Engine. Snyder instead blames only Babbage. "For Babbage, the enemy of the good was always the better", claimed Snyder, which, although no doubt true, misses the point somewhat. A more careful reading of modern scholarship on Babbage's engines would have uncovered his inability to overcome practical limitations. His engines worked at scale only on paper. Modern reconstructions do exist - in software where friction and machining are not limiting.
Babbage, that master of deductive reasoning, was also the odd man out in other ways. The four friends had come together at Cambridge University and as students met regularly for the eponymous breakfast discussions. They decided at Whewell's prodding to dedicate their lives to further the ideas of Francis Bacon. Bacon was the first to codify what we now think of as the scientific method and promoted inductive reasoning as key to advancement in science. Induction is the collection of many specific examples of a phenomenon and then eventually theorizing a general law to fit the collected evidence; it is the philosophic opposite of deduction which proceeds directly from some principle to others by the rules of logic. Of course, scientists now use both induction and deduction to analyze nature, just as was done in the early nineteenth century. The members of the philosophical breakfast club weren't after mundane, plodding science. They wanted to uncover grand new laws of nature. For that, they needed induction. Hershel, Jones and Whewell, following Bacon to the letter, pleaded for the use of induction in their great works. It is easy, they thought, to be led astray by logically proceeding from a bogus presumption. The truth of that risk continues to dog many would-be philosophers and their followers today. John Dewey's presumptive twentieth century educational reforms and their aftermath are as fine an example as we could wish.
And British science in the early nineteenth century did need help. The breakfast club's contemporary Charles Dickens could have been talking about British science in his opening paragraph to A Tale of Two Cities:
It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to Heaven, we were all going direct the other way - in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only.
Babbage eventually broke ranks as progress on his Difference Engine broke new ground. Deduction has its place, after all. Whewell, himself a mathematician of note, felt that Babbage had betrayed the goals of the club. It is hard for a modern computer scientist to agree. The ability to compute on a massive scale, rapidly and with accuracy, has brought new abilities to pursue both pure and applied science.
Complaining about the London Statistical Society's focus on the collection of facts alone, Whewell complained that "they would go on better if they had some zealous theorists among them...Unconnected facts are of comparatively small value." (pp. 153) The computer scientist Jeff Hawkins noted a similar problem in late twentieth century neuroscience research when he said:
There was an unending supply of things to study and read about, but I was not gaining any clear understanding of how the whole brain actually worked or even what it did. This was because the field of neuroscience itself was awash in details. It still is. Thousands of research reports are published every year, but they tend to add to the heap rather than organize it. There's still no overall theory, no framework, explaining what your brain does and how it does it.
It is hard to remember that you came to drain the swamp, goes the joke, when you are up to your ass in alligators. It is a problem that is likely to become steadily worse as science and engineering continue to specialize. Hawkins, like a good inductionist, proceeded to put forward his theory based on the myriad of observations. Whewell, Jones and Hershel did the same thing in their time.
Any Englishman uncertain whether such philosophic differences had practical consequence would have become certain as workhouses were constructed throughout England during this period. The poorest members of society were at first encouraged and eventually forced (after 1834) to pay their debts by moving into workhouses that separated family members and reduced laborers to virtual slaves. This system was unabashedly based on the social theories of the political economists Thomas Malthus and David Ricardo, and the social reformer Edwin Chadwick, a protégé of the philosopher Jeremy Bentham. Bentham founded the philosophic movement of Utilitarianism, which suggested calculation of any proposed course of action to maximize overall happiness. It may be useful to apply such a calculus to an individual, but to apply it to a society one runs the substantial risk of justifying suffering of the socially inconvenient. Philosophy mattered as much in the first half of the nineteenth century as it did in the latter half, when the application of the ideas of Darwin and Marx would instigate social change that would take more than a century to painfully play out.
Babbage's ideas about religion also rankled. All four men were religious and seemingly honestly so. Jones and Whewell were ordained ministers of the Church of England. Herschel and Babbage were wealthy enough not to need to be. Snyder does a fine job describing his house parties in which Babbage used a small Difference Engine was used to count from one to one hundred, one digit at a time. As the machine reached the one hundred mark, it would begin to count by twos.
What you just witnessed seemed almost miraculous, did it not? he asked them. It seemed like the machine would just keep counting by one for an eternity. And yet this is not what occurred. The machine suddenly changed its course and began to count by a new rule.
Is this not what we feel when we look at nature, and see wondrous and inexplicable events, such new species arising as others die off? Babbage inquired. Is this not typically explained by supposing that God, our creator, our inventor if you will, has intervened in the world causing this event, outside of the natural order of things? Is this not exactly what we call a "miracle"?
Babbage had set a feedback mechanism to change the function being computed when the value reached one hundred. He had invented the idea of God as algorithm. Personal interventions were not necessary to keep the world running, as Isaac Newton had believed. In this he was expressing, rather uncomfortably for his ordained friends, a quite Deistviewpoint. Snyder never quite hangs the term Deist on Babbage.
Whewell, Herschel and Jones might have been equally disquieted by the comments of Henry Wilmot Buxton, who said of Babbage that he "had taught wheelwork to think, or at least to do the office of thought." (pp. 88) The implications of artificial intelligence have yet to be dealt with today. We have the entrepreneur billionaire Elon Musk warning in 2014 that AI is "our biggest existential threat." It is unclear whether Musk's fears have any basis in reality, but the threat to traditional religion and the concept of the soul is enough to trouble many and was certainly enough to trouble our Anglican protagonists.
Snyder reminds us that most men of science at the time thought of science, indeed - fought for science, to be seen as an equally valid avenue to approach the divine. "The idea was that nature was one of God's two books - the other, of course, being the Bible." says Snyder. She is right to remind us, "At that time, especially in Britain, science and religion were not considered enemies; on the contrary, they were seen as compatriots, both devoted to the appreciation of the Creator of the universe." Times have changed. It has been widely noted that a survey of the U.S. National Academy of Sciences reported that "72.2% did not [believe in a god] and 20.8% were agnostic or had doubts". It is no longer fashionable or required for most scientists to express faith in a national religion.
Snyder notes that Whewell closely followed the British philosopher and cleric William Paley in believing that religion and science were necessarily aligned. His Astronomy and General Physics considered with reference to Natural Theology in 1833 echoed Paley's argument that we so well fit our environment that it could not be a matter of chance. It would take Darwin to show that their argument had it exactly backward: We so well fit our environment because we evolved to fit it well, as Bertrand Russell pointed out in his 1927 essay  Why I Am Not A Christian . Paley's "divine watchmaker" has given way to Dawkins' "blind watchmaker".
It was Paley who first used the human eye as an example of something that was too complex to have occurred by chance. His lack of imagination led him to conclude that a creator must be required. This argument is still a mainstay of the argument for so-called Intelligent Design. It has been thoroughly debunked by modern science. 
Intelligent Design enthusiasts say that certain biological features are far too complex to have been evolved using Darwin's algorithm. They (especially Lehigh University biochemistry professor Michael Behe) have claimed that since very complex structures (such as the eye) rely on the simultaneous activation of many (tens, hundreds, thousands) of genes, they could not possibly evolve by successive single mutations. This is known as "irreducible complexity", because the complex nature of the structure is not thought to be reducible into a series of single mutations. That is just not the case. A research team led by Chistoph Adami at the Keck Graduate Institute and Cal Tech in 2005 used software modeling to show how complex adaptive traits such as the eye can evolve using only the Darwinian algorithm via the accumulation of mutations. This is possible because random mutations may survive temporarily even if they are not in themselves adaptive - giving other mutations the opportunity to build upon them.
It is interesting to note that the men of the philosophical breakfast club reached their respective zeniths during a period of post-Enlightenment revivalism and before Darwin injected the first serious scientific basis for doubting revelation. A young Darwin features as a bit player in Snyder's book, bringing elements of foreshadowing to her story. It is also interesting that both Babbage's deduction and Darwin's induction resulted in the understanding of powerful algorithms that would together shape our world. One cannot help but think that both camps had their legitimate points.
We stand on the shoulders of these giants, as Isaac Newton famously observed of his previously unreferenced sources. We need not overly worry that they were merely human and had their faults. As Arthur C. Clarke observed, "If an elderly but distinguished scientist says that something is possible, he is almost certainly right; but if he says that it is impossible, he is very probably wrong."
Snyder ends her history with an almost paradoxical plea, that the separation of science from art - and perhaps implicitly of science from religion - be healed. The difference before and after the philosophical breakfast club was the difference before and after the industrial revolution: The age of the generalist was overtaken, rather brutally, by the age of the specialist. She makes much of Herschel's preferential views on art while noting that the members of the philosophical breakfast club began the process of separation, as science moved at their behest toward a professional discipline of specialists. It seems wistful and unrealistic to desire a return of the amateur. And yet, how many scientists, engineers, medical doctors today play a musical instrument, draw, paint, write? Many if not most in my personal experience. If anything, the results of the club's desire to see Bacon's induction at the forefront of science and Babbage's personal journey into dedicated deduction have helped usher in such a golden age of scientific discovery. It is an age when many of us have the opportunity to learn, discover, and play more than every before.