Wednesday, November 26, 2014

Book Review: The Philosophical Breakfast Club by Laura J. Snyder

I wandered into Riverby Books in downtown Fredericksburg one blustery autumn afternoon in the mood for some serendipity. Used bookstores may be going the way of the dodo (that is, headed for an early and forced extinction but destined to be brought back by some mad scientist of the near future in a slightly twisted guise), but they are the best sources of serendipity that I know. The smell of the place is distinctive but hardly uncommon. One may find it in used bookstores, libraries and scholars holes from Timbuctu to Taipei, but never in a chain bookstore. The most concentrated source of the scent of old paper and paper mold may be found in the city block of Powell's City of Books, the Mecca of knowledge nerds. I would pay good money for iBooks or Kindle to produce such a smell.
The two glass windows flanking Riverby's wood-and-glass door invite one to stop and browse. A classic Underwood manual typewriter shows the first line from a classic novel on the top half of a sheet of US Letter. Tell the cashier who wrote it and get 10% off. A selection of banned books are proudly displayed with index cards explaining where and when they were banned. A whiteboard advertises specials, includes cute drawings and exhorts engagement. Inside, cookbooks vie with vintage comics, classic novels with the history of Rome, used philosophy books sold back by university students and a stack of restored Compact Oxford English Dictionaries in the original slipcases and including the reading glasses necessary for the tiny print. The book binder's corner is almost hidden by a display of fine bindings in leather and gold. The rare book cabinets sit just across from the constantly changing novelty shelves that stretch from floor to high ceiling, some at intentionally odd angles. There are stacks of recently acquired books on the stairs, posters and maps on the walls. Original drawings from Maurice Sendak's Where the Wild Things Are catch my wife's eye every time we stroll by. I enter and inhale.
I walk past the novelties on my way to Ancient Egypt, Greece and Rome but something catches my eye. Something almost always does. A copy of The Philosophical Breakfast Club: Four Remarkable Friends Who Transformed Science and Changed the World [AmazonGoodreads] by Laura J. Snyder sits facing outward on an set of shelves canted sideways right into the natural traffic flow. It is shiny and new with a perfect dust jacket. The cover mirrors Riverby itself with its full height bookshelves, tabletop telescope and photographs of four of the most important scientists of the early nineteenth century, William WhewellJohn HerschelRichard Jones, and Charles Babbage. I knew of the astronomer Herschel and of course Babbage the mathematical genius who invented the first programmable computer, friend to the first computer programmer Augusta Ada King nee Byron, Countess of Lovelace (known as Ada Lovelace). I recalled vaguely that Whewell had coined the word "scientist". Snyder would soon fill in the many missing details.
The book promises to show how these four friends set out to revolutionize the practice of science and then, incredibly, did it. As a history, it has it all; a catchy title, a true and interesting story told like a novel, and subject matter that keeps giving and giving. Snyder proves herself a historian to watch. I would read anything she writes. Thank you Riverby. I feel guilty at the recognition that Snyder won't see any of the paltry $5.98 I hand over for the hardback. I would have paid twice that at Amazon but never would have found it on Amazon. It sold for six times that amount when it was first published in 2011. Riverby and Powell's capture my business by providing the serendipity still missing from online alternatives.
Fortunately for us all, Snyder has written more. Her Web site lists two more books as grand in scope as The Philosophical Breakfast Club: This year brought Reforming Philosophy: A Victorian Debate on Science and Society [AmazonGoodreads] and next promises us Eye of the Beholder: Johannes Vermeer, Antoni van Leeuwenhoek, and the Reinvention of Seeing [AmazonGoodreads] about the seventeenth century developments in optics. She gave a TED talk on The Philosophical Breakfast Club in 2012 and somehow manages to find time to teach at St. John’s University in New York. I console myself that she will see some cash from me for the new works.
The Philosophical Breakfast Club begins with a triumphant speech by Whewell at the British Association for the Advancement of Science at the height of his career and soon flashes back to his childhood. Snyder's story hangs off of Whewell, who served in life as the instigator, foil and correspondent for the other three as well as a first-rate scientist in his own right. Snyder uses the the voluminous extant letters of the four men and many of their contemporaries to paint a detailed, personal account of their extraordinary lives. She seems to know as much of the four men as their wives and mothers and perhaps, with the advantage of distance, more.
William Whewell was a mathematical whiz. He was a poor boy who struggled to pay his way during his time as a student at Cambridge. His home parish raised money to get him there and he worked throughout his time there in order to stay. Before his long run as an academic was over, he would become Master of Trinity College, hold a professorship in mineralogy, write physics textbooks, revolutionize the study of the tides with one of the first worldwide data collection efforts, and write a major work of philosophy. Says Snyder,
Whewell suggested "Eocene, Miocene, and Pliocene" to the geologist Charles Lyell as names for historical epochs, and in a few years he would give Michael Faraday the terms "ion, cathode, and anode" for his electrical research.
Besides "scientist", Whewell would also coin the word "physicist", both by analogy with the word artist. He was one of the most fascinating and capable characters you have probably never heard of.
The Reverend Richard Jones was a founder of modern economics. His Essay on the Distribution of Wealth and on the Sources of Taxation [full text] in 1831 was the first major work of economics to rely on both the massive collection of real-world data and the statistical analysis thereof. He served first as a rural curate, interested mostly in gardening, and following his book, as a professor of political economy. He arguably, as Snyder argues, would not have finished the massive work except for Whewell's constant prodding. The one survivingphoto of Jones shows an unhappy man, looking downward, corpulent and clearly depressed. He is the only one of the four whose eyes do not radiate their intelligence and learning.
Sir John Hershel was not just any astronomer. He was the son of astronomer William Herschel, the discoverer of the planet Uranus, and carried on his work on cataloging double stars and nebulae in both the northern and southern hemispheres. Throughout a lifetime of science, he discovered seven planetary moons, conducted a major study in the botany of South Africa and was one of the inventors of photography.
Charles Babbage, another mathematician, has had the most lasting reputation of the four due to the influence of computing on our current age. 
Snyder shows herself equally comfortable with history, science and the history of science. The dispassionate eye of the historian mixes nicely with the sympathetic observer of human nature. The only one who is treated at arm's length is perhaps Babbage. Babbage's behavior, especially in middle age, was variously described as "spiteful", "cruel", and "snide". Babbage was "never one to underestimate his own intelligence." Snyder seems to have justification for these descriptions in the letters of her protagonists as they struggled to maintain positive relations with their misanthropic friend.
Here is the voice of the professional historian, introducing the concerns of the original Luddites:
Babbage had realized that just as steam-driven mechanical looms were replacing men and women in the wool and cotton mills, so too could a machine replace the human computers. [He] seemed unconcerned that unemployed English computers would riot like the unemployed wool and cotton workers had done in Whewell's home county of Lancashire in 1813 (it helped that these were still part-time laborers; only after 1832 did computing for the Nautical Almanac become a full-time job). During these labor disturbances and others that took place between 1811 and 1816 in the manufacturing districts in the north of England, displaced workers destroyed mechanical looms and clashed with government troops. The term Luddite was coined in this period, after Ned Ludd, an English laborer who was lionized for having destroyed two stocking frames in a factory around 1779. Babbage would have none of this Luddism. Progress in science and industry required more mechanical means of calculation, as well as mechanical means of factory manufacturing, and nothing would stand in the way of that progress. (pp. 83)
Snyder's clear prose not only introduces Babbage's immediate concerns and the environment in which he worked, but ties Whewell's personal story into the picture, explains a modern term few know the full etymology of and throws in a substantial portion of the early history of computing. All in one easily readable paragraph! She handily ties her history to our own modern day problems, which we instantly recognize in Babbage's.
Here is the science writer, not a scientist, but one who is perfectly comfortable explaining complex ideas:
Newton established that the attractive forces of the sun and moon produced a tide-generating force; but it still remained to be shown how the law of universal gravitation could account for particular tides. His theory as applied to the tides did correctly predict some of the observed phenomena, such as the known fact that the oceanic high tides lag roughly three hours behind the syzygies (when the sun, earth, and moon are aligned, which happens at the time of the full moon and the new moon). But Newton's analysis was inconsistent with many of the observations that did exist, indicating that the relation between the tides and the gravitation between the earth, sun and moon was still not fully understood. In particular, his theory did not provide any understanding of factors that might counteract the attractive force. (pp. 171)
A lessor writer would have dodged the word syzygies altogether, or discussed Whewell's groundbreaking work on understanding tides without bothering to describe why Newton's theory was insufficient. Perhaps she knew of syzygy's many synonyms from her studies, such as the philosophic sense used to denote a close union or alternatively a union of opposites.
Of course, no one is perfect. Snyder credits Whewell for bringing about "a particular vision of what a modern scientist does", especially in his massive data collection efforts. She fails to mention Whewell's contemporary Matthew Fontaine Maury, who used identical methods on the other side of the Atlantic in the same years. There is more to that story than Snyder admits. "Where Maury mapped the oceans, Whewell mapped the coasts", notes Caren Cooper of Cornell University.
In describing the mechanism connecting the number-crunching central "mill" in Babbage's Analytical Engine to the "store" that provided memory storage, Snyder makes a point of referencing a modern expert who calls it a "memory data bus". She apparently missed that the term bus has become the common term of art. She did recognize just how much of Babbage's architecture for his earlier Difference Engine and later, general purpose, Analytical Engine presaged the von Neumann architecture used in the first stored-program digital computers, although she failed to mention John von Neumann by name. Readers may be rightfully surprised to discover that Babbage not only designed the machines but peripheral devices such as graph plotters and printers.
Snyder does not address that most vexing question for large-scale mechanical computing: How did Babbage intend to overcome the large amount of friction caused by the interactions of his many gears? There is modern evidence to suggest that both friction and the inability to finely machine the parts required doomed the Difference Engine. Snyder instead blames only Babbage. "For Babbage, the enemy of the good was always the better", claimed Snyder, which, although no doubt true, misses the point somewhat. A more careful reading of modern scholarship on Babbage's engines would have uncovered his inability to overcome practical limitations. His engines worked at scale only on paper. Modern reconstructions do exist - in software where friction and machining are not limiting.
Babbage, that master of deductive reasoning, was also the odd man out in other ways. The four friends had come together at Cambridge University and as students met regularly for the eponymous breakfast discussions. They decided at Whewell's prodding to dedicate their lives to further the ideas of Francis Bacon. Bacon was the first to codify what we now think of as the scientific method and promoted inductive reasoning as key to advancement in science. Induction is the collection of many specific examples of a phenomenon and then eventually theorizing a general law to fit the collected evidence; it is the philosophic opposite of deduction which proceeds directly from some principle to others by the rules of logic. Of course, scientists now use both induction and deduction to analyze nature, just as was done in the early nineteenth century. The members of the philosophical breakfast club weren't after mundane, plodding science. They wanted to uncover grand new laws of nature. For that, they needed induction. Hershel, Jones and Whewell, following Bacon to the letter, pleaded for the use of induction in their great works. It is easy, they thought, to be led astray by logically proceeding from a bogus presumption. The truth of that risk continues to dog many would-be philosophers and their followers today. John Dewey's presumptive twentieth century educational reforms and their aftermath are as fine an example as we could wish.
And British science in the early nineteenth century did need help. The breakfast club's contemporary Charles Dickens could have been talking about British science in his opening paragraph to A Tale of Two Cities:
It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to Heaven, we were all going direct the other way - in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only.
Babbage eventually broke ranks as progress on his Difference Engine broke new ground. Deduction has its place, after all. Whewell, himself a mathematician of note, felt that Babbage had betrayed the goals of the club. It is hard for a modern computer scientist to agree. The ability to compute on a massive scale, rapidly and with accuracy, has brought new abilities to pursue both pure and applied science.
Complaining about the London Statistical Society's focus on the collection of facts alone, Whewell complained that "they would go on better if they had some zealous theorists among them...Unconnected facts are of comparatively small value." (pp. 153) The computer scientist Jeff Hawkins noted a similar problem in late twentieth century neuroscience research when he said:
There was an unending supply of things to study and read about, but I was not gaining any clear understanding of how the whole brain actually worked or even what it did. This was because the field of neuroscience itself was awash in details. It still is. Thousands of research reports are published every year, but they tend to add to the heap rather than organize it. There's still no overall theory, no framework, explaining what your brain does and how it does it.
It is hard to remember that you came to drain the swamp, goes the joke, when you are up to your ass in alligators. It is a problem that is likely to become steadily worse as science and engineering continue to specialize. Hawkins, like a good inductionist, proceeded to put forward his theory based on the myriad of observations. Whewell, Jones and Hershel did the same thing in their time.
Any Englishman uncertain whether such philosophic differences had practical consequence would have become certain as workhouses were constructed throughout England during this period. The poorest members of society were at first encouraged and eventually forced (after 1834) to pay their debts by moving into workhouses that separated family members and reduced laborers to virtual slaves. This system was unabashedly based on the social theories of the political economists Thomas Malthus and David Ricardo, and the social reformer Edwin Chadwick, a protégé of the philosopher Jeremy Bentham. Bentham founded the philosophic movement of Utilitarianism, which suggested calculation of any proposed course of action to maximize overall happiness. It may be useful to apply such a calculus to an individual, but to apply it to a society one runs the substantial risk of justifying suffering of the socially inconvenient. Philosophy mattered as much in the first half of the nineteenth century as it did in the latter half, when the application of the ideas of Darwin and Marx would instigate social change that would take more than a century to painfully play out.
Babbage's ideas about religion also rankled. All four men were religious and seemingly honestly so. Jones and Whewell were ordained ministers of the Church of England. Herschel and Babbage were wealthy enough not to need to be. Snyder does a fine job describing his house parties in which Babbage used a small Difference Engine was used to count from one to one hundred, one digit at a time. As the machine reached the one hundred mark, it would begin to count by twos.
What you just witnessed seemed almost miraculous, did it not? he asked them. It seemed like the machine would just keep counting by one for an eternity. And yet this is not what occurred. The machine suddenly changed its course and began to count by a new rule.
Is this not what we feel when we look at nature, and see wondrous and inexplicable events, such new species arising as others die off? Babbage inquired. Is this not typically explained by supposing that God, our creator, our inventor if you will, has intervened in the world causing this event, outside of the natural order of things? Is this not exactly what we call a "miracle"?
Babbage had set a feedback mechanism to change the function being computed when the value reached one hundred. He had invented the idea of God as algorithm. Personal interventions were not necessary to keep the world running, as Isaac Newton had believed. In this he was expressing, rather uncomfortably for his ordained friends, a quite Deistviewpoint. Snyder never quite hangs the term Deist on Babbage.
Whewell, Herschel and Jones might have been equally disquieted by the comments of Henry Wilmot Buxton, who said of Babbage that he "had taught wheelwork to think, or at least to do the office of thought." (pp. 88) The implications of artificial intelligence have yet to be dealt with today. We have the entrepreneur billionaire Elon Musk warning in 2014 that AI is "our biggest existential threat." It is unclear whether Musk's fears have any basis in reality, but the threat to traditional religion and the concept of the soul is enough to trouble many and was certainly enough to trouble our Anglican protagonists.
Snyder reminds us that most men of science at the time thought of science, indeed - fought for science, to be seen as an equally valid avenue to approach the divine. "The idea was that nature was one of God's two books - the other, of course, being the Bible." says Snyder. She is right to remind us, "At that time, especially in Britain, science and religion were not considered enemies; on the contrary, they were seen as compatriots, both devoted to the appreciation of the Creator of the universe." Times have changed. It has been widely noted that a survey of the U.S. National Academy of Sciences reported that "72.2% did not [believe in a god] and 20.8% were agnostic or had doubts". It is no longer fashionable or required for most scientists to express faith in a national religion.
Snyder notes that Whewell closely followed the British philosopher and cleric William Paley in believing that religion and science were necessarily aligned. His Astronomy and General Physics considered with reference to Natural Theology in 1833 echoed Paley's argument that we so well fit our environment that it could not be a matter of chance. It would take Darwin to show that their argument had it exactly backward: We so well fit our environment because we evolved to fit it well, as Bertrand Russell pointed out in his 1927 essay  Why I Am Not A Christian . Paley's "divine watchmaker" has given way to Dawkins' "blind watchmaker".
It was Paley who first used the human eye as an example of something that was too complex to have occurred by chance. His lack of imagination led him to conclude that a creator must be required. This argument is still a mainstay of the argument for so-called Intelligent Design. It has been thoroughly debunked by modern science. 
Intelligent Design enthusiasts say that certain biological features are far too complex to have been evolved using Darwin's algorithm. They (especially Lehigh University biochemistry professor Michael Behe) have claimed that since very complex structures (such as the eye) rely on the simultaneous activation of many (tens, hundreds, thousands) of genes, they could not possibly evolve by successive single mutations. This is known as "irreducible complexity", because the complex nature of the structure is not thought to be reducible into a series of single mutations. That is just not the case. A research team led by Chistoph Adami at the Keck Graduate Institute and Cal Tech in 2005 used software modeling to show how complex adaptive traits such as the eye can evolve using only the Darwinian algorithm via the accumulation of mutations. This is possible because random mutations may survive temporarily even if they are not in themselves adaptive - giving other mutations the opportunity to build upon them.
It is interesting to note that the men of the philosophical breakfast club reached their respective zeniths during a period of post-Enlightenment revivalism and before Darwin injected the first serious scientific basis for doubting revelation. A young Darwin features as a bit player in Snyder's book, bringing elements of foreshadowing to her story. It is also interesting that both Babbage's deduction and Darwin's induction resulted in the understanding of powerful algorithms that would together shape our world. One cannot help but think that both camps had their legitimate points.
We stand on the shoulders of these giants, as Isaac Newton famously observed of his previously unreferenced sources. We need not overly worry that they were merely human and had their faults. As Arthur C. Clarke observed, "If an elderly but distinguished scientist says that something is possible, he is almost certainly right; but if he says that it is impossible, he is very probably wrong."
Snyder ends her history with an almost paradoxical plea, that the separation of science from art - and perhaps implicitly of science from religion - be healed. The difference before and after the philosophical breakfast club was the difference before and after the industrial revolution: The age of the generalist was overtaken, rather brutally, by the age of the specialist. She makes much of Herschel's preferential views on art while noting that the members of the philosophical breakfast club began the process of separation, as science moved at their behest toward a professional discipline of specialists. It seems wistful and unrealistic to desire a return of the amateur. And yet, how many scientists, engineers, medical doctors today play a musical instrument, draw, paint, write? Many if not most in my personal experience. If anything, the results of the club's desire to see Bacon's induction at the forefront of science and Babbage's personal journey into dedicated deduction have helped usher in such a golden age of scientific discovery. It is an age when many of us have the opportunity to learn, discover, and play more than every before.

Monday, November 17, 2014

Book Review: Why We Believe in God(s) by J. Anderson Thomson

We lined up shoulder-to-shoulder with our arms around each other. We swayed back and forth as we sang. The little old lady to my left was frail and I was acutely conscious not to knock her off balance, nor to bruise her. The song, of course, was Amazing Grace. It is a beautiful and powerful song, perhaps one of the best spirituals ever written. We lifted our voices and sang. We swayed together, left and right.
Slides walked us through the lyrics, projected on a large screen at the front of the room. It seems that everyone knew the first verse, but not the others. "'Twas grace that taught my heart to fear, and grace my fears relieved," we sang, struggling to hit the high notes, "How precious did that grace appear, the hour I first believed!"
Not one of us believed a word. This was, after all, a meeting of the Fredericksburg Secular Humanists. We all sat down as J. Anderson Thomson, Jr., MD, author of Why We Believe in God(s): A Concise Guide to the Science of Faith [AmazonGoodreads], asked us for a show of hands. Andy is a practicing psychiatrist located near the University of Virginia and a faculty member there.
"How many of you feel more positive than you did before singing?", asked Andy. Almost every hand went up. He had us test our pain threshold by pinching the sensitive fleshy area on the back of our hands, between thumb and forefinger. It just didn't hurt as much as it did before we had sang. We had just proven by experiment his contention that ritual group behavior, what Andy calls "song, dance and trance", has a measurable positive effect on the mood and emotions of human beings.
You can do a small scale version of this experiment on yourself. Thomson tells you how on pages 99-100.
The vast majority of people believe in some form of god or another form of supernatural agency such as ghosts, a guiding hand, a purpose of some form. Many believe in reincarnation, bodily or not, and an immortal soul. Why should this be? Clearly it is not because we all, from primitive people to airplane-riding denizens of skyscrapers have the same perception of the same god. As Sir James George Frazer and Joseph Campbell showed, religions differ predictably in their fundamentals as the geography of their adherents change.
Frazer's epic study of comparative religion in the 1890's, The Golden Bough: A Study in Comparative Religion [AmazonGoodreads] was followed by Campbell's equally epic follow on two generations later with his Masks of God series: Primitive Mythology [AmazonGoodreads], Oriental Mythology [AmazonGoodreads], Occidental Mythology [Amazon,Goodreads] and Creative Mythology [AmazonGoodreads]. Thomson does not seem to be familiar with either of these studies. Of course, comparing religions was not his goal. Instead, he informs the reader that his studies were informed by the potent combination of his training as a clinical psychiatrist, including Sigmund Freud's The Future of an Illusion [AmazonGoodreads], his own personal atheism and the near-death experience of his son Mathew during the horror of the 9/11 attack on New York. He is interested in what is capable of turning a person from a religious believer into a suicide bomber.
Distribution of dominant world religions (Click image to expand)
No, Thomson did not set out to compare religions. He set out to determine why we believe. That is a tall order and one with a storied history. Heavy thinkers for more than the last two millennia focused on explaining why we should believe instead of why we do. Aristotle, noting that all activities have cause, postulated a prime mover (or unmoved mover) that began all events. St. Thomas Aquinas summarized five reasons (the quinque viae) to believe in the Christian god in his book Summa Theologica [AmazonGoodreads] in the thirteenth century. There have been so many others, from the ontological argument of René Descartes to Pascal's Wager. All of them, shows Thomson, have missed the point entirely.
I had already read Why We Believe in God(s) when I had the opportunity to hear Andy speak at a lecture in October 2014. He is personally engaging, passionate about his topic and very obviously experienced as a lecturer.
The book is wonderfully concise and has playful, almost humorous, aspects to it. Each chapter is cleverly titled using either a Biblical quotation or one from Christian culture. In spite if its brevity, it has the potential to start a long overdue, critical conversation about religious thought. Why do we believe in gods? Because, as Andy put it in his Fredericksburg lecture, "We are all trapped in a Stone Age brain."
The book is a product of Pitchstone Publishing, a pocket publisher of the "new atheist" genre located in Thomson's home town of Charlottesville, Virginia. Their Web site lists only a meager seventeen print titles. I have heard of at least one of them, A Manual For Creating Atheists by philosopher Peter Boghossian [AmazonGoodreads]. It is on my reading list. Some of the other authors are psychiatrists like Thomson. The book was co-authored by Clare Aukofer, a medical writer and frequent collaborator of Thomson's.
There have been other attempts to explain why religious thought is so prevalent. Bertrand Russell's 1927 essay  Why I Am Not A Christian brilliantly and insightfully went to the heart of the matter: Addressing why arguments for the existence of god and the usefulness of the church, Russell systematically dismantled rational reasons to believe. His arguments against the design of nature were powerfully supported by Darwin's theory and pointed out that, with evidence to show that we have adapted to our environment, it was no longer sensible to argue that the environment was designed for us. Today's generation of both theists and atheists would benefit from dusting off Russell's essay.
More recently, in 2003, Christopher Hitchens took a stand against any form of revealed religion. Hitchens' Razor states, "What can be asserted without evidence can be dismissed without evidence". Hitchens followed astrophysicist Carl Sagan in his "baloney detection kit", introduced in his 1997 book The Demon-Haunted World [AmazonGoodreads]. Their arguments echo other rational thinkers like the evolutionary biologist Richard Dawkins and the philosopher Daniel Dennett. So-called "new atheists" such as Hitchens, Dawkins and Dennett and their predecessors like Sagan have argued from the standpoint of logic. This is rational because science has uncovered two critical facts:
  1. The mechanisms of the world can be understood and explained through the scientific method, logic and mathematics.
  2. Humans do not think logically very often or for very long.
Unfortunately, point #2 ensures that most people will never be swayed by logical arguments such as Russell's.
The journalist, author and educator Michael Shermer touched on Thomson's core argument in his 1997 book Why People Believe Weird Things: Pseudoscience, Superstition and Other Confusions of Our Time [AmazonGoodreads]. Although Shermer is not a scientist and did not have the benefit of the last decade of neuroscience, he managed like Russell to close on the core problem:
In my opinion, most believers in miracles, monsters, and mysteries are not hoaxers, flimflam artists, or lunatics. Most are normal people whose normal thinking has gone wrong in some way.
Shermer proceeded to list a large number of logical fallacies where people's thinking goes wrong. He thus follows centuries of atheistic thinkers who can prove logically reasons for disbelief without being able to explain why people behave as they do. It is not uncommon for scientific explanations to proceed from facts, to knowledge, to theory, to more fully complete theory. Darwin's On the Origin of Species, published in 1859, did not in fact explain the origin of species, only their differentiation. Others were required to do that following the explanation of genetic coding by Darwin's contemporary Gregor Mendel and the description of DNA by James Watson and Francis Crick a century later.
By contrast with these earlier authors, Thomson has effectively related the mechanisms of evolutionary psychology to the actions evinced by religious thinkers. Where Shermer was only able to list, Thomson has been able to explain. Thomson systematically describes the reasons and benefits of each cognitive feature prevalent in religious thought so that we may see how and why it has been adopted for that purpose. Religion, it turns out, has empirically come to use each aspect of our brains that enable the religion itself to be considered compelling.  It is critically important for memes to be sticky; those that are attract adherents and thrive. Those that do not whither and die. A wonderful example is how the proselytizing religions such as mainstream Christianity grow while smaller factions that do not proselytize, and in fact erect a high barrier to entry such as the Shakers, have disappeared entirely or have fallen to just a few members.
Thomson challenges us to think about who we really are and how we really think. Not many will welcome such introspection. Reading Why We Believe in God(s), I am reminded of that other great modern insight, that the concept of self is also an illusion. In The Self Illusion: How the Social Brain Creates Identity by Bruce Hood [AmazonGoodreads], we are asked to consider what that would mean:
One can imagine all sorts of scenarios in which brain structures are copied or replaced cell by cell until none of the original brain material is left and yet people maintain an intuition that the self somehow continues to exist independently of all these physical changes. If that were true, then one would have to accept a self that can exist independently of the brain. Most neuroscientists reject that idea. Rather, our brain creates the experience of our self as a model - a cohesive, integrated character - to make sense of the multitude of experiences that assault our senses throughout a lifetime and leave lasting impressions in our memory.
This exploration seems dangerous and perhaps it is. Challenging our most closely held beliefs about ourselves may be our final frontier - and maybe even a frontier where most of us dare not go.
In an attempt to summarize Thomson's long list of cognitive features that relate to religious behavior, I decided to group them by their evolutionary benefit. The first and longest list relate to group cohesiveness. Group cohesiveness is critically important to the survival of humans and our closest cousins, chimpanzees and bonobos. Interestingly, it seems that these features align with a category of behavior theorized by Jean Piaget's theory of cognitive development, especially Piaget's symbolic function substage that kicks in between two and four years of age and that is characterized by the beginning of social behavior.
Cognitive Features Driven By Group Cohesiveness
Thomson's DefinitionNotes
Decoupled CognitionThis allows us to conduct a complex social interaction in our mind with an unseen other.Also related to mating behavior. We choose much of our behavior after playing out various scenarios in our heads to judge which would yield results closest to the ones we want.
AttachmentThis most basic of human needs almost defines religion’s premise. Religion supplements or supplants family.Also related to safety. We cling to family in times of hardship or crisis.
IntensionalityThis allows us to speculate about others’ thoughts about our thoughts, desires, beliefs, and intentions.Also related to mating behavior. We take actions based on what we think others are thinking.
Theory of MindThis allows us to “read” others’ possible thoughts, desires, beliefs, and intentions.Also related to mating behavior and closely with intensionality. Thomson: "A chimpanzee mother will never share her baby, but humans will because we can judge trust."
Mirror NeuronsWe literally feel each other’s pain; this is inborn, not invented by religion. We are born caring about others.Also related to safety. Mirror neurons are also implicated in spectator sports and probably systems that inspire via grandeur (e.g. Catholicism, British Empire).
TransferenceWe can accept religious figures as easily as we accepted the family figures we’ve known since birth. We transfer our familial thoughts to religious figures.Who are our parental figures once our parents have died or are absent? Not everyone is comfortable becoming parents without a safety net.
Deference to AuthorityWe are all more deferential to authority figures than we can see or want to admit to ourselves.See Stanley Milgram's experiments.
Kin PsychologyWe are hardwired to prefer our kin over others.This simple survival mechanism allows us to define an "in group", which has the natural result of simultaneously defining the opposite: the out group. Those not in the in group become The Other.
Reciprocal AltruismYou scratch my back, I’ll scratch yours.Probably just a human iteration of social grooming in other animals.
Moral Feeling SystemsThese generate moral decisions. They are instinctual and automatic. Because they operate largely outside of awareness, religions can claim ownership of them and insist that we are only moral with faith.Experiments related to moral sense theory shows that moral feelings are innate and not a product of religion.
Ritual behavior - song, dance and tranceThis enhances group cohesion and tests who is committed to the group. Song and dance harness our neurochemistry that reduces pain and fear and increases trust, love, self-esteem, and cooperation.This was what Thomson was illustrating with the song and dance experiment discussed above.
The following, plus the ones noted in the table above, all relate to safety:
Cognitive Features Related to Safety
Thomson's DefinitionNotes
Hyperactive agency detection (HADD)This leads us to assume that unknown forces are human agents. It evolved to protect us. We mistake a shadow for a burglar and never mistake a burglar for a shadow. It encourages anthropomorphism.Errors toward false positives, and a near-complete elimination of false negatives.
Promiscuous TeleologyThis arises from our bias to understand the world as purpose driven.Why? What benefit do we get from that? Perhaps we benefit from the presumption that the world is understandable even though nature can be capricious. All of science presumably arises from the bias to understand the world as being understandable.
Childhood CredulityWe all believe too readily, with too little evidence. Children are even more vulnerable, especially when taught by someone with a mantle of authority.What happens during adolescence that reduces this?
Some of these features date to the far evolutionary past of our species, so far in fact that we could reasonably suppose that other animals share them.  Others seem to be related to "higher" thought. There is almost surely a continuum between humans and other animals in this regard. Darwin was the first to intuit (or at least to have the guts to express, however lately and reluctantly) that the theory of evolution applied to humans as well as any other animal. Thomson quotes Darwin at the beginning of his seventh chapter, "Such social qualities, the paramount importance of which to the lower animals is disputed by no one, were no doubt acquired by the progenitors of man in a similar manner, namely, through natural selection, aided by inherited habit."
As we approach a complete theory of how cognition operates, we must accept that Darwin was right. Many animals empirically show elements of familiar cognition, although certainly to a lessor degree. The primatology studies of Jane Goodall, which showed that chimps had individual personalities, and the ever-evolving recitation of tool usage by animals, have begun to break down the religiously circumscribed belief that humans are somehow separate from the rest of nature. Thomson was right to highlight what he called the "lovely phrase" of William Allman, "We are risen apes, not fallen angels." The quote is from Allman's book Stone Age Present: How Evolution Has Shaped Modern Life - From Sex, Violence and Language to Emotions, Morals and Communities [AmazonGoodreads].
The many dog cognition studies referenced in The Genius of Dogs: How Dogs Are Smarter Than You Think [AmazonGoodreads] by Brian Hare and Vanessa Woods seem to lead to conclusion that dogs do "think" using much of the same physical properties of brain that humans have. Hare and Woods illustrate the special relationship which dogs have with people, not just socially (or culturally) created, but as a product of a sort of evolutionary co-dependance. Similarly, studies of dolphin cognition show that they, too, are a species with strong social ties. We have no reason to believe that the underlying mechanisms are completely different. To make matters worse for those who would deny that evolved brains produce complex social behavior, the book Animal Architects: Building and the Evolution of Intelligence [AmazonGoodreads] by James L. Gould and Carol Grant Gould should give any reader pause.
So we are left with a question for Thomson's remaining cognitive features: Do dogs or dolphins also have these features, much less monkeys?
We might think that they do. Wolfgang Köhler's primatology studies from the 1970s reported the dancing of a group of chimpanzees around a central pole. This probably relates more to group dynamics than religion per se, but it provides an interesting continuity of behavior from our closest extant ancestors and the earliest human religions, regardless of the human justifications for the action. The observation is in Köhler's 1976 classic The Mentality of Apes [AmazonGoodreads].
A useful discussion on the study and its wider context is to be found in the first volume of Joseph Campbell's The Masks of God series, Primitive Mythology [AmazonGoodreads(pp. 358-359): It seems to me extraordinary,” Köhler concludes, “that there should arise quite spontaneously, among chimpanzees, anything that so strongly suggests the dancing of some primitive tribes." and on the same page Campbell says:
We note, furthermore, the surprising detail of the central pole, which in the higher mythologies becomes interpreted as the world-uniting and supporting Cosmic Tree, World Mountain, axis mundi, or sacred sanctuary, to which both the social order and the meditations of the individual are to be directed. And finally, we have that wonderful sense of play, without which no mythological or ritual game of “make believe” whatsoever could ever have come into being.
Other Cognitive Features Implicated in Religious Expression
Thomson's DefinitionEvolutionary BenefitNotes
Minimally Counterintuitive Worlds (MCIs)This allows belief in the supernatural, as long as it’s not too “super” and does not violate too many basic tenets of humanness.Possibly cognitive closure, which is presumed necessary for understanding the world.Could this be related to intuitive reasoning in that one is perhaps a mechanism of cognitive closure and the other (MCI) a minimization of the effort involved?
Intuitive ReasoningThis helps us “fill in the blanks” of logic.Possibly cognitive closure, which is presumed necessary for understanding the world.Could this be a (the?) mechanism of cognitive closure? If we are presented with multiple disconnected facts and need to cognitively close the world, we must do it somehow.
Person Permanence(Discussed in Thomson's slides, but not his book)Object permanence is necessary for object tracking and recognition, especially in noisy environments. Person permanence probably builds on the same mechanisms.Also called object-person permanence.
Thomson also lists "Hard to Fake, Costly Honest Signals of Commitment" as an important cognitive feature that is used by religious belief. An example of a costly signal is the giving of an expensive gift to a friend, a potential mate, or in the case of potlatch, to an entire community. Costly signals do not seem to fit with an evolutionary argument but with a cultural one. Thomson suggests that costly signals help a person to determine when someone is lying to them (pp. 80-81) and thus associates them with safety. Of course, we cannot perfectly detect lies. Just as shiny hair, full lips or clear skin are often taken as signals for robust health and hence indicate good mating potential, they can also be faked. Faking them is in fact big business. A 2012 study by Michelle Yeomans for Market Research Firm Lucintel, forecasted that the global beauty care products industry will grow to approximately $265 billion annually by 2017. (See Global beauty market to reach $265 billion in 2017 due to an increase in GDP ). Similarly, costly signals can lead us to believe the sincerity of someone like a religious leader even when the long term impact of that belief might not be good for us. Thomson naturally uses as examples the mass murder/suicide at Jonestown and Islamist suicide bombers. One is left to wonder just how detrimental mainstream religious belief might be to the interests of believers. There has been wide consensus among literati for millennia that religion helps keep the powerful powerful. "Religion is regarded by the common people as true," said Seneca the Younger in the first century CE, "by the wise as false, and by the rulers as useful." Readers of Alexandre Dumas' The Three Musketeers would recognize the truth of that statement in the character of Cardinal Richelieu. The relationship between the established religions and politics seem to be in the headlines of international newspapers on a daily basis.
One of the few criticisms I will make of Why We Believe in God(s) is that the book is in desperate need of an index. The publisher really should have seen to that. Thomson asks in his preface that we take four actions: "Finish the book. Refer to it often. Give it to a friend. Donate it to a library or school." It is in fact rather difficult to refer to specific sections of it quickly in the absence of an index. A good index provides a complement to a table of contents as the other side of the same coin. Readers might approach a table of contents as a map to give them an overview and an index as a key to specific content.
Andy has collected a very large (250MB) number of slides and has been incredibly generous in allowing their public distribution. I have placed his entire slide deck online for download here with Andy's permission. Please note that the collection has been modified for various talks over time and does contain some significant duplication. There are no speakers notes in the slides. However, the book's Web site contains an hour-long video of Dr. Thomson lecturing on the topic using the slides.
I highly recommend Why We Believe in God(s). Perhaps only by understanding why we believe can we begin to consciously decide what we should believe.

An Abbreviated Literature Survey

Andy recommends a number of studies during his lectures. Here is a short list of them for your further enjoyment with notes regarding their applicability:
  • Asp E, Ramchandran K, Tranel D. Authoritarianism, Religious Fundamentalism, and the Human Prefrontal Cortex, Neuropsychology. 2012 Jul;26(4):414-21. doi: 10.1037/a0028526. Epub 2012 May 21.
  • Andrew Newberg and Mark Robert Waldman, Born to Believe: God, Science, and the Origin of Ordinary and Extraordinary Beliefs [AmazonGoodreads]
  • The first human religion:
    • Nicholas Wade, The Faith Instinct: How Religion Evolved and Why It Endures [AmazonGoodreads]
  • We attribute to the dead mental states that we cannot shut off, so we think of them still being somewhere. Those raised in religious schools or families lose this tendency later than those raised in secular schools or families:
    • Jesse Bering, The Belief Instinct: The Psychology of Souls, Destiny, and the Meaning of Life [AmazonGoodreads]
  • Robert Karen, Becoming Attached, Atlantic Monthly, Feb 1990
  • Paul Bloom, Just Babies: The Origins of Good and Evil [AmazonGoodreads]
  • Matthew D. Lieberman, Social: Why Our Brains Are Wired to Connect [AmazonGoodreads]
  • Morality is innate and not attributable to religion:
  • Autistic people "simply cannot be religious", as in the case of Temple Grandin:
    • Simon Baron-Cohen, The Essential Difference: Male And Female Brains And The Truth About Autism [AmazonGoodreads]
  • Neuroimaging evidence that religious experience occurs in Theory of Mind networks:
  • Neuroimaging evidence that belief in self and god overlap; belief in others resides in a different part of the brain:
  • The origin of religion is based on kin group selection, as driven by oxytocin release:
  • Rowers moving in synchrony had higher pain thresholds than those rowing alone:
  • Neuroimaging differences between believers' and nonbelievers' brains: