Monday, November 17, 2014

Book Review: Why We Believe in God(s) by J. Anderson Thomson

We lined up shoulder-to-shoulder with our arms around each other. We swayed back and forth as we sang. The little old lady to my left was frail and I was acutely conscious not to knock her off balance, nor to bruise her. The song, of course, was Amazing Grace. It is a beautiful and powerful song, perhaps one of the best spirituals ever written. We lifted our voices and sang. We swayed together, left and right.
Slides walked us through the lyrics, projected on a large screen at the front of the room. It seems that everyone knew the first verse, but not the others. "'Twas grace that taught my heart to fear, and grace my fears relieved," we sang, struggling to hit the high notes, "How precious did that grace appear, the hour I first believed!"
Not one of us believed a word. This was, after all, a meeting of the Fredericksburg Secular Humanists. We all sat down as J. Anderson Thomson, Jr., MD, author of Why We Believe in God(s): A Concise Guide to the Science of Faith [AmazonGoodreads], asked us for a show of hands. Andy is a practicing psychiatrist located near the University of Virginia and a faculty member there.
"How many of you feel more positive than you did before singing?", asked Andy. Almost every hand went up. He had us test our pain threshold by pinching the sensitive fleshy area on the back of our hands, between thumb and forefinger. It just didn't hurt as much as it did before we had sang. We had just proven by experiment his contention that ritual group behavior, what Andy calls "song, dance and trance", has a measurable positive effect on the mood and emotions of human beings.
You can do a small scale version of this experiment on yourself. Thomson tells you how on pages 99-100.
The vast majority of people believe in some form of god or another form of supernatural agency such as ghosts, a guiding hand, a purpose of some form. Many believe in reincarnation, bodily or not, and an immortal soul. Why should this be? Clearly it is not because we all, from primitive people to airplane-riding denizens of skyscrapers have the same perception of the same god. As Sir James George Frazer and Joseph Campbell showed, religions differ predictably in their fundamentals as the geography of their adherents change.
Frazer's epic study of comparative religion in the 1890's, The Golden Bough: A Study in Comparative Religion [AmazonGoodreads] was followed by Campbell's equally epic follow on two generations later with his Masks of God series: Primitive Mythology [AmazonGoodreads], Oriental Mythology [AmazonGoodreads], Occidental Mythology [Amazon,Goodreads] and Creative Mythology [AmazonGoodreads]. Thomson does not seem to be familiar with either of these studies. Of course, comparing religions was not his goal. Instead, he informs the reader that his studies were informed by the potent combination of his training as a clinical psychiatrist, including Sigmund Freud's The Future of an Illusion [AmazonGoodreads], his own personal atheism and the near-death experience of his son Mathew during the horror of the 9/11 attack on New York. He is interested in what is capable of turning a person from a religious believer into a suicide bomber.
Distribution of dominant world religions (Click image to expand)
No, Thomson did not set out to compare religions. He set out to determine why we believe. That is a tall order and one with a storied history. Heavy thinkers for more than the last two millennia focused on explaining why we should believe instead of why we do. Aristotle, noting that all activities have cause, postulated a prime mover (or unmoved mover) that began all events. St. Thomas Aquinas summarized five reasons (the quinque viae) to believe in the Christian god in his book Summa Theologica [AmazonGoodreads] in the thirteenth century. There have been so many others, from the ontological argument of René Descartes to Pascal's Wager. All of them, shows Thomson, have missed the point entirely.
I had already read Why We Believe in God(s) when I had the opportunity to hear Andy speak at a lecture in October 2014. He is personally engaging, passionate about his topic and very obviously experienced as a lecturer.
The book is wonderfully concise and has playful, almost humorous, aspects to it. Each chapter is cleverly titled using either a Biblical quotation or one from Christian culture. In spite if its brevity, it has the potential to start a long overdue, critical conversation about religious thought. Why do we believe in gods? Because, as Andy put it in his Fredericksburg lecture, "We are all trapped in a Stone Age brain."
The book is a product of Pitchstone Publishing, a pocket publisher of the "new atheist" genre located in Thomson's home town of Charlottesville, Virginia. Their Web site lists only a meager seventeen print titles. I have heard of at least one of them, A Manual For Creating Atheists by philosopher Peter Boghossian [AmazonGoodreads]. It is on my reading list. Some of the other authors are psychiatrists like Thomson. The book was co-authored by Clare Aukofer, a medical writer and frequent collaborator of Thomson's.
There have been other attempts to explain why religious thought is so prevalent. Bertrand Russell's 1927 essay  Why I Am Not A Christian brilliantly and insightfully went to the heart of the matter: Addressing why arguments for the existence of god and the usefulness of the church, Russell systematically dismantled rational reasons to believe. His arguments against the design of nature were powerfully supported by Darwin's theory and pointed out that, with evidence to show that we have adapted to our environment, it was no longer sensible to argue that the environment was designed for us. Today's generation of both theists and atheists would benefit from dusting off Russell's essay.
More recently, in 2003, Christopher Hitchens took a stand against any form of revealed religion. Hitchens' Razor states, "What can be asserted without evidence can be dismissed without evidence". Hitchens followed astrophysicist Carl Sagan in his "baloney detection kit", introduced in his 1997 book The Demon-Haunted World [AmazonGoodreads]. Their arguments echo other rational thinkers like the evolutionary biologist Richard Dawkins and the philosopher Daniel Dennett. So-called "new atheists" such as Hitchens, Dawkins and Dennett and their predecessors like Sagan have argued from the standpoint of logic. This is rational because science has uncovered two critical facts:
  1. The mechanisms of the world can be understood and explained through the scientific method, logic and mathematics.
  2. Humans do not think logically very often or for very long.
Unfortunately, point #2 ensures that most people will never be swayed by logical arguments such as Russell's.
The journalist, author and educator Michael Shermer touched on Thomson's core argument in his 1997 book Why People Believe Weird Things: Pseudoscience, Superstition and Other Confusions of Our Time [AmazonGoodreads]. Although Shermer is not a scientist and did not have the benefit of the last decade of neuroscience, he managed like Russell to close on the core problem:
In my opinion, most believers in miracles, monsters, and mysteries are not hoaxers, flimflam artists, or lunatics. Most are normal people whose normal thinking has gone wrong in some way.
Shermer proceeded to list a large number of logical fallacies where people's thinking goes wrong. He thus follows centuries of atheistic thinkers who can prove logically reasons for disbelief without being able to explain why people behave as they do. It is not uncommon for scientific explanations to proceed from facts, to knowledge, to theory, to more fully complete theory. Darwin's On the Origin of Species, published in 1859, did not in fact explain the origin of species, only their differentiation. Others were required to do that following the explanation of genetic coding by Darwin's contemporary Gregor Mendel and the description of DNA by James Watson and Francis Crick a century later.
By contrast with these earlier authors, Thomson has effectively related the mechanisms of evolutionary psychology to the actions evinced by religious thinkers. Where Shermer was only able to list, Thomson has been able to explain. Thomson systematically describes the reasons and benefits of each cognitive feature prevalent in religious thought so that we may see how and why it has been adopted for that purpose. Religion, it turns out, has empirically come to use each aspect of our brains that enable the religion itself to be considered compelling.  It is critically important for memes to be sticky; those that are attract adherents and thrive. Those that do not whither and die. A wonderful example is how the proselytizing religions such as mainstream Christianity grow while smaller factions that do not proselytize, and in fact erect a high barrier to entry such as the Shakers, have disappeared entirely or have fallen to just a few members.
Thomson challenges us to think about who we really are and how we really think. Not many will welcome such introspection. Reading Why We Believe in God(s), I am reminded of that other great modern insight, that the concept of self is also an illusion. In The Self Illusion: How the Social Brain Creates Identity by Bruce Hood [AmazonGoodreads], we are asked to consider what that would mean:
One can imagine all sorts of scenarios in which brain structures are copied or replaced cell by cell until none of the original brain material is left and yet people maintain an intuition that the self somehow continues to exist independently of all these physical changes. If that were true, then one would have to accept a self that can exist independently of the brain. Most neuroscientists reject that idea. Rather, our brain creates the experience of our self as a model - a cohesive, integrated character - to make sense of the multitude of experiences that assault our senses throughout a lifetime and leave lasting impressions in our memory.
This exploration seems dangerous and perhaps it is. Challenging our most closely held beliefs about ourselves may be our final frontier - and maybe even a frontier where most of us dare not go.
In an attempt to summarize Thomson's long list of cognitive features that relate to religious behavior, I decided to group them by their evolutionary benefit. The first and longest list relate to group cohesiveness. Group cohesiveness is critically important to the survival of humans and our closest cousins, chimpanzees and bonobos. Interestingly, it seems that these features align with a category of behavior theorized by Jean Piaget's theory of cognitive development, especially Piaget's symbolic function substage that kicks in between two and four years of age and that is characterized by the beginning of social behavior.
Cognitive Features Driven By Group Cohesiveness
Thomson's DefinitionNotes
Decoupled CognitionThis allows us to conduct a complex social interaction in our mind with an unseen other.Also related to mating behavior. We choose much of our behavior after playing out various scenarios in our heads to judge which would yield results closest to the ones we want.
AttachmentThis most basic of human needs almost defines religion’s premise. Religion supplements or supplants family.Also related to safety. We cling to family in times of hardship or crisis.
IntensionalityThis allows us to speculate about others’ thoughts about our thoughts, desires, beliefs, and intentions.Also related to mating behavior. We take actions based on what we think others are thinking.
Theory of MindThis allows us to “read” others’ possible thoughts, desires, beliefs, and intentions.Also related to mating behavior and closely with intensionality. Thomson: "A chimpanzee mother will never share her baby, but humans will because we can judge trust."
Mirror NeuronsWe literally feel each other’s pain; this is inborn, not invented by religion. We are born caring about others.Also related to safety. Mirror neurons are also implicated in spectator sports and probably systems that inspire via grandeur (e.g. Catholicism, British Empire).
TransferenceWe can accept religious figures as easily as we accepted the family figures we’ve known since birth. We transfer our familial thoughts to religious figures.Who are our parental figures once our parents have died or are absent? Not everyone is comfortable becoming parents without a safety net.
Deference to AuthorityWe are all more deferential to authority figures than we can see or want to admit to ourselves.See Stanley Milgram's experiments.
Kin PsychologyWe are hardwired to prefer our kin over others.This simple survival mechanism allows us to define an "in group", which has the natural result of simultaneously defining the opposite: the out group. Those not in the in group become The Other.
Reciprocal AltruismYou scratch my back, I’ll scratch yours.Probably just a human iteration of social grooming in other animals.
Moral Feeling SystemsThese generate moral decisions. They are instinctual and automatic. Because they operate largely outside of awareness, religions can claim ownership of them and insist that we are only moral with faith.Experiments related to moral sense theory shows that moral feelings are innate and not a product of religion.
Ritual behavior - song, dance and tranceThis enhances group cohesion and tests who is committed to the group. Song and dance harness our neurochemistry that reduces pain and fear and increases trust, love, self-esteem, and cooperation.This was what Thomson was illustrating with the song and dance experiment discussed above.
The following, plus the ones noted in the table above, all relate to safety:
Cognitive Features Related to Safety
Thomson's DefinitionNotes
Hyperactive agency detection (HADD)This leads us to assume that unknown forces are human agents. It evolved to protect us. We mistake a shadow for a burglar and never mistake a burglar for a shadow. It encourages anthropomorphism.Errors toward false positives, and a near-complete elimination of false negatives.
Promiscuous TeleologyThis arises from our bias to understand the world as purpose driven.Why? What benefit do we get from that? Perhaps we benefit from the presumption that the world is understandable even though nature can be capricious. All of science presumably arises from the bias to understand the world as being understandable.
Childhood CredulityWe all believe too readily, with too little evidence. Children are even more vulnerable, especially when taught by someone with a mantle of authority.What happens during adolescence that reduces this?
Some of these features date to the far evolutionary past of our species, so far in fact that we could reasonably suppose that other animals share them.  Others seem to be related to "higher" thought. There is almost surely a continuum between humans and other animals in this regard. Darwin was the first to intuit (or at least to have the guts to express, however lately and reluctantly) that the theory of evolution applied to humans as well as any other animal. Thomson quotes Darwin at the beginning of his seventh chapter, "Such social qualities, the paramount importance of which to the lower animals is disputed by no one, were no doubt acquired by the progenitors of man in a similar manner, namely, through natural selection, aided by inherited habit."
As we approach a complete theory of how cognition operates, we must accept that Darwin was right. Many animals empirically show elements of familiar cognition, although certainly to a lessor degree. The primatology studies of Jane Goodall, which showed that chimps had individual personalities, and the ever-evolving recitation of tool usage by animals, have begun to break down the religiously circumscribed belief that humans are somehow separate from the rest of nature. Thomson was right to highlight what he called the "lovely phrase" of William Allman, "We are risen apes, not fallen angels." The quote is from Allman's book Stone Age Present: How Evolution Has Shaped Modern Life - From Sex, Violence and Language to Emotions, Morals and Communities [AmazonGoodreads].
The many dog cognition studies referenced in The Genius of Dogs: How Dogs Are Smarter Than You Think [AmazonGoodreads] by Brian Hare and Vanessa Woods seem to lead to conclusion that dogs do "think" using much of the same physical properties of brain that humans have. Hare and Woods illustrate the special relationship which dogs have with people, not just socially (or culturally) created, but as a product of a sort of evolutionary co-dependance. Similarly, studies of dolphin cognition show that they, too, are a species with strong social ties. We have no reason to believe that the underlying mechanisms are completely different. To make matters worse for those who would deny that evolved brains produce complex social behavior, the book Animal Architects: Building and the Evolution of Intelligence [AmazonGoodreads] by James L. Gould and Carol Grant Gould should give any reader pause.
So we are left with a question for Thomson's remaining cognitive features: Do dogs or dolphins also have these features, much less monkeys?
We might think that they do. Wolfgang Köhler's primatology studies from the 1970s reported the dancing of a group of chimpanzees around a central pole. This probably relates more to group dynamics than religion per se, but it provides an interesting continuity of behavior from our closest extant ancestors and the earliest human religions, regardless of the human justifications for the action. The observation is in Köhler's 1976 classic The Mentality of Apes [AmazonGoodreads].
A useful discussion on the study and its wider context is to be found in the first volume of Joseph Campbell's The Masks of God series, Primitive Mythology [AmazonGoodreads(pp. 358-359): It seems to me extraordinary,” Köhler concludes, “that there should arise quite spontaneously, among chimpanzees, anything that so strongly suggests the dancing of some primitive tribes." and on the same page Campbell says:
We note, furthermore, the surprising detail of the central pole, which in the higher mythologies becomes interpreted as the world-uniting and supporting Cosmic Tree, World Mountain, axis mundi, or sacred sanctuary, to which both the social order and the meditations of the individual are to be directed. And finally, we have that wonderful sense of play, without which no mythological or ritual game of “make believe” whatsoever could ever have come into being.
Other Cognitive Features Implicated in Religious Expression
Thomson's DefinitionEvolutionary BenefitNotes
Minimally Counterintuitive Worlds (MCIs)This allows belief in the supernatural, as long as it’s not too “super” and does not violate too many basic tenets of humanness.Possibly cognitive closure, which is presumed necessary for understanding the world.Could this be related to intuitive reasoning in that one is perhaps a mechanism of cognitive closure and the other (MCI) a minimization of the effort involved?
Intuitive ReasoningThis helps us “fill in the blanks” of logic.Possibly cognitive closure, which is presumed necessary for understanding the world.Could this be a (the?) mechanism of cognitive closure? If we are presented with multiple disconnected facts and need to cognitively close the world, we must do it somehow.
Person Permanence(Discussed in Thomson's slides, but not his book)Object permanence is necessary for object tracking and recognition, especially in noisy environments. Person permanence probably builds on the same mechanisms.Also called object-person permanence.
Thomson also lists "Hard to Fake, Costly Honest Signals of Commitment" as an important cognitive feature that is used by religious belief. An example of a costly signal is the giving of an expensive gift to a friend, a potential mate, or in the case of potlatch, to an entire community. Costly signals do not seem to fit with an evolutionary argument but with a cultural one. Thomson suggests that costly signals help a person to determine when someone is lying to them (pp. 80-81) and thus associates them with safety. Of course, we cannot perfectly detect lies. Just as shiny hair, full lips or clear skin are often taken as signals for robust health and hence indicate good mating potential, they can also be faked. Faking them is in fact big business. A 2012 study by Michelle Yeomans for Market Research Firm Lucintel, forecasted that the global beauty care products industry will grow to approximately $265 billion annually by 2017. (See Global beauty market to reach $265 billion in 2017 due to an increase in GDP ). Similarly, costly signals can lead us to believe the sincerity of someone like a religious leader even when the long term impact of that belief might not be good for us. Thomson naturally uses as examples the mass murder/suicide at Jonestown and Islamist suicide bombers. One is left to wonder just how detrimental mainstream religious belief might be to the interests of believers. There has been wide consensus among literati for millennia that religion helps keep the powerful powerful. "Religion is regarded by the common people as true," said Seneca the Younger in the first century CE, "by the wise as false, and by the rulers as useful." Readers of Alexandre Dumas' The Three Musketeers would recognize the truth of that statement in the character of Cardinal Richelieu. The relationship between the established religions and politics seem to be in the headlines of international newspapers on a daily basis.
One of the few criticisms I will make of Why We Believe in God(s) is that the book is in desperate need of an index. The publisher really should have seen to that. Thomson asks in his preface that we take four actions: "Finish the book. Refer to it often. Give it to a friend. Donate it to a library or school." It is in fact rather difficult to refer to specific sections of it quickly in the absence of an index. A good index provides a complement to a table of contents as the other side of the same coin. Readers might approach a table of contents as a map to give them an overview and an index as a key to specific content.
Andy has collected a very large (250MB) number of slides and has been incredibly generous in allowing their public distribution. I have placed his entire slide deck online for download here with Andy's permission. Please note that the collection has been modified for various talks over time and does contain some significant duplication. There are no speakers notes in the slides. However, the book's Web site contains an hour-long video of Dr. Thomson lecturing on the topic using the slides.
I highly recommend Why We Believe in God(s). Perhaps only by understanding why we believe can we begin to consciously decide what we should believe.

An Abbreviated Literature Survey

Andy recommends a number of studies during his lectures. Here is a short list of them for your further enjoyment with notes regarding their applicability:
  • Asp E, Ramchandran K, Tranel D. Authoritarianism, Religious Fundamentalism, and the Human Prefrontal Cortex, Neuropsychology. 2012 Jul;26(4):414-21. doi: 10.1037/a0028526. Epub 2012 May 21.
  • Andrew Newberg and Mark Robert Waldman, Born to Believe: God, Science, and the Origin of Ordinary and Extraordinary Beliefs [AmazonGoodreads]
  • The first human religion:
    • Nicholas Wade, The Faith Instinct: How Religion Evolved and Why It Endures [AmazonGoodreads]
  • We attribute to the dead mental states that we cannot shut off, so we think of them still being somewhere. Those raised in religious schools or families lose this tendency later than those raised in secular schools or families:
    • Jesse Bering, The Belief Instinct: The Psychology of Souls, Destiny, and the Meaning of Life [AmazonGoodreads]
  • Robert Karen, Becoming Attached, Atlantic Monthly, Feb 1990
  • Paul Bloom, Just Babies: The Origins of Good and Evil [AmazonGoodreads]
  • Matthew D. Lieberman, Social: Why Our Brains Are Wired to Connect [AmazonGoodreads]
  • Morality is innate and not attributable to religion:
  • Autistic people "simply cannot be religious", as in the case of Temple Grandin:
    • Simon Baron-Cohen, The Essential Difference: Male And Female Brains And The Truth About Autism [AmazonGoodreads]
  • Neuroimaging evidence that religious experience occurs in Theory of Mind networks:
  • Neuroimaging evidence that belief in self and god overlap; belief in others resides in a different part of the brain:
  • The origin of religion is based on kin group selection, as driven by oxytocin release:
  • Rowers moving in synchrony had higher pain thresholds than those rowing alone:
  • Neuroimaging differences between believers' and nonbelievers' brains:

Tuesday, November 11, 2014

Book Review: How We Think by John Dewey

Our culture is built on ideas from the past. They don't always make sense when laid alongside our rapidly changing understanding of the natural world. Should our political conversations in an urban and post-industrial economy include presumptions of rugged individualism formed in the wilderness of a continent mostly depopulated by horrific diseases brought from Europe? Should our children be taught revelations passed down from the Bronze Age via millennia of interpretations and translations? Should we celebrate holidays whose origins and purposes we do not rationally discuss? Whether we should or should not, we do. We can't help but build in our short lives on the bits and pieces left to us from earlier ages.
My father, a career professional educator, died last year. I found a copy of How We Think [AmazonGoodreads] by John Dewey on his bookshelf. Dewey was a leading philosopher and educational theorist of the early twentieth century. His ideas have shaped a century of educational thought. To this day, reviews of his work by teachers rave breathlessly about his prescience, his clear thinking and his relevance to modern education. They have used this book and some of his others in university classes and relate to Dewey as a forefather of their teaching philosophy.
Amazon boasts a whopping seventy editions and formats of How We Think. The book is still in print by several publishers since its original copyright protection has expired. The book was originally written in 1909 and published the following year. I reviewed the 1997 paperback edition published by Dover Publications.
Dewey has that most rare of Wikipedia street cred, a bibliography page that is separate from his biographical page. The bibliography page lists 29 books authored by him, an impressive achievement for anyone, academic or not. His biographical page mentions several of his notable ideas and includes his founding of the American Association of University Professors (AAUP) and, right up top, his concept of "reflexive thinking". The latter found its initial expression in How We Think.
Having built Dewey up to the apex of his pedestal I will attempt to rip him down from it as if he were a Saddam Hussein impersonator at an Iraqi War veterans reunion. Dewey's ideas are badly dated and, especially in light of his continued prominence among teachers and school administrators, dangerous to our best current concepts of education.
Dewey's intellectual lineage traced from Aristotle to Aristotle's primary medieval interpreter ibn Sīnā, the Enlightment philosophers John Locke and Jean-Jacques Rousseau and Charles Peirce (pronounced "purse"), a founder of the Pragmatism school of philosophy to which Dewey would ascribe and extend. Some would say that he favored Plato more than Aristotle, and this may be so, but I list Aristotle for two very good reasons. His philosophy relies on both Aristotle's passé conception of the infant mind as a blank slate, or tabula rasa, and on a "scientific" worldview that was inadequately inductive even for his time. These failings conspire to make questionable his conclusions.
It would be difficult to overestimate the importance of Locke in Dewey's thinking. Dewey didn't make much more progress in figuring out why some people are educable and others less so. He merely categorized them following Locke's approach. These days we might suggest that people who cannot be educated have a natural variation in neurotransmitters that suppress neural plasticity, much as some people make good guitar players or athletes partially due to a superior ability to learn "muscle memory".
John Locke was the philosopher behind the modern conception of the self. Locke believed in the human mind as a blank slate at birth, a tabula rasa, which to my mind has been thoroughly debunked even though you will find some adherents still desperately clinging even as they are tossed on the high seas of our age. Locke was clearly an Aristotelian in that he picked up the tabula rasa from Ibn-Sīnā and the cosmological argument directly from Aristotle. He was thus a man of his time. However, his conception of the self appears to have been new and his "Associationism", the idea that the associations formed in the mind of youth are critical for the formation of all later thought, has become the foundation of later attempts at educational reform, including Dewey's.
One area of philosophy modernized by Dewey's time was the recognitions that the mind exists in the brain and the brain is inarguably part of the body. Those who believed that the mind was of different stuff than the body ("dualists") lost ground to modern science. Those who recognized the common elements of brain and body and the mind as an emergent property of the brain are known as monists. Locke, notionally a dualist, seems to have acknowledged a hint of monism in his  An Essay Concerning Human Understanding when he said that "the body too goes to the making the man." Maybe Dewey, apparently a monist, picked it up from him.
Dewey begins his book by thinking about thinking. He rightly discusses the four major approaches to his subject. His language makes it clear where he stands.
In the first place thought is used broadly, not to say loosely. Everything that comes to mind, the "goes through our heads," is called a thought. To think of a thing is just to be conscious of it in any way whatsoever. Second, the term is restricted by excluding whatever is directly presented; we think (or think of) only such things as we do not directly see, hear, smell, or taste.
One might reasonably suppose that he would also exclude touch, if asked.
Then, third, the meaning is further limited to beliefs that rest upon some kind of evidence or testimony. Of this third type, two kinds - or, rather, two degrees - must be discriminated. In some cases, a belief is accepted with slight or almost no attempt to state the grounds that support it. In other cases, the ground or basis for a belief is deliberately sought and its adequacy to support the belief examined. This process is called reflexive though; it alone is truly educative in value, and it forms, accordingly, the principle subject of this volume.
These definitions certainly sound reasonable, even learned, but they are the very source of my objection to Dewey's philosophy and everything that springs from it. They are anti-science in their way. They were based on no observation, no inductive analysis of many facts leading to generalizations when and only when patterns have been naturally found to emerge. They are as scientific as Aristotle's suppositions that thinking occurred around the heart, that the brain was a mere "cooling organ" as evidenced by the running of classical noses, or his idea that women are cooler than men (they are slightly warmer). As Hawkins noted in On Intelligence [AmazonGoodreads] (pp. 32), "looking across the history of science, we see our intuition is often the biggest obstacle to discovering the truth." Dewey's axioms of thinking are simply philosophic, in the navel-gazing sense of the term. We can now do better.
The pioneering neuroscientist Vernon Mountcastle noted in 1978 that the human neocortex consists of the same basic structure throughout, from the parts that process vision to the parts that process hearing to the parts that recognize a friend to the parts that predict the weather from the color, shape and distribution of the clouds overhead. Jeff Hawkins has called Mountcastle's paper the "Rosetta stone of neuroscience". Everywhere are the same vertical columns, always six identifiable sections deep, all consisting of the same types of cells in the same distribution. Only the pliable, changeable connections between them are different and those connections can change with experience and with injury. Indeed, a person blinded does not lose the use of that portion of the cortex that used to process vision. Instead, those areas are coopted to process other sensory inputs and they often do so as completely as if they were originally part of that structure that we have come to call the somatosensory cortex. Even the horrific injury sustained by Phineas Gage healed sufficiently to leave him functional after some time, although we now recognize how fortunate he was not to have damaged other specific areas of his brain. This analysis leaves little room for Dewey's assertion that we may somehow separate "mere" consciousness from those cognitive activities based on processing sensory input. The cortex processes both in areas that are seemingly interchangeable.
Dewey's two-part third category of thought is in no better shape. On what basis might we suppose that fantasy or fictional thought may be separated from some form of rational thought that demands evidence? "Reflection," says Dewey, "involves not simply a sequence of ideas, but a consequence - a consecutive ordering in such a way that each determines the next as its proper outcome, which each in turn leans back on its predecessors." There is little doubt here that Dewey relates reflexive thought to deductive reasoning, as in a mathematical proof. This would certainly fit with his stated goal to make scientific the process of education. There are at least two problems with this line of thinking.
Firstly, fictional and fact-oriented thought are processed by the same part of the brain, in the same way. Associations between concepts seem to be the cortex's currency, not some (to the brain) arbitrarily factual grounding. What about novels, plays or poems that reflect life so sublimely that they inspire or teach us something about ourselves? The playwright Ayad Akhtar has said, "An artist’s job is to tease and poke and question the larger racial, ethnic, religious and social conscience and in the process to provoke questions that lead to new practices and new way of seeing." Akhtar neatly shows the fallacy of Dewey's argument for a separation of thought based on some form of intent. This was well known to the ancient Greeks so I'm not sure what led Dewey astray. It was however philosophic reductionism run amok.
There are also solid evolutionary reasons for the human brain not to separate fact from fiction. We rely on our ability to consider various scenarios before we act. Those scenarios are fictional, surely, and yet by playing them through in our heads we can decide which are most likely to lead to outcomes we prefer. "Imagine that the only way you could think about what might be going on in another person's mind was for that person to be sitting in front of you." suggested Andy Thomson in his impressive little book Why We Believe in God(s): A Concise Guide to the Science of Faith [AmazonGoodreads]. "Human relationships as we know them would be impossible." Indeed. "We need to evaluate the likely thoughts and feelings of others, even when those others are nowhere to be seen." Are these evaluations not fictional? We might ask whether a boss will think badly of us if we are late one more time, or whether a parent will punish us if we don't clean our room, or what it will take for that cute girl to agree to a date. These fictions are the very key to navigating our real-world life. They allow us to determine our eventual behavior. Without our fictional thoughts we would all be sociopaths.
Secondly, Dewey's mathematical analogy doesn't hold up. Mathematical deduction itself proceeds from some axioms that are presumed to be self evident. The classic axioms of Euclid's geometry, for example, are these (translated by Thomas Heath [AmazonGoodreads] with contemporary comments by me in parentheses):
  1. "To draw a straight line from any point to any point." (A straight line is drawn between any two points)
  2. "To produce [extend] a finite straight line continuously in a straight line." (A straight line extends forever in both direction)
  3. "To describe a circle with any centre and distance [radius]." (A circle can be drawn from a center point and any fixed radius)
  4. "That all right angles are equal to one another."
  5. "That, if a straight line falling on two straight lines make the interior angles on the same side less than two right angles, the two straight lines, if produced indefinitely, meet on that side on which are the angles less than the two right angles."
Most school children are introduced to Euclid's axioms, although they might be given in more modern language. Geometry on a flat surface ("plane" geometry) as laid down by Euclid stood unchallenged for more than two millennia. There was only one problem: The fifth axiom (the so-called parallel postulate) is both unprovable and unneeded. It got in the way of Einstein's Theory of Relativity and he needed to jettison it in order to make progress. Jason Socrates Bardi did a fine job tracing the painful history of the thing in his book The Fifth Postulate: How Unraveling A Two Thousand Year Old Mystery Unraveled the Universe [AmazonGoodreads].
Euclid's failure, only recognized in modern times, well illustrates the dangers of proceeding in a "scientific" manner from questionable presumptions. The resulting edifice might be useful in a given context but taking it as written can and probably will slow future progress. Dewey's theory was only right, or useful, in the cultural context of the West.
Dewey proves that he is operating within the strict cultural context of the Western world when he states that the thesis of his book is to teach children to "convey knowledge and assist thought." And later, "to direct pupils' oral and written speech, used primarily for practical and social ends, so that gradually it shall become a conscious tool of conveying knowledge and assisting thought" (pp. 179). Those goals, however laudable they may be in our current society, are so very different from the immediate survival skills necessary in a hunter-gatherer society, a herding culture or a civilization mired in warfare. They are the goals of a well-fed and generally secure post-agricultural civilization with a lot of time on their hands.
Dewey is hardly the only philosopher in history to overgeneralize in a cultural context. Aristotle's most famous errors, from his diminution of the brain to his many fallacious statements about women, proceeded from presumptions that were founded upon cultural preconceptions, not truth. This way lies the definition of paradigms that must be, painfully, overthrown in time as Thomas Kuhn famously described in The Structure of Scientific Revolutions [AmazonGoodreads]. Kuhn was on to something in spite of the half-century of criticisms around his central idea.
Dewey next pulls an academic sleight-of-hand to justify his "scientific" approach: He redefines the word "logic" to suit his thesis. Redefinition, or at least careful definition, of terms is a sharp tool in the academic toolbox. Often it fosters a more precise understanding when the primary mode of communication, language, is inherently too imprecise for the purpose at hand.  Judge, though, Dewey's usage:
In its broadest sense, any thinking that ends in a conclusion is logical - whether the conclusion reached be justified or fallacious; that is, the term logical covers both the logically good and and the illogical or the logically bad. In its narrowest sense, the term logical refers only to what is demonstrated to follow necessarily from premises that are definite in meaning and that are either self-evidently true, or that have been previously proved to be true. Stringency of proof is here the equivalent of the logical. In this sense mathematics and formal logic (perhaps as a branch of mathematics) alone are strictly logical. Logical, however, is used in a third sense, which is at once more vital and more practical; to denote, namely, the systematic care, negative and positive, taken to safeguard reflection so that it may yield the best results under the given conditions. [...] In this sense, the word logical is synonymous with wide-awake, thorough, and careful reflection - thought in its best sense. (pp. 56, my emphasis of the last sentence)
Dewey's idea, clearly expressed here, is that only when we are wide awake and spend time carefully reflecting (that is, working through the logical conclusions of our ideas, structuring them and relating them to proven facts) can we be considered to be thinking "in its best sense". One wonders at the temerity of that statement. There is no doubt that such thinking has led our current Western civilization to construct complex and occasionally complete theories of the natural world, invent new materials, machinery and social structures, and to provide unprecedented billions with more food and physical safety than in all previous generations combined. And, as Carl Sagan correctly said, "Skeptical scrutiny is the means, in both science and religion, by which deep thoughts can be winnowed from deep nonsense." But best? How then to account for the human animal's ability to have successfully spread to every continent save Antartica before the agricultural revolution, much less the industrial one? Our hunter-gatherer ancestors certainly did not think in Dewey's best sense. Yet they did survive rather well and did not create nuclear weapons, biochemical warfare, land mines, nor burn or spill oil on industrial scales.
I do not yearn for some idealized past full of noble savages, nor do I believe that humans have fundamentally changed much in short millennia since cultural advances created civilization. But I do believe that "best" is a relative term which one should certainly apply with care. To apply it to one's place and time smacks of, at best, parochialism and, at worst, nationalism or racism.
Taking all together, the book would better have been titled How We Should Think or How I Think We Should Think. The way that Dewey postulates thought is more about how we learn and even then does not match our best current understanding.
John Dewey, like all the members of his intellectual lineage, was brilliantly wrong. His well thought out and finely honed educational theories rest upon a foundation of sand. It is high time that his theories are reviewed and revised for a more monist age. The scientific community now generally accepts the non-theist (if not always atheist) proposition that the mind is housed in the brain and that the brain is made out of the same materials and using the same processes as the rest of the human body. A mind is as much a product of evolution as a heart or toes. Few people could say with a straight face that the scientific consensus is that the mind is a separate creation. Of course, that is not so for the general population.
Not that Dewey was a theist. He was a brilliant iterator of the Pragmatist school of philosophy and apparently an atheist. His conception of education was to maximize the capabilities of the individual in his industrial age. "If these pages assist any to appreciate this kinship" between childish thought and scientific thought "and to consider seriously how its recognition in educational practice would make for individual happiness and the reduction of social waste, the book will amply have served its purpose." said Dewey in his preface (pp. vii, my emphasis). His Pragmatic philosophy seems to have been an American iteration on the British Utilitarian school started the century before by Jeremy Bentham. 
The problem with Dewey's poor foundation for reflexive thought is how influential he has been regarding the American educational system. Perhaps this is one reason for America's poor standing in education as judged on a worldwide basis. It is high time that Dewey's contribution to the guiding principles of American education were reexamined. His late contemporary Maria Montessori disagreed fundamentally with him in that she did not think that a teacher must feel their tasks "made heavier in that they have come to deal with pupils individually and not merely in mass." (pp. vii) Contrarily, Montessori argued that "Free choice is one of the highest of all the mental processes" when educating a child and guiding them to make good choices was a necessary process. "To let the child do as he likes when he has not yet developed any powers of control", Montessori said, "is to betray the idea of freedom." Alas, Dr. Montessori's ideas have never found full expression in the United States in spite of her superior research. This has been due in equal measure to the radical nature of her changes, the necessity for significant education of teachers and her unfortunate citizenship in the defeated Italian fascist state under Benito Mussolini.
Dewey's understanding of human consciousness has also not stood the test of time. We have come a long, long way toward an understanding of consciousness in spite of loudly repeated claims to the contrary. The common philosophical refrain of linguists, philosophers and many psychologists is that the key tool of consciousness is the creation, storage and manipulation of symbols is emblematic of the very progress that they deny. Take, for example, these statements by Dewey in 1910 (pp. 170-171):
Three typical views have been maintained regarding the relation of thought and language : first, that they are identical ; second that words are the garb or clothing of thought, necessary not for thought but only for conveying it ; and third (the view that we shall here maintain) that while language is not thought it is necessary for thinking as well as for its communication. When it is said, however, that thinking is impossible without language, we must recall that language includes much more than oral and written speech. Gestures, pictures, monuments, visual images, finger movements - anything consciously employed as a sign is, logically, language. To say that language is necessary for thinking is to say that signs are necessary. Thought deals not with bare things, but with their meanings, their suggestions ; and meanings, in order to be apprehended, must be embodied in sensible and particular existences. Without meaning, things are nothing but blind stimuli or chance sources of pleasure and pain ; and since meanings are not themselves tangible things, they must be anchored by attachment to some physical existence.
Dewey's expansive definition of language allows for Helen Keller's intelligence which would otherwise have been excluded. Such an exclusion would have rendered the neat little theory without meaning in the face of evidence to the contrary.
It is worth comparing Dewey's thesis with Hawkins. Dewey takes quite a behavioralist approach; behavior, in this case language, equals intelligence. Hawkins disagrees:
But intelligence is not just a matter of acting or behaving intelligently. Behavior is a manifestation of intelligence, but not the central characteristic or primary definition of being intelligent. A moment's reflection proves this: You can be intelligent just lying in the dark, thinking and understanding. Ignoring what goes on in your head and focusing instead on behavior has been a large impediment to understanding intelligence and building intelligent machines. (pp. 29)
It is interesting to note that Hawkins so completely refutes Dewey's understanding of the importance of behavior while using the term "reflection" to do it.
The idea of the mental "sign" is finally being overcome. Mountcastle, Hawkins and those neuroscientists who have uncovered the way ideas both concrete and abstract are laid down in the tangled interconnections of the cortex have shown us that the "sign" is merely a collection of activation patterns of neuronal columns that itself adapts to new input. We are close to understanding the algorithm of thought. Indeed, the implementation of Hawkins' admittedly experimental and approximate "cortical learning algorithm" does in fact make solid predictions from streams of input data in a way that is both reminiscent of the way humans predict and different from all other existing forms of artificial intelligence.
And yet, persistent ideas die slowly. Philosophers have argued for many years whether the mind is of a different nature from the brain, and some still do although their voices are less loud. Similarly, philosophers have argued (this is what they do) whether some human thought processes are innate, like instincts, or whether we learn absolutely everything. The view that we do have some innate thought patterns has been well proven to my mind, such as facial recognition or the strong tendency to assign an intelligent purpose to rapid movement in our peripheral vision. These instincts help to keep us alive, safe and social.
The understanding that we have some innate thoughts is known as nativism and comes in two important kinds. The linguist Noam Chomsky has championed the idea of a "Universal Grammar", structures or modules in the brain that are responsible for a child's ability to rapidly learn languages. Others have suggested that the brain is a form of computer, not in its architecture but in its ability to compute. That is, the brain can and does perform computations and it is made of modules that provide computational functions.
Very similar views to Dewey's regarding "signs" in the mind have been expressed in recent times by Chomsky and philosopher Jerry Fodor. Fodor's book The Mind Doesn't Work That Way [AmazonGoodreads] was a direct criticism and response to Steven Pinker's book How the Mind Works [AmazonGoodreads] and although I have criticized Pinker in some regards I am much more in his camp than Fodor's. Fodor is a fun read, though. The man is a hoot. But his insistence on the wobbly cairn of mental signs is both dated and increasingly rather silly in the face of new neuroscience. "I think some version of Chomskian nativism will probably turn out to be true and that the current version of New Synthesis nativism probably won't." says Fodor.
Fodor, an MIT professor, is intellectually honest enough to admit that he might be wrong. He states at the opening to his fourth chapter that if cognition is in fact all or mostly modular, as Pinker, Mountcastle and Hawkins say that it is, then "Pinker and Plotkin are probably right about the prospects for New Synthesis Psychology being very good, and I have been wasting your time. (Mine too, come to think of it.)" Fodor might be that rare man who can adapt to yet another revolution in his old age.
Dewey did not have Mountcastle to tell him that the modular components of the human cortex were practically structurally identical throughout, nor Hawkins to prove that prediction could be implemented by emulating that structure. Without proof for those modules, he fell in line with his lineage and attempted to take it one step farther. His reliance on undefined "signs" and "symbols" in the brain to somehow explain thought looks more like mumbo jumbo than science. "Learning," Dewey insists, "in the proper sense, is not learning things, but the meanings of things, and this process involves the use of signs, or language in its generic sense." (pp. 176). He goes on to warn that educational reformers who reject such analysis risk "the destruction of the intellectual life", a serious charge if it had any basis. We are now rather sure that the human cortex has no inherent separation between things and their meanings. It is the associations between similar structures that matter, not some hardwired categorization in our mental hardware.
Were we to jettison this loose and empirical mumbo jumbo with the concrete description of the operation of the human neocortex by Hawkins, something amazing pops out: strong support for the Computational Theory of Mind. The CTM postulates that the brain is fundamentally an information processing system, both enabled and constrained by the same mathematical limits of any computation. Importantly, the CTM implies that we can eventually understand the brain's operations. That is not to say in any way that the brain is a "computer" as the term is currently used. The brain certainly does not implement the von Neumann architecture used by artificial computers of our age.
Neuroscientists call high-level concepts in the brain "invariant representations". Invariant representations ensure that an object in motion is still thought of as the same object, even if it is changing position, moves in or out of shadows, if you can only see part of it at any one time, etc. Invariant representations are created automatically by the cortex, says Hawkins, using exactly the same cortical structures that are used elsewhere. Invariant representations are both the result of senses and memory and used by them to understand the world around us. It is my understanding that we could justifiably replace the "signs" and "symbols" of linguistic theory with the invariant representations of modern neuroscience and lose exactly nothing in translation. I think that we could begin to have a serious scientific conversation about the biology of language.
Of course, Dewey was a certifiable genius and he was not all wrong. Far from it. There is much in his work left to admire if one takes into account the bits that have been surpassed by further research.
To his credit, Dewey seems to be an early adopter of the embodied cognition school which embraced the idea that the mind is of the body. It is unclear whether he was influenced primarily by a close reading of Locke as I suggested earlier or whether he was motivated by his generation's raucous wrestling with Darwin's theory of evolution. Given Dewey's progressive stance on religion (he didn't seem to have one), it is not unreasonable to think that both may have been so.
Dewey suggested that "enlargement of vocabulary" was the key to creating an educated person. He was undoubtedly right about that because associating many words results in more and stronger neural connections between cortical regions. There are other techniques that have such result, and they should be given equal air time in any theory of education. Learning to play the piano is a commonly used technique. Performing research using the scientific method is another. Dewey was barking up a legitimate tree in a forest full of the same species.
Dewey's statement that, "Looseness of thinking accompanies a limited vocabulary" (pp. 181) strikes me as a restatement of the Sapir-Whorf Hypothesis. Most scholars studying how language intersects with thought have generally accepted that it does have some effect for the last twenty years. Current research focuses on how and how much.
Dewey also recognized, possibly through his own experience, the tendency of students to prefer material novel to them than to focus on details of the familiar (pp. 221-222). Pat Conroy agreed in his fascinating memoir of teaching poor children on a South Carolina island, The Water is Wide [AmazonGoodreads]. Conroy introduced classical music to children who had little interest in classical education and used the stories behind the songs to introduce geography, history, biography. It was a neat trick and an excellent application of Dewey's approach. Children and adults become rapt by the unexpected or unfamiliar because our evolution ensures that we pay attention to changing circumstances. Neuroscience agrees. Unexpected experiences cause a cascade neuronal of activity throughout the cortex which updates old conceptions of normality.
Thomson has illustrated that we as a species can often be tricked into believing something factually nonsensical. One of the ways that religion becomes embedded in our consciousness is by being introduced to so-called "minimally counterintuitive worlds". In line with Dewey and Conroy, he notes that such stories are "an optimal compromise between the interesting and the expected". They allow us to accept new ideas from others readily precisely because they mix what we already know and can verify with something just a bit novel and therefore interesting.
Given this rare alignment between Dewey's theories, teacher experience and neuroscience, it might reasonably seem amazing that the current US trend is to teach students material in relation to their home town. We have every reason to believe that this works against the process of education. Dewey's advice, gleaned from the Associationist concept he picked up from Locke, was to specifically avoid experiences that would form negative associations. Fostering a love of learning is much more important than forcing the bulk memorization of facts.
Dewey railed against the 'Fallacy of making "facts" an end in themselves' (pp. 188). One can almost hear the frustration of students exposed the "ten thousand facts" aproach to learning, then and now. Our most recent foray into educational reform, the so-called No Child Left Behind Act of 2002, ensured that schools would teach to the many standardized tests and thus run, not walk, away from Dewey's ideas. It is little wonder that teachers trained in Dewey object.
The guise of education that teaches only facts relies on an individual having their own insightful moment that connects the facts together into some consistent whole. Such teachers create cognitive dissonance and hope that students will close it for them. Naturally many students will not intuit useful connections themselves.
Toward the end of his long career, Dewey did accept significant criticism of his theories and refined them accordingly. However, he never gave up on reflexive thought, probably because few dared challenge him on that topic prior to the last decade's worth of neuroscience research. His 1938 book Experience and Education [AmazonGoodreads] redefines his ideas toward the end of his working life.
In that volume, the older Dewey echoes Michel de Montaigne in his concern for the ultimate result of education:
What avail is it to win prescribed amounts of information about geography and history, to win ability to read and write, if in the process the individual loses his own soul: loses his appreciation of things worth while, of the values to which these things are relative; if he loses desire to apply what he has learned and, above all, loses the ability to extract meaning from his future experiences as they occur?
Montaigne said something very similar around 1580 in his  Essays  when he asked, "What good can we suppose it did Varro and Aristotle to know so many things? Did it exempt them from human discomforts? Were they freed from the accidents that oppress a porter? Did they derive from logic some consolation for the gout?" It is a question that all philosophers must ask as they develop their art. We are all mortal and there will come a time when the carefully constructed neuronal connections in our heads becomes literally fertilizer for new generations. We can but take solace in the fact that those ideas that are written, passed down, discussed, and argued over define not only our culture but have a chance to influence the culture of our descendants. We are still reading Plato, Aristotle, Montaigne and, yes, Dewey. "We are all worms," noted Winston Churchill, "But I believe I am a glow-worm." So was Dewey.
There is of course another way to answer Montaigne's question. Montaigne and Dewey both look at the value of education from the perspective of the individual. But humans are a social species. Society has certainly benefited from the studies of those many unsung heroes through the ages. Many of us currently do escape the pain of illnesses, recover from accidents and even some of the ravages of age. Our lives are both longer and more pleasant than of those who came before us. We owe a mountain of debt to all people who struggle to understand and pass their lessons down. They are the giants on whose shoulders we wobble, squat and sometimes stand.