Our culture is built on ideas from the past. They don't always make sense when laid alongside our rapidly changing understanding of the natural world. Should our political conversations in an urban and post-industrial economy include presumptions of rugged individualism formed in the wilderness of a continent mostly depopulated by horrific diseases brought from Europe? Should our children be taught revelations passed down from the Bronze Age via millennia of interpretations and translations? Should we celebrate holidays whose origins and purposes we do not rationally discuss? Whether we should or should not, we do. We can't help but build in our short lives on the bits and pieces left to us from earlier ages.
My father, a career professional educator, died last year. I found a copy of How We Think [Amazon, Goodreads] by John Dewey on his bookshelf. Dewey was a leading philosopher and educational theorist of the early twentieth century. His ideas have shaped a century of educational thought. To this day, reviews of his work by teachers rave breathlessly about his prescience, his clear thinking and his relevance to modern education. They have used this book and some of his others in university classes and relate to Dewey as a forefather of their teaching philosophy.
Amazon boasts a whopping seventy editions and formats of How We Think. The book is still in print by several publishers since its original copyright protection has expired. The book was originally written in 1909 and published the following year. I reviewed the 1997 paperback edition published by Dover Publications.
Dewey has that most rare of Wikipedia street cred, a bibliography page that is separate from his biographical page. The bibliography page lists 29 books authored by him, an impressive achievement for anyone, academic or not. His biographical page mentions several of his notable ideas and includes his founding of the American Association of University Professors (AAUP) and, right up top, his concept of "reflexive thinking". The latter found its initial expression in How We Think.
Having built Dewey up to the apex of his pedestal I will attempt to rip him down from it as if he were a Saddam Hussein impersonator at an Iraqi War veterans reunion. Dewey's ideas are badly dated and, especially in light of his continued prominence among teachers and school administrators, dangerous to our best current concepts of education.
Dewey's intellectual lineage traced from Aristotle to Aristotle's primary medieval interpreter ibn Sīnā, the Enlightment philosophers John Locke and Jean-Jacques Rousseau and Charles Peirce (pronounced "purse"), a founder of the Pragmatism school of philosophy to which Dewey would ascribe and extend. Some would say that he favored Plato more than Aristotle, and this may be so, but I list Aristotle for two very good reasons. His philosophy relies on both Aristotle's passé conception of the infant mind as a blank slate, or tabula rasa, and on a "scientific" worldview that was inadequately inductive even for his time. These failings conspire to make questionable his conclusions.
It would be difficult to overestimate the importance of Locke in Dewey's thinking. Dewey didn't make much more progress in figuring out why some people are educable and others less so. He merely categorized them following Locke's approach. These days we might suggest that people who cannot be educated have a natural variation in neurotransmitters that suppress neural plasticity, much as some people make good guitar players or athletes partially due to a superior ability to learn "muscle memory".
John Locke was the philosopher behind the modern conception of the self. Locke believed in the human mind as a blank slate at birth, a tabula rasa, which to my mind has been thoroughly debunked even though you will find some adherents still desperately clinging even as they are tossed on the high seas of our age. Locke was clearly an Aristotelian in that he picked up the tabula rasa from Ibn-Sīnā and the cosmological argument directly from Aristotle. He was thus a man of his time. However, his conception of the self appears to have been new and his "Associationism", the idea that the associations formed in the mind of youth are critical for the formation of all later thought, has become the foundation of later attempts at educational reform, including Dewey's.
One area of philosophy modernized by Dewey's time was the recognitions that the mind exists in the brain and the brain is inarguably part of the body. Those who believed that the mind was of different stuff than the body ("dualists") lost ground to modern science. Those who recognized the common elements of brain and body and the mind as an emergent property of the brain are known as monists. Locke, notionally a dualist, seems to have acknowledged a hint of monism in his An Essay Concerning Human Understanding when he said that "the body too goes to the making the man." Maybe Dewey, apparently a monist, picked it up from him.
Dewey begins his book by thinking about thinking. He rightly discusses the four major approaches to his subject. His language makes it clear where he stands.
In the first place thought is used broadly, not to say loosely. Everything that comes to mind, the "goes through our heads," is called a thought. To think of a thing is just to be conscious of it in any way whatsoever. Second, the term is restricted by excluding whatever is directly presented; we think (or think of) only such things as we do not directly see, hear, smell, or taste.
One might reasonably suppose that he would also exclude touch, if asked.
Then, third, the meaning is further limited to beliefs that rest upon some kind of evidence or testimony. Of this third type, two kinds - or, rather, two degrees - must be discriminated. In some cases, a belief is accepted with slight or almost no attempt to state the grounds that support it. In other cases, the ground or basis for a belief is deliberately sought and its adequacy to support the belief examined. This process is called reflexive though; it alone is truly educative in value, and it forms, accordingly, the principle subject of this volume.
These definitions certainly sound reasonable, even learned, but they are the very source of my objection to Dewey's philosophy and everything that springs from it. They are anti-science in their way. They were based on no observation, no inductive analysis of many facts leading to generalizations when and only when patterns have been naturally found to emerge. They are as scientific as Aristotle's suppositions that thinking occurred around the heart, that the brain was a mere "cooling organ" as evidenced by the running of classical noses, or his idea that women are cooler than men (they are slightly warmer). As Hawkins noted in On Intelligence [Amazon, Goodreads] (pp. 32), "looking across the history of science, we see our intuition is often the biggest obstacle to discovering the truth." Dewey's axioms of thinking are simply philosophic, in the navel-gazing sense of the term. We can now do better.
The pioneering neuroscientist Vernon Mountcastle noted in 1978 that the human neocortex consists of the same basic structure throughout, from the parts that process vision to the parts that process hearing to the parts that recognize a friend to the parts that predict the weather from the color, shape and distribution of the clouds overhead. Jeff Hawkins has called Mountcastle's paper the "Rosetta stone of neuroscience". Everywhere are the same vertical columns, always six identifiable sections deep, all consisting of the same types of cells in the same distribution. Only the pliable, changeable connections between them are different and those connections can change with experience and with injury. Indeed, a person blinded does not lose the use of that portion of the cortex that used to process vision. Instead, those areas are coopted to process other sensory inputs and they often do so as completely as if they were originally part of that structure that we have come to call the somatosensory cortex. Even the horrific injury sustained by Phineas Gage healed sufficiently to leave him functional after some time, although we now recognize how fortunate he was not to have damaged other specific areas of his brain. This analysis leaves little room for Dewey's assertion that we may somehow separate "mere" consciousness from those cognitive activities based on processing sensory input. The cortex processes both in areas that are seemingly interchangeable.
Dewey's two-part third category of thought is in no better shape. On what basis might we suppose that fantasy or fictional thought may be separated from some form of rational thought that demands evidence? "Reflection," says Dewey, "involves not simply a sequence of ideas, but a consequence - a consecutive ordering in such a way that each determines the next as its proper outcome, which each in turn leans back on its predecessors." There is little doubt here that Dewey relates reflexive thought to deductive reasoning, as in a mathematical proof. This would certainly fit with his stated goal to make scientific the process of education. There are at least two problems with this line of thinking.
Firstly, fictional and fact-oriented thought are processed by the same part of the brain, in the same way. Associations between concepts seem to be the cortex's currency, not some (to the brain) arbitrarily factual grounding. What about novels, plays or poems that reflect life so sublimely that they inspire or teach us something about ourselves? The playwright Ayad Akhtar has said, "An artist’s job is to tease and poke and question the larger racial, ethnic, religious and social conscience and in the process to provoke questions that lead to new practices and new way of seeing." Akhtar neatly shows the fallacy of Dewey's argument for a separation of thought based on some form of intent. This was well known to the ancient Greeks so I'm not sure what led Dewey astray. It was however philosophic reductionism run amok.
There are also solid evolutionary reasons for the human brain not to separate fact from fiction. We rely on our ability to consider various scenarios before we act. Those scenarios are fictional, surely, and yet by playing them through in our heads we can decide which are most likely to lead to outcomes we prefer. "Imagine that the only way you could think about what might be going on in another person's mind was for that person to be sitting in front of you." suggested Andy Thomson in his impressive little book Why We Believe in God(s): A Concise Guide to the Science of Faith [Amazon, Goodreads]. "Human relationships as we know them would be impossible." Indeed. "We need to evaluate the likely thoughts and feelings of others, even when those others are nowhere to be seen." Are these evaluations not fictional? We might ask whether a boss will think badly of us if we are late one more time, or whether a parent will punish us if we don't clean our room, or what it will take for that cute girl to agree to a date. These fictions are the very key to navigating our real-world life. They allow us to determine our eventual behavior. Without our fictional thoughts we would all be sociopaths.
Secondly, Dewey's mathematical analogy doesn't hold up. Mathematical deduction itself proceeds from some axioms that are presumed to be self evident. The classic axioms of Euclid's geometry, for example, are these (translated by Thomas Heath [Amazon, Goodreads] with contemporary comments by me in parentheses):
- "To draw a straight line from any point to any point." (A straight line is drawn between any two points)
- "To produce [extend] a finite straight line continuously in a straight line." (A straight line extends forever in both direction)
- "To describe a circle with any centre and distance [radius]." (A circle can be drawn from a center point and any fixed radius)
- "That all right angles are equal to one another."
- "That, if a straight line falling on two straight lines make the interior angles on the same side less than two right angles, the two straight lines, if produced indefinitely, meet on that side on which are the angles less than the two right angles."
Most school children are introduced to Euclid's axioms, although they might be given in more modern language. Geometry on a flat surface ("plane" geometry) as laid down by Euclid stood unchallenged for more than two millennia. There was only one problem: The fifth axiom (the so-called parallel postulate) is both unprovable and unneeded. It got in the way of Einstein's Theory of Relativity and he needed to jettison it in order to make progress. Jason Socrates Bardi did a fine job tracing the painful history of the thing in his book The Fifth Postulate: How Unraveling A Two Thousand Year Old Mystery Unraveled the Universe [Amazon, Goodreads].
Euclid's failure, only recognized in modern times, well illustrates the dangers of proceeding in a "scientific" manner from questionable presumptions. The resulting edifice might be useful in a given context but taking it as written can and probably will slow future progress. Dewey's theory was only right, or useful, in the cultural context of the West.
Dewey proves that he is operating within the strict cultural context of the Western world when he states that the thesis of his book is to teach children to "convey knowledge and assist thought." And later, "to direct pupils' oral and written speech, used primarily for practical and social ends, so that gradually it shall become a conscious tool of conveying knowledge and assisting thought" (pp. 179). Those goals, however laudable they may be in our current society, are so very different from the immediate survival skills necessary in a hunter-gatherer society, a herding culture or a civilization mired in warfare. They are the goals of a well-fed and generally secure post-agricultural civilization with a lot of time on their hands.
Dewey is hardly the only philosopher in history to overgeneralize in a cultural context. Aristotle's most famous errors, from his diminution of the brain to his many fallacious statements about women, proceeded from presumptions that were founded upon cultural preconceptions, not truth. This way lies the definition of paradigms that must be, painfully, overthrown in time as Thomas Kuhn famously described in The Structure of Scientific Revolutions [Amazon, Goodreads]. Kuhn was on to something in spite of the half-century of criticisms around his central idea.
Dewey next pulls an academic sleight-of-hand to justify his "scientific" approach: He redefines the word "logic" to suit his thesis. Redefinition, or at least careful definition, of terms is a sharp tool in the academic toolbox. Often it fosters a more precise understanding when the primary mode of communication, language, is inherently too imprecise for the purpose at hand. Judge, though, Dewey's usage:
In its broadest sense, any thinking that ends in a conclusion is logical - whether the conclusion reached be justified or fallacious; that is, the term logical covers both the logically good and and the illogical or the logically bad. In its narrowest sense, the term logical refers only to what is demonstrated to follow necessarily from premises that are definite in meaning and that are either self-evidently true, or that have been previously proved to be true. Stringency of proof is here the equivalent of the logical. In this sense mathematics and formal logic (perhaps as a branch of mathematics) alone are strictly logical. Logical, however, is used in a third sense, which is at once more vital and more practical; to denote, namely, the systematic care, negative and positive, taken to safeguard reflection so that it may yield the best results under the given conditions. [...] In this sense, the word logical is synonymous with wide-awake, thorough, and careful reflection - thought in its best sense. (pp. 56, my emphasis of the last sentence)
Dewey's idea, clearly expressed here, is that only when we are wide awake and spend time carefully reflecting (that is, working through the logical conclusions of our ideas, structuring them and relating them to proven facts) can we be considered to be thinking "in its best sense". One wonders at the temerity of that statement. There is no doubt that such thinking has led our current Western civilization to construct complex and occasionally complete theories of the natural world, invent new materials, machinery and social structures, and to provide unprecedented billions with more food and physical safety than in all previous generations combined. And, as Carl Sagan correctly said, "Skeptical scrutiny is the means, in both science and religion, by which deep thoughts can be winnowed from deep nonsense." But best? How then to account for the human animal's ability to have successfully spread to every continent save Antartica before the agricultural revolution, much less the industrial one? Our hunter-gatherer ancestors certainly did not think in Dewey's best sense. Yet they did survive rather well and did not create nuclear weapons, biochemical warfare, land mines, nor burn or spill oil on industrial scales.
I do not yearn for some idealized past full of noble savages, nor do I believe that humans have fundamentally changed much in short millennia since cultural advances created civilization. But I do believe that "best" is a relative term which one should certainly apply with care. To apply it to one's place and time smacks of, at best, parochialism and, at worst, nationalism or racism.
Taking all together, the book would better have been titled How We Should Think or How I Think We Should Think. The way that Dewey postulates thought is more about how we learn and even then does not match our best current understanding.
John Dewey, like all the members of his intellectual lineage, was brilliantly wrong. His well thought out and finely honed educational theories rest upon a foundation of sand. It is high time that his theories are reviewed and revised for a more monist age. The scientific community now generally accepts the non-theist (if not always atheist) proposition that the mind is housed in the brain and that the brain is made out of the same materials and using the same processes as the rest of the human body. A mind is as much a product of evolution as a heart or toes. Few people could say with a straight face that the scientific consensus is that the mind is a separate creation. Of course, that is not so for the general population.
Not that Dewey was a theist. He was a brilliant iterator of the Pragmatist school of philosophy and apparently an atheist. His conception of education was to maximize the capabilities of the individual in his industrial age. "If these pages assist any to appreciate this kinship" between childish thought and scientific thought "and to consider seriously how its recognition in educational practice would make for individual happiness and the reduction of social waste, the book will amply have served its purpose." said Dewey in his preface (pp. vii, my emphasis). His Pragmatic philosophy seems to have been an American iteration on the British Utilitarian school started the century before by Jeremy Bentham.
The problem with Dewey's poor foundation for reflexive thought is how influential he has been regarding the American educational system. Perhaps this is one reason for America's poor standing in education as judged on a worldwide basis. It is high time that Dewey's contribution to the guiding principles of American education were reexamined. His late contemporary Maria Montessori disagreed fundamentally with him in that she did not think that a teacher must feel their tasks "made heavier in that they have come to deal with pupils individually and not merely in mass." (pp. vii) Contrarily, Montessori argued that "Free choice is one of the highest of all the mental processes" when educating a child and guiding them to make good choices was a necessary process. "To let the child do as he likes when he has not yet developed any powers of control", Montessori said, "is to betray the idea of freedom." Alas, Dr. Montessori's ideas have never found full expression in the United States in spite of her superior research. This has been due in equal measure to the radical nature of her changes, the necessity for significant education of teachers and her unfortunate citizenship in the defeated Italian fascist state under Benito Mussolini.
Dewey's understanding of human consciousness has also not stood the test of time. We have come a long, long way toward an understanding of consciousness in spite of loudly repeated claims to the contrary. The common philosophical refrain of linguists, philosophers and many psychologists is that the key tool of consciousness is the creation, storage and manipulation of symbols is emblematic of the very progress that they deny. Take, for example, these statements by Dewey in 1910 (pp. 170-171):
Three typical views have been maintained regarding the relation of thought and language : first, that they are identical ; second that words are the garb or clothing of thought, necessary not for thought but only for conveying it ; and third (the view that we shall here maintain) that while language is not thought it is necessary for thinking as well as for its communication. When it is said, however, that thinking is impossible without language, we must recall that language includes much more than oral and written speech. Gestures, pictures, monuments, visual images, finger movements - anything consciously employed as a sign is, logically, language. To say that language is necessary for thinking is to say that signs are necessary. Thought deals not with bare things, but with their meanings, their suggestions ; and meanings, in order to be apprehended, must be embodied in sensible and particular existences. Without meaning, things are nothing but blind stimuli or chance sources of pleasure and pain ; and since meanings are not themselves tangible things, they must be anchored by attachment to some physical existence.
Dewey's expansive definition of language allows for Helen Keller's intelligence which would otherwise have been excluded. Such an exclusion would have rendered the neat little theory without meaning in the face of evidence to the contrary.
It is worth comparing Dewey's thesis with Hawkins. Dewey takes quite a behavioralist approach; behavior, in this case language, equals intelligence. Hawkins disagrees:
But intelligence is not just a matter of acting or behaving intelligently. Behavior is a manifestation of intelligence, but not the central characteristic or primary definition of being intelligent. A moment's reflection proves this: You can be intelligent just lying in the dark, thinking and understanding. Ignoring what goes on in your head and focusing instead on behavior has been a large impediment to understanding intelligence and building intelligent machines. (pp. 29)
It is interesting to note that Hawkins so completely refutes Dewey's understanding of the importance of behavior while using the term "reflection" to do it.
The idea of the mental "sign" is finally being overcome. Mountcastle, Hawkins and those neuroscientists who have uncovered the way ideas both concrete and abstract are laid down in the tangled interconnections of the cortex have shown us that the "sign" is merely a collection of activation patterns of neuronal columns that itself adapts to new input. We are close to understanding the algorithm of thought. Indeed, the implementation of Hawkins' admittedly experimental and approximate "cortical learning algorithm" does in fact make solid predictions from streams of input data in a way that is both reminiscent of the way humans predict and different from all other existing forms of artificial intelligence.
And yet, persistent ideas die slowly. Philosophers have argued for many years whether the mind is of a different nature from the brain, and some still do although their voices are less loud. Similarly, philosophers have argued (this is what they do) whether some human thought processes are innate, like instincts, or whether we learn absolutely everything. The view that we do have some innate thought patterns has been well proven to my mind, such as facial recognition or the strong tendency to assign an intelligent purpose to rapid movement in our peripheral vision. These instincts help to keep us alive, safe and social.
The understanding that we have some innate thoughts is known as nativism and comes in two important kinds. The linguist Noam Chomsky has championed the idea of a "Universal Grammar", structures or modules in the brain that are responsible for a child's ability to rapidly learn languages. Others have suggested that the brain is a form of computer, not in its architecture but in its ability to compute. That is, the brain can and does perform computations and it is made of modules that provide computational functions.
Very similar views to Dewey's regarding "signs" in the mind have been expressed in recent times by Chomsky and philosopher Jerry Fodor. Fodor's book The Mind Doesn't Work That Way [Amazon, Goodreads] was a direct criticism and response to Steven Pinker's book How the Mind Works [Amazon, Goodreads] and although I have criticized Pinker in some regards I am much more in his camp than Fodor's. Fodor is a fun read, though. The man is a hoot. But his insistence on the wobbly cairn of mental signs is both dated and increasingly rather silly in the face of new neuroscience. "I think some version of Chomskian nativism will probably turn out to be true and that the current version of New Synthesis nativism probably won't." says Fodor.
Fodor, an MIT professor, is intellectually honest enough to admit that he might be wrong. He states at the opening to his fourth chapter that if cognition is in fact all or mostly modular, as Pinker, Mountcastle and Hawkins say that it is, then "Pinker and Plotkin are probably right about the prospects for New Synthesis Psychology being very good, and I have been wasting your time. (Mine too, come to think of it.)" Fodor might be that rare man who can adapt to yet another revolution in his old age.
Dewey did not have Mountcastle to tell him that the modular components of the human cortex were practically structurally identical throughout, nor Hawkins to prove that prediction could be implemented by emulating that structure. Without proof for those modules, he fell in line with his lineage and attempted to take it one step farther. His reliance on undefined "signs" and "symbols" in the brain to somehow explain thought looks more like mumbo jumbo than science. "Learning," Dewey insists, "in the proper sense, is not learning things, but the meanings of things, and this process involves the use of signs, or language in its generic sense." (pp. 176). He goes on to warn that educational reformers who reject such analysis risk "the destruction of the intellectual life", a serious charge if it had any basis. We are now rather sure that the human cortex has no inherent separation between things and their meanings. It is the associations between similar structures that matter, not some hardwired categorization in our mental hardware.
Were we to jettison this loose and empirical mumbo jumbo with the concrete description of the operation of the human neocortex by Hawkins, something amazing pops out: strong support for the Computational Theory of Mind. The CTM postulates that the brain is fundamentally an information processing system, both enabled and constrained by the same mathematical limits of any computation. Importantly, the CTM implies that we can eventually understand the brain's operations. That is not to say in any way that the brain is a "computer" as the term is currently used. The brain certainly does not implement the von Neumann architecture used by artificial computers of our age.
Neuroscientists call high-level concepts in the brain "invariant representations". Invariant representations ensure that an object in motion is still thought of as the same object, even if it is changing position, moves in or out of shadows, if you can only see part of it at any one time, etc. Invariant representations are created automatically by the cortex, says Hawkins, using exactly the same cortical structures that are used elsewhere. Invariant representations are both the result of senses and memory and used by them to understand the world around us. It is my understanding that we could justifiably replace the "signs" and "symbols" of linguistic theory with the invariant representations of modern neuroscience and lose exactly nothing in translation. I think that we could begin to have a serious scientific conversation about the biology of language.
Of course, Dewey was a certifiable genius and he was not all wrong. Far from it. There is much in his work left to admire if one takes into account the bits that have been surpassed by further research.
To his credit, Dewey seems to be an early adopter of the embodied cognition school which embraced the idea that the mind is of the body. It is unclear whether he was influenced primarily by a close reading of Locke as I suggested earlier or whether he was motivated by his generation's raucous wrestling with Darwin's theory of evolution. Given Dewey's progressive stance on religion (he didn't seem to have one), it is not unreasonable to think that both may have been so.
Dewey suggested that "enlargement of vocabulary" was the key to creating an educated person. He was undoubtedly right about that because associating many words results in more and stronger neural connections between cortical regions. There are other techniques that have such result, and they should be given equal air time in any theory of education. Learning to play the piano is a commonly used technique. Performing research using the scientific method is another. Dewey was barking up a legitimate tree in a forest full of the same species.
Dewey's statement that, "Looseness of thinking accompanies a limited vocabulary" (pp. 181) strikes me as a restatement of the Sapir-Whorf Hypothesis. Most scholars studying how language intersects with thought have generally accepted that it does have some effect for the last twenty years. Current research focuses on how and how much.
Dewey also recognized, possibly through his own experience, the tendency of students to prefer material novel to them than to focus on details of the familiar (pp. 221-222). Pat Conroy agreed in his fascinating memoir of teaching poor children on a South Carolina island, The Water is Wide [Amazon, Goodreads]. Conroy introduced classical music to children who had little interest in classical education and used the stories behind the songs to introduce geography, history, biography. It was a neat trick and an excellent application of Dewey's approach. Children and adults become rapt by the unexpected or unfamiliar because our evolution ensures that we pay attention to changing circumstances. Neuroscience agrees. Unexpected experiences cause a cascade neuronal of activity throughout the cortex which updates old conceptions of normality.
Thomson has illustrated that we as a species can often be tricked into believing something factually nonsensical. One of the ways that religion becomes embedded in our consciousness is by being introduced to so-called "minimally counterintuitive worlds". In line with Dewey and Conroy, he notes that such stories are "an optimal compromise between the interesting and the expected". They allow us to accept new ideas from others readily precisely because they mix what we already know and can verify with something just a bit novel and therefore interesting.
Given this rare alignment between Dewey's theories, teacher experience and neuroscience, it might reasonably seem amazing that the current US trend is to teach students material in relation to their home town. We have every reason to believe that this works against the process of education. Dewey's advice, gleaned from the Associationist concept he picked up from Locke, was to specifically avoid experiences that would form negative associations. Fostering a love of learning is much more important than forcing the bulk memorization of facts.
Dewey railed against the 'Fallacy of making "facts" an end in themselves' (pp. 188). One can almost hear the frustration of students exposed the "ten thousand facts" aproach to learning, then and now. Our most recent foray into educational reform, the so-called No Child Left Behind Act of 2002, ensured that schools would teach to the many standardized tests and thus run, not walk, away from Dewey's ideas. It is little wonder that teachers trained in Dewey object.
The guise of education that teaches only facts relies on an individual having their own insightful moment that connects the facts together into some consistent whole. Such teachers create cognitive dissonance and hope that students will close it for them. Naturally many students will not intuit useful connections themselves.
Toward the end of his long career, Dewey did accept significant criticism of his theories and refined them accordingly. However, he never gave up on reflexive thought, probably because few dared challenge him on that topic prior to the last decade's worth of neuroscience research. His 1938 book Experience and Education [Amazon, Goodreads] redefines his ideas toward the end of his working life.
In that volume, the older Dewey echoes Michel de Montaigne in his concern for the ultimate result of education:
What avail is it to win prescribed amounts of information about geography and history, to win ability to read and write, if in the process the individual loses his own soul: loses his appreciation of things worth while, of the values to which these things are relative; if he loses desire to apply what he has learned and, above all, loses the ability to extract meaning from his future experiences as they occur?
Montaigne said something very similar around 1580 in his Essays when he asked, "What good can we suppose it did Varro and Aristotle to know so many things? Did it exempt them from human discomforts? Were they freed from the accidents that oppress a porter? Did they derive from logic some consolation for the gout?" It is a question that all philosophers must ask as they develop their art. We are all mortal and there will come a time when the carefully constructed neuronal connections in our heads becomes literally fertilizer for new generations. We can but take solace in the fact that those ideas that are written, passed down, discussed, and argued over define not only our culture but have a chance to influence the culture of our descendants. We are still reading Plato, Aristotle, Montaigne and, yes, Dewey. "We are all worms," noted Winston Churchill, "But I believe I am a glow-worm." So was Dewey.
There is of course another way to answer Montaigne's question. Montaigne and Dewey both look at the value of education from the perspective of the individual. But humans are a social species. Society has certainly benefited from the studies of those many unsung heroes through the ages. Many of us currently do escape the pain of illnesses, recover from accidents and even some of the ravages of age. Our lives are both longer and more pleasant than of those who came before us. We owe a mountain of debt to all people who struggle to understand and pass their lessons down. They are the giants on whose shoulders we wobble, squat and sometimes stand.
No comments:
Post a Comment