Friday, October 24, 2014

Afternoon with a Rabbi

My friend Marsha and I were discussing anti-Semitism. Well, I was discussing anti-Semitism and pulled Marsha into it because she is both Jewish and unlikely to be offended when I start asking questions. I asked Marsha to introduce me to a local rabbi and she suggested that I speak with Neumiro daSilva. We met in a Panera as he ate lunch.
Neumiro is both a rabbi and a Class A general contractor. His rabbinical duties include officiating at weddings and funerals, Bar & Bat Mitzvah tutoring, Shabbat & holiday service music (he plays guitar), teaching the Torah, consoling the sick, dying and bereaved and general counseling. As a builder, he focuses on home improvements such as custom decks, porches, fences, kitchens, bathrooms and such. I suppose he is building something either way.
Neumiro opens strongly, jumping into theology immediately after looking clearly and openly into my eyes, studying me. He throws out the word spirituality and then notes that it contains the word ritual: "Judaism is ritual". That is a good mneumonic, but not linguistically accurate. The English word comes from the Latin spiritualis, from spiritus "of breathing". Ritual, on the other hand, shares the same root as the English word rite. Both stem from the Latin ritualis from ritus, suggesting a usage of something, particularly in a religious context. The linguistic sleight of hand makes me wary. Was it intentional, meant to be just a useful way to think about Judaism or was he repeating something that he had heard without checking his sources? I may never know. As Eric Hoffer, the longshoreman and author, so keenly observed, "Nothing so baffles the scientific approach to human nature as the vital role words play in human affairs."
Neumiro's point is that one may be Jewish without believing in God. Ritual is the thing. He does believe in God.. It doesn't bother him if others don't. "God believes in you", he says. I have heard that before and am not offended.
Neumiro was raised Catholic in Brazil. His mother converted to Seventh Day Adventism when he was a child. She simply came home one day and announced that the family was now all Seventh Day. Neumiro was confused by that. At 25 years old, Neumiro met a young woman who claimed to be of the Seventh Day faith. She was 20. He brought her home. "Hey, Mom! I met this wonderful girl and she's Seventh Day!" Only later did he discover that she was from a Jewish family. Fern was also American and the happily married young couple moved to the United States. He converted to Judaism eleven years later and eventually became a rabbi.
Neumiro's English is heavily accented in the way that an Iberian language filtered through dense jungle can be. He says the word "Judaism" almost continuously but pronounces it "Judaísmo" as in his native language.
His rabbinical training was taken in New York but required him to travel to Israel to walk the land of the Torah. He was fascinated to discover a "garden of death" perched against the city walls of Jerusalem. The garden was dedicated to a God that occasionally asked for the sacrifice of children. That got him thinking about the story of Abraham's intended sacrifice of his son, Isaac. "Was God testing Abraham or was Abraham testing God?", he asked. I said that I am very comfortable with the idea that Abraham was testing God. Many people, including atheists, test God. We watch people every day ask, "Is God there?" This fits in easily with an atheist perspective. I have never experienced God testing anyone. Neumiro also thinks that Abraham was testing God, which is why he brought it up.
On evolution: Most Jews accept evolution. Neumiro notes that the Catholic Church accepts evolution, but most Catholics and Protestants he knows (and that I know) do not. This is especially true in the United States, but also in South America.
On Genesis: Neumiro notes that woman was God's second act of creation in relation to humans. His point was lost in translation. I sincerely hope that he doesn't infer something along the lines of the New Testament's 1 Timothy 2:12, where Paul puts women in their supposed place, but I'm not sure.
The God of Judaísmo, says Neumiro, is not a loving God. "He is in a bad mood all the time. I like that." If the God of the Torah wants people to move to a land, or an enemy to be destroyed, the people have to do it for Him. "God asks us to do his dirty work." Hah! That's a nice metaphor. Nobody is going to do anything for you. You will need to get out and do it for yourself. It reminds me of Mahatma Ghandi's famous quote, "Be the change you want to see in the world." I tell this to Neumiro and he responds, "Martin Luther King, too. 'I have a dream' and all that. You must do it."
"Elohim is God", Neumiro says into the natural pause, "which in Hebrew starts with an aleph. The aleph in Hebrew is sometimes silent when a vowel is not present. You must seek. That is, you must add a vowel yourself to make a sound in order for God to speak to you." That is a beautiful metaphor but my atheist mind wants to say something along of lines of, "If you need to do all the work yourself and make the sounds yourself for God to speak, what purpose exactly does God serve?". But I don't. To do that would be to fall into the same trap of literalism that I rail against. Be happy with the beautiful metaphor, I think.
How can God be everywhere? Neumiro answers his own question with an analogy. No matter where you stand on Earth, he says, you look up and you can see the sun. I don't mention nighttime. That would be rude and really quite unproductive. If you are somewhere else, anywhere else, in the world, you can look up and see the same sun. That's how God can be everywhere. I'm not sure how to take this one. Were I only a few stars away, or even between here and, say, our nearest neighboring star, our sun wouldn't look like anything special. From the Andromeda galaxy, it would effectively have ceased to exist. And Andromeda is close, astronomically speaking. This story reminds me of how effective analogy can be when restricted to a purely human scale. It would have worked wonders during the Bronze Age, I see that. My modern, educated mind jumps to poke holes in it in terms of scale, and in terms of movement. I know that I am not stationary. The sun is big only from my puny perspective.
Neumiro moves on to the story of Babel. It doesn't matter how it happened, he thinks. Something happened. The story recorded what more than how. It is a perspective that I appreciate. He is trying to find middle ground with me and he succeeded. He probably thought he lost me with God being everywhere.
The story of Cain and his younger brother Abel comes up next. How could it not? We are on a stroll through the twisted paths of the Torah, as if the pages had been flung far into the air and we were coming upon one random one and then another. Judaísmo requires that a witness be produced or a crime cannot be punished. There was no witness to Cain's murdering of Abel, says Neumiro, and so Cain could not be punished. There is the lesson - it is a recording of an aspect of Jewish law.
Was Cain not punished? The Christians and Muslims seem to think so. My Revised Standard Version of the Bible, hardly a canonical reference, states that Cain was "cursed from the ground" and he "shall be a fugitive and a wanderer on the earth." Yet it also says that God marked him to protect him from harm by others (What others? Who are these others that are not Adam and Eve's children - we are not told - another indication that the earliest biblical stories are the folk tales of a tribe), and that he goes to the land of Nod and builds a city. The Islamic version seems on the surface to agree that Cain was a murderer and punished, but its location (in Book 5, section 5, entitled "Cain and Abel - murderous plots against the Prophet") have lead some scholars to suggest that the reference is more to the otherwise well-known story of Cain and Abel and not a retelling. A contemporary allegory may have been the thing.
The mythologist Joseph Campbell hinges his reading on the fact that Cain was a farmer and Abel a herder. God liked the offering of Abel better, which caused the fight in the first place. Campbell dissects the myth in relation to the murder motifs of many early planting myths of tropical societies and comes to the conclusion that the myth is in many ways backward from its peers. "Here the murder motif does not precede, but follows, the end of the mythological age, in contrast to the sequence in all the primitive myths. Moreover, it has been transformed to render a duplication of the Fall motif. The ground no longer bears to Cain its strength and he is to wander on the face of the earth - which is, of course, just the opposite result to that which the ritual murder of the agricultural myth produced." (Occidental Mythology, Arkana Penguin, 1964, pp. 105.) I note that Campbell was also referencing a Revised Standard Version of the Christian Bible.
With due respect to the scholarship of others, I wonder, as I so often do, if the point of the story isn't easier than all that. The story of Cain and Abel undoubtedly was recorded by early Hebrews in a desert climate. Campbell himself notes later (pp. 106) the existence of a Sumerian cuneiform text of c. 2050 BCE in which a goddess prefers a farmer over a herder. "One millennium later", says Campbell, "the patriarchal desert nomads arrived, and all judgements were reversed in heaven, as on earth." One immediately notes the similarity to the Chinese concept of actions on earth mirroring actions in heaven. Perhaps climate change, locally caused by over-farming or irrigation as in so much of the ancient Mediterranean, from the silting of the harbor of Athens to the "granary of Rome" that was once Libya, was to blame for the Sumerian withdrawal and the ascendance of the nomadic herders. But times changed again. The ancestral memory of God might have preferred herders by the time of the recording of Genesis, but Cain was the farmer (again) who lived, albeit in a different land "far to the East". Farmers do not eat as well (they are "cursed from the ground"), and are therefore not as healthy as their herding cousins, but their settled ways allow them to feed more children. Farmers, unlike herders, do not traditionally practice infanticide. Populations rise (Cain founded his city) and eventually the unhappy farmer takes over, much to the herding God's chagrin.
At any event, one must be careful when playing a game of telephone over three millennia. I try this on Neumiro. He is unimpressed.
Why do educated people so often fail to follow up on new perspectives or respond to newly available knowledge? One reason often discussed in the media is that new perspectives can threaten outlooks that are based upon so-called revealed knowledge. Another and possibly better reason have been known to psychologists since the 1940s: The Einstellung Effect. The Einstellung Effect is a natural consequence of the way your brain functions that literally (yes, literally) blinds you to new experiences and better judgements, descriptions or solutions when the one you have settled on is already "good enough". In other words, if your brain is not forced into cognitive dissonance then you don't feel the need for cognitive closure. You just won't revisit the issue. A recent study reported in Scientific American provides more detail on the causes and mechanisms of the effect.
The Einsteillung Effect alone is not sufficient to explain all of cultural history. There are other factors. Another important one may be the cycles in larger society. There would seem to be several of these cycles of longer and shorter time scales. The fascinating 2009 book Secular Cycles [AmazonGoodreads] by Peter Turchin and Sergey Nefedov describes some of economic and especially population-driven cycles without particular reference to religion. Alternatively, the anthropologist Marvin Harris, writing in the 1970s, described a longer religious thought cycle that he dubbed the messiah/witch cycle. In his fun little book Cows, Pigs, Wars and Witches [AmazonGoodreads] Harris traces the rise of "messiah" cultures, where the idea of a "a final and decisive struggle" is promised to "achieve redemption and salvation on a cosmic scale", and "witch" cultures, where (I dare not say "in which") the dominant idea is "a lifestyle dreamwork whose social function is to dissolve and fragment the energies of dissent." Put another way, the messiah/witch cycle takes us from everyone joining to everyone doing their own thing and back again. Monotheistic Judaism and more literalist progeny Christianity and Islam are stalwarts of the messiah archetype. They fight against the witch wherever it arises, be that in post-medieval Europe or in the mid-twentieth century "counterculture".
Do we have any hope of escaping such cycles? Nobody is certain how our unprecedented scientific explorations and creation of technology will impact them. Harris suggested:
I make no claim for the millenarian splendors that will come from a better understanding of the causes of lifestyle phenomena. Yet there is a sound basis for assuming that by struggling to demystify our ordinary consciousness we shall improve the prospects for peace and economic and political justice. If this potential change of odds in our favor be ever so slight, I think, we must regard the expansion of scientific objectivity into the domain of lifestyle riddles as a moral imperative. It's the only thing that's never been tried.
All of this flashes through my mind as Neumiro takes me on a tour of Bronze Age thought. It works. Humans, after all, haven't changed much if at all. Our only real change has been cultural. Our culture is currently more amorphous, more under stress, from the process of globalization as we are forced against our will to recognize that other humans with other ideas are as human as we are. It makes people uncomfortable. Those Bronze Age ways of defining an in-group and, unavoidably, the out-group serve to bind people into a covenant of comfort in a changing and dangerous world.
Neumiro returns to the story of Cain and Abel. "God asks for a sacrifice and then changes His mind. That reminds us that we can also change our minds." Does he mean that God changed his mind because he didn't accept Cain's sacrifice? "Yes." That may be so. It is certainly a plausible reading and, like so much of what Neumiro has said, it is a nice metaphor. The entire conversation reminds me why I get along better with more Jews than Christians. I can appreciate the metaphors while still rejecting their god. They don't generally seem to be upset about that. Christians uncomfortably insist on proselytizing. No doubt that is why there are more Christians than Jews. I still find it annoying.
"You don't need to go to the synagogue to pray.", Neumiro continues. "Life is about feelings." The synagogue is a tool to help you reach those feelings that make you feel in touch with the rest of the world. Fair enough.
"Judaism, reform, orthodox, conservative, changes constantly. They do not believe what they did even ten years ago." I wonder about the rise of fundamentalist Islam and Christianity. Evangelical Christians have certainly changed their thinking dramatically in my lifetime, at least in the US. They are, as a group, much more vocal in politics and thus more insistent upon having their views reflected in the laws of the land. I'm not certain about radical Islam. The radical fringes of both religions claim to be returning to the basics of their religion. I know for a fact that they are doing no such thing in Christianity. But Islam? I suspect that Islam is suffering from its incredibly detailed early literature and history. Islam simply got started late and is better documented than either Judaism or Christianity. Perhaps that gets in the way of people changing their thoughts to match new situations. We have never in the history of the world encountered such a radical global change as we have now, especially in the pace of scientific and thus technological advancement. Those unable to close their cognitive dissonance in a flexible way are naturally stuck. It is easier to burn down the house than map its rapidly changing rooms.
This ability to adapt and change seems to be central to the Jewish experience. They are so few, surrounded by so many, and scattered literally over the face of the Earth. I know several Jews who know of no other Jews in their town. Lucky are those who live near a synagogue. The ability to adapt their religion to their immediate needs has no doubt been both a source of strength and a guarantee of a sort that their is a community out there somewhere to which they belong. Christians and Muslims, each strong majorities in many countries, do not face the world's uncertainties with such resolve. The senseless persecution of members of those religions causes large-scale outcry. The Jews are, outside of Israel, left to adapt.
It is always easier to take the side of a minority. I get along with US-based Muslims better than I do with US-based evangelical Christians and with Indian Christians better than Indonesian Muslims. Certainly US Christians were more careful in their public views during their fractious minority in the Enlightenment than they are currently in their majority. I commented upon this in more depth in my review of Susan Jacoby's Freethinkers: A History of American Secularism [AmazonGoodreads].
It may in fact be time for Islam to revive its "lost tradition of independent thinking", know as ijtihad, as Irshad Manji suggested in The Trouble with Islam Today [AmazonGoodreads]. That would certainly help moderate Muslims combat the growing fundamentalism in their own communities. Harking back to Islam's intellectual golden age of roughly 900-1200 CE instead of ancient tribal rivalries also wouldn't hurt. I have no idea what the Christians could do. Protestant Christianity shows no sign of returning to the authoritarian fold of Catholicism and in its insistence that each individual can reach God directly promotes a sort of democratic acceptance of ignorance that sits poorly with the new science. Protestantism is in fact a form of Harris' witch culture in spite of its messianistic promises.
"That's why I don't like the Talmud!", declares Neumiro. I think back. The Talmud contains a record of generations of rabbinical thinking. It records interpretations of the Torah and how challenges to Jewish life have been met for generations. It is roughly equivalent to Islam's Hadith. I do not know of a similar concept in Christianity. "The rules are so strict! Think about a traffic light. It can be red, yellow or green. Do you speed through a red or yellow light if you have someone behind you who will rear-end you if you stop? The Orthodox say no! You will break a commandment!" He is excited at this point. "The Talmud says 'no'. I say 'of course!'"
I am reminded of my wife's thought that you can make heaven or hell for yourself right here, right now. By stopping at the light, you will make hell not only for yourself, but for the people behind you. You will cause them to repair their car at a minimum. Someone might even be hurt or killed. My atheistic, at least partially Utilitarian, philosophy appreciates Neumiro's position.
The mention of my wife brings Neumiro to his. He is honest about both the joys and frustrations of marriage. He congratulates me with a hearty "Mazel tov!" when I mention that we just celebrated our twentieth anniversary. "We are at thirty", he says. "Do you know that some Jews like to break the glass at weddings? It is not about throwing something away. I tell people that they need to pick up the pieces every day, try to put it back together again." It is yet another living metaphor. This is a fun conversation.
"What does it mean to you that we are created in God's image?", I ask. Neumiro thinks for a moment. "There is a kid's song." He hums for a few seconds. "I don't know how to sing it in English. It says, 'my spirit and my soul'. What is the difference between a spirit and a soul? When you sleep, your soul wanders." Suddenly, we are back to the Bronze Age. Does he mean that it literally wanders or is this another metaphor for dreaming and free association during memory formation? It is hard to tell. I recall a conversation with a Methodist minister in Northern Virginia where I tried to pin him down on Biblical literalism. He squirmed out of it every time like some kind of ecclesiastical snake, not of course in a Garden of Eden way. His genius is to avoid traps from people like me. He let his partitioners believe in a literal Jesus if they wanted to. He challenged no one.
"The spirit is your breathing. God gave Adam life by breathing it into him." I am reminded of the Sanskrit word prana, which means both "life force" and "breath". The concepts are conflated in yoga practice, in many martial arts and apparently in Judaism.
Neumiro is uncomfortable being Jewish in the US. "Christians say that we are going to hell. Jews don't believe in hell." He mentions the 2012 bombings of synagogues in Newark, New Jersey. He says that Jews regularly receive notes from the FBI telling them to "be alert". One gets the idea that Neumiro is alert. "It is more dangerous in Brazil, though, or Europe." He is right. The rise of anti-Semitism in Europe, European Jewish Congress President Moshe Kantor stated in April 2014, is such that "Normative Jewish life in Europe is unsustainable." A late 2013 survey found that almost a third of Europe's Jews were considering emigration as a response to rising anti-Semitism.
The amazingly erudite Christopher Hitchens wrote in his introduction to the 2007 Penguin edition of Rebecca West’s classic but controversial pre-WWII book Black Lamb and Grey Falcon [AmazonGoodreads]:
West reflects on the virus of anti-Semitism, shrewdly locating one of its causes in the fact that “many primitive peoples must receive their first intimation of the toxic quality of thought from Jews. They know only the fortifying idea of religion; they see in Jews the effect of the tormenting and disintegrating ideas of skepticism.”
I found that particularly interesting when laid alongside the recent Pew survey How Americans Feel About Religious Groups that demonstrated widespread uneasiness regarding atheists.  Of course, the Pew survey reflects current US values, not historical ones nor those in other countries. It would seem that there is a strong similarity between those who introduce, intentionally or not, doubt into religious convictions. They, we, are naturally vilified.  The underlying evolutionary mechanism is probably human group dynamics, especially the cohesiveness so necessary to the formation of stable tribes.
In spite of some press reports to the contrary, examples of atheist bashing in the US are not uncommon.  Research by Margaret Downey (Discrimination Against Atheists: The Facts) shows that atheists are "losing their jobs, facing abusive family situations, being subjected to organized shunning campaigns in their communities, receiving death threats, and the like." This should not be a surprise. Just like domestic violence, rape, and other socially stigmatizing crimes, it is under reported. I suspect the same is true for anti-Semitism.
We were interrupted by a woman sitting at the next table. "I'm sorry to interrupt your conversation", she says, "but I just wanted to say that I just love the Jewish people." She was concerned that the US "stand by Israel". I wondered whether she was going to inform us that Jesus had been a Jew and indeed she did. "Israel's boundaries were established Biblically", she informed us, no doubt parrotting her Christian minister. It didn't sound like something she had thought of herself. Her name was Sarah. She was sweet in a rather innocent way. "A few of my friends feel the same way." Neumiro was later to pick up on the word "few". "Not many or most", he said. Neumiro gave Sarah a business card. She thanked him. I hesitated, frankly afraid of being too open with a person didn't know and didn't particularly trust.
"Thank you for interrupting", I tried. "It is not often that a Christian, a Jew and an atheist can have a peaceful conversation."
She made pleasant sounds but returned immediately reassure Neumiro that Jews were fine with her. She was not interested in me. I breathed a sigh of relief. They share a god, I thought. At least they have that in common.
I turned back to Neumiro when Sarah left. "I fear the combination of nationalism and fundamental Christianity. It would seem to be an explosive mix. The combination of national socialism and blaming others didn't work out so well. This could turn the same way if we let it."
"God told Abraham to go to a place", Neumiro replied. "He never gave him a destination. Each step Abraham asks where and God says, 'keep going'. The journey is the thing."
We shook hands and parted friends.

Sunday, October 19, 2014

Book Review: On Intelligence by Jeff Hawkins

On Intelligence [AmazonGoodreads] purports to explain human intelligence and point the way to a new approach toward artificial intelligence. It partially succeeds on the former and knocks it out of the park on the latter.
This is only book that Jeff Hawkins has written. Silicon Valley insiders may remember Hawkins as the creator of the PalmPilot back in the 1990s and, when the owners restricted his vision, he left to create Handspring. Both companies made a lot of money, which is all that matters on the Sand Hill Road side of Silicon Valley. The tech side of the Valley cares more about the fact that Hawkins succeeded in the handheld computing market where the legendary Steve Jobs had failed (with the Newton).
Hawkins' journalist co-author Sandra Blakeslee, on the other hand, has an Amazon author page that scrolls and scrolls.  She has co-authored ten books, several of which have related to the mind, consciousness and intelligence.  Her most recent book, Sleights of Mind: What the Neuroscience of Magic Reveals About Our Everyday Deceptions, was published as recently as 2011 with neuroscientists Stephen L. Macknik and Susana Martinez-Conde and was an international best seller. She has seemingly made a career out of helping scientists effectively communicate thought-provoking ideas.
Hawkins focuses all of his attention on uncovering the algorithm implemented by the human neocortex. Where that is impossible due to lack of agreement or basic science, he makes some (hopefully) reasonable assumptions and proceeds without slowing down. That will strike most neuroscientists as inexcusable. It makes perfect sense to an engineer.
Albert Einstein once said, "Scientists investigate that which already is; Engineers create that which has never been." Or, to quote myself, scientists look at the world and ask, "How does this work?". Engineers look at the world and say, "This sucks! How can we make it better?" There is a fundamental difference in philosophy required of scientists and engineers. Hawkins proved himself to be an engineer through and through even when he bends over backward when attempting to do some science.
There is a particularly useful review on Goodreads that drives a crowbar though the core of the book as if it were the left frontal lobe of Phineas Gage. The reviewer who goes solely by the name of Chrissy rightly points out Hawkins' overfocus on the neocortex.
It became clear that Hawkins was so fixated on the neocortex that he was willing to push aside contradictory evidence from subcortical structures to make his theory fit. I've seen this before, from neuroscientists who fall in love with a given brain region and begin seeing it as the root of all behaviour, increasingly neglecting the quite patent reality of an immensely distributed system.
Chrissy is correct. Hawkins' work is nevertheless critically important. Although the cortex is without doubt only part of the brain and only part of the "seat" of consciousness, his work to define a working theory of the "cortical learning algorithm" has lead directly to a new branch of machine learning. It is one that has borne substantial fruit since the book's 2004 debut.
It shouldn't surprise anyone that Hawkins' reviewers confuse science and engineering. Professionals are often confused on the separation themselves. Any such categorization is arbitrary and people have the flexibility to change their perspective, and thus their intent, on demand. To make matters worse, computer science is neither about computers nor science. It is the Holy Roman Empire of the engineering professions. Computer science involves the creation and implementation of highly and increasingly abstract algorithms to solve highly and increasingly abstract problems of information manipulation. It is certainly different from computer engineering, which actually does involve building computers, and it is also generally different from its subfield software engineering. Of course reporters and even scientists get confused.
Writing On Intelligence has not made Hawkins into a neuroscientist. That does not seem to have been his goal. Hawkins goal was to build a more intelligent computer program - one that "thinks" more like a human thinks. His explorations of the human brain have had that goal constantly in mind.
Hawkins himself states his goal differently, but I stand by my interpretation. Why? Consider what he says (pp. 90):
What has been lacking is putting these disparate bits and pieces into a coherent theoretical framework. This, I argue, has not been done before, and it is the goal of this book.
That makes him sound like a scientist. But he went on to do exactly what I claim. He described a framework and then implemented it as a computer program. That's engineering.
It seems almost strange that it took fully five years from the book's publication for Hawkins' group at the Redwood Neuroscience Institute (now called the Redwood Center for Theoretical Neuroscience at UC Berkeley) to publish a more technical white paper detailing the so-called cortical learning algorithm (CLA) described in the book. The white paper provides sufficient detail to create a computer program that works the way that Hawkins understands the human neocortex to work. Again surprisingly, another four years passed before an implementation of that algorithm became available for download by anyone interested. The Internet generally works faster than that when a good idea comes along. The only reasonable explanation is that a fairly small team has been working on it.
You can, since early 2013, download an implementation of the CLA yourself and run it on your own computer to solve problems that you give it. Programmers normally love this sort of thing. It is interesting to note that the Google self-driving car uses exactly the traditional artificial intelligence techniques that Hawkins denigrates in his first chapter. Hawkins may have come too late for easy acceptance of his ideas. There are entrenched interests in AI research and Moore's Law ensures that they can still find success with their existing approaches. A specialist might note that the machine learning algorithms in the Google car have stretched traditional neural networking well beyond its initial boundaries and toward many of the aspects described by Hawkins, without ever quite buying into his approach.
The implementation is called the Numenta Platform for Intelligent Computing (NuPIC). It is dual licensed under a commercial license and the GNU GPL v3 Open Source license. That means that you can use it for free or they will help you if you want to pay. You can choose.
Hawkins lists and briefs brief critiques for the major branches of artificial intelligence, specifically expert systems, neural networks, auto-associative memories and Bayesian networks. He is right to criticize all of them for not having looked more carefully at the brain's physical structure before jumping to simple algorithmic approaches. The closest of the lot is perhaps neural networks, which is notionally based on composing collections of software-implemented "neurons". These artificial neurons are rather gross simplifications of biological neurons and the networks, with their three-tier structure, are poor substitutes for the complex relationships known to exist in the brain of even the most primitive animals. Still, the timing of Hawkins book was unfortunate in that its publication occurred at the beginning of our current golden age of neuroscience. AI is back and AI research is suddenly well funded again. So-called deep learning networks currently contain many more than the three traditional layers, up to eight or even more. IBM has recently moved neural networks to hardware with their announcement of their SyNAPSE chip that "has one million neurons and 256 million synapses" implemented in silicon. All approaches are currently blooming for AI and are being applied to everything from voice and facial recognition to automatically filling spreadsheet cells to autonomous robots. There is currently less reason for the AI community to investigate, or lobby for hardware implementing, a brand new general approach. None of that makes Hawkins wrong. The human brain is still the only conscious system we know of and neuroscience is still doing a bad job of looking at its structures from the top down.
The largest single criticism of On Intelligence from me is that the cortex Hawkins describes is a blank slate, also called a tabula rasa. We know that the human brain is not. The idea that a mind is empty until filled solely by experience dates back at least to Aristotle. The Persian philosopher Ibn-Sīnā, popularly called Avicenna in Europe - a name still taught in Western universities, coined the term tabula rasa a thousand years ago as he interpreted and translated Aristotle's de Anima. We have known for decades that we are born with a number of innate functions, such as facial perception, so the brain is not a blank slate. Other animals have their own innate behavior such as the fear that many bird species have for the shape of a hawk. Hawkins does address the changing nature of brain function during life but does not even peripherally describe how innate functions fit into his theory.
Hawkins is often criticized for failing to provide a collated list of his assumptions. They are indeed buried in the prose. Hawkins comes right after the book's last chapter by providing an appendix that lists eleven predictions. They are all testable given the right science. Scientists are explicitly asked to validate or repudiate those predictions. A decade later, I am not aware of a comprehensive attempt to do so.
I have attempted to find all of Hawkins presumptions and have listed them here in the hope that they will both help other reviewers and neuroscientists who might pick away at them. All page numbers are from the 2004 St. Martin's Griffen paperback edition. All indications of emphasis are in the original text unless otherwise marked. The assumptions generally flow from the highest level of abstraction to the lowest, as Hawkins mostly does.
1. "We can assume that the human neocortex has a similar hierarchy [to a monkey cortex]" pp. 45. This one not only seems reasonable but is an assumption held by many scientists. It is in line with the many independent threads of evidence from evolutionary theory. Hawkins was intentionally careful when he used the word "similar".
2. "We don't even have to assume the cortex knows the difference between sensation and behavior, to the cortex they are both just patterns." pp. 100. This is actually a negative assumption in that he is not making one. This kind of thinking, determining what assumptions are necessary to a system, is in keeping with Hawkins' coding background. It is an engineering necessity.
3. "Prediction is not just one of the things your brain does. It is the primary function of the neocortex, and the foundation of intelligence." pp. 89. This is Hawkins' central idea and the one that informs not only the book and the implementation of NuPIC but the philosophic approach to his understanding of the brain and its functions. Hawkins relates the traditional AI approach of artificial auto-associative memories and declares, "We call this chain of memories thought, and although its path is not deterministic, we are not fully in control of it either." pp. 75. He proposes that "the brain uses circuits similar to an auto-associative memory to [recall memories]" pp. 31.
Here is also where Hawkins is forced to leave the cortex and venture into its relationships with another area of the brain. He notes the large number of connections between the cortex and the thalamus and the delay inherent in passing signals that way. He declares that the cortex-thalamus circuit is "exactly like the delayed feedback that lets auto-associative memory models learn sequences." pp. 146. He is onto something here, but one must question his oversimplification. The thalamus is also known to be involved in the regulation of sleep and thus almost assuredly implements more than just a delayed communication loop with the cortex.
Eventually he is able to bring his prediction model into sharp focus: "If the cortex saw your arm moving without the corresponding motor command, you would be surprised. The simplest way to interpret this would be to assume your brain first moves the arm and then predicts what it will see. I believe this is wrong. Instead I believe the cortex predicts seeing the arm, and this prediction is what causes the motor commands to make the prediction come true. You think first, which causes you to act to make your thoughts come true." pp. 102. This focus on the predictive nature of the neocortex is key to Hawkins understanding. Either the neocortex implements an algorithm really quite similar to the CLA as described by Hawkins and is therefore a "memory-prediction framework" or he has got it wrong. The predictive abilities of NuPIC suggest that he is on the right track in spite of his many assumptions.
4. Hawkins makes two interesting and useful assumptions for the purposes of developing a top down theory: "For now, let’s assume that a typical cortical area is the size of a small coin" pp. 138 (he does acknowledge there is substantial variation), and "I believe that a column is the basic unit of prediction" pp. 141. Why does it matter to Hawkins how large a cortical area is, much less a typical one? It shouldn't matter to a typical neuroscientist. They take the anatomy the way they find it. Remember though that Hawkins' purpose is to build a more intelligent computer program. He betrays his intent in making assumptions that all cortical regions have fundamentally the same structure (in spite of minor variations that he readily admits are in the literature) and in setting a typical size for an area of cortex. These assumptions will help him to design a computer program that learns in a new way. He is on better footing with the purpose of a cortical column. Cortical columns are indeed very regular in their construction and distribution, a fact that Hawkins dug out of 1970s research and relies upon heavily. It is striking and probably key to any successful high-level theory.
From this point forward Hawkins' assumptions get progressively more technical as he moves toward something that he can implement using existing technology. This may be the most important criticism of On Intelligence even though I personally find it perfectly excusable. Those seeking new neuroscience will be disappointed. Those seeking new and more general ways to approach artificial intelligence will be rapt.
Any review attempting to list Hawkins' more technical assumptions will need to pause to introduce new vocabulary for the general reader. A cortex, animal or human, is the outer layer of the brain. It consists of valleys and folds in order to increase its surface area in the small space afforded it in the skull. Its basic structure is a "cortical column" of six layers. The human brain has "some 100,000 neurons to a single cortical column and perhaps as many as 2 million columns." The Blue Brain Project of the Brain and Mind Institute of the École Polytechnique in Lausanne, Switzerland is currently attempting to model a complete brain, or at least the cortex. They have already succeeded in modeling a rat's cortical column. This is much more than Hawkins attempted, but a top-level theory of cortical function has yet to emerge from the project.
The six layers of a cortical column have many connections to other layers, other columns, other regions of the cortex and other areas of the brain. It is a complex network. Each layer consists of differently shaped cells. Hawkins collected the many, many neurons in a cortical column into functions at each layer. That alone may be a very valuable contribution if it is shown that level of abstraction can be made without sacrificing higher level function.
It will be useful and fascinating to see what emerges from a study of the Blue Brain Project's cortical column models. In the meantime, Hawkins has provided us with a roadmap of questions to ask.
5. Noting the obvious disparity between streams of sensory inputs and highly abstract thought, Hawkins illustrates how a hierarchical set of relationships between cortical areas could produce abstractions ("invariant representations") at the higher levels. "The transformation—from fast changing to slow changing and from spatially specific to spatially invariant—is well documented for vision. And although there is a smaller body of evidence to prove it, many neuroscientists believe you’d find the same thing happening in all the sensory areas of your cortex, not just in vision." pp. 114. Hawkins goes on to take this as written, which is just what he needs to do in the absence of established science in order to build a system.
6. Continuing with the vision system, possibly the best studied areas of the brain to date, Hawkins discusses some of the key regions called by neuroscientists V1, V2 and so on. He says, "I have come to believe that V1, V2, and V4 should not be viewed as single cortical regions. Rather, each is a collection of many smaller subregions." pp. 122. Hawkins is making a rather classic reductionist argument here. The question is not how arbitrary regions are defined or what they are called. The problem in front of our engineer is how they are connected. He needs that information to make reasonable (not necessarily physiologically accurate) assumptions if he is to uncover the mechanisms of the brain's learning system.
7. A region of cortex, says Hawkins, "has classified its input as activity in a set of columns." pp. 148. It is hard to argue with this suggestion given the success of Hawkins' artificial CLA in making predictions without the traditional training necessary to other forms of AI. Further, the cortex gets around limits on variation handling found in early artificial auto-associative memories, "partly by stacking auto-associative memories in a hierarchy and partly by using a sophisticated columnar architecture." pp. 164.
8. There are several assumptions about the detailed workings of a cortical column. "Let's also assume that one class of cells, called layer 2 cells, learns to stay on during learning sequences", says Hawkins (pp. 152). He makes no judgement whether that "learning" is innate or actively learned during life. He doesn't even know that it is really there. Something like it must be in order to make his theory work. That is no criticism! It is instead a testable hypothesis and thus the very model of scientific advancement. It also allows him to build something.
"Next, let’s assume there is another class of cells, layer 3b cells, which don’t fire when our column successfully predicts its input but do fire when it doesn’t predict its activity. A layer 3b cell represents an unexpected pattern. It fires when a column becomes active unexpectedly. It will fire every time a column becomes active prior to any learning. But as a column learns to predict its activity, the layer 3b cell becomes quiet." pp. 152. This might seem unjustified. What would make Hawkins jump to a conclusion in the apparently complete absence of supportive science. The answer is that the engineer clearly sees the necessity of feedback when it is presented to him. There simply must be a mechanism that fills the role or no learning could occur. Hawkins merely suggests a reasonable place for it and encourages the neuroscience community to look for it.
As for the lowest level, layer 6: "cells in layer 6 are where precise prediction occurs." pp. 201.
9. Finally, Hawkins rightly notes some differences between biological neurons and the artificial neurons used in neural networking models. It makes one wonder what IBM implemented on their SyNAPSE chip. How biologically correct were they? Hawkins says, "neurons behave differently from the way they do in the classic model. In fact, in recent years there has been a growing group of scientists who have proposed that synapses on distant, thin dendrites can play an active and highly specific role in cell firing. In these models, these distant synapses behave differently from synapses on thicker dendrites near the cell body. For example, if there were two synapses very close to each other on a thin dendrite, they would act as a 'coincidence detector.' That is, if both synapses received an input spike within a small window of time, they could exert a large effect on the cell even though they are far from the cell body. They could cause the cell body to generate a spike." pp. 163. This is exactly the sort of thing that can have great biologic effect and cause great trouble for overly simplistic implementors. It would seem that Hawkins was careful to avoid this over simplification even while embracing others.
Hawkins has also uncovered something really quite important and almost painfully subtle. Philosophers of mind, psychologists and priests have for centuries argued that the mind is fundamentally different from the body. We moderns have become comfortable with considering huge swaths of the body as mechanistic in nature. We can replace an arm, a leg, a kidney, even a heart for a while. We can insert a pacemaker, or a hearing aide. Surgery can cut, sew and sometimes almost magically repair, replace or augment much of our bodily infrastructure. We tend to view the body as a mechanism, however complicated, as a natural result. The brain, though, the mind, is a different matter. All the neuroscience conducted to date fails to convince most of us that the brain implements an algorithm. We cannot, so it is said, be reduced to an algorithm because that would imply that we could - one day - make a machine with all the abilities of people. Perhaps it would need to have all the rights, too. That scares people badly.
Parts of the brain have come to be accepted as algorithmic. Are you aware that a computerized cerebellum has been created for a rat? That was in 2011. Scientists and engineers are starting to soberly discuss creating such a device for paralyzed human beings.
The slow, painfully slow, admission that the body is a series of devices each of which chemically implement algorithms has been a long time coming. Parts of the brain have now unarguably fallen to the algorithmic worldview. First the ears, the eyes, the entire vision system. The cerebellum. The pineal gland. Hormonal balances. Most of the pons. Hawkins takes on the neocortex and, in spite of Chrissy's complaint, he did find it necessary to include the thalamus in his model. The bottom line is that the cortical learning algorithm is an algorithm. Philosophers of mind fear such a finding.
The idea that thinking is a form of computation dates from 1961 when Hilary Putnam first expressed it publicly. It has become known as the Computational Theory of Mind or CTM. Although CTM has its detractors (especially John Searle's Chinese Room, although that has been debunked to my personal satisfaction) it has become the basis for current thinking in evolutionary and cognitive psychology. The so called new synthesis of CTM is roughly a combination of the ideas of Charles Darwin's evolution, mathematician Alan Turing's universal computation and limits to computability proofs and linguist Noam Chomsky's rationalist epistemology. The basic idea is still the same, that human thought in human brains are algorithms even if they are quite complex ones that we haven't fully deconstructed. The new synthesis is about proving that theory.
"The dissociation between mind and matter in men and machines is very striking", observed David Berlinski in his book The Advent of the Algorithm[AmazonGoodreads], "it suggests that almost any stable and reliable organization of material objects can execute an algorithm and so come to command some form of intelligence."
We know what to do with algorithms. We implement them. It doesn't really matter how. We can implement algorithms in computer software or by creating DNA from a vat of chemicals or by lining up sticks and stones in clever ways. The only difference is the efficiency of the implemented algorithm. Electronic computers give us a way to perform calculations - implement algorithms - blindingly fast but they aren't the fastest way to implement all algorithms. Optical computers can do some things faster. Bodily chemistry, too. Or quantum computing. Each is just another way to implement algorithms be they designed by people or discovered by the search algorithm that we call evolution.
Discovering that the brain is algorithmic is arguably the most important realization of this or any other century. It means we can make more by any means we choose. That will shatter many world views even if Hawkins only got us part way there.

Tuesday, October 14, 2014

Book Review: Freethinkers: A History of American Secularism by Susan Jacoby

Freethinkers: A History of American Secularism [AmazonGoodreads] is exactly what it claims to be: A major piece of the missing history of secular thought heretofor diligently and thankfully incompletely surpressed. Jacoby has joined Jennifer Michael Hecht, author of Doubt: A History: The Great Doubters and Their Legacy of Innovation from Socrates and Jesus to Thomas Jefferson and Emily Dickinson [AmazonGoodreads], and Christopher Hitchens, author of The Portable Atheist: Essential Readings for the Nonbeliever [AmazonGoodreads], as one of the preeminent historians of nonbelieving.
Negative reviews of Freethinkers invariably say that the book is "a slog", "packed with too much information" or describe Jacoby's writing style as "condescending". They are not completely without basis. Later chapters veer from the impassioned and erudite opening in which Jacoby, at her best, quotes the prominant nineteenth century orator Robert Ingersoll as saying, "We have retired the gods from politics" and immediately contrasting that sentiment strongly with President George W. Bush's post-911 sermon thunderingly delivered from the pulpit of the National Cathedral.
As Jacoby sings the praises of the secular founding of the United States, she fails to follow up on the irony of the National Cathedral itself. Why does a secular government have a National Cathedral? The answer goes back almost to the where Jacoby's history starts: 1792. That is when the architect of Washington, D.C., Pierre L'Enfant, set aside a place for a central church, prominantly on the National Mall between the Congress and White House. Congress itself chartered the building of the cathedral in the late nineteenth century and has designated the building as the "National House of Prayer". The Congressional mandate was made in spite of the operating of the building by the Episcopal Church or the ownership by the Protestant Episcopal Cathedral Foundation. Our nationwide conversation regarding the separation of church and state has never been fully resolved.
How many of the positive reviews of Freethinkers (sample: "A must-read for freethinkers!!") are due to the nearly complete removal of freethinking from history books adopted as texts in US schools?  Perhaps many atheists, agnostics and freethinkers have been simply stunned to discover that their thoughts should not make them lonely. I certainly have been.
Stealing history from a subculture does not make them love you. Yet it doesn't keep rulers from trying. Other modern examples of minority cultures losing their history include Australia's shameful Stolen Generations, the kidnapping of children by Nazi Germany for the purpose of "Germanification", stories of European Jews hiding their heritage from their children (who sometimes regained it). Earlier examples abound, especially in areas once controlled by native Americans, conquered in war or folded into empires. The hiding of American secularism from new generations hardly reaches near to these extremes. Nevertheless, a shock of recognition is bound to occur when the light is eventually allowed to shine.
Jacoby sometimes gets so close to her subject that she forgets that the rest of us do not command her source material as well as she. She refers to a petition signed by 'four hundred Quakers, wittingly signed "your real Friends".' The joke may well be lost on those coastal Americans unfamiliar with the Religious Society of Friends, who are only colloquially called Quakers for their founder's admonition to "tremble at the word of the Lord".
I am nearly forced at this point to interlude long enough for the only Quaker joke that I know (perhaps because it is one of very few):
A Quaker farmer is milking his cow. The cow had been walking through brambles and has a tail full of burrs. The cow whacks the farmer across the face with her tail. The farmer shakes his head and continues to milk. The half-ton cow then steps on the farmer's foot. The farmer puts his shoulder into the cow, pushes and extracts his foot. When the milking is finished, the farmer stands up. The cow kicks over the bucket of milk.
The farmer looks at the spilled milk and then walks around to look at the cow in her eye. "Thee knows," says the farmer, "that I may not strike thee. And thee knows that I may not curse thee. But what thee does not know is that I may sell thee to a Methodist."
It may be difficult for us moderns to comprehend the subtly and difficulties inherent in that recitation. What was it like in Western Europe or the fledgling United States at a time when Catholics and Jews were considered as heathan alongside the Deists, so prevalent in the countryside, scores of minor Protestant sects, outright atheists and the smattering of Asian faiths seeping in during the imperial precursor to globalization? So many sects abounded in the early US that it was in almost everyone's best interest to keep the others from gaining too much power. Our situation today is so different partly due to the widespread majority of Protestant Christianity of the evangelical nature. Evangelicals have forgotten, as Jacoby points out, that they were the natural political allies of atheists during the contentious negotiations leading to the Constitution. More to the point, they have fallen into the desire to wield their powerful majority when they have it. John Adams coined the phrase "tyranny of the majority" in 1788 and he knew exactly what he was talking about.
Those interested in learning more about Jacoby's star of the show, Robert Ingersoll, will be interested to learn that Jacoby recently reprised her biosketch in Freethinkers with a complete biography. The Great Agnostic: Robert Ingersoll and American Freethought [AmazonGoodreads] has garnered the same high marks as its predecessor. Jacoby could not have initially benefited from Pulitzer Prize winning journalist Tim Page's distillation of Ingersoll's work in What's God Got to Do with It?: Robert Ingersoll on Free Thought, Honest Talk and the Separation of Church and State [Amazon,Goodreads] since it was published the year following Freethinkers.
Jacoby has many successes in this book and I do not wish to diminish them. There is one aspect of the book that did not stand up to the rest. She seems to have entirely missed the European atheistic influence on her American heroes.
Ingersoll seems to have been a staunch Utilitarian in his philosophy. Utilitarianism judges each course of action on its effects, positive or negative, to the greatest number of people. This ever-so-practical philosophy was pioneered by Jeremy Bentham in his 1789 book An Introduction to the Principles of Morals and Legislation [AmazonGoodreads] and was intended to form the underpinnings of an atheistic moral philosophy to replace the prevailing Judeo-Christian and Deistic philosophies of his day. Amazingly, and to Jacoby's theme, the word "atheist" does not currently appear on Jeremy Bentham's Wikipedia entry.
Ingersoll's following quotation is wholly in line with Bentham's "greatest happiness principle" and with the philosophy espoused by his British contemporary John Stuart Mill:
Happiness is the only good. The place to be happy is here. The time to be happy is now. The way to be happy is to make others so.
Bentham went on to have great influence on the founding of University College London, the first university in Britain to allow for the tertiary education of atheists, Jews, Hindus and members of other religious minorities. The leading universities of the time, Oxford and Cambridge, required membership in the Church of England. He was also the first person to donate his body to science. Prior to Bentham, anatomists acquired their corpses from graveyards with or without the permission of the law or by receiving the bodies of executed criminals. Bentham's body was publicly dissected in UCL's medical theatre by his friend Dr. George Fordyce, the remains later preserved, dressed in his own clothes and placed on permanent display in UCL's South Cloisters. His so-called "auto-icon" may still be seen there today. All of this was in accordance with Bentham's atheism, his Utilitarianism and his last will and testament.
Can we be certain that Betham influenced Ingersoll? Yes. Project Gutenberg contains a freely available copy of Ingersollia , or "Gems of thought from the lectures, speeches, and conversations of Col. Robert G. Ingersoll, representative of his opinions and beliefs". In that useful collection we find this particular gem: 'The glory of Bentham is, that he gave the true basis of morals, and furnished the statesmen with the star and compass of this sentence: "The greatest happiness of the greatest number."' Ingersoll himself admits the influence and yet Jacoby seems to have either missed it or ignored it. Interestingly, Hecht too seems to have missed the connection although she focuses more of her considerable attention on Bentham than on Ingersoll. To miss this connection is to suggest an Americanism to Ingersoll that belies its European inheritance.
Together Bentham, Mill, Ingersoll and other Utilitarians created the atheistic golden age in the nineteenth century. Jacoby's American-centric history inexplicably leaves out this European connection to Ingersoll's life and times.
Jacoby also failed to mention alongside the women's suffrage movement and that of civil rights that the expansion of voting rights has corresponded directly with the simplification of political speech. George Washington's first inaugural address is full of big juicy words as little understood by the common people of his time as by the uneducated of ours: "Among the vicissitudes incident to life, no event could have filled me with greater anxieties than that of which the notification was transmitted by your order, and received on the fourteenth day of the present month." Can one imagine Barak Obama or George W. Bush using a word like vicissitudes and getting away with it? Actually, Obama tried early in the 2007 campaign season and was castigated as being "professorial" as a result. There is little doubt in my mind that the democratic expansion of suffrage brought along the religious concerns of the masses.
The state of atheism and agnosticism in the US today is, as it always been, complicated. I agree with Jacoby that it is difficult in the extreme for an openly atheistic person to be elected to national office. Indeed, the Huffington Post reported earlier this year that 24 US congressmen reported being "privately" nonbelievers but would not say so publicly. American atheists and agnostics are, as a group, mostly in the closet.  This is slightly contrasted by the widely-reported 1997 survey of members of the National Academy of Sciences. The original article in Nature requires a subscription to access, but a summary is available via the Internet Archive's Wayback Machine. 93% of our nation's best scientists reported being either atheist or agnostic (72.2% atheists, 20.8% agnostics). A 2009 survey by the Pew Research Center confirmed a much higher secularism in all scientists compared to the general public. The sense of tension reported by Jacoby between theists and atheists, with agnostics sometimes caught in the middle, has survived intact through to the modern day, with the vastly increased pace of scientific discovery bringing the conversation to an uncomfortable head.
Jacoby's litany of examples from modern America, from Justice Antonin Scalia to President G.W. Bush to Al Gore, in her final chapter is well titled as Reason Embattled. It certainly feels that way. Scalia's repetitive quotations from St. Paul should make any of us wonder how he thinks about the New Testament's 1 Timothy 2:12. However, the First Amendment is still in force even in a period where the Fourth is held to be first among equals. We have not lost yet. Jacoby's call to arms to regain the pride of place for American secularism should be heeded or it is our fault indeed.

Friday, October 03, 2014

Rewriting the US Pledge of Allegiance

The Humanist, a publication of the American Humanist Association, is having a contest to rewrite the US pledge of allegiance.

I have never, and I mean never, been a fan of the pledge. I first encountered the pledge in kindergarten where, along with all the other five-year-olds I was informed that I would stand every morning, face the flag in the corner of the room and recite:
I pledge allegiance to the Flag of the United States of America, and to the Republic for which it stands, one Nation under God, indivisible, with liberty and justice for all.
The phrase "under God" sits poorly with most non-theists. I also never understood the point of pledging allegiance to a flag per se.

The pledge itself contains a respectable amount of baggage for such a short sentence. "one Nation" and "indivisible" are clear reflections of their time. The pledge was written to coincide with the quadcentennial of Columbus' initial landing in New World on an unknown island in The Bahamas. America in 1892 was still reeling from the society-wide shock of the Civil War, the economic rebuilding of the American South and the migration of both native Americans and former slaves into some semblance of citizenship. Although the direct memories of the war were fading from living memory and US patriotism was rising to the crescendo that would culminate in the Spanish-American War, no educated person of the time could have mistaken the words "one Nation" and "indivisible" as meaning anything other than "Let us never again fight a civil war".

The contentious phrase "under God" was snuck into the pledge in 1954 after six years of agitation from various groups of religious bent including both the Sons and the Daughters of the American Revolution, the Knights of Columbus and the organizers of the National Prayer Breakfast.

Arguably (because I will argue it) the best words in the pledge are "with liberty and justice for all". These simple words betray a social commitment that was extreme from the day they were written. Socialist, Baptist minister and pledge creator Francis Bellamy originally wanted to include the words "equality" and "fraternity" in the pledge. They must remind one of the Enlightenment-era motto of the French Revolution (and the current national motto of both France and the Republic of Haiti): Liberté, Égalité, Fraternité. Fear of extending the franchise to women, African Americans and other marginalized people would have kept the pledge from wide adoption had those words been included. Bellamy successfully conned his ship of patriotism past the shoals of overt prejudice.

The America of today may not be less prejudiced but it is certainly more multicultural. Most of the adults of the country's roughly 12% African American and 16% Hispanic or Latino population have the ability to vote, as do most of the 5% of adults of Asian decent. The total population is also more than five times the size of the country in the 1890s.

How can we repair the current pledge? I doubt we can. The fractious nature of the US Congress, especially the House of Representatives, and the vocalism of our religious fellow citizens make any agreement to change the pledge elusive. Some have argued that we should not pledge at all, to a country or to a god. That need not stop a reasonable discourse. It is just likely to stop an official agreement.

My own suggestion is to get back to basics. The original purpose of the pledge was to "instill into the minds of our American youth a love for their country and the principles on which it was founded". This is social engineering if I have ever heard it. Call it instilling patriotism in the new generation or mindwashing or anything else, it is absolutely an explicit form of cultural transmission. So what form of cultural transmission should we wish for?

I would love to get any idea of someone else's god out of the picture. Religion is best when it is silent in public. Nationalism, too, does not tend to serve us well as it leads us into wars that may not need to be fought.

The Preamble to the US Constitution is, for me, one of the cleanest statements of the goals of a secular society in which citizens have liberty to pursue their own interests as long as the general welfare is not infringed. My right to swing my arm, as my father would say, ends at the tip of your nose.

I therefore propose the following restatement of the US pledge of allegiance:
I pledge to work for a more perfect union, the establishment of justice, the insurance of domestic tranquility, the provision for common defense and the promotion of general welfare in order to secure the blessings of liberty for ourselves and our posterity.