I wandered into Riverby Books in downtown Fredericksburg one blustery autumn afternoon in the mood for some serendipity. Used bookstores may be going the way of the dodo (that is, headed for an early and forced extinction but destined to be brought back by some mad scientist of the near future in a slightly twisted guise), but they are the best sources of serendipity that I know. The smell of the place is distinctive but hardly uncommon. One may find it in used bookstores, libraries and scholars holes from Timbuctu to Taipei, but never in a chain bookstore. The most concentrated source of the scent of old paper and paper mold may be found in the city block of Powell's City of Books, the Mecca of knowledge nerds. I would pay good money for iBooks or Kindle to produce such a smell.
The two glass windows flanking Riverby's wood-and-glass door invite one to stop and browse. A classic Underwood manual typewriter shows the first line from a classic novel on the top half of a sheet of US Letter. Tell the cashier who wrote it and get 10% off. A selection of banned books are proudly displayed with index cards explaining where and when they were banned. A whiteboard advertises specials, includes cute drawings and exhorts engagement. Inside, cookbooks vie with vintage comics, classic novels with the history of Rome, used philosophy books sold back by university students and a stack of restored Compact Oxford English Dictionaries in the original slipcases and including the reading glasses necessary for the tiny print. The book binder's corner is almost hidden by a display of fine bindings in leather and gold. The rare book cabinets sit just across from the constantly changing novelty shelves that stretch from floor to high ceiling, some at intentionally odd angles. There are stacks of recently acquired books on the stairs, posters and maps on the walls. Original drawings from Maurice Sendak's Where the Wild Things Are catch my wife's eye every time we stroll by. I enter and inhale.
I walk past the novelties on my way to Ancient Egypt, Greece and Rome but something catches my eye. Something almost always does. A copy of The Philosophical Breakfast Club: Four Remarkable Friends Who Transformed Science and Changed the World [Amazon, Goodreads] by Laura J. Snyder sits facing outward on an set of shelves canted sideways right into the natural traffic flow. It is shiny and new with a perfect dust jacket. The cover mirrors Riverby itself with its full height bookshelves, tabletop telescope and photographs of four of the most important scientists of the early nineteenth century, William Whewell, John Herschel, Richard Jones, and Charles Babbage. I knew of the astronomer Herschel and of course Babbage the mathematical genius who invented the first programmable computer, friend to the first computer programmer Augusta Ada King nee Byron, Countess of Lovelace (known as Ada Lovelace). I recalled vaguely that Whewell had coined the word "scientist". Snyder would soon fill in the many missing details.
The book promises to show how these four friends set out to revolutionize the practice of science and then, incredibly, did it. As a history, it has it all; a catchy title, a true and interesting story told like a novel, and subject matter that keeps giving and giving. Snyder proves herself a historian to watch. I would read anything she writes. Thank you Riverby. I feel guilty at the recognition that Snyder won't see any of the paltry $5.98 I hand over for the hardback. I would have paid twice that at Amazon but never would have found it on Amazon. It sold for six times that amount when it was first published in 2011. Riverby and Powell's capture my business by providing the serendipity still missing from online alternatives.
Fortunately for us all, Snyder has written more. Her Web site lists two more books as grand in scope as The Philosophical Breakfast Club: This year brought Reforming Philosophy: A Victorian Debate on Science and Society [Amazon, Goodreads] and next promises us Eye of the Beholder: Johannes Vermeer, Antoni van Leeuwenhoek, and the Reinvention of Seeing [Amazon, Goodreads] about the seventeenth century developments in optics. She gave a TED talk on The Philosophical Breakfast Club in 2012 and somehow manages to find time to teach at St. John’s University in New York. I console myself that she will see some cash from me for the new works.
The Philosophical Breakfast Club begins with a triumphant speech by Whewell at the British Association for the Advancement of Science at the height of his career and soon flashes back to his childhood. Snyder's story hangs off of Whewell, who served in life as the instigator, foil and correspondent for the other three as well as a first-rate scientist in his own right. Snyder uses the the voluminous extant letters of the four men and many of their contemporaries to paint a detailed, personal account of their extraordinary lives. She seems to know as much of the four men as their wives and mothers and perhaps, with the advantage of distance, more.
William Whewell was a mathematical whiz. He was a poor boy who struggled to pay his way during his time as a student at Cambridge. His home parish raised money to get him there and he worked throughout his time there in order to stay. Before his long run as an academic was over, he would become Master of Trinity College, hold a professorship in mineralogy, write physics textbooks, revolutionize the study of the tides with one of the first worldwide data collection efforts, and write a major work of philosophy. Says Snyder,
Whewell suggested "Eocene, Miocene, and Pliocene" to the geologist Charles Lyell as names for historical epochs, and in a few years he would give Michael Faraday the terms "ion, cathode, and anode" for his electrical research.
Besides "scientist", Whewell would also coin the word "physicist", both by analogy with the word artist. He was one of the most fascinating and capable characters you have probably never heard of.
The Reverend Richard Jones was a founder of modern economics. His Essay on the Distribution of Wealth and on the Sources of Taxation [full text] in 1831 was the first major work of economics to rely on both the massive collection of real-world data and the statistical analysis thereof. He served first as a rural curate, interested mostly in gardening, and following his book, as a professor of political economy. He arguably, as Snyder argues, would not have finished the massive work except for Whewell's constant prodding. The one survivingphoto of Jones shows an unhappy man, looking downward, corpulent and clearly depressed. He is the only one of the four whose eyes do not radiate their intelligence and learning.
Sir John Hershel was not just any astronomer. He was the son of astronomer William Herschel, the discoverer of the planet Uranus, and carried on his work on cataloging double stars and nebulae in both the northern and southern hemispheres. Throughout a lifetime of science, he discovered seven planetary moons, conducted a major study in the botany of South Africa and was one of the inventors of photography.
Charles Babbage, another mathematician, has had the most lasting reputation of the four due to the influence of computing on our current age.
Snyder shows herself equally comfortable with history, science and the history of science. The dispassionate eye of the historian mixes nicely with the sympathetic observer of human nature. The only one who is treated at arm's length is perhaps Babbage. Babbage's behavior, especially in middle age, was variously described as "spiteful", "cruel", and "snide". Babbage was "never one to underestimate his own intelligence." Snyder seems to have justification for these descriptions in the letters of her protagonists as they struggled to maintain positive relations with their misanthropic friend.
Here is the voice of the professional historian, introducing the concerns of the original Luddites:
Babbage had realized that just as steam-driven mechanical looms were replacing men and women in the wool and cotton mills, so too could a machine replace the human computers. [He] seemed unconcerned that unemployed English computers would riot like the unemployed wool and cotton workers had done in Whewell's home county of Lancashire in 1813 (it helped that these were still part-time laborers; only after 1832 did computing for the Nautical Almanac become a full-time job). During these labor disturbances and others that took place between 1811 and 1816 in the manufacturing districts in the north of England, displaced workers destroyed mechanical looms and clashed with government troops. The term Luddite was coined in this period, after Ned Ludd, an English laborer who was lionized for having destroyed two stocking frames in a factory around 1779. Babbage would have none of this Luddism. Progress in science and industry required more mechanical means of calculation, as well as mechanical means of factory manufacturing, and nothing would stand in the way of that progress. (pp. 83)
Snyder's clear prose not only introduces Babbage's immediate concerns and the environment in which he worked, but ties Whewell's personal story into the picture, explains a modern term few know the full etymology of and throws in a substantial portion of the early history of computing. All in one easily readable paragraph! She handily ties her history to our own modern day problems, which we instantly recognize in Babbage's.
Here is the science writer, not a scientist, but one who is perfectly comfortable explaining complex ideas:
Newton established that the attractive forces of the sun and moon produced a tide-generating force; but it still remained to be shown how the law of universal gravitation could account for particular tides. His theory as applied to the tides did correctly predict some of the observed phenomena, such as the known fact that the oceanic high tides lag roughly three hours behind the syzygies (when the sun, earth, and moon are aligned, which happens at the time of the full moon and the new moon). But Newton's analysis was inconsistent with many of the observations that did exist, indicating that the relation between the tides and the gravitation between the earth, sun and moon was still not fully understood. In particular, his theory did not provide any understanding of factors that might counteract the attractive force. (pp. 171)
A lessor writer would have dodged the word syzygies altogether, or discussed Whewell's groundbreaking work on understanding tides without bothering to describe why Newton's theory was insufficient. Perhaps she knew of syzygy's many synonyms from her studies, such as the philosophic sense used to denote a close union or alternatively a union of opposites.
Of course, no one is perfect. Snyder credits Whewell for bringing about "a particular vision of what a modern scientist does", especially in his massive data collection efforts. She fails to mention Whewell's contemporary Matthew Fontaine Maury, who used identical methods on the other side of the Atlantic in the same years. There is more to that story than Snyder admits. "Where Maury mapped the oceans, Whewell mapped the coasts", notes Caren Cooper of Cornell University.
In describing the mechanism connecting the number-crunching central "mill" in Babbage's Analytical Engine to the "store" that provided memory storage, Snyder makes a point of referencing a modern expert who calls it a "memory data bus". She apparently missed that the term bus has become the common term of art. She did recognize just how much of Babbage's architecture for his earlier Difference Engine and later, general purpose, Analytical Engine presaged the von Neumann architecture used in the first stored-program digital computers, although she failed to mention John von Neumann by name. Readers may be rightfully surprised to discover that Babbage not only designed the machines but peripheral devices such as graph plotters and printers.
Snyder does not address that most vexing question for large-scale mechanical computing: How did Babbage intend to overcome the large amount of friction caused by the interactions of his many gears? There is modern evidence to suggest that both friction and the inability to finely machine the parts required doomed the Difference Engine. Snyder instead blames only Babbage. "For Babbage, the enemy of the good was always the better", claimed Snyder, which, although no doubt true, misses the point somewhat. A more careful reading of modern scholarship on Babbage's engines would have uncovered his inability to overcome practical limitations. His engines worked at scale only on paper. Modern reconstructions do exist - in software where friction and machining are not limiting.
Babbage, that master of deductive reasoning, was also the odd man out in other ways. The four friends had come together at Cambridge University and as students met regularly for the eponymous breakfast discussions. They decided at Whewell's prodding to dedicate their lives to further the ideas of Francis Bacon. Bacon was the first to codify what we now think of as the scientific method and promoted inductive reasoning as key to advancement in science. Induction is the collection of many specific examples of a phenomenon and then eventually theorizing a general law to fit the collected evidence; it is the philosophic opposite of deduction which proceeds directly from some principle to others by the rules of logic. Of course, scientists now use both induction and deduction to analyze nature, just as was done in the early nineteenth century. The members of the philosophical breakfast club weren't after mundane, plodding science. They wanted to uncover grand new laws of nature. For that, they needed induction. Hershel, Jones and Whewell, following Bacon to the letter, pleaded for the use of induction in their great works. It is easy, they thought, to be led astray by logically proceeding from a bogus presumption. The truth of that risk continues to dog many would-be philosophers and their followers today. John Dewey's presumptive twentieth century educational reforms and their aftermath are as fine an example as we could wish.
And British science in the early nineteenth century did need help. The breakfast club's contemporary Charles Dickens could have been talking about British science in his opening paragraph to A Tale of Two Cities:
It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to Heaven, we were all going direct the other way - in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only.
Babbage eventually broke ranks as progress on his Difference Engine broke new ground. Deduction has its place, after all. Whewell, himself a mathematician of note, felt that Babbage had betrayed the goals of the club. It is hard for a modern computer scientist to agree. The ability to compute on a massive scale, rapidly and with accuracy, has brought new abilities to pursue both pure and applied science.
Complaining about the London Statistical Society's focus on the collection of facts alone, Whewell complained that "they would go on better if they had some zealous theorists among them...Unconnected facts are of comparatively small value." (pp. 153) The computer scientist Jeff Hawkins noted a similar problem in late twentieth century neuroscience research when he said:
There was an unending supply of things to study and read about, but I was not gaining any clear understanding of how the whole brain actually worked or even what it did. This was because the field of neuroscience itself was awash in details. It still is. Thousands of research reports are published every year, but they tend to add to the heap rather than organize it. There's still no overall theory, no framework, explaining what your brain does and how it does it.
It is hard to remember that you came to drain the swamp, goes the joke, when you are up to your ass in alligators. It is a problem that is likely to become steadily worse as science and engineering continue to specialize. Hawkins, like a good inductionist, proceeded to put forward his theory based on the myriad of observations. Whewell, Jones and Hershel did the same thing in their time.
Any Englishman uncertain whether such philosophic differences had practical consequence would have become certain as workhouses were constructed throughout England during this period. The poorest members of society were at first encouraged and eventually forced (after 1834) to pay their debts by moving into workhouses that separated family members and reduced laborers to virtual slaves. This system was unabashedly based on the social theories of the political economists Thomas Malthus and David Ricardo, and the social reformer Edwin Chadwick, a protégé of the philosopher Jeremy Bentham. Bentham founded the philosophic movement of Utilitarianism, which suggested calculation of any proposed course of action to maximize overall happiness. It may be useful to apply such a calculus to an individual, but to apply it to a society one runs the substantial risk of justifying suffering of the socially inconvenient. Philosophy mattered as much in the first half of the nineteenth century as it did in the latter half, when the application of the ideas of Darwin and Marx would instigate social change that would take more than a century to painfully play out.
Babbage's ideas about religion also rankled. All four men were religious and seemingly honestly so. Jones and Whewell were ordained ministers of the Church of England. Herschel and Babbage were wealthy enough not to need to be. Snyder does a fine job describing his house parties in which Babbage used a small Difference Engine was used to count from one to one hundred, one digit at a time. As the machine reached the one hundred mark, it would begin to count by twos.
What you just witnessed seemed almost miraculous, did it not? he asked them. It seemed like the machine would just keep counting by one for an eternity. And yet this is not what occurred. The machine suddenly changed its course and began to count by a new rule.Is this not what we feel when we look at nature, and see wondrous and inexplicable events, such new species arising as others die off? Babbage inquired. Is this not typically explained by supposing that God, our creator, our inventor if you will, has intervened in the world causing this event, outside of the natural order of things? Is this not exactly what we call a "miracle"?
Babbage had set a feedback mechanism to change the function being computed when the value reached one hundred. He had invented the idea of God as algorithm. Personal interventions were not necessary to keep the world running, as Isaac Newton had believed. In this he was expressing, rather uncomfortably for his ordained friends, a quite Deistviewpoint. Snyder never quite hangs the term Deist on Babbage.
Whewell, Herschel and Jones might have been equally disquieted by the comments of Henry Wilmot Buxton, who said of Babbage that he "had taught wheelwork to think, or at least to do the office of thought." (pp. 88) The implications of artificial intelligence have yet to be dealt with today. We have the entrepreneur billionaire Elon Musk warning in 2014 that AI is "our biggest existential threat." It is unclear whether Musk's fears have any basis in reality, but the threat to traditional religion and the concept of the soul is enough to trouble many and was certainly enough to trouble our Anglican protagonists.
Snyder reminds us that most men of science at the time thought of science, indeed - fought for science, to be seen as an equally valid avenue to approach the divine. "The idea was that nature was one of God's two books - the other, of course, being the Bible." says Snyder. She is right to remind us, "At that time, especially in Britain, science and religion were not considered enemies; on the contrary, they were seen as compatriots, both devoted to the appreciation of the Creator of the universe." Times have changed. It has been widely noted that a survey of the U.S. National Academy of Sciences reported that "72.2% did not [believe in a god] and 20.8% were agnostic or had doubts". It is no longer fashionable or required for most scientists to express faith in a national religion.
Snyder notes that Whewell closely followed the British philosopher and cleric William Paley in believing that religion and science were necessarily aligned. His Astronomy and General Physics considered with reference to Natural Theology in 1833 echoed Paley's argument that we so well fit our environment that it could not be a matter of chance. It would take Darwin to show that their argument had it exactly backward: We so well fit our environment because we evolved to fit it well, as Bertrand Russell pointed out in his 1927 essay Why I Am Not A Christian . Paley's "divine watchmaker" has given way to Dawkins' "blind watchmaker".
It was Paley who first used the human eye as an example of something that was too complex to have occurred by chance. His lack of imagination led him to conclude that a creator must be required. This argument is still a mainstay of the argument for so-called Intelligent Design. It has been thoroughly debunked by modern science.
Intelligent Design enthusiasts say that certain biological features are far too complex to have been evolved using Darwin's algorithm. They (especially Lehigh University biochemistry professor Michael Behe) have claimed that since very complex structures (such as the eye) rely on the simultaneous activation of many (tens, hundreds, thousands) of genes, they could not possibly evolve by successive single mutations. This is known as "irreducible complexity", because the complex nature of the structure is not thought to be reducible into a series of single mutations. That is just not the case. A research team led by Chistoph Adami at the Keck Graduate Institute and Cal Tech in 2005 used software modeling to show how complex adaptive traits such as the eye can evolve using only the Darwinian algorithm via the accumulation of mutations. This is possible because random mutations may survive temporarily even if they are not in themselves adaptive - giving other mutations the opportunity to build upon them.
It is interesting to note that the men of the philosophical breakfast club reached their respective zeniths during a period of post-Enlightenment revivalism and before Darwin injected the first serious scientific basis for doubting revelation. A young Darwin features as a bit player in Snyder's book, bringing elements of foreshadowing to her story. It is also interesting that both Babbage's deduction and Darwin's induction resulted in the understanding of powerful algorithms that would together shape our world. One cannot help but think that both camps had their legitimate points.
We stand on the shoulders of these giants, as Isaac Newton famously observed of his previously unreferenced sources. We need not overly worry that they were merely human and had their faults. As Arthur C. Clarke observed, "If an elderly but distinguished scientist says that something is possible, he is almost certainly right; but if he says that it is impossible, he is very probably wrong."
Snyder ends her history with an almost paradoxical plea, that the separation of science from art - and perhaps implicitly of science from religion - be healed. The difference before and after the philosophical breakfast club was the difference before and after the industrial revolution: The age of the generalist was overtaken, rather brutally, by the age of the specialist. She makes much of Herschel's preferential views on art while noting that the members of the philosophical breakfast club began the process of separation, as science moved at their behest toward a professional discipline of specialists. It seems wistful and unrealistic to desire a return of the amateur. And yet, how many scientists, engineers, medical doctors today play a musical instrument, draw, paint, write? Many if not most in my personal experience. If anything, the results of the club's desire to see Bacon's induction at the forefront of science and Babbage's personal journey into dedicated deduction have helped usher in such a golden age of scientific discovery. It is an age when many of us have the opportunity to learn, discover, and play more than every before.
No comments:
Post a Comment