Thursday, December 31, 2009

What once loomed large and now seems almost quaint

I think it was Calvin Coolidge who said that if you see ten problems coming down the road to meet you, the likelihood is that nine of them will fall off in a ditch before you ever have to deal with them. Fifty years ago tonight, I was at the home of my grandmother in Memphis, watching the TV news announcer count down to 1960. A few days before, President Eisenhower had announced to the nation on TV the severing of diplomatic relations with Cuba over Castro's increasing closeness to the Soviets. We worried that Russia would bomb us and had drills at school where we huddled under our desks.

Forty years ago tonight, I was at my church's parsonage in Newark with several high school buddies. Back then, we earnestly debated when the Second Coming of Christ would occur (we were all quite sure that it would be long before now) and whether Paul McCartney was really dead (as seemed to be indicated by mysterious clues on a Beatles album). The United States was still fighting in Vietnam, and I and my friends were just short of draft age.

Thirty years ago tonight, I had gone to a bar in Memphis that once stood on the site of Zinnie's East and played classical music and had been invited home by a group of people at the next table. I went with them and sat up all night talking and engaging in a singalong with two people who played the guitar. The Iranian hostage crisis had erupted just over a month before. As I did some last-minute Christmas shopping at Target, I saw footage of the poor hostages bravely singing "Silent Night" in their confinement, as a clergyman of some Orthodox jurisdiction had been allowed in to hold a Christmas service for them.

Twenty years ago tonight, my ex wife and I had a late supper at the Peabody. The Food and Beverage Manager was our next-door neighbor, and as we dined in the old Dux restaurant, just off the lobby, he stopped by our table to greet us. The United States had recently deposed Manuel Noriega, the drug-connected President of Panama. It was my son's first Christmas. The mercury in Memphis had been around 0 on Christmas Day, and the pipes had frozen beneath our kitchen sink. The Berlin Wall had fallen a month or two before, and there was intense debate as to whether this really heralded the beginning of the end for the Soviet empire (a surmise that was confirmed just 2 years later, when the Soviet Union ceased to exist on Christmas Day of 1991) or was merely a retrenchment.

Ten years ago tonight, I sat at home recovering from the flu and nervously looking at news reports online to see whether the countries where it was already New Year's Day were suffering from the anticipated effects of Y2K. I was not one of those who thought civilization would collapse, but I did think that on a scale of 1 to 10, with 10 representing major disasters, we would suffer a 5 or 6. Having spent most of a technical writing career working in information technology environments, I fully believed in the severity of the problem and had almost no confidence in the likelihood of it being fixed in time.

Tonight, my worries are about Iran once more, climate change, Afghanistan, and the future of the Republican party.

© Michael Huggins, 2009. All rights reserved.

Sunday, October 4, 2009

A one-way ticket to Mars? You first!

Physicist Lawrence M. Krauss recently suggested, in The New York Times, that one way to cut the costs of a manned mission to Mars was to make it a one-way trip for the astronauts. After all, Krauss reasons, the original American colonists didn't expect to return to England. Krauss claims that, heartless as his proposal may sound (really, you think?), informal polls among scientists encountered in his travels show that the majority would be happy to go to Mars with no thought of return.

Which only goes to show how extraordinarily intelligent people sometimes seem to lack the sense to come in out of the rain. Krauss is at least properly skeptical of claims that human space exploration is justified by humans being able to conduct scientific experiments better than robots, which is probably not true. His reasoning is that we need to establish ongoing human life on Mars in case something catastrophic happens to our native planet. Considering the almost insane challenges of the Martian environment for human life, Krauss's purposes would be nearly as well served by a proposal to colonize the submerged parts of the continental shelves of Earth's various land masses.

No one should doubt the invaluable additions to knowledge of properly conducted scientific research on Mars. Its age is similar to that of Earth, and it is the most earth-like planet in our Solar System, though the two planets' respective outcomes have been radically different. Whether liquid water exists far beneath its surface and, even more intriguing, whether biological life exists in some primitive form on Mars are important issues for understanding our own planet's history.

But not the issue here. No human could survive unaided on Mars's surface for 10 seconds. Because its atmosphere is of a thinness to be found at altitudes 19 times that of Denver, liquids boil and evaporate very quickly; a human's blood would boil inside him in seconds. Mars's temperatures are generally worse than those on Antarctica, while its thin atmosphere leaves the surface more vulnerable to the Sun's radiation than the hottest parts of the Sahara. Its atmosphere is 95% carbon dioxide, and it is plagued by storms of red dust lasting months at a time and capable of raising dust clouds 25 miles high.

No, the original American colonists did not expect to return to England, but they did expect to hunt, fish, and farm. Mars is not a candidate for any of those things. Indeed, the very need to protect astronauts from the radiation they are likely to encounter simply getting to Mars in the first place (the shortest possible trip would take 7–8 months) might make their transport craft too heavy to make the trip at all! Krauss acknowledges the issue, supposing a crop of astronauts arriving on Mars with their life expectancies radically cut short by radiation exposure. A promising start for establishing human life on the red planet!

Of course we have, or can develop, the technology to create habitable environments on Mars, perhaps beneath the surface. Let's suppose that, to prepare for such an eventuality, NASA constructs an artificial habitat somewhere on Earth and confines a group of male and female scientists there for some months. There is no TV, radio, or internet, and no real-time communication with the rest of humanity, only data links twice a day, as has been the case with the Mars Rover. One can't go outside without heavy protective equipment, and one may not be able to go outside for months at a time, because of the fierce dust storms, raging at speeds of hundreds of kilometers per hour. Oxygen and water must be manufactured, and attempts must be made to begin cultivating edible plants inside. No health care is available except for what can be provided right there. And these conditions will never change because of the very nature of the environment itself.

I suspect the eventual human result would include murder, insanity, sexual slavery, and rationing of food, water, oxygen, and medical care by some dominant personality and his clique to enforce his will on the rest of the group.

But supposing that didn't happen—that humans somehow learned to adapt and coexist in a civilized way completely inside an artificial environment, forever—Mars has two remaining disadvantages. Since it has so little atmosphere, it is much more vulnerable to meteor strikes than Earth, whose atmosphere burns up many of the debris from space that would otherwise wreak havoc here. Finally, Mars is a great deal smaller than Earth, so its likely future as a human outpost must be quite limited.

And lest we forget, in the light of what we know of evolution, the isolation of two previously compatible groups from the same species generally results in each group eventually developing characteristics so different that they can no longer mate and reproduce with members of the other group. The facts of biology tell us that unless we dispatched additional colonies to Mars at regular intervals to add to its human population, there would eventually come a time when the two groups would be of no further use to each other for propagating common descendants.

We are still too haunted by the ghost of Star Trek, which showed humans boldly going, not only to places where man had never been before, but where he simply can't go, unless we discover usable shortcuts through space-time. Mars, the one planet in our Solar System where humans might have even a remote chance of establishing an outpost, has the disadvantages described above. The closest possibility of another Earth-like planet lies in the vicinity of Alpha Centauri, 4.37 light-years from Earth. Light travels 5.8 trillion miles in a single year; at our current 18,000-mile-per-hour speed of space travel, it would take 37,200 years to travel the extent of a single light-year. Which reminds me of a joke by Johnny Carson. "The space shuttle is under warranty...120,000 miles or ten seconds." I think the late lamented king of late-night television had more common sense about this issue than our physicist friend Krauss. In the dawning age of robotics, there is no more reason to send humans to Mars or any other inhospitable environment than there is to station some hapless soul 11,000 miles above the Earth's surface on a GPS satellite to make sure motorists here below can continue to find their way.

© Michael Huggins, 2009. All rights reserved.

Thursday, October 1, 2009

The buck...well, bounces around a great deal

Anita Tedaldi, military wife and parent of five daughters, who has made a name for herself blogging about motherhood, gave up her adopted 18-month-old son when she realized she just didn't feel all that close to him. She told her story to Lisa Belkin of The New York Times, who also appeared with her when Tedaldi was interviewed on The Today Show. Apparently encouraged by her exposure to the world of journalism to be even-handed, Tedaldi gently informed her audience that the failure to bond "really went both ways." Well I'm all for holding kids accountable, certainly.

There is the awkward matter of Tedaldi having outspokenly criticized another adoptive couple, in print, for doing pretty much the same thing just last year, but, as Lincoln once observed, "The dogmas of the quiet past are inadequate to the stormy present." Meanwhile, the U.S. Military, who owns the web site on which Tedaldi's earlier article appeared, is obligingly treating the matter about like they did the death of Pat Tillman; the text is no longer there.

I read a couple of years ago the troubling story of a single mom in England who adopted an African girl about the same age as the mom's biological 7-year-old daughter. If her account is to be believed, she did everything she could to welcome the adopted child and blend her into the family, to no avail. Eventually, the adopted girl's hostility, not only toward the mother, but even more so against her adoptive sister, reached a point at which the mother feared for her biological daughter's safety. With tremendous reluctance and chagrin, she made the decision to give up the adopted child. Perhaps there was nothing else she could do.

I certainly don't wish for little "Baby D," as Tedaldi refers to her adopted son, to grow up in a house where his closest caregiver is continually judging his bonding skills and finding them wanting; he deserves better, and I hope he is placed in an emotionally healthy home. I could even respect Tedaldi if, chastened by her experience, she took time off from blogging about motherhood for a period of reflection. But we must be realistic; book deals and appearances on Oprah wait for no one. Who knows but that one day the little tyke may pen his own book about "Mommy T" and the strange mismatch between her blogging skills and her nurturing abilities.

This week's other poster child for forgiving one's own mistakes and blowing off the stodgy critics is Roman Polanski, on whose behalf over 100 luminaries of the entertainment world have signed a petition demanding his immediate release from custody, following his recent arrest in Switzerland. These include Woody Allen, whose nude photos of his adopted stepdaughter broke up his long-time partnership with Mia Farrow, and the noted moral philosopher Harvey Weinstein, who can see more clearly than most of us that Polanski was sufficiently punished for his "so-called crime" with a 30-year inability to attend Hollywood parties.

As is well known, Polanski accepted an unchaperoned visit from aspiring 13-year-old model Samantha Gailey at the home of Jack Nicholson (never mind!) in 1977, photographed her nude, plied her with champagne and quaaludes, and then sexually assaulted her, ignoring her repeated protests and requests to leave.

No one but Hollywood libertines are in serious doubt as to the hideous nature of Polanski's actions that night. Yes, I know future Chief Justice John Marshall started courting his future wife when she was 14 and Marshall was 26, but that was in a day when Marshall would have been shot by her outraged father had he so much as kissed her and not followed through shortly after with a trip to the church to make good. And it may be that 15-year-old Nastassia Kinski acted with perfectly free choice upon beginning a sexual liaison with Polanski; frankly, if I had a maniac like Klaus Kinski for a father, I too might find even Polanski's company a desirable alternative.

Polanski's actions with Gailey, in any case, were completely beyond the pale, and he was rightly convicted. The moral issue is clear. What is tangled is the legal issue, an entanglement caused by the egregious misconduct of the late judge Laurence Rittenband, who first approved, and then gave every indication of intending to renege on, a plea bargain supported by the victim's own family. Rittenband seems to have done this, moreover, on the advice of a District Attorney who wasn't even involved in the case, itself an instance of judicial misbehavior. In desperation, Polanski fled the court's jurisdiction and then went abroad, which was another crime added to the one for which he had already been convicted.

If Polanski's celebrity status should not win him special treatment, neither should it have made him the special victim of a judge's personal pique, in violation not only of judicial ethics but of an agreement that the victim and her family had acknowledged was in her best interests. The larger legal issue is whether, having reached a court-approved plea bargain, a defendant for any crime, at any level of wealth or social prominence, should have to wonder if the court will honor its own agreement or decide, on a whim, to suddenly "get tough."

Polanski is apparently an unrepentant reprobate, and one could wish to see him humiliated and abused as his victim was that night all those years ago. But the law should serve justice, not become an instrument of popular revenge. If they wanted his hide, the court should have rejected the plea bargain and insisted on imposing the maximum sentence to begin with. If a foolish, publicity-hungry judge can do this to a celebrity, what might he do to any of us? Polanski's original sentence was for time previously served; to this, a reasonable penalty of additional time should be added for having fled legal jurisdiction.

© Michael Huggins, 2009. All rights reserved.

A bit different from "Jingle Bells"

I've been listening to one of my favorite pieces from Bach's Christmas Oratorio: Schlafe, Mein Liebster ("Sleep, my dearest," imagined as a melody sung by the shepherds to the Christ Child). What if they played this in department stores? The lengthy and sustained development of the theme, so demanding on the baroque soloist, might just about equal the length of one's wait to be checked out by a sales clerk, and meanwhile, beguiled by the peaceful and contemplative mood induced by the music, you might forget to buy anything at all!

I think Macy's set a record of sorts about 3 years ago by starting to play Christmas music around September 10th. Of course, denouncing commercialism at Christmas is a favorite trope the world over, but I was startled to learn, a few years ago, that Rudolph, the Red-Nosed Reindeer was created by a man whose wife was dying of cancer at the time and who sought to divert their small daughter. Jack May, a copywriter for Montgomery Ward department stores, was asked to write a jingle as a promotional gimmick and came up with Rudolph. His only hesitation was that the image of a red nose was popularly associated with drunkenness, but he had an artist friend sketch a deer with a shiny nose, which sold his employers on the idea. Eight years after the song's successful release, he persuaded the store to assign the royalties to him, so that he could discharge the medical bills left over from his wife's death.

© Michael Huggins, 2009. All rights reserved.

Monday, September 28, 2009

Global dimming?

How ironic that Josiah Franklin wanted his son Benjamin, the future discoverer of electricity, to follow Josiah's own trade of soap and candle maker, removing his son from school for that purpose when Ben was just ten years of age. As a child, fascinated by the Founding Fathers, I sometimes regretted that I had not lived in the 18th century, but as someone born in the 20th century, I am too used to the conveniences of bright light. A Christmas Eve candlelight service is all very well, but imagine having that kind and degree of light as one's sole illumination all the time. This scene from Kubrick's 1975 film Barry Lyndon gets the look about right. I can understand how Dr. Johnson had to sit so close to his candle for reading that he would singe his wig; what seems nearly incredible is the idea of men and women of that day reading and playing cards by the hour without going nearly blind.

I thought of that when I read the following from Amy Myers Jaffe of The Economist, quoted in the current issue of The Week:

To replace the global energy produced today by fossil fuels, we would need to build 6,020 new nuclear plants across the globe, or to produce 133 times the combined solar, wind, and geothermal energy currently harvested. Barring such a “monumental” transformation, we’re stuck with oil—or with “walking.”

Or candles. The problem is that it takes many candles to equal the illumination of a single bulb, and candles emit more carbon.

It gets worse, and more ironic. The same issue of The Week quotes The New York Times as saying that

To satisfy the exploding worldwide electricity demand caused by flat-screen TVs, game consoles, personal computers, and other gadgets, nations will have to build the equivalent of 560 coal-fired power plants, or 230 nuclear plants, over the next two decades. The average American now owns 25 electronic products.

I read once that if the whole world enjoyed the American standard of living, it would take the resources of three planet earths to support such consumption. Now imagine the world going dark for the sake of the Xbox, Twitter, and flat screen TVs!

And speaking of differences between the 18th century and our own, if an educated man of that day could be resurrected in ours and read the following, which opened an auto review that I read this evening, I think he would quickly ask to be reentombed:

Retirees love Cadillac’s flagship DTS, and the CTS goes up against sporty European rivals, but the SRX is taking on the Lexus RX 350 in the crossover SUV market.

© Michael Huggins, 2009. All rights reserved.

Sunday, September 27, 2009

A single spot on a pristine surface

One doesn't know whether to laugh or cry upon reading the word euphemisms, in the New York Times obituary of William Safire, misspelled as "euphamisms." The writer, Robert D. McFadden, is not likely to have responded to my profile on dating web sites, when I used to use them; but his sisters-in-kind used to write that they enjoyed "quite" evenings at home, which immediately caused me to stop reading.

Safire himself is probably laughing at McFadden's gaffe. It reminds me of Garrison Keillor, whose Writer's Almanac program I greatly enjoy, telling us every April 9 about Lee and Grant meeting "at the Appomattox Court House" to negotiate the surrender of the Army of Northern Virginia. They didn't meet in a court house but in the parlor of Wilmer McLean's farm house; "Appomattox Courthouse" was simply the name of the nearest town!

On the other hand, I liked it very much when Keillor described one of the last acts of William Shawn, longtime editor of The New Yorker:

"Four days before he died in 1992, Shawn had lunch with Lillian Ross, and she showed him a book cover blurb she had written and asked if he would check it. She later wrote of that day, "He took out the mechanical pencil he always carried in his inside jacket pocket, and ... made his characteristically neat proofreading marks on a sentence that said 'the book remains as fresh and unique as ever.' He changed it to read, 'remains unique and as fresh as ever.' 'There are no degrees of uniqueness,' Mr. Shawn said politely."

I had not known that Safire arranged Nixon's famous "kitchen debate" with Kruschev and had forgotten that, as a former White House speechwriter, he was apparently the source of the phrase made famous by Spiro Agnew: "Nattering nabobs of negativism."

The ability to convey a great deal with a few well-chosen words is all too rare. I greatly admire the opening paragraph of James Gleick's biography of Sir Isaac Newton:

Isaac Newton said he had seen farther by standing on the shoulders of giants, but he did not believe it. He was born into a world of darkness, obscurity, and magic; led a strangely pure and obsessive life, lacking parents, lovers, and friends; quarreled bitterly with great men who crossed his path; veered at least once to the brink of madness; cloaked his work in secrecy; and yet discovered more of the essential core of human knowledge than anyone before or after. He was chief architect of the modern world. He answered the ancient philosophical riddles of light and motion, and he effectively discovered gravity. He showed how to predict the courses of heavenly bodies and so established our place in the cosmos. He made knowledge a thing of substance: quantitative and exact. He established principles, and they are called his laws.

The prose is lithe and supple; Gleick's longest sentence, the second, is a series of short clauses and is immediately followed by the elegant eight-word "He was chief architect of the modern world." He uses three-word series but does not overuse them, as Samuel Johnson sometimes did. In 142 words, there are only 15 of 3 syllables or more—only 2, quantitative and philosophical—of 4 syllables or more. The tone is intimate and yet somehow commanding, making the reader feel as if he has encountered something important that ought to require his full attention.

C.S. Lewis would have quarreled fiercely with Gleick's assertion that knowledge became "a thing of substance" by being "quantitative," but he shared Gleick's talent for powerful writing, intelligent and yet accessible to the general reader. As someone who no longer agrees with Lewis's beliefs in the most important things, I was still startled, this past week, to hear an intelligent friend who shares Lewis's faith, remark that Mere Christianity was so written that my friend couldn't really consider it very clear or engaging. Out of curiosity, I reread it for the first time in about 15 years, surprised to find it available online in .PDF format.

I remembered Lewis's flaws well enough but had forgotten some of his strengths. I can't finally agree with the book but found it quite gripping in places. As to writing style, Lewis knew the value of pithiness as much as anyone, but was still a very perceptive critic of that quality used for meretricious ends. I have often enjoyed, and still agree, with his comments on Bacon's Essays from Lewis's History of English Literature in the 16th Century:

"Even the completed Essays of 1625 is a book whose reputation curiously outweighs any real pleasure or profit that most people have found in it, 'a book' (as my successor admirably says) 'that everyone has read but no one is ever found reading.' The truth is, it is a book for adolescents. It is they who underline (as I see from the copy before me) sentences like 'There is little friendshipe in the worlde, and least of all betweene equals': a man of 40 either disbelieves it or takes it for granted. No one, even if he wished, could really learn 'policie' from Bacon, for cunning, even more than virtue, lives in minute particulars. What makes young readers think they are learning is Bacon's manner; the dry, apophthegmatic sentences, in appearance so unrhetorical, so little concerned to produce an effect, fall on the ear like oracles and are thus in fact a most potent rhetoric...."

In another place (I thought it was here but can't find it), Lewis says something like "Nothing could be less practical than the desperate practicality of Bacon's maxims." Actually, it may be just as well that poor Lewis has gone on to his reward; were he alive today, so far from complaining about the too-great attraction of the Essays for adolescents, he would be chagrined to learn that the likelihood of high school students appreciating or even comprehending Bacon's work on any grounds is considerably less than it was in Lewis's own day. In fact, I had forgotten until writing this that one night, about 10 years ago, some young person instant-messaged me on AOL® out of the blue, having seen the word "writer" in my profile and wondering if I could help her understand, what she had been given as homework, Bacon's essay "Of Truth." With more charity than sense, I gave her a long explanation of it, at the end of which she said "Yes, but what does it mean?"

© Michael Huggins, 2009. All rights reserved.

Monday, March 2, 2009

What shall it profit a man, if he gains the world and has no clue?

I won't say I'm glad I'm not wealthy. Samuel Johnson once rebuked his longsuffering friend, Hester Thrale, for repeating a rather fatuous moral from David Garrick: "I'd smile with the simple and feed with the poor." Johnson rightly replied, "Nay, madam, I'd smile with the wise and feed with the rich," although he might have added that one doesn't always find both in the same place.

When West Virginia businessman Jack Whitaker won his record lottery payout in 2002, I noticed that his granddaughter's name was Brandi Bragg, and since Bragg is my maternal grandmother's maiden name, I facetiously suggested that my family call Whitaker and tell him we were long-lost cousins. Of course, his own buffoonish decline since and poor Brandi's pathetic end just 2 years later are a cautionary tale about the influence of riches without a sufficient sense of purpose.

On the two occasions in my own life when I suddenly acquired rather sizable cash windfalls, I noticed that, if anything, my new and temporary fortunes only made me even more irascible and peremptory than usual, so I'm not sure that wealth would be my best condition. I would at least want to avoid the frame of mind of the two museum-goers that I read about in The American Scholar 20 years ago: seeing some priceless artifact, an Etruscan drinking vessel or the like, one expressed her admiration for it, while the other replied, "Yes, but if I bought it, where would I put it?"

I was reminded of this today on reading Peter Plagens' survey of the impact of the recession on the world of contemporary art, published a week ago in Newsweek:

Right up until last September, even the greenest postgraduate painter showing for the first time in a barely reputable gallery was asking—and getting—$10,000 to $20,000 per picture. The number of still-living (not to mention merely middle-aged) contemporary artists commanding a cool million dollars for a single work at auction is edging toward 100. Anecdotes about art-world excess are legion. A collector at an art fair was shown a previously undiscovered canvas by a midlevel abstractionist from the 1960s and told that the price was under $100,000. "Well, I suppose I could enjoy that," she said to the dealer, "if I were poor."

Well we must all keep up our standards, certainly.

I would like to have enough money not to worry about whether Courvoisier is costing me too much, to buy books from the 17th, 18th, and 19th centuries, and to collect black and white photography and antique clocks. I don't know if this will ever happen. But I certainly want to avoid the trap of the "vanishing wealthy" that I heard interviewed a few years ago in an investigative feature on NPR.

The reporter, an enterprising young woman, asked who was considered financially well off in today's society. First, she approached a two-paycheck couple in a nice suburb in Northern New Jersey, with a combined income of $100,000 (which is most certainly not wealthy, especially if you're raising a family). The wife replied, "Well off? Good heavens, no! Sure, we have our kids in private schools and give them music and ballet lessons, but we live from paycheck to paycheck."

Next, the reporter approached a businessman in the same area with an income of $325,000. "Well off?" the man exclaimed, clearly annoyed. "Are you kidding me? You should see the taxes I pay!"

Finally, the reporter talked her way into a 4-storey brownstone, the single-family home of a $1 million-a-year investment banker and his wife and children. "Wealthy?" the wife reflected. "Well no, not really. I actually don't have that many designer dresses in my closet."

The reporter was almost speechless. "Well if you're not wealthy," she exclaimed, "who in the world is?

The wife replied, "Oh, I don't know—perhaps someone with $20 or $30 million."

For another interesting look at the contemporary art world, see Who the #$&% is Jackson Pollock, the story of a feisty, truck-driving grandma from the Midwest who became convinced that her $50 thrift shop purchase was a genuine lost Pollock (of whom she had never heard). Needless to say, the art world, including still-living patrons and collectors who had known Pollock, were having none of that.

© Michael Huggins, 2009. All rights reserved.

Tuesday, February 10, 2009

No Rime for the Ancient Naturalist

Why did Darwin lose his taste for poetry late in life?

That may seem like a strange question in our day when, for many of us, the chief experience of poetry was memorizing Paul Revere's Ride and reciting it in grade school. Darwin, of course, born 200 years ago, grew up in a culture where daily exposure to the poetic cadences of the King James Bible and the Anglican Book of Common Prayer was standard, where acquaintance with the works of Shakespeare and Dryden was expected of any educated man and the poetry of Lord Byron was fashionable. Former President John Adams, in his 80s when Darwin was a child, reread the complete works of Shakespeare every year; just 35 years ago, veteran journalist Arthur Krock recited Thackeray's whimsical Ballad of Buillabaisse for a young visitor, having read it once 50 years previously.

One expects the senses and appetites to diminish with age, but hopefully, never the taste for art, music, or poetry. One need not argue that Darwin's scientific interests were a bar to appreciation of the Muse; it was on the voyage of the Beagle that he took along a volume of Milton and read through Paradise Lost.

Yet in his old age, Darwin penned this forlorn confession in a letter to his wife, Emma:

Up to the age of 30 or beyond it, poetry of many kinds…gave me great pleasure, and even as a schoolboy I took intense delight in Shakespeare…. Formerly pictures gave me considerable, and music very great, delight. But now for many years I cannot endure to read a line of poetry: I have tried to read Shakespeare, and found it so intolerably dull that it nauseated me. I have also almost lost any taste for pictures or music.… I retain some taste for fine scenery, but it does not cause me the exquisite delight which it formerly did.… My mind seems to have become a kind of machine for grinding general laws out of large collections of facts, but why this should have caused the atrophy of that part of the brain alone, on which the higher tastes depend, I cannot conceive.… The loss of these tastes is a loss of happiness, and may possibly be injurious to the intellect, and more probably to the moral character, by enfeebling the emotional part of our nature.

Of course he was 30 when he made his voyage. This loss of aesthetic enjoyment apparently became part of the Darwin legend, especially after the publication of some of his private correspondence, to the point where his son, William, felt compelled to deny it at a Darwin Centennial gathering in 1909.

Unacquainted with any but the bare outline of Darwin's life and career, I knew none of this—the early appreciation of poetry and its subsequent loss—until I read today that Darwin's great-granddaughter, Ruth Padel, had published a poetic biography of her distinguished ancestor; the review, in that noted journal of the arts, The Economist, was favorable, though I was keeping my fingers crossed, imagining filiopietistic hagiography buttressed with bad verse (how many rhymes are there for fossil?).

Fortunately, it seems I was wrong, if this excerpt from her book is typical:

The deck is dazzle, fish-stink, gauze-covered buckets.
Gelatinous ingots, rainbows of wet flinching amethyst
and flubbed, iridescent cream. All this
means he's better; and working on a haul of lumpen light.

Polyps, plankton, jellyfish. Sea butterflies, the pteropods.
'So low in the scale of nature, so exquisite in their forms!
You wonder at so much beauty - created,
apparently, for such little purpose!' They lower his creel

to blue pores of subtropical ocean. Wave-flicker, white
as a gun-flash, over the blown heart of sapphire.
Peacock eyes, beaten and swollen,
tossing on lazuline steel.

Whatever her other accomplishments, Padel is certainly a poet.

But what of her poor great-great grandfather? The religious critic has no trouble seeing, in Darwin's loss, a just requital of his supposed offense against faith; the man whose works imply denial of Divine creation ends by seeing part of his own humanity wither away.

Even in a less orthodox context, I can imagine Coleridge casting a Darwin-like figure as a ruthless hunter whose scientific inquiry, like a crossbow, transfixes and kills the creatures he studies, making everything dead and dry in proportion to his knowledge.

I don't know what happened. I would like to think that, like the man utterly convinced of a fact that consumes and shapes his entire being, as described in Emerson's essay on Character, he came to need nothing else but this knowledge that changed everything—yet he describes himself as having suffered a loss. I would like to think that, like the yogis who achieve the state of nirvikalpa samadhi, described as an entrance into unitary consciousness from which the adept never returns, he reached a point where any works of the imagination seemed feeble and derivative compared to the reality that he had come to know intimately through his studies.

But Darwin himself describes his state as unhappy, and in any case, all this is uninformed speculation on my part.

I am certain that if rejection of religious faith leads to desertion by the Muse, we would have a hard time accounting for the poetic power of A.E. Housman, whose unbelief was, if anything, even more definite than Darwin's own. I have read no biographies of Darwin and have nothing but the evidence of his own words; if we must take them at face value, I am sorry for what happened to him but grateful for the unrelenting pursuit of the truth; as Dobzhanksy said of natural selection, "Nothing in biology makes sense without it."

© Michael Huggins, 2009. All rights reserved.

Tuesday, January 20, 2009

It was the minor moments that counted

One of the very best things about today's inaugural ceremony was the closing prayer by Rev. Joseph Lowery, a veteran of the civil rights struggles of 40 years ago. Lowery, who has more gravitas in his little finger than the simpering Rick Warren does in his entire body, gave an eloquent benediction that made one mercifully forget the clumsy "poem" by Elizabeth Alexander that preceded it, gave the most honorable and dignified presentation possible of the new President's commitment to govern the nation by the ideals of his faith, and, at the end, erased Warren's comically condescending attempt to be inclusive to Jews and Muslims.

As to dignity, I don't know what possessed the Chief Justice of the United States, who is my age, to act in a way that was just this side of the president of a local high school student council, overwhelmed at the opportunity to be at a grand event and misquoting the oath of office to the point that Obama, self-possessed as always, was reduced to staring at him in dignified, waiting silence, until he got it right. I can only hope that Roberts, who seems to have a well-deserved reputation as a distinguished jurist, admired by right and left alike, is better at conducting sessions of the Supreme Court. Speaking of the Supreme Court, it was interesting, as Aretha Franklin ascended the podium, to see the brutish mug of Antonin Scalia right behind, her, staring out at the world with his customary look of belligerence and self-complacency.

Warren, who doesn't belong within 10 miles of any occasion to which the words "grand" or "solemn" might be attached, reminds me of someone who intends to sign me up for a multi-level marketing plan and, when he learns that I prefer reading, assures me, with a wink and a nudge, that he can probably get me a good deal on a set of Reader's Digest Condensed Books (so you can get through them faster!). His prayer did, indeed, contain some good things about the hopes and struggles of the American people, but it was destroyed by the cringe-inducing climax, in which he said "I pray this in the name of the one who changed my life, Yeshua, Isa, Jesus," etc. Technically, one can't fault a Christian minister for offering a prayer in the name of Jesus, which is all but a formal theological requirement (although fellow-Protestant Lowery simply ended with "Amen"), but to assume, as Warren must have, that he would somehow make Jews and Muslims feel better by including Jesus's Jewish name or the name by which he is referred to in the Koran (where, of course, he is referred to as a prophet only and not worshipped as divine) was astonishing in its fatuousness. There are times, as Warren perhaps has yet to learn, that the best way to show awareness of something is a prudent silence.

Obama himself gave a competent and workmanlike speech, as he always does, though little in it rose to the level of anything that could be called inspirational, and I can only assume that he had let Al Gore's speechwriter contribute a phrase or two when he ran into that clumsily worded patch in which he said "These things are subject to data, statistics, and analysis"—good God! It's probably a good thing the statue of Lincoln sitting in the Memorial down the Mall could not come alive at that point, or he might have uttered something hardly in keeping with the decorum of the occasion—or, better still, spat a marble gob of tobacco juice into the Reflecting Pool to give that part of the speech a fitting response. I turned the TV off after about 12 minutes, reflecting that watching Obama speak reminds me of what Emerson said about the elder William Pitt: "It was said of the Earl of Chatham that there was something finer in the man, than in anything he said." Obama inspires, all right, but it is by the impression he makes, more than by what he says. There was more applause when he appeared than there was during the speech itself (indeed, the camera caught his brother-in-law suppressing a yawn as he sat behind him!). Nevertheless, he said one thing, at least, that was extremely important: that we as a nation repudiate the belief that we must sacrifice our ideals for the sake of security.

Aretha Franklin's appearance was symbolically important, but the measured, majestic 18th-century musical phrasing of "My Country, 'Tis of Thee" hardly suited her rather informal performance style. For my money, one of the best parts of the ceremony was the brief instrumental ensemble that included Yo Yo Ma, Itzhak Perlman, a female pianist, and a black clarinetist, performing an arrangement by famous movie composer John Williams of themes from Aaron Copland's Appalachian Spring. Once again, today's arrangement wasn't exactly right—Williams had a fine opportunity, which he seems to have missed, to have also included a theme built on a black spiritual—but the performance seemed to be a musical reflection of how our new President seeks to present himself and his proposed government: cool, simple, elegant, direct, drawing from history but arranging the themes in new ways, a blending of different voices, a performance executed without flaw. It seemed to me that it was that performance, as much as his own inaugural address, that set the standard by which he will be judged.

© Michael Huggins, 2009. All rights reserved.

Monday, January 19, 2009

If not now...

For years, I never thought I would live to see the day when Leningrad would be called St. Petersburg once more. If my expectations of an African-American President were not quite so dismal as my hopes for the fall of Communism, they were at least projected into an ever-receding future of perhaps 30 to 50 years. The last time I watched a Presidential inauguration on television, in 1961, the Civil Rights Act had not been passed, and the University of Mississippi had not been integrated. Many adults I knew regarded Dr. Martin Luther King, Jr. as a dangerous radical.

I remember that in 1988, George F. Will suggested that the Republican Party nominate Colin Powell for Vice-President and steal a march from the Democrats, but that opportunity was forfeited by both parties. (Senator, you were no Colin Powell!)

Tomorrow's inauguration comes 200 years after the birth of the Great Emancipator, 120 years after the death of a sad and unrepentant Jefferson Davis, 100 years after the founding of the NAACP, about 70 years after FDR nominated Benjamin O. Davis as the first African-American general in the U.S. military (his son, Benjamin O. Davis, Jr., later became the first black Air Force general), and roughly 100 years after Teddy Roosevelt outraged many Southerners by having Booker T. Washington to the White House as his dinner guest. When Roosevelt had visited Memphis, in 1902, he had spoken at Church Auditorium, built several years earlier by millionaire black Memphis businessman Robert Church Sr., since local laws forbade him and his fellow blacks to use city parks and other facilities.

Writing in today's New York Times, Henry Louis Gates and John Stauffer argue, quite plausibly, that Lincoln himself, a man of his own time, would likely have been horrified by the thought of the government of the United States being entrusted to a black man. I agree. As the article points out, Lincoln casually used such terms as "Sambo," "Cuffee," and "nigger," and addressed Sojourner Truth as "Aunty." On the eve of issuing the Emancipation Proclamation, he invited black leaders to meet with him and discuss the possibility of founding a black republic in Central America to which freed slaves would be urged to emigrate. Like the author of the words "All men are created equal," Lincoln saw no possibility of racial equality as consistent with a stable system of government.

Having said that, Lincoln should be honored, not only for political measures, but for his own efforts to transcend the attitudes of his day and stretch his understanding of the possibilities between whites and blacks, as he did, for instance, in cultivating a personal friendship with his contemporary, the charismatic black spokesman Frederick Douglass. Nor was he alone; even former Confederate General Nathan Bedford Forrest, who had became notorious for the slaughter of black troops at Fort Pillow, attended an Independence Day picnic in Memphis as the invited guest of black organizers in 1875, 2 years before his death. Admitting privately after the event that he had been quite uncomfortable, the former slave trader addressed the gathering and said that he was ready to offer the hand of friendship and assist the black man in achieving any station in life to which his talents entitled him. For the founder of the Ku Klux Klan to utter such words was like walking a thousand miles, and I doubt that any of us today, having been raised in this more inclusive age, have progressed as far in our own attitudes about race.

© Michael Huggins, 2009. All rights reserved.

Monday, January 12, 2009

Google Warming

I'd heard about the losing battle that newspapers are fighting to stay afloat as they steadily lose ad revenue to the net, and perhaps that accounts in part for the article in yesterday's Times of London that painted Google as a carbon-spewing behemoth. But I see the article is actually carried in their online edition, so perhaps they are daring Google to block searches to their damning report and thus incidentally redeem itself from the sin of environmental spoliation. Or perhaps, since they draw a comparison between a Google search and the homely English cup of tea, it's a belated rebuke, 235 years late, to our disrespect of their favorite brew in Boston Harbor long ago.

In any case, the article quotes Harvard physicist Alex Wissner-Gross as saying that two Google searches generate about as much carbon as brewing a single cup of tea, about 15 grams, and of course, if we multiply that by all the millions who are looking up baseball scores, their favorite celebrities, or their horoscopes online, instead of brewing a cup of tea and opening a paper copy of The Times, the worldwide impact could be severe. If Samuel Johnson could be made to understand the issue here, he, at least, would feel vindicated, since he is said to have consumed tea by the basinful after he gave up wine and liquor. On the other hand, he peopled his attic with a swarm of scribes to help him mark and copy passages from books for use in illustrating the definitions in his Dictionary. Presumably, he supplied them with candles, so perhaps their labors were not more energy-efficient than if they had used Google.

To be sure, the computer industry worldwide is not lagging in its efforts to burn kilowatt hours and is said to contribute to 2% of the planet's carbon emissions, about the same as world aviation. On the other hand, since greater energy use adds not only to the deterioration of the materials in networks and computers but to their operating costs, it's in the interests of information technology to discover better materials, more rational designs, and greater energy-efficiency. For that matter, the growing capacity of smart phones is leading us to a future in which much of what is done from desktop or laptop computers today is about to take place in the palm of the user's hand, instead. Meanwhile, the consumer who shops online, the reader who reads online, and the traveler who goes online and finds the quickest routes and lowest prices all contribute to saving energy, compared to older and more conventional means of accomplishing the same things.

Google published a response that denies Wissner-Gross's figures and asserts, instead, that the carbon involved in a Google search is many times smaller, involving, in fact, about the same amount of energy that the human body burns in about a tenth of a second. Indeed, Google says, its networks are so efficient that the the seeker's journey to the desired information consumes less energy than the computer sitting on one's desk.

Meanwhile, the Washington Posts's TechCrunch feature notes that the average book is responsible for 2,500 grams of carbon (Ayn Rand, you should have trimmed some of John Galt's speech in Atlas Shrugged!), while a cheeseburger accounts for an unseemly 3,600 grams, not to mention what it does to the profile and the digestion.

The London Times article certainly has a point in saying that a great deal of time and energy is wasted by sharing with the world what would have been confined to one's diary a century ago ("Walked the dog; it was hot today; rosebushes are not doing well," etc.). Indeed, that's one of the reasons that I've never seriously considered carrying a cell phone and would have little interest even if I had more room in my budget: I refuse to become part of an enterprise in which millions worldwide pay to say at a distance what was never worth saying face to face in the first place. Air quality isn't the only issue here; we also have noise, thoughtlessness, and what I'll call spiritual pollution to reckon with. Still, Google represents a force that, used wisely, ought to increase our efficiency and make knowledge more widely available, and at a lesser cost.

© Michael Huggins, 2009. All rights reserved.