Wednesday, December 31, 2008

Get thou behind me

Anxious to know when he would see his mistress once more, Benvenuto Cellini tells us in his autobiography that he went to the deserted ruins of the Colosseum late at night with a friend, to consult the spirits of the dead on the happy event. The spirits apparently manifested themselves in such a frightful way that Benvenuto, beside himself with fear, confessed that he made "the noise of a thousand flatulent trumpetings." The sonnet that opens his interesting book reminds us that "the wind, it beareth man's thoughts away," but unfortunately it faces a tall order in disposing of the methane of cattle—it seems that the average cow emits 500 liters of methane every day—and since methane has 20 times the heat-trapping properties of carbon dioxide, we may have Clarabell to blame if the time comes when Miami can only be viewed from above, through glass-bottomed excursion boats. The government is considering the possibility of a tax on cow emissions, a measure that might be aptly named "Cork and trade," and of course farmers are very displeased.

Perhaps it was solicitousness for public opinion on that proposed measure that has caused our Memphis Pink Palace Museum to offer an educational exhibit on animal waste (I'm not kidding). The flyer I received last night invited me to a members-only preview to "Get the Scoop on Poop: learn the science of what animals leave behind." Actually, I feel I have sufficient education in that already from my neighbors' two unsupervised dogs; indeed, one evening, having stepped unaware in their leavings while walking to my car to go to a movie, I spent much of the show wondering why the dolt sitting a few seats away couldn't practice basic hygiene before going out in public and actually considered moving to another seat to evade it, which just goes to prove the adage, "Wherever you go, there you are."

As to the exhibit, yes, I know that important scientific knowledge may be gleaned from feces, and I have listened to jokes about what bears do in the woods, but I never expected to be invited to ponder such matters in a museum. To ensure that members would find the prospect sufficiently attractive, the flyer helpfully noted, "Light refreshments will be served" which, for sheer tone-deafness, reminds me of the honest comment made by a courageous young model once, who said "When you're in Playboy, you really get a lot of exposure."

The sights, though fortunately not the smells, of nature were on silent and austere display this afternoon when my son and I drove to the William B. Clark Nature Preserve near Rossville, Tennessee. Several hundred acres of river bottomland are protected from development and may be viewed by walking a third of a mile along a sturdy boardwalk that takes you into the heart of the swamp. But for the gentle swaying of the trees, bare in winter and standing like unlit candles in the afternoon sun, the only movement we saw was a solitary hawk wheeling overhead. No planes, no wires, no voices. At first, I wondered why Mark kept taking out his cell phone, but I now realize it was to photograph the quiet beauty of the scene.

© Michael Huggins, 2008. All rights reserved.

Tuesday, December 30, 2008

Hast thou philosophy, shepherd?

It was my father who went to college, but my mother whom I always remember absorbed in a book. Mom did not go to college until she was in her 30s. Dad was intelligent and well-spoken, but for him, the purpose of knowledge was to learn useful things or guide his thoughts in the right paths. For mom, reading was a key to asking why things were this way instead of another.

That difference appeared again a few years ago when our new Memphis Public Library building was dedicated, and local citizens were outraged to discover that among quotes from famous authors etched into the pavement near the Library's entrance, there were some from authors of whom they didn't approve, including Marx. One outraged citizen wrote to the local newspaper in protest, declaring, with perfect sincerity, that a library was supposed to be "a place of indoctrination."

For all I know, the person who wrote that absurdity held a college degree, though it didn't save him from completely misunderstanding the whole educational enterprise. Indoctrination is instruction in a prescribed set of norms that are not meant to be disputed; training is the impartation of facts, principles, and techniques meant to be mastered by rote, though that mastery may eventually lead to insights over and above the mere body of material that the student originally learned. Education, to be sure, builds on facts—there's not much point in discussing the effects of European discovery of the New World if one doesn't know when Columbus came over—but it is more than that. Education takes facts and teaches students to think. And that is really the problem.

This has nothing to do with whether most people could cultivate contemplative and analytical habits of mind if they wished; it is to reflect, instead, on the fact that the willingness to sift, to compare, to ask "why" and "what if" often causes discomfort not only to others but to the questioner himself. Philosopher James P. Carse was right to comment that "Many people read to have their views confirmed; the educated person reads to be surprised."

It may be that just about anyone could benefit from wrestling, at some point in his life, with the insights of Plato or Shakespeare; the question is whether he should pursue this as a private interest or be forced to pay thousands of dollars to do so as a requirement for obtaining the most ordinary employment. Charles Murray, of the American Enterprise Institute, made this point in an excellent article in The New York Times last Sunday:

My beef is not with liberal education, but with the use of the degree as a job qualification.

For most of the nation’s youths, making the bachelor’s degree a job qualification means demanding a credential that is beyond their reach. It is a truth that politicians and educators cannot bring themselves to say out loud: A large majority of young people do not have the intellectual ability to do genuine college-level work.

If you doubt it, go back and look through your old college textbooks, and then do a little homework on the reading ability of high school seniors. About 10 percent to 20 percent of all 18-year-olds can absorb the material in your old liberal arts textbooks. For engineering and the hard sciences, the percentage is probably not as high as 10....

But I’m not thinking just about students who are not smart enough to deal with college-level material. Many young people who have the intellectual ability to succeed in rigorous liberal arts courses don’t want to. For these students, the distribution requirements of the college degree do not open up new horizons. They are bothersome time-wasters.

A century ago, these students would happily have gone to work after high school. Now they know they need to acquire additional skills, but they want to treat college as vocational training, not as a leisurely journey to well-roundedness.

Lest this seem like another dyspeptic rant on "today's good-for-nothing youngsters," a similar perspective was provided in the June Atlantic by an anonymous professor teaching English 101 and 102 in a "college of last resort" to classes made up mostly of forty-somethings who must complete a degree for job advancement:

Some of their high-school transcripts are newly minted, others decades old. Many of my students have returned to college after some manner of life interregnum: a year or two of post-high-school dissolution, or a large swath of simple middle-class existence, 20 years of the demands of home and family. They work during the day and come to class in the evenings. I teach young men who must amass a certain number of credits before they can become police officers or state troopers, lower-echelon health-care workers who need credits to qualify for raises, and municipal employees who require college-level certification to advance at work.

My students take English 101 and English 102 not because they want to but because they must. Both colleges I teach at require that all students, no matter what their majors or career objectives, pass these two courses. For many of my students, this is difficult. Some of the young guys, the police-officers-to-be, have wonderfully open faces across which play their every passing emotion, and when we start reading “Araby” or “Barn Burning,” their boredom quickly becomes apparent. They fidget; they prop their heads on their arms; they yawn and sometimes appear to grimace in pain, as though they had been tasered. Their eyes implore: How could you do this to me?

The goal of English 101 is to instruct students in the sort of expository writing that theoretically will be required across the curriculum. My students must venture the compare-and-contrast paper, the argument paper, the process-analysis paper (which explains how some action is performed—as a lab report might), and the dreaded research paper, complete with parenthetical citations and a listing of works cited, all in Modern Language Association format. In 102, we read short stories, poetry, and Hamlet, and we take several stabs at the only writing more dreaded than the research paper: the absolutely despised Writing About Literature.

The author relates the heartbreaking story of Mrs. L., a mature student assigned to do a research paper citing both sides of a historical controversy. Not only could she not write a coherent paragraph; she was never really able to understand the nature of the assignment in the first place. This has nothing to do with socio-economic status; I remember an article in The American Scholar some years ago remarking that for a certain sort of 60-something member of the country club class, taking graduate courses was seen as an interesting alternate form of recreation.

I overheard half of a telephone conversation once, in which one of my fellow students tried to reassure her caller that she would give her the help she needed in writing a comparison-and-contrast paper, a concept that the caller seemed unable to grasp. After the call was over, my fellow student chuckled merrily and said "Oh, that Anne! What a character! She just loves education. She has got herself two Master's degrees, and she has come back for more!" And if she continued to pay fees, no doubt the school saw no reason not to collect them.

A college degree has become a sort of űber-high school diploma in the minds of many employers and for no good reason. While I agree that a study of Shakespeare's Julius Caesar is probably one of the best introductions ever to office politics, I see no reason to require a clerical worker to learn it as an indispensable step to promotion, unless she simply wants to, and if she does, more power to her. Meanwhile, I have a step-cousin, a very sharp individual who has contributed computer code to NASA's missions to Mars, who cannot get permanent positions in the private sector for want of a college degree.

Once, it was assumed that a college degree was undertaken only as preparation for the ministry or a teaching career, and I agree with that archaic standard to the extent that everyone who sought it knew exactly what they were after and why. Again, to admit that college is not for everyone has nothing to do with misanthropy or invidious social distinctions. In 1983, 30-year-old Robert Martin was found living near Rossville, Tennessee, barefoot, with half his teeth missing, in a shack with no electricity or running water, with his elderly grandmother. He owned a Bible and a copy of Milton's works, and he knew both very nearly by heart. Taken to Vanderbilt, he amazed the professors with his knowledge. He had a hunger to know, and to think consequentially about what he had learned. I think it's time to leave liberal arts educations to those constituted like Martin and let the rest of the workforce demonstrate their competence through certification exercises that actually have something to do with their occupations. If they discover, at some point, that they have an urge to learn what Chaucer's pilgrims were up to and why, then I hope they find a willing teacher who can make those characters speak once more.

© Michael Huggins, 2008. All rights reserved.

Monday, December 29, 2008

Brokeback or bareback?

From an article posted Friday in Slate, I learn that two names that were googled this year with a frequency matching Sarah Palin and John McCain were Heath Ledger and Miley Cyrus. Ledger I can understand, and I hope he gets a posthumous Oscar for his performance in The Dark Knight, which was simply amazing; as to the other search object, I'd certainly rather gaze on a pretty young woman, even chastely, than on McCain any day—but Miley Cyrus? Who is she and why should anyone outside her friends and family be interested in her? Two years ago, her name was an item that I included on a list of practice questions for my office trivia team; now, tickets to her performances are scalped.

Her notoriety this year came from the Annie Leibovitz photos in Vanity Fair, and if viewers of the pictures took exception to Leibovitz tastelessly posing Cyrus with her boneheaded father in ways that looked like boyfriend and girlfriend, I agree. It seems there are times when no one has less common sense than artists, as Peter Lynch showed, for instance, when he filmed the humiliating near-rape scene between Willem Dafoe and Laura Dern in Wild at Heart and called it "An episode of female empowerment" (I won't say "Hand me a barf bag" because that motif was another element all too evident in the film). But what got everyone searching for Miley was a single shot that revealed no more than my then-14-year-old daughter did when she donned her new Speedo and went swimming a few years ago: a bare back. Of course society must reject the prurient exploitation of minors, but the last time I checked, shoulder blades were not secondary sexual characteristics, nor their display pornographic.

It's ironic—in April, touring a restored mansion in the annual "pilgrimage" of ante-bellum homes in nearby Holly Springs, Mississippi, I was startled to see several photographs, in one room, of what I assume was the owner's 12-year-old granddaughter in various states of undress, though the photos were very tastefully posed and could not be taken as pornographic. For that matter, on another historic homes tour a year or two ago, passing through some private family rooms, I came across a nude oil portrait of a young woman, and a moment later, found myself face to face with the original (dressed, of course), trying to contain her amusement at the startled glances of the guests. Cyrus's photo is pretty tame compared to those instances, and it seems the feverish interest is based on "concern" over what might have happened had she not adequately draped herself; frankly, it's all beginning to sound to me like John Ashcroft veiling the statues in the lobby of the Justice Department. Let little 15-year-old Destiny Hope do her television show and worry about dating and leave the rest of us to recollect ourselves and consider when our fixation on a young woman's bare back reflects things about our own thoughts that we'd just as soon not reveal.

© Michael Huggins, 2008. All rights reserved.

No bailout bargain

It would be nice if the economy would have a Clive Huggins moment. Clive was my late father, and, having been raised in the Depression, was always fearful of the prospect of paying too much for something; thus, he was often punished by the rule that says you get what you pay for. The Christmas trees he brought home looked like the last survivors of a worldwide drought, while he grumbled that he had overpaid. I, on the other hand, being perfectly willing to pay a premium price where I can afford it (though, on my budget, such willingness is more often a state of mind than an actual monetary transaction) and am likely to gain real quality by doing so, sometimes find myself buying perfectly creditable items for absurdly low prices. When that happens—e.g., paying $85 for $250 luggage or $18 for a $120 lounging robe—I call it a Clive Huggins moment. One of the largest Christmas trees I ever had, a tree so large I actually had to block one of the doors of my apartment to find a place to set it up, cost me $2, which was ridiculously low even for 1974.

Jeffrey Garten, writing in Newsweek recently, was convinced that we are trying to nickel-and-dime our way out of the current financial crisis and that nothing but an investment of $1 trillion, or 7% of GDP over the next 2 years, will give the economy the necessary infusion of capital and inspire consumer confidence once more:

The fundamental issue is fear. Despite the colossal problems in the U.S. economy, the dollar continues to strengthen, which just shows that investors fear other markets even more. Billions of dollars are flowing into three-year U.S. Treasury bills, whose interest rate is zero, so investors are merely trying to minimize losses, not make money. Clearly, the governments have not succeeded in restoring calm. Their efforts look improvised, confused and ineffective to the average consumer or investor. The poster child for this problem is the $700 billion Troubled Asset Relief Program in the United States. The bitter congressional debates over the program and its shifting purpose—from buying toxic assets to injecting cash—has left the public feeling that Washington isn't quite sure what it is doing. For many weeks now, the Treasury and the Fed have appeared to be constantly on the brink of unveiling yet another new program, leaving the impression that even they don't believe the current ones will work.

I only wish someone could figure out the ratio of fear to real potential economic damage. Of course the government needs to spend, but how much? The exasperating thing is the extent to which the crisis is driven by emotion. I think of this, for instance, when I read of homeowners who, we are told, are still stuck paying on homes that are now worth less than the balance of the mortgage. So? My Saturn Ion is worth less than what I owe on it, even at the 0% APR financing I obtained from a desperate car dealer last year (another Clive Huggins moment), but that doesn't keep it from getting me to work, and I am still making payments. There is a real economic crisis—no argument there—and an additional amount of hand-wringing that is no doubt making the situation potentially much worse than it needs to be.

Actually, psychology is as interesting in figuring out how we got here as it is in figuring out how to get out of this mess. Henry Blodget, once notorious as a Wall St. tech stock analyst forced out of his occupation after the dot.com meltdown by Eliot Spitzer (never mind), provides a very incisive analysis of why there will always be economic bubbles and why perfectly intelligent, well-intentioned people will miss them until it's too late, in his article "Why Wall Street Always Blows It" in the current Atlantic:

...most bubbles are the product of more than just bad faith, or incompetence, or rank stupidity; the interaction of human psychology with a market economy practically ensures that they will form. In this sense, bubbles are perfectly rational—or at least they’re a rational and unavoidable by-product of capitalism (which, as Winston Churchill might have said, is the worst economic system on the planet except for all the others). Technology and circumstances change, but the human animal doesn’t.

One of the biggest culprits, as Blodget points out, is the recurring belief that "it's different this time" in a way that is supposed to make caution irrelevant:

Those are said to be the most expensive words in the English language, by the way: it’s different this time. You can’t have a bubble without good explanations for why it’s different this time. If everyone knew that this time wasn’t different, the market would stop going up. But the future is always uncertain—and amid uncertainty, all sorts of faith-based theories can flourish, even on Wall Street.

In the 1920s, the “differences” were said to be the miraculous new technologies (phones, cars, planes) that would speed the economy, as well as Prohibition, which was supposed to produce an ultra-efficient, ultra-responsible workforce. (Don’t laugh: one of the most respected economists of the era, Irving Fisher of Yale University, believed that one.) In the tech bubble of the 1990s, the differences were low interest rates, low inflation, a government budget surplus, the Internet revolution, and a Federal Reserve chairman apparently so divinely talented that he had made the business cycle obsolete. In the housing bubble, they were low interest rates, population growth, new mortgage products, a new ownership society, and, of course, the fact that “they aren’t making any more land.”

In hindsight, it’s obvious that all these differences were bogus (they’ve never made any more land—except in Dubai, which now has its own problems). At the time, however, with prices going up every day, things sure seemed different.

In fairness to the thousands of experts who’ve snookered themselves throughout the years, a complicating factor is always at work: the ever-present possibility that it really might have been different. Everything is obvious only after the crash.

The other deadly ingredient in bubbles is that investment professionals won't keep their jobs if they restrict themselves to prudent courses leading to merely reasonable returns; the same competition that fuels a free market also drives each fund manager to chase larger and larger returns for his investors, on peril of a forced retirement. Blodget cites the instance of fund manager Julian Robertson, whose Tiger Management company lost 66% of its assets to withdrawals by disgruntled investors and finally closed its doors because Robertson correctly anticipated the tech stock meltdown and moved his investors' funds elsewhere (where returns were lesser, though on safer ground).

In other words, we don't want the slick mortgage broker, the BMW-driving realtor, or the glib investment pitchman to do the right thing for us but the thing, instead, that will make us feel as well off as our neighbors occupying the houses that they also couldn't afford. Frankly, I wonder if the Dutch had the right idea when they went nuts over tulip bulbs in 1634; the Semper Augustus bulb was commanding prices equal to that of a house on the Amsterdam market, but at least that was a product of nature.

Actually, Jane Bryant Quinn quotes financial advisor Steve Leuthold in a recent column as saying that investors should buy just about anything at this point, on the grounds that it is likely to be a bargain. That was certainly true of the $22.99 box of firelogs I talked an exasperated Kroger manager into selling me for $5.99 yesterday, after it had been mislabeled, but I don't think that's what Leuthold meant. If stocks scare you too much, perhaps you could consider bidding on Jeff Koons's metal "Hanging Heart" sculpture, a colossal piece of kitsch that recently sold for $27 million, which proves that a fool and his money are soon parted. I think my best buy today was ordering Facing Mount Kenya from Amazon.com on the advice of the Kenyan clergyman of the church I have been visiting; an ethnographic study done at the London School of Economics by a young Jomo Kenyatta in the 1930s, it is supposed to be a very insightful analysis of Kikuyu life and culture. I got it for just $1.75.

© Michael Huggins, 2008. All rights reserved.

Sunday, December 28, 2008

Barack Jindal

I voted for Obama and would be delighted to see him fulfill his apparent promise. After the sheer boneheadedness of the officers of government my fellow Republicans had been content to elect, it was time for a change. I heartily agreed with Christopher Buckley's explanation for his defection:

While I regret this development, I am not in mourning, for I no longer have any clear idea what, exactly, the modern conservative movement stands for. Eight years of "conservative" government has brought us a doubled national debt, ruinous expansion of entitlement programs, bridges to nowhere, poster boy Jack Abramoff and an ill-premised, ill-waged war conducted by politicians of breathtaking arrogance. As a sideshow, it brought us a truly obscene attempt at federal intervention in the Terry Schiavo case.

So, to paraphrase a real conservative, Ronald Reagan: I haven't left the Republican Party. It left me.

Exactly. Still, don't throw out your Republican campaign literature yet, Christopher; like 10-year-old Cameron Bright reappearing as Nicole Kidman's deceased husband in Birth comes the Gipper reborn—or so many are beginning to hope—as a 36-year-old up-and-comer restoring competence and honesty to a state nearly as corrupt as Rod Blagojevich's Illinois. The comparisons are a bit of a stretch; Reagan never pretended to a résumé that included graduating from Brown at 21 and completing a Rhodes Scholarship at 23, but this most quintessentially American of 20th-century Presidents would certainly warm to a small boy, a son of Indian immigrants, who suddenly announced at age 5 that he would answer to no name but Bobby because that was his favorite character on The Brady Bunch. Republicans today, as described in Andrew Romano's piece in a recent issue of Newsweek, are flocking to the young Governor of Louisiana for a combination of style and substance, as Grover Norquist describes it:

First of all, he's brilliant....Two, he's from an immigrant community, so that speaks to immigrant experience, period. Three, he's a Catholic who lives his values instead of shouting at you about them. Four, he's a principled Reagan Republican. Five, he's from the South but doesn't look like a Southern sheriff. And he's got more successes as a governor, already, one year in, than George W. Bush or Obama had when they ran for president. He's exactly what we need.

I don't want Jindal to run as a sort of rebuke or comeuppance to Barack Obama; if the President-elect is as principled as he is intelligent, if he governs wisely and well, if he can truly effect needed changes in energy, the economy, and healthcare without incurring a ruinous debt, if he can restore our damaged credibility among nations, more power to him. No, I want Jindal to enter the arena because more nearly-matched competition puts each contestant on his mettle and forces voters to be very sure of why they are choosing one over the other; I want him to run also so as to put to rest for good the idea that the incoherent, swaggering, shoot-a-moose-from-a-helicopter style of another governor is seen once and for all as the tasteless national joke that it is.

Jindal claims he has no intention of running in 2012, and if he really means that, he needs to be reminded that Iowa is not in his jurisdiction, but whenever he chooses to run, can he win? There are at least two points of vulnerability: his uncompromising opposition to abortion under any circumstances whatsoever, and his self-attested participation in an amateur exorcism at Brown.

The majority of voters—even self-described pro-choice advocates, it seems—do not wish to see the wanton taking of unborn life for convenience but wish only to see abortion, as the phrase goes, become "safe, legal, and rare." It has to be a comfort even to those who are strongly pro-life, as I am, that the incidence of abortion has actually declined since the early 1990s. I hope for a cultural change of the type that caused the rate of smoking to drop by half over the past 40 years: the emergence of a culture that regards unborn life with such care and reverence as to see abortion as a regrettable choice and avoid it if possible; a blanket refusal to so much as consider it even for medical necessity is not likely to win votes for a candidate for national office. As to the rest of his religious views, I can only trust that someone with a biology degree from Brown will exercise his apparently considerable intelligence and not resort, for political or any other reasons, to the asinine prescription to "teach the controversy" to public school students, a controversy that would never have existed but for the perverse refusal of the scientifically ignorant to assent to what they don't want to understand.

© Michael Huggins, 2008. All rights reserved.

Saturday, December 27, 2008

Detroit's Appomattox

If Southerners act in concert, their purpose must be sinister, or so says Michael Lind in his article last week in Salon, "How to End the South's Economic War on the North." The action of Senators Shelby, Corker, and McConnell to block the bailout of Detroit's Big Three cannot possibly be ascribed to a failure to understand why Toyota, Nissan, and BMW should be penalized for not raising their per-hour labor costs from $46 to Detroit's standard $71 or to add $1,500 in worker health-care costs to the price of every car, as Ford does; it must, instead, be a plot to avenge Jeff Davis and express contempt for the rest of the country. If Lind were not the Whitehead Senior Fellow at the New America Foundation, he should be a novelist or perhaps a psychologist; while analyzing others, he might try to get a clearer insight into his own outrage over the fact that the South will not acquiesce in his contempt for the region or submit to being told what is best for it by enlightened liberals like Lind. I think this is one Whitehead that needs to be squeezed.

I take no pleasure in reading about the wasteland of Flint or the prospect of faithful and competent auto-workers being turned out into the cold because of factors over which they have no control. For that matter, I do think the Big Three must not be allowed to fail altogether, for the ripple effect both of the size of their work force and in their supplier relationships. But their prospective failure is not a plot, from the South or anywhere else, but simply a result of management's own mistakes and of competition from those who do their jobs better. Yes, it is true that the states of the old Confederacy spend less on public services such as education and healthcare than others, and that ought to be rectified; it is also true that the large payrolls made possible by foreign auto-makers in those regions help to improve the quality of life for the workers and their families (see Daniel Gross's article in the December 22 Newsweek on the economic impact of the foreign auto industry on the South). In any case, is there something sacrosanct about living in the Snow Belt? Just as Southern laborers once migrated north for better jobs, perhaps it's time for workers in Michigan to abandon their frozen habitations and do the opposite. If they come in sufficient numbers, they may even vote higher taxes for public services, if they wish.

I understand that the elephant in the room is union vs. non-union labor. Union work rules are sometimes necessary to protect the worker against the caprice of management; I remember reading the reminiscences of an early labor leader in which he recalled that no matter how bad economic conditions became, somehow, the members of the company baseball team were never included in layoffs. Unions happened because corporate management of the late 19th and early 20th centuries were as blind as GM and Chrysler management have been for some time now, and their contributions to equity in the workplace should not be simply dismissed. If union contracts eventually saddled the employer with health-care and pension obligations that can no longer be sustained, they were at least an understanding for worker and employer alike that faithful service over the employee's working life would result in comfort and economic security.

Now it is time to renegotiate all of it, if not the basic principle of mutual obligation, then the details of how such a principle can be honored in today's economy. The solution lies neither in abandoning the Detroit worker to freeze and starve nor in wild-eyed calls from the likes of Lind for a "New Reconstruction" in which the South eventually becomes as bankrupt and hamstrung as the North, but in an honest understanding between management and labor about building cars that work, that do not despoil the environment, and that the worker can afford, along with reasonable health and pension benefits. Instead of punishing Dixie for its success, others would do well to emulate it.

© Michael Huggins, 2008. All rights reserved.

Friday, December 26, 2008

Better to light a candle

I admire the ways in which civilization is maintained and spread in the face of hardship. In the rude days of the Saxon Heptarchy, Theodore of Tarsus taught the Scriptures, Greek, poetry, and astronomy to young Anglo-Saxon scholars at Canterbury. Young Edward Taylor took his leave of Harvard President Charles Chauncy and set out into a New England winter in 1671 to travel 100 miles to the frontier town of Westfield, Massachusetts, where he served the next 58 years as pastor, doctor, and counselor, while writing the only known Metaphysical poetry by an American author. On Sundays, his parishioners often worshipped while listening for a drumbeat that would warn them of Indian attack.

Today I learn that Andrew Jackson, thought to be the least educated U.S. President of his day (his predecessor, John Quincy Adams, who detested him, refused to attend a ceremony in which Harvard, Adams's alma mater, conferred an honorary degree on Jackson) studied, as boy in upstate South Carolina, with a Presbyterian minister, emerging with an ability to quote Plutarch and Pope, and having memorized verbatim the Westminster Shorter Catechism, a 4,500-word-long compendium of Reformed Protestant theology, consisting of 107 questions and answers on the Bible and church doctrine. Famous for his violent temper and his irregular courtship of a still-married woman (amazingly, it had been his mother's hope that he would become a minister), Jackson read 3 chapters of the Bible daily and, for inspiration in valor, cherished Jane Porter's 1809 The Scottish Chiefs, her rather breathless retelling of the struggles of Sir William Wallace. (It's a tribute to Jackson's love of reading that he could be so influenced by a book that was not published until he was 42 years old and had already served as a Senator, Major-General of Militia, and Tennesee Superior Court Judge.) He was raised and educated in a frontier settlement so primitive that his own father's pallbearers drank themselves into besotted confusion and lost Jackson senior's casket in the snow, as they stumbled through the wilderness to bury him. Orphaned at 14, young Jackson estranged himself from his only surviving family, relatives of his mother, by his violence and irascibility.

Sometimes even those qualities served him in good stead. Riding the circuit as a Tennessee judge, he found a small-town sheriff too frightened to arrest a man accused of mutilating one of his own children in a drunken frolic. Jackson confronted the man, who was armed, and told him to surrender or die.

His legend has come down to us as that of a man almost pathologically angry and I wondered what, besides his physical courage, made men respect and follow him, but as Jon Meacham's biography points out, Jackson, as a perpetual outsider, had managed to cultivate a certain degree of charm, and that, added to his bravery, drew men to him. Fleeing hostile Indians in the wilderness of Tennessee with his legal colleague, John Overton, Jackson was nearly swept over the edge of a waterfall to his death; at the last moment, he calmly extended an oar to Overton, standing on the riverbank, who caught it and pulled Jackson to safety. “You were within an ace, Sir, of being pulled over the brink and dashed to pieces,” Overton observed.

Jackson calmly replied, “A miss is as good as a mile. Follow me and I’ll save you yet.”

© Michael Huggins, 2008. All rights reserved.

Wednesday, December 24, 2008

We are all descended from grandfathers

Not least among the mysterious laws of nature is the principle that if you had a past life, you must be descended from a pharaoh or priestess; either the street sweepers and outhouse collectors didn't reincarnate or the rest of us are new creations. Whatever the reason, I can never get enough of the historical recreation reality shows, in which modern people from various backgrounds and occupations are placed in historically accurate settings and required to live for 3 months or more as they might have lived in 1900, 1867, and so on. Participants generally find that they can eventually adjust to cold baths and shampoo improvised from egg yolks; what often breaks them and sometimes causes angry defections is the requirement to treat anyone else besides one's own inner child with deference.

All these tendencies are on full display in the series I've been watching this week, Regency House Party, filmed 5 years ago for British TV. The program places 10 eligible singles in an English country house and asks them to engage in the social rituals, including the quest for a suitable mate, that would have been their most important business in 1811. It may be true that in our day, dating has gone the way of the minuet and been replaced by hookups, but in 1811, such behavior was social suicide, and physical contact between the sexes is confined mostly to furtive hand squeezing during formal dances. As was true of that time, the men find plenty to do, what with shooting matches, athletic contests of various sorts, and heavy drinking, flavored with indiscreet references to their female housemates, while the ladies must content themselves, for the most part, with embroidery and occasional outside walks, carefully guarded by parasols against the sun. Interestingly, the first defection is by a man from a working-class background in modern life; assigned to portray a British Army Captain from the minor aristocracy of that day, he balks at learning the bearing, walk, and manners of a gentleman and, declaring that he wants to be a "regular bloke," packs and leaves early on.

Naturally, modern entertainment devices are nowhere to be seen (indeed, a high point of technological progress comes when the assembled guests witness the lighting of an early gas lamp, enabling them to see each other more clearly than by candlelight), so they must resort to the diversions of that day. Among the amusements on display is the armonica, an instrument in which shallow glass dishes are suspended on a wooden spindle that runs through their centers like an axle; the whole assembly is suspended, in turn, in an open cabinet-like structure. It is played by lightly wetting one's fingers and then stroking the edges of the glass dishes, which make eerie tones sounding rather like the calls of whales or dolphins. The reverberations last so long that you can't play anything very fast on it; it seems designed to play the likes of the Pachelbel Canon except that there would be too much overlap the moment the second round began, and it would all melt into a meaningless shimmering mixture.

What surprised me in a series so well researched (e.g., it turns out that meals in the Regency Era were not served in courses, as was customary later, but in balanced groups of sweet and spicy dishes, following the lead of George IV's French chef), is that they never mention that the armonica was the invention of our great American polymath, Ben Franklin. Perhaps the omission can't be charged exclusively to the state of modern scholarship; in Franklin's own day, Wedgwood sold figurines of Franklin sometimes labeled "Country Gent." or even "Geo. Washington" (I actually saw a copy with the Washington label on it in a house in Charleston some years ago).

One ought to follow the path of learning wherever it leads, but what I could have done without in the Regency series was the introduction among the visitors of controversial Professor Gunther von Hagens, who practices plastination of the dead for public display. Yes, I know Michael Jackson tried to purchase the skeleton of the Elephant Man in 1987, that Jeremy Bentham's skeleton presided over board meetings of a London Hospital for 92 years, and that deceased Spanish royalty were deposited for centuries in the "rotting room" of the Escorial, but the public display of plastinated human bodies in various states of dissection, arranged in artistic poses, is obscene if the word means anything at all. In one scene of the series, von Hagens comments at dinner(!) that when he dissected a close friend, who had died young, he felt the friend was communicating with him more than ever in a special way; I can only hope the poor fellow meant to say something like, "If I could get up off this table, I would tear your balls off." It's something of a relief when an heiress comments later, "I could see him wanting my body, but for all the wrong reasons."

I suspect that if von Hagens had encountered the feisty seventh President of the United States, Old Hickory would have horsewhipped him and then run him through with his sword cane for good measure. My children thoughtfully gave me Jon Meacham's new biography of Jackson, American Lion, for Christmas, so I think I'll get my mind off plastic cadavers and get started on that instructive book. Meacham's photos always give me the impression that he is about to weep under the intensity of a self-awareness of his own enlightenment and the pleasurable feelings resulting therefrom, but he does write well, so I have high hopes.

© Michael Huggins, 2008. All rights reserved.

Tuesday, December 23, 2008

It ain't over 'til it's over

If only scientists would stop learning new things, I could be comfortable in my previous assumptions. For several years, I have accepted as true that we confront the modern age with essentially Stone Age minds and that human evolution had receded to vanishingly small levels, shielded from adaptive challenges by technology.

That some people speak and act with Neolithic sensibilities seems indisputable in my experience, but in general, I may be premature in marking an end to human evolution and guilty at least of oversimplication in thinking that we are mostly wired to hunt Mastodons. Writing in the December Scientific American, Peter Ward describes the ways in which agriculture and urban life have contributed to changes in the human genome:

...over the past 10,000 years humans have evolved as much as 100 times faster than at any other time since the split of the earliest hominid from the ancestors of modern chimpanzees. [Scientists attribute] the quickening pace to the variety of environments humans moved into and the changes in living conditions brought about by agriculture and cities. It was not farming per se or the changes in the landscape that conversion of wild habitat to tamed fields brought about but the often lethal combination of poor sanitation, novel diet and emerging diseases (from other humans as well as domesticated animals).

As to the Stone Age mind, philosopher David Buller argues in the same issue that we can't really know enough about the environment of our ancestors to describe their psychology with any precision and, that, indeed, the concept of a Stone Age mind does an injustice both to our pre-human past and to our more recent development:

[The] claim that human nature was designed during the Pleistocene, when our ancestors lived as hunter-gatherers, gets it wrong on both ends of the epoch.

Some human psychological mechanisms undoubtedly did emerge during the Pleistocene. But others are holdovers of a more ancient evolutionary past, aspects of our psychology that are shared with some of our primate relatives. Evolutionary neuroscientist Jaak Panksepp of Bowling Green State University has identified seven emotional systems in humans that originated deeper in our evolutionary past than the Pleistocene. The emotional systems that he terms Care, Panic and Play date back to early primate evolutionary history, whereas the systems of Fear, Rage, Seeking and Lust have even earlier, premammalian origins....

The view that “our modern skulls house a Stone Age mind” gets things wrong on the contemporary end of our evolutionary history as well. The idea that we are stuck with a Pleistocene-adapted psychology greatly underestimates the rate at which natural and sexual selection can drive evolutionary change. Recent studies have demonstrated that selection can radically alter the life-history traits of a population in as few as 18 generations (for humans, roughly 450 years).

Of course, such rapid evolution can occur only with significant change in the selection pressures acting on a population. But environmental change since the Pleistocene has unquestionably altered the selection pressures on human psychology. The agricultural and industrial revolutions precipitated fundamental changes in the social structures of human populations, which in turn altered the challenges humans face when acquiring resources, mating, forming alliances or negotiating status hierarchies. Other human activities—ranging from constructing shelter to preserving food, from contraception to organized education—have also consistently altered the selection pressures. Because we have clear examples of post-Pleistocene physiological adaptation to changing environmental demands (such as malaria resistance), we have no reason to doubt similar psychological evolution.

I don't know if the Internet rises to the level of a selection pressure or not (for me, post-divorce encounters with Internet dating sites have acted more as an inducement to celibacy), but some hold high hopes for it. Futurist Raymond Kurzweil believes that human and machine intelligence will eventually merge, perhaps improving both. Other observers believe and hope that the Internet will enable a qualitative advance in communication that results in the merging of all minds into a super-mind, a whole greater than the sum of its parts (an expectation that I think is completely unfounded, for the simple reason that two cars don't make a bus).

Still others are deeply skeptical about the influence of the Net on civilization, and the two sides have lined up in a clear and vigorous debate, as noted by David Brin in his article "Will the Net Help Us Evolve?" in today's Salon:

Some of today's most vaunted tech philosophers are embroiled in a ferocious argument. On one side are those who think the Internet will liberate humanity, in a virtuous cycle of e-volving creativity that may culminate in new and higher forms of citizenship. Meanwhile, their diametrically gloomy critics see a kind of devolution taking hold, as millions are sucked into spirals of distraction, shallowness and homogeneity, gradually surrendering what little claim we had to the term "civilization."

Call it cyber-transcendentalists versus techno-grouches.

Nicholas Carr weighed in for the skeptics with his article "Is Google Making Us Stupid?" in the July Atlantic. Carr laments:

...what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.

I’m not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing....Scott Karp, who writes a blog about online media, recently confessed that he has stopped reading books altogether. “I was a lit major in college, and used to be [a] voracious book reader,” he wrote. “What happened?”

....Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the Internet has altered his mental habits. “I now have almost totally lost the ability to read and absorb a longish article on the web or in print,” he wrote earlier this year. A pathologist who has long been on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His thinking, he said, has taken on a “staccato” quality, reflecting the way he quickly scans short passages of text from many sources online. “I can’t read War and Peace anymore,” he admitted. “I’ve lost the ability to do that. Even a blog post of more than three or four paragraphs is too much to absorb. I skim it.”

Responding to Carr in the Encyclopedia Britannica Blog, Clay Shirky agrees that

The web presents us with unprecedented abundance. This can lead to interrupt-driven info-snacking, which robs people of the ability to find time to think about just one thing persistently. I also think that these changes are significant enough to motivate us to do something about it. I disagree, however, about what it is we should actually be doing.

Shirky suspects that Carr has misdiagnosed his own pain and only thinks it's about the Net while actually, it is a lament for a literary culture that was vanishing before the computer age even began:

Despite the sweep of the title, it’s focused on a very particular kind of reading, literary reading, as a metonym for a whole way of life. You can see this in Carr’s polling of “literary types,” in his quoting of Wolf and the playwright Richard Foreman, and in the reference to War and Peace, the only work mentioned by name. Now War and Peace isn’t just any piece of writing, of course; it is one of the longest novels in the canon, and symbolizes the height of literary ambition and of readerly devotion.

But here’s the thing: it’s not just Carr’s friend, and it’s not just because of the web—no one reads War and Peace. It’s too long, and not so interesting.

This observation is no less sacrilegious for being true. The reading public has increasingly decided that Tolstoy’s sacred work isn’t actually worth the time it takes to read it, but that process started long before the internet became mainstream. Much of the current concern about the internet, in fact, is a misdirected complaint about television, which displaced books as the essential medium by the 1970s....

And this, I think, is the real anxiety behind the essay: having lost its actual centrality some time ago, the literary world is now losing its normative hold on culture as well. The threat isn’t that people will stop reading War and Peace. That day is long since past. The threat is that people will stop genuflecting to the idea of reading War and Peace.

While agreeing that the need to focus must not be lost, Shirky argues that the challenge with which the Net presents us is not an avalanche of intellectual junk but merely a selection problem analogous to what happened a couple of centuries ago when the printing press produced so many works that one man could no longer hope to be master of all knowledge. The concept of the sage as cathedral-like structure, Shirky says, must give way to the idea of a shopper in a bazaar.

Brin isn't ready to wholeheartedly sign up with either side. He agrees with Carr that the internet tempts some to become part of a dim-witted mob but also hails the same abundance that so delights Shirky and his fellow techno-enthusiasts. His solution is to ensure that there are tools on the web for winnowing the wheat from the chaff:

...what's needed is not the blithe enthusiasm preached by Ray Kurzweil and Clay Shirky. Nor Nicholas Carr's dyspeptic homesickness. What is called for is a clear-eyed, practical look at what's missing from today's Web. Tools that might help turn quasar levels of gushing opinion into something like discourse, so that several billion people can do more than just express a myriad of rumors and shallow impulses, but test, compare and actually reach some conclusions now and then.

But what matters even more is to step back from yet another tiresome dichotomy, between fizzy enthusiasm and testy nostalgia. Earlier phases of the great Enlightenment experiment managed to do this by taking a wider perspective. By taking nothing for granted.

Being temperamentally inclined toward the conservative and curmudgeonly, I ought to give Shirky his due. Folly didn't begin with the Internet, and gullibility existed in the days of the papyrus scroll and before. By itself, the Net can't keep people from reading Milton any more than widespread popular use of the transistor radio in the '50s could destroy interest in Antonio Vivaldi; indeed, Vivaldi enjoyed a revival in that decade, as did Georg Philipp Telemann in the decade after, which was also, of course, the decade of Woodstock. As to reading great works, it was true once, as Samuel Johnson said, that "Classical quotation is the parole of educated men the world over," but sadly, no longer (I didn't know whether to laugh or cry at the moment in The Sixth Sense when Bruce Willis, playing a Ph.D. in Psychology, had to look up the meaning of De Profundis).

Still, I'm less concerned over whether or not someone can quote Vergil than whether he is inclined to examine questions rigorously on the evidence, and in sufficient detail to verify anything. As to verification, Farhad Manjoo correctly points out in his book True Enough that many people no longer care as much for the truth as their truth, and their truth may be that Obama is a Muslim or that the Pentagon secretly planned 9/11. How is this any worse on the Net than in print 200 years ago? It spreads more quickly, and its credit is aided by the almost superstitious awe in which many people hold technology—for some, the Internet itself is taken as a sort of verification of even foolish claims.

As to reading at length, I have seen the way instant messaging and the BlackBerry® have transformed the workplace; for many, if it can't fit on a BlackBerry screen, it's superfluous. Shirky partly misses the point by focusing specifically on Carr's mention of War and Peace; under the trend to fragmented reading and thinking promoted by the Internet, how many people does he imagine are willing, even today, to read a piece of the length and conceptual complexity of his own essay?

Lord Chesterfield wrote:

A man is fit for neither business nor pleasure, who either cannot, or does not, command and direct his attention to the present object, and, in some degree, banish for that time all other objects from his thoughts....steady and undissipated attention to one object is a sure mark of a superior genius; as hurry, bustle, and agitation are the never-failing symptoms of a weak and frivolous mind.

Were he alive now, Chesterfield might be hired for his skill at making himself liked, but his disapproval of multitasking would make him an odd egg in today's business environment.

A book or essay, whether sitting on my shelf or downloadable online, reminds me of what Robert Maynard Hutchins called "The Great Conversation," a dialogue that has lasted for centuries, on texts that were the result of wrestling with important questions. The Internet, which makes its users impatient to read any one thing for very long, is less like a symposium than a food fight. Just as the democratic character of Wikipedia® made guardians of content more necessary than ever, the openness of the internet, the variety of its distractions, and the brevity of many of its offerings make it necessary for the user to recollect himself and ask if the entertaining site he has discovered conveys the truth of the matter or is the online equivalent of a supermarket tabloid. It is a question that I think fewer and fewer are willing to ask.

© Michael Huggins, 2008. All rights reserved.

Monday, December 22, 2008

The purpose-driven inaugural

Many in Barack Obama's base feel betrayed by the planned appearance of Rick Warren at the inaugural ceremony next month, though their candidate openly avowed his opposition to gay marriage in his appearance at the Saddleback Church last summer. My fellow skeptics of religion can hardly feel similarly hoodwinked, since Obama is openly Christian. I could wish, though, that the President-elect had exercised as much taste as calculation in his choice and avoided selecting someone who represents the MacDonaldization of religion.

I won't complain that Warren's appearance makes it doubtful that he has made much progress in the time-honored religious virtue of fasting, since my own waistline has been overdrawn by several inches for some years now. But I must wonder what the fortunes of his church would be if there were a prolonged power blackout. The ancient church sustained their faith in darkened catacombs illuminated only by torchlight; they steeled themselves against the prospect of hideous deaths or were transported by religious ecstasies at the prospect of the Savior's imminent return to judge the world. The modern megachurch, by contrast, would silently implode after 3 successive weeks without electric power; its kind of spirituality, lacking tongues of fire, is sustained only by flashing lights, PowerPoint, large display screens, and high-priced sound systems. Its culture invites the attendee to be seated, be entertained, and be generous in support of this institution so happily designed to allow the worshipper to bathe in good feeling about himself.

It so happens that Saddleback and its pastor are apparently among the best of the breed—Warren not only accepts no salary from the church but has purposely repaid every penny of salary he received during its first 25 years of existence; moreover, he lives on just 10% of his income, donating the balance to worthy causes. Neither a sleazy hypocrite in his personal life, like Swaggart or Ted Haggard, nor a severe ranter on putative damnation for trivial offenses, Warren has actively worked to highlight environmental and social justice concerns among Evangelicals, giving in support of AIDS relief and differing markedly from his compeer James Dobson in raising awareness of global warming.

All this is to the good. Still, what kind of recommendation is it to praise him on the grounds that he is simply not as objectionable as other instances of a phenomenon that is meretricious at its core? Should we admire Warren and his church members because they have finally acknowledged what the International Panel on Climate Change has been documenting for nearly 20 years?

Really, why did Obama invite this man to be a central figure at his inauguration? It reminds me of what I wondered when I read an article in Time last week about the social contagion of happiness; I was startled to read that the odds of one's increased personal happiness were 34% greater if his neighbor were happy (and 10% greater if that neighbor's friend were happy, even if the neighbor's friend were unknown to the original subject!), the odds for happiness increased by just 14% if one's sibling were happy, and by only 8% if one's spouse were happy. One is tempted to leave that last point alone since, sadly, it isn't hard to imagine two married partners discovering that the happiness of each is inversely proportional to the contentment of the other, but still, you have to wonder. I suppose we must have evolved in such a way that whereas we assume a reciprocal commitment to each other's welfare in our relations with spouses and family members, we realize that our neighbor's benevolence is by no means so certain and, thus, feel an obligation to work harder to win his good will. If there is anything to that, I suspect that Obama is confident that his core supporters will trust the integrity of his voting record and formal beliefs, which are politically liberal, whereas he hopes to encourage Warren, a bellwether of Evangelical opinion, to lead his flock in a more centrist direction.

There's a great deal of political good sense in that, certainly, but as someone who declines to share Warren's metaphysics altogether, I still hope the day will come, decades from now, where a candidate for President, invited to come and be cross-examined at Saddleback or its like, will respond with a statement like the following:

Thank you for your interest in my campaign, and I welcome the support of all fair-minded voters. I will not accept your invitation, and I hope my opponent will join me in declining as well. I refuse not because I am uninterested in what your members think but because I cannot discover on what grounds a church is a fitting venue for examining the qualifications of a candidate for high office under the Constitution of the United States. I may happen to hold similar or even identical moral positions to those held by you or some of your members, but I cannot consent to have that agreement linked, in the public mind, to beliefs about a Sky Spirit that I simply do not share; in particular, I refuse to given even token consent to the very foolish notion that without believing in such a Sky Spirit, we would all revert to savagery. Those are your opinions, and you are entitled to them, but they have nothing to do with my ability to devise or implement policies that would make this country prosperous and safe. Instead of meeting in a temple of religious worship, I propose, instead, that we all gather in a more neutral environment where every voter of every shade of belief or no belief feels that he enters, and his opinion is valued, without regard to his views on matters that no one can ever prove.


© Michael Huggins, 2008. All rights reserved.

Sunday, December 21, 2008

To have died in vein

I knew next to nothing about Interview with the Vampire before watching a Netflix® copy this evening, and since I associate an ineluctable quality of glossiness with both Tom Cruise and Brad Pitt, it was hard for me to discover the poignancy and horror that Neil Jordan seemed to be searching for; the movie came closest to those qualities in the slow corruption of 12-year-old Kirsten Dunst. For creating horror and poignancy, I think The Hunger, with Catherine DeNeuve, David Bowie, and Susan Sarandon, which I watched a couple of months ago, was more successful.

Paul Fussell commented in his cruel but witty 1983 book, Class, that the very wealthiest Americans, the type who used to build palaces on 5th Avenue and "cottages" on Bellevue Avenue in Newport, had prudently vanished from public view after the mid-20th century; even the very rich who grace the covers of Town and Country are likely to be one rung down from the "top out of sight," according to Fussell. The rest of us vaguely intuit that there are two classes of humanity who seem wholly other, apart, and, to most people, very strange: the ultra-wealthy and the very cerebral, and I think that such images as vampires, the Addams family, and Hannibal Lecter in pop culture are one way that our culture deals with them. The very wealthy, like Southerners, cannot be merely happy and wise; they must be saddled with crippingly narrow sympathies, live in obvious or impending ruin, and have ambiguous relationships with their fathers or sisters; as to the very bright, in the case of a Hannibal Lecter, they are savage in proportion to their very refinement.

I did like the portrayal of Bradd Pitt's character as operating under a burden because he retained something like a human soul, a quality that had fled even from his child protégée. In his last scene with Tom Cruise, he gently but firmly dismissed his ruined mentor, forced to embrace the pursuit of blood for his own survival but refusing to take pleasure in it.

© Michael Huggins, 2008. All rights reserved.

Saturday, December 20, 2008

You can't turn it off

Beowulf begins with the bard exclaiming Hwaet! (the ancestor of our "What!" but meant, in this context to stand for "Listen to this!"). That arresting moment occurs in a different type of production when you realize that the conventional frame doesn't contain the story; it comes early in Noises Off when the action on the stage is suddenly interrupted by the director of the play, striding down an aisle of the theatre from behind the viewer and upbraiding the performers on the stage for getting it wrong; it comes at the end of EXistenZ, a film about virtual reality, when one participant, dazed by the ordeal, plaintively inquires, "Tell me the truth, are we still in the game?" In The French Lieutenant's Woman, you're aware of it from the first and watch the actors go continually from the 20th century to the 19th and back again, but it also adds to the viewer's uncertainty about the motives of both the Victorian characters and the modern actors who play them in the film within a film.

In England, My England, Tony Palmer's "experimental biography" of 17th-century composer Henry Purcell, the "What!" moment comes immediately after watching Simon Callow crowned as Charles II upon the Stuart Restoration in 1660; with no warning, someone off camera hands Charles an already-lit cigarette, which he begins to enjoy as the camera pulls back and reveals a 21st-century London theatre employee assisting Callow backstage during a performance about the dissolute monarch and his court. Though much is known of Restoration England, and the works he composed even during a short life of 35 years guaranteed Purcell a place among the immortals of music, there are few verifiably documented facts about the composer's own life, a condition reflected in the movie itself, which conveniently has Purcell born in 1660, the year of Charles's Restoration, whereas it is more likely that he had actually been born the year before. The haunting march that begins the funeral music he composed for the 1694 funeral of Mary II, Queen and Consort of William III (the same music was used for Purcell's own funeral a year later) may be remembered as the music that begins Kubrick's A Clockwork Orange, where its use was either boldly apt or a travesty, depending on how you look at it.

In the Purcell film, it's interesting to see that the actor Callow encounters when he returns to his dressing room is Murray Melvin, who was wonderful in his minor role as Lady Lyndon's chaplain in Kubrick's Barry Lyndon and does a brief turn here that reminds me of Tom Courtenay in The Dresser. Callow himself is always great; he was memorable as Emanuel Schikaneder in Miloš Forman's Amadeus, with Tom Hulce, as well as in Mike Newell's Four Weddings and a Funeral, but for me, one of his most characteristic and loathsome roles was the cynical barrister in an earlier Newell film, The Good Father, with Jim Broadbent and Anthony Hopkins. Callow often portrays the kind of sang froid that would have enabled him to write the entry in Samuel Pepys' diary for October 13, 1660, upon witnessing the execution for treason of Thomas Harrison, one of the Parliamentary regicides:

I went to see Major General Harrison hung, drawn, and quartered; he looked as cheerful as any man could in that condition.

In fact, that line is spoken in England, My England, though it is used to refer to the posthumous fate of Oliver Cromwell.

© Michael Huggins, 2008. All rights reserved.

Friday, December 19, 2008

Debris, diabolical and divine

I once corresponded with a prisoner and visited him when I could. Jerry was in for 12 years for DUI resulting in a fatality; his own wife had been killed when Jerry drove through an intersection at a very high speed. After a few years, he came up for parole, and I testified for him at his hearing. Jerry was released but soon violated parole and fled the state. Eventually, he was recaptured. From prison, he placed a collect call to an elderly widow that he and I both knew and asked her to give me a message: "Tell Michael to buy me a TV set." When the widow, who herself lives on a small Social Security pension, protested that I probably couldn't afford such a gesture, Jerry was undaunted and proposed that she and I split the cost.

I thought of Jerry when reading about the impudence of Rod Blagojevich. The late Frank Gorshin once defined chutzpah as a policeman writing you a traffic ticket and borrowing your pen to do so. The Governor of Illinois, who looks and comports himself like an aging 24-year-old detected in the act of shoplifting smutty magazines, is apparently so bereft of shame or sense that, incredibly, he had even talked to associates about a Presidential run in 2016, or perhaps an ambassadorial post, according to Time. At the rate things are going, I half expect him to end up like former Dyersburg, Tennessee Chancery Court Judge David Lanier, who, outraged that he had been indicted for corruption, went on the run and presently called the police who had a warrant for his arrest and warned them that he would give them just one more chance to stop pursuing him.

Equally lacking in shame or sense are Heath and Deborah Campbell of Easton, Pennsylvania, who named their little girl JoyceLynn Aryan Nations Campbell and their son Adolf Hitler Campbell. The dad, who touted the names as an indication of pride in his German heritage, took umbrage at the manager of a local ShopRite®, who refused to sell the Campbells a birthday cake bearing little Adolf's name, for his third birthday. (Again, this must go under the heading of things you couldn't make up!)

Beside all this, the arrest of Bristol Palin's future mother-in-law on drug charges sounds refreshingly normal. If, as Dilbert creator Scott Adams suggests in his book, God's Debris, the Universe and all within it consists of particles of the Deity progressively reassembling themselves as the reinstantiation of the Supreme Mind, Blagojevich and the senior Campbells may belong to a class of particles that need to be vacuumed up, instead.

© Michael Huggins, 2008. All rights reserved.

Thursday, December 18, 2008

Smiles of a winter night

Had I known that Robert Prosky lived within blocks of my mother and stepfather's former home on Capitol Hill in Washington, I would probably have knocked on his door on one of the many Christmases we visited the area to express appreciation of his work. I know he played a variety of roles, not all of them sinister and some even kindly—in fact, my then six-year-old son sat on my lap, a few minutes from Prosky's house, watching him as kindly judge Harper in Miracle on 34th St. in 1995—but the role I'll never forget was Prosky's first, the chillingly soft-spoken mobster, Leo, in Michael Mann's 1981 film, Thief. This scene is the epitome of what Prosky was capable of (be warned: the language is very foul, though the performance is amazing).

It's not certain that even Leo the mobster was the equal of Sam Zell, the foul-mouthed corporate pirate who bought the Tribune Company, parent of major American newspapers, for $8.2 billion and then started cutting newsroom staff; the supreme irony is that Zell had put up just $315 million of his own, financing the balance with Tribune employee pension funds, without their consent. (I can think of nothing that such an action reminds me of so much as committing sexual assault and also convincing your victim to take out a second mortgage to pay you for doing so.) Commenting on the subprime mortgage crisis last April, the veteran capitalist offered this hardheaded advice:

This country needs a cleansing. We need to clean out all those people who never should have bought in the first place, and not give them sympathy.

You said it, Sam. As a firm believer in free enterprise and individual initiative, I look at the likes of Zell and wonder if Marx perhaps had virtues that we've overlooked; as someone who doesn't believe there is an afterlife, I am tempted to hope that Zell is headed where his tactics are rapidly sending major newspapers: a place that rhymes with his last name.

But speaking of Marx, it's interesting that Poland's last Communist dictator, the heavy-handed General Wojciech Jaruzelski, now 85 and on trial for his 1981 imposition of martial law, is winning some sympathy, even from still-living victims of his repressive methods, according to Time. Did Jaruzelski pre-empt a planned Soviet invasion by his crackdown and save Poland from worse, or did he in fact do the Soviets' dirty work for them, disingenuously using the invasion threat to justify his own desire for power? Even erstwhile opponent Lech Walesa, one of 10,000 detained during the crackdown (90 were killed), thinks Jaruzelski's trial after so long—if convicted, he faces 10 years' imprisonment—may be a mistake. I admire Walesa's moderation, but the truth needs to be shown for what it is, no matter what sentence is handed down. After all, such tactics are not confined to the past, and age is not keeping 84-year-old Robert Mugabe from holding a feast while Zimbabweans starve.

© Michael Huggins, 2008. All rights reserved.

Wednesday, December 17, 2008

Hold that thought

On the morning of my late maternal grandfather's funeral, when I and others came back from the cemetery and were allowed into his room to select a single keepsake apiece, I quietly plucked his reprint of Thwaites and Kellogg's 1905 Dunmore's War from his book shelf to take home; 15 years earlier, he had gently intimated that he would donate it to the public library instead, by way of expressing disapproval of my youthful personal life. Tonight, at my son's request, I ordered something for him for Christmas that I trust an apt descendant will take from his bookshelf on the morning of his passing, about 65 years from now, though with the donor's previous blessing: the Odes of Pindar. Fourteen years ago, at age five, Mark came home from kindergarten, face aglow, reciting "A-a-apple...buh-buh-ball," fascinated by phonics and the power it gave him to unlock texts for himself; within months, he looked over my shoulder, spotted the Norton Critical Anthology of Swift's writings on the bookshelf behind me, and triumphantly sounded out, "The Witches [Writings] of Joe-Nathan Swift!"

I like to discover personal inscriptions in old copies of books. My copy of Richard Halliburton's Royal Road to Romance, which I picked up in a street fair a few years ago for a dollar, is inscribed "Ada Norfleet Fuller, 1930"; Mrs. Fuller was a Memphis socialite of the early 20th century. More surprising was the inscription in a copy of the 1834 novel Alice Paulet, which I have been reading online: about a quarter of the way through the book, some 19th-century reader warns us, in antique penmanship:

Kind reader if you have endured this book to this point, do not proceed further for it grows no better.

Actually, that reader picked an odd place to break off; his or her warning comes immediately after a country clergyman is murdered and before a duel and a suicide; in any case, I'm about halfway through it, so I think I'll continue.

© Michael Huggins, 2008. All rights reserved.

Tuesday, December 16, 2008

So Czarry

Mickey Kaus aptly observes in Slate that the government might as well appoint a Czar Czar to oversee the work of all the other ad hoc plenipotentiaries. Kaus links to Laura Meckler's balloon-deflating piece in The Wall Street Journal on a concept that seems to be about as useful as a Fabergé egg:

"There've been so many czars over last 50 years, and they've all been failures," said Paul Light, an expert on government at New York University. "Nobody takes them seriously anymore." He pointed to officials placed in charge of homeland security and drug policy.

The problem is that "czars" are meant to be all-powerful people who can rise above the problems that plague the federal agencies, he said, but in the end, they can't.

"We only create them because departments don't work or don't talk to each other," Mr. Light said, adding that creation of a White House post doesn't usually change that. "It's a symbolic gesture of the priority assigned to an issue, and I emphasize the word symbolic. When in doubt, create a czar."

The enterprising reporter traces the Czar concept only as far back as the Clinton Administration, but a few of us were born before that time, and I seem to remember that the first person to be called Czar was former Treasury Secretary William E. Simon, appointed Energy Czar by Richard Nixon in the early seventies. It's strangely relevant that Simon eventually authored a book called A Time for Truth. Indeed.

Czar is too haughty a title to ascribe to the Prince of Peace, humbly born into the world in a manger, but that doesn't satisfy the curmudgeonly Christoper Hitchens, who vents articulately as always in his article "'Tis the Season to be Incredulous." Christopher definitely feels crowded:

The core objection, which I restate every December at about this time, is that for almost a whole month, the United States—a country constitutionally based on a separation between church and state—turns itself into the cultural and commercial equivalent of a one-party state.

As in such dismal banana republics, the dreary, sinister thing is that the official propaganda is inescapable. You go to a train station or an airport, and the image and the music of the Dear Leader are everywhere. You go to a more private place, such as a doctor's office or a store or a restaurant, and the identical tinny, maddening, repetitive ululations are to be heard. So, unless you are fortunate, are the same cheap and mass-produced images and pictures, from snowmen to cribs to reindeer. It becomes more than usually odious to switch on the radio and the television, because certain officially determined "themes" have been programmed into the system. Most objectionable of all, the fanatics force your children to observe the Dear Leader's birthday, and so (this being the especial hallmark of the totalitarian state) you cannot bar your own private door to the hectoring, incessant noise, but must have it literally brought home to you by your offspring. Time that is supposed to be devoted to education is devoted instead to the celebration of mythical events. Originally Christian, this devotional set-aside can now be joined by any other sectarian group with a plausible claim—Hanukkah or Kwanzaa—to a holy day that occurs near enough to the pagan winter solstice.

His facts are all quite true, of course, though the reaction is his own (and the label "mythical events," even though I agree with him, seems an unnecessary gibe). I'm just glad they don't display Warner Sallman's Head of Christ that used to be so ubiquitous in my childhood, or poor Hitchens might have to be blindfolded for his own sanity. I agree with him that the concept of Heavenly Hosts is more inspirational than factual, but fortunately, I don't find myself in the same distress as he does, though I may come close when the holiday mélange played over my office intercom includes, of all things, Christmas Tree from Home Alone II, God help us, as though that were becoming a holiday treasure! If Hitchens wants to man the barricades on that one, I'm with him!

Perhaps Hitchens, whose mental acuteness I respect a great deal, can so condition himself that whenever he hears the Christmas Muzak, he can go into a trance and believe himself to be listening, instead, to the marvelous Christmas Concerto of Arcangelo Corelli. Exactly what Corelli had in mind when he wrote it I can't say, but it fits the Season of Advent, reminding the listener of someone hastening to a momentous event, quickened by anticipation of a meeting long desired.

© Michael Huggins, 2008. All rights reserved.

Monday, December 15, 2008

Sink or swim

I can hardly think of a more fitting way to ring in the New Year in the current economic crisis than to spend an evening tasting the menu once offered to passengers of the Titanic, a pleasure advertised by Chateau on the Lake in Branson, Missouri, of all places, in connection with Branson's Titanic Museum (and I'm not sure which is more strange: London Bridge rebuilt in the Arizona Desert or a Titanic Museum in Branson). A setting more suited to my taste and about 295 miles closer to my apartment than Branson is the casually elegant Equestria Restaurant, 3165 Forest Hill-Irene, in Germantown, Tennessee, which offers polite and attentive service (attentiveness in no way diminished by the fact that I eat there on the strength of gift certificates won in trivia contests) and superbly prepared food; their four-course New Year's Eve menu is an outstanding value.

Wine writer Lettie Teague, quoted in The Week, refers to champagne as a "consolation" in trying times and tries to console the reader with the fact that Pol Roger is available for as little as $35 a bottle, and I'm sure that must be encouraging to someone; frankly, I'm only sorry that the price of Courvoisier doesn't decline at a comparable rate with the price of oil.

Too much attention to the consolations of bottled spirits and a shocking disregard of morals is the focus of Bertrand Tavernier's sharply observed 1974 drama, Que la Fête Commence (English title, Let Joy Reign Supreme), with Philippe Noiret and Jean Rochefort. While peasants starve or are kidnapped for deportation to Louisiana in the France of 1719, Philippe d'Orleans, nephew of the recently deceased Louis XIV, makes fitful attempts to rule more humanely while devoting his chief interest to palace orgies, where restraint is quickly cast aside along with court dress, while his own musical compositions are performed by blind musicians. Following the autopsy of his daughter, the Duchesse de Berri, who ate and drank herself to death at 23, Philippe expresses irreligious opinions that shock even his mistress, while he weakly accedes to the demands of his former tutor and fellow atheist, the Abbé Dubois, for elevation to the Archbishopric of Cambrai. The only surprise in all this is that the French Revolution did not occur for another 70 years.

Had Philippe or the Abbé wished to examine the traditions of faith more closely, they could have done worse than to travel to Rome to see the ancient and august San Giovanni in Fonte, the 4th-century baptistery built next to St. John Lateran by Constantine the Great and now an awe-inspiring microcosm of the architectural, decorative, and devotional styles of its various ages. I missed this site on my one visit to Rome some years ago, though perhaps I'd better go back before it is sold, dismantled, and rebuilt in Orlando, Florida. In any case, two friends who had the rare privilege of seeing their goddaughter baptized there a few years ago, shared this link to the church's website.

© Michael Huggins, 2008. All rights reserved.

Sunday, December 14, 2008

Still kicking

Liza Minelli's show at the Palace Theatre in Manhattan is drawing good reviews, and whether her kind of musical performance is your favorite, and it's certainly not mine, one has to admire her spirit; as The Week notes:

...in the nearly 10 years since her last revue, the 62-year-old Minnelli has had two hip replacements, knee surgery, encephalitis, more than a few addictions, and one very strange and public divorce. Put that all out of your mind. From her first moment onstage, spiky-haired, Halston-clad, and striking her signature one-arm-aiming-skyward pose, “Liza with a Z” proves she can still hold an audience “in the palm of her hand.”

It's also a fitting tribute on Minelli's part to revive the "Palace Medley," which her mother performed there in 1951. Liza's first husband, Jack Haley, had been Judy Garland's protégé, and her fourth, David Gest, seems to have adopted as his operating principle, I'm a Celebrity, Get Me out of Here! the title of the British TV reality show on which he formerly appeared. For a brief moment, Gest bestowed his presence on Memphis, Tennessee, announcing his planned residence in the nation's 18th-largest city with the comment, "I always wanted to live in a small town."

In fact, he was the neighbor, two doors down, of my mother and stepfather, and he happened to be outside one day, 4 years ago, when they and I rode past him. Since we're all so neighborly down here, I waved to him, and he favored me with a look something like the one that Mr. Darcy must have bestowed on the hopeful citizens of Meryton. Still, he did pay for many billboard notices in November, 2004, inviting the less fortunate of Memphis to "Be David's Gest" on Thanksgiving Day by showing up at a variety of local restaurants for free meals.

In any case, Gest is seldom seen here any more, and if he's looking for a setting worthy of him and has sufficient fortune left over from his Thanksgiving largesse, he might consider moving to the Channel Islands of Sark. In that unhappy quarter, which held its first democratic elections last week after voting earlier this year to dissolve Europe's last feudal polity, 140 jobs will be lost after the cancellation of a £5 million per year investment program by the Barclay brothers, millionaire British investors, who were dissatisfied with certain provisions of the outcome. If Gest's bank balance comes up a little short for replacing their bounty, he can at least join the Sercquiais in the islands' distinctive Clameur de Haro, a Norman-era plea for justice in which the supplicant recites the Lord's Prayer in French in front of witnesses and then cries aloud:

Haro, Haro, Haro! À mon aide mon Prince, on me fait tort! ("Haro, Haro, Haro! To my aid, my Prince! I am being wronged!")

As long as Gest doesn't forget and cry out "Je suis remarquable! À la sortie!" all should go well.

© Michael Huggins, 2008. All rights reserved.

Saturday, December 13, 2008

Not a retreat, but a backward advance

ServiceMaster®, which, until recently, contemplated building a new 100-acre $122 million corporate headquarters in Memphis, will lay off 50 headquarters employees instead. A company spokeswoman, who must have done her Master's in the works of Orwell, explained that this was a way to "Lead the market and grow the company."

If ServiceMaster merely found themselves short of office space, they might have relocated to Dubai (surely one of the world's largest potential markets for TruGreen® lawn care), a country that owns 20% of the London Stock Exchange and where 42 million square feet of office space were under construction recently, according to Christopher Dickey in Newsweek. In line with ServiceMaster's thinking, Dubai opened its Atlantis resort with a celebrity-packed event, featuring a $20 million fireworks display that was visible from space, while it laid off 500 immigrant workers and construction on its unfinished skyscrapers began to shut down, a ripple effect of the world financial crisis and falling oil revenues.

Speaking of fireworks visible from space, the latest flashpoint in the abortion issue is the discovery that choice is open not only to pregnant women but to health-care providers who don't approve of terminating pregnancy at will, an exercise of choice that many pro-choicers have greeted with astonishment and outrage. As described by attorney and columnist Dahlia Lithwick:

What does it tell us about the state of the abortion wars, that battles once waged over the dignity and autonomy of pregnant women have morphed into disputes over the dignity and autonomy of their health-care providers?

What it suggests to me, Dahlia, is that framing the entire matter in terms of "choice" was always a too-simple and, ultimately self-defeating approach; it was always just a matter of time until actors who were philosophical opponents of "reproductive freedom" suddenly realized that they had freedom as well and decided to exercise it. Just as the state's right to execute a criminal does not compel a doctor to participate in the execution, the legal right to terminate pregnancy cannot compel a pharmacist, nurse, or doctor to participate.

Of course Lithwick is well aware of this already and is specifically concerned about two emerging legal issues in the contest. She sees the first as both superfluous and harmful:

The first dispute concerns a new rule purporting to protect the "right of conscience" of American health-care workers. Under a new midnight regulation crammed through by the Bush Department of Health and Human Services and poised to become law any day now, any health-care worker may refuse to perform procedures, offer advice or dispense prescriptions, if doing so would offend their "religious beliefs or moral convictions." Congress has protected the right of physicians to opt out of providing abortions for decades. This new rule, which President-elect Obama can overturn (although it may take months), is far broader. It allows one's access to birth control, emergency contraception and even artificial insemination to turn on the moral preferences of a pharmacist, nurse or ambulance driver.

True, it does, and legal and medical relief that no one for 500 miles around will willingly provide, is hollow, but in a free society, that can't be helped. In a free society, laws have much more to do with prohibiting behavior than compelling it; my religious neighbor who believes that prayer is the sovereign remedy for all ills may not prevent me from going to the hospital, but neither is he compelled to drive me there. Of course we're talking specifically about health-care providers, not the public at large, and it is argued that a health-care provider who refuses to provide legally recognized services at discretion is at best acting with a lack of integrity but that, again, begs the question of whether the only morally defensible position is to support human intervention at will in reproductive matters, a position that many—including many in health care—find objectionable. Whatever the merits of that belief, it is as much their right to hold it—and act on it—as it is the right of a woman to terminate or a couple to use contraception. The nurse's or pharmacist's freedom does not end where someone else's preferences begin.

Lithwick is on much more solid ground with the second issue she raises:

The second dispute involves a South Dakota law that went into effect last summer after an appeals court lifted a preliminary injunction. The law requires physicians providing abortions to read from a state-mandated script advising the patient that she is about to "terminate the life of a whole, separate, unique, living human being" with whom she has an "existing relationship." The doctor must have her patient sign each page of a form indicating that she has been warned of the "statistically significant" risks of the procedure, including "increased risk of suicide ideation and suicide." These "risks" are almost completely unsupported by the scientific literature. A new comprehensive study released by Johns Hopkins found "no significant differences in long-term mental health between women in the United States who choose to terminate a pregnancy and those who do not." The disparity between the empirical data and the mandatory script thus forces physicians into a Hobson's choice between providing patients with accurate medical information, and possible license suspension and misdemeanor charges.

Even as someone who is strongly pro-life, I find the South Dakota law preposterous. As to psychological consequences to women, I'm not convinced of the virtually risk-free picture portrayed by the Johns Hopkins study, but the South Dakota script has other issues that render it absurd, and the doctor's legal duty to get it signed equally so. Without being explicitly religious, it comes as close as possible to giving the force of law to philosophical pronouncements about the present state of a fetus that simply can't be proven. The reasons for objecting to abortion on demand cannot and must not rest on near-metaphysical speculation about when consciousness and "personhood" begin but upon the fetus's potential value as a whole human life, a value that is likely to be realized unless we or nature intervene. If South Dakota isn't careful, the likely result of such a misbegotten law will be to leave citizens with no health-care providers but Native American shamans, the doctors all having fled to venues governed by more reasonable statutes. Of course the obvious approach should be for the doctor to administer the script but only after advising the patient that he or she is doing so only under legal compulsion and that it represents an unwarranted interference in the administration of health-care. Certainly, doctors who believe that a specific abortion decision is a mistake also have a right to present their views—voluntarily—and no one should prevent them. Awkward as it may be, it's part of what a free society is about.

© Michael Huggins, 2008. All rights reserved.