Showing posts with label New York Times. Show all posts
Showing posts with label New York Times. Show all posts

Monday, July 12, 2010

It's 3 a.m. Do you know where your roofer is?

It's 18 minutes after 3 Monday morning. Forty minutes ago, I felt a single drop of water hit my shoulder when I stood up in my bedroom to investigate the tap...tap...tap coming from just above the ceiling while a thunderstorm sounded outside.

Thunderstorms generally don't bother me, even though I believe they will get worse and more frequent with climate change. I like living on the third floor of my well-constructed building that is probably no more than 40 years old, because no one is walking over my head, my heating bill is probably less in winter, there is a vaulted ceiling in my livingroom, and it feels like living in a treehouse. The main view of the world from my apartment is a large sliding glass door just off my livingroom which looks out past my balcony to the upper branches of large pine trees. If the wind ever reaches my balcony enough to make the wind chimes sound or the rain fiercely enough to splash it, I know it is worse beyond the stand of pines. A flowering tree, standing between my balcony and the parking lot, is left untrimmed to the point where its blossom-laden branches hang down very nearly to the roof of my parked car.

I've always liked "living up under the eaves" and have done so when I could. In 1967, my family bought an old Dutch Colonial house with an absurdly large 3rd-floor attic, which my dad and I finished into an apartment for me. In graduate school, the professor I worked for occupied a large corner office on the top floor of the University of Memphis's Patterson Hall, and when I entered the office each morning, I looked out the windows directly into leafy tree branches. When I was 10, I remember being awake at 3 one morning with one of my brothers, and we sat at the attic windows of my paternal grandparents' house, listening to the pigeons cooing just outside, in the nest they had built under the eaves.

This morning, alas, the outside world is no longer merely a show and the fourth wall has been breached. The tap...tap I heard some time ago was like a small, sinister footstep in the dark, though fortunately it didn't end with a visit from Samara Morgan. After I noticed water stains in a corner of my bedroom ceiling a few weeks ago following heavy rains, the property manager asked me to wait a week or two until we had had no rain, after which he would have the roof repaired above my ceiling. This was supposed to have been done a couple of weeks ago, and a man came to the apartment Friday and put a new piece of sheetrock in the damaged corner of my ceiling.

Well it looks like he'll probably have to do so again. I can't see any fresh stains, but there must be water in the attic space between the ceiling and the roof.

The last time anything like this happened to me was nearly 40 years ago, when I occupied the upper floor of a house that had probably been built before the First World War. I had just moved in a few days before and was awakened by the sound of water spilling outside my bedroom door but inside the house. Rushing out into the hall, I saw a stream pouring down from the attic, splashing water and plaster flakes all over a box of books. I thought I remembered it hitting my copy of the Greek New Testament, but I found it just now and couldn't see any water stains.

I say that was the last time, but a variation happened about 9 years ago in the building where I lived before moving here, a place a little older and not so well maintained. A fire broke out in a nearby apartment, and smoke came pouring through the air vents into mine. No fire came to my apartment, but a fireman had to enter and punch a hole in my dining room ceiling with his fire axe, to make sure there was no fire in the attic. After he left, I taped a garbage bag over the hole until it could be repaired. To this day, some of my books and papers remain a dingy gray, left that way by the smoke.

Those episodes were unpleasant but not completely unexpected, but this is not "supposed" to happen. A friend lives in the African nation of Chad and looks out through her screen door to find herself being observed by a curious goat who has wandered by, but in the richest country in the world, we are supposed to be sealed, sanitized, waterproofed, and warrantied. With a thunderstorm raging outside, I have a computer desk, too large and heavy to move, laden with snaking wires and cables, and I don't expect the least danger or inconvenience from the elements to my HP color printer-copier-scanner, my Dell computer, my Acer monitor, my DSL modem, or my Logitech web cam. My expected mode of life is 90 years and a world away from the expectation of my paternal grandfather when a boy, waking up on a winter morning just 85 miles east of here and brushing the snow off his blanket that had come through the roof in the night, or of my paternal grandmother, unable to sleep because of the hideous din of rain and hail on the tin roof of the farmhouse she and her family occupied in a field in St. Francis County, Arkansas.

Actually, something like this still does happen to my mother and stepfather, who live in a very large and attractive house downtown overlooking the river, for which they were willing to pay a handsome price a few years ago, thinking it had been well built. It seems now that that wasn't strictly true, and if they experience leaks, the water may drip on polished hardwood and tasteful antiques and works of art, which is certainly worse than anything happening here!

What can the roofers have thought they were doing a couple of weeks ago? I assume they actually did climb to the top of my building and didn't somehow place fresh shingles above the wrong apartment by mistake, so why is a tenant reduced to listening watchfully in the dark at 3:30 a.m., wondering if a corner of his ceiling is about to give way and pour a muddy mixture from the storm outside into his room, ruining the fragile tangle of wires that connects him to e-mail, news updates from The New York Times and The Washington Post, FaceBook, Amazon.com, Netflix, and his online banking?

And that, really, is the thing that causes the most worry in 2010. My maternal grandparents were related by marriage to a family in Point Pleasant, West Virginia, who, 80 years ago, left their house on purpose every year in anticipation of the spring flooding of the Ohio and Kanawha Rivers and later returned to clean out the mud and snakes from their house's lower storey. Within the last 10 years, when the neighbor of a former colleague of mine died in rural Tennessee, the hearse could not get to the home of the deceased to pick up the remains because of flooding, and the neighbors had to improvise their own way to get the body to where it needed to be. Two months ago, one of my colleagues at work had neighbors camping out in her house in Nashville and her entire neighborhood was cut off from the rest of the city by flooding. She could not reach her office, but she remained connected to the outside world because her cellphone service included a data plan.

And when I get right down to it, that's why I keep listening for the dripping sound to resume: the risk that a breach of the fragile fabric that keeps out the elements may throw me back to a time when I didn't have the choices that now make me feel deprived if they are closed off. On one of the eight bookcases in my apartment is a two-volume life of St. Paul published in 1858 that I have owned since 1977 but not read, as well as a four-volume edition of Boswell's Life of Johnson that my parents bought for me in a garage sale in 1967; I actually have read through that one, but, like Shakespeare, Homer, and the Bible, it bears rereading throughout one's life. My entire dining room table and my coffee table are piled high with books and magazines, and if the Internet went the way of the Hummer, I would still have plenty to do.

But not the same choice. Now, if I awake to a noise in the night, I can immediately publish a reflection about it that at least two people are likely to read, both in other states and one of whom I haven't seen in 40 years or, if I wish, I can enter a message board and explain to someone in England or Australia why Hitler was not a Christian, despite their wish to believe it so. I can listen to Orlando Gibbons via Radio at AOL or click on a scene from Citizen Kane posted to YouTube. If I lose all those choices, even for a few days, it's no more than irksome, but I'd still as soon avoid it. Now, turning to my e-mail in-box, I see the daily headlines from the Times, informing me that governors have expressed grave concerns about immigration, as well as another Times notice that assures me that there is a "boatload of water fun" to be had at Clearwater and St. Petersburg, Florida. Meanwhile, the dripping has resumed 4 feet above my shoulder, steadier now and more insistent.

© Michael Huggins, 2010. All rights reserved.

Sunday, October 4, 2009

A one-way ticket to Mars? You first!

Physicist Lawrence M. Krauss recently suggested, in The New York Times, that one way to cut the costs of a manned mission to Mars was to make it a one-way trip for the astronauts. After all, Krauss reasons, the original American colonists didn't expect to return to England. Krauss claims that, heartless as his proposal may sound (really, you think?), informal polls among scientists encountered in his travels show that the majority would be happy to go to Mars with no thought of return.

Which only goes to show how extraordinarily intelligent people sometimes seem to lack the sense to come in out of the rain. Krauss is at least properly skeptical of claims that human space exploration is justified by humans being able to conduct scientific experiments better than robots, which is probably not true. His reasoning is that we need to establish ongoing human life on Mars in case something catastrophic happens to our native planet. Considering the almost insane challenges of the Martian environment for human life, Krauss's purposes would be nearly as well served by a proposal to colonize the submerged parts of the continental shelves of Earth's various land masses.

No one should doubt the invaluable additions to knowledge of properly conducted scientific research on Mars. Its age is similar to that of Earth, and it is the most earth-like planet in our Solar System, though the two planets' respective outcomes have been radically different. Whether liquid water exists far beneath its surface and, even more intriguing, whether biological life exists in some primitive form on Mars are important issues for understanding our own planet's history.

But not the issue here. No human could survive unaided on Mars's surface for 10 seconds. Because its atmosphere is of a thinness to be found at altitudes 19 times that of Denver, liquids boil and evaporate very quickly; a human's blood would boil inside him in seconds. Mars's temperatures are generally worse than those on Antarctica, while its thin atmosphere leaves the surface more vulnerable to the Sun's radiation than the hottest parts of the Sahara. Its atmosphere is 95% carbon dioxide, and it is plagued by storms of red dust lasting months at a time and capable of raising dust clouds 25 miles high.

No, the original American colonists did not expect to return to England, but they did expect to hunt, fish, and farm. Mars is not a candidate for any of those things. Indeed, the very need to protect astronauts from the radiation they are likely to encounter simply getting to Mars in the first place (the shortest possible trip would take 7–8 months) might make their transport craft too heavy to make the trip at all! Krauss acknowledges the issue, supposing a crop of astronauts arriving on Mars with their life expectancies radically cut short by radiation exposure. A promising start for establishing human life on the red planet!

Of course we have, or can develop, the technology to create habitable environments on Mars, perhaps beneath the surface. Let's suppose that, to prepare for such an eventuality, NASA constructs an artificial habitat somewhere on Earth and confines a group of male and female scientists there for some months. There is no TV, radio, or internet, and no real-time communication with the rest of humanity, only data links twice a day, as has been the case with the Mars Rover. One can't go outside without heavy protective equipment, and one may not be able to go outside for months at a time, because of the fierce dust storms, raging at speeds of hundreds of kilometers per hour. Oxygen and water must be manufactured, and attempts must be made to begin cultivating edible plants inside. No health care is available except for what can be provided right there. And these conditions will never change because of the very nature of the environment itself.

I suspect the eventual human result would include murder, insanity, sexual slavery, and rationing of food, water, oxygen, and medical care by some dominant personality and his clique to enforce his will on the rest of the group.

But supposing that didn't happen—that humans somehow learned to adapt and coexist in a civilized way completely inside an artificial environment, forever—Mars has two remaining disadvantages. Since it has so little atmosphere, it is much more vulnerable to meteor strikes than Earth, whose atmosphere burns up many of the debris from space that would otherwise wreak havoc here. Finally, Mars is a great deal smaller than Earth, so its likely future as a human outpost must be quite limited.

And lest we forget, in the light of what we know of evolution, the isolation of two previously compatible groups from the same species generally results in each group eventually developing characteristics so different that they can no longer mate and reproduce with members of the other group. The facts of biology tell us that unless we dispatched additional colonies to Mars at regular intervals to add to its human population, there would eventually come a time when the two groups would be of no further use to each other for propagating common descendants.

We are still too haunted by the ghost of Star Trek, which showed humans boldly going, not only to places where man had never been before, but where he simply can't go, unless we discover usable shortcuts through space-time. Mars, the one planet in our Solar System where humans might have even a remote chance of establishing an outpost, has the disadvantages described above. The closest possibility of another Earth-like planet lies in the vicinity of Alpha Centauri, 4.37 light-years from Earth. Light travels 5.8 trillion miles in a single year; at our current 18,000-mile-per-hour speed of space travel, it would take 37,200 years to travel the extent of a single light-year. Which reminds me of a joke by Johnny Carson. "The space shuttle is under warranty...120,000 miles or ten seconds." I think the late lamented king of late-night television had more common sense about this issue than our physicist friend Krauss. In the dawning age of robotics, there is no more reason to send humans to Mars or any other inhospitable environment than there is to station some hapless soul 11,000 miles above the Earth's surface on a GPS satellite to make sure motorists here below can continue to find their way.

© Michael Huggins, 2009. All rights reserved.

Thursday, October 1, 2009

The buck...well, bounces around a great deal

Anita Tedaldi, military wife and parent of five daughters, who has made a name for herself blogging about motherhood, gave up her adopted 18-month-old son when she realized she just didn't feel all that close to him. She told her story to Lisa Belkin of The New York Times, who also appeared with her when Tedaldi was interviewed on The Today Show. Apparently encouraged by her exposure to the world of journalism to be even-handed, Tedaldi gently informed her audience that the failure to bond "really went both ways." Well I'm all for holding kids accountable, certainly.

There is the awkward matter of Tedaldi having outspokenly criticized another adoptive couple, in print, for doing pretty much the same thing just last year, but, as Lincoln once observed, "The dogmas of the quiet past are inadequate to the stormy present." Meanwhile, the U.S. Military, who owns the web site on which Tedaldi's earlier article appeared, is obligingly treating the matter about like they did the death of Pat Tillman; the text is no longer there.

I read a couple of years ago the troubling story of a single mom in England who adopted an African girl about the same age as the mom's biological 7-year-old daughter. If her account is to be believed, she did everything she could to welcome the adopted child and blend her into the family, to no avail. Eventually, the adopted girl's hostility, not only toward the mother, but even more so against her adoptive sister, reached a point at which the mother feared for her biological daughter's safety. With tremendous reluctance and chagrin, she made the decision to give up the adopted child. Perhaps there was nothing else she could do.

I certainly don't wish for little "Baby D," as Tedaldi refers to her adopted son, to grow up in a house where his closest caregiver is continually judging his bonding skills and finding them wanting; he deserves better, and I hope he is placed in an emotionally healthy home. I could even respect Tedaldi if, chastened by her experience, she took time off from blogging about motherhood for a period of reflection. But we must be realistic; book deals and appearances on Oprah wait for no one. Who knows but that one day the little tyke may pen his own book about "Mommy T" and the strange mismatch between her blogging skills and her nurturing abilities.

This week's other poster child for forgiving one's own mistakes and blowing off the stodgy critics is Roman Polanski, on whose behalf over 100 luminaries of the entertainment world have signed a petition demanding his immediate release from custody, following his recent arrest in Switzerland. These include Woody Allen, whose nude photos of his adopted stepdaughter broke up his long-time partnership with Mia Farrow, and the noted moral philosopher Harvey Weinstein, who can see more clearly than most of us that Polanski was sufficiently punished for his "so-called crime" with a 30-year inability to attend Hollywood parties.

As is well known, Polanski accepted an unchaperoned visit from aspiring 13-year-old model Samantha Gailey at the home of Jack Nicholson (never mind!) in 1977, photographed her nude, plied her with champagne and quaaludes, and then sexually assaulted her, ignoring her repeated protests and requests to leave.

No one but Hollywood libertines are in serious doubt as to the hideous nature of Polanski's actions that night. Yes, I know future Chief Justice John Marshall started courting his future wife when she was 14 and Marshall was 26, but that was in a day when Marshall would have been shot by her outraged father had he so much as kissed her and not followed through shortly after with a trip to the church to make good. And it may be that 15-year-old Nastassia Kinski acted with perfectly free choice upon beginning a sexual liaison with Polanski; frankly, if I had a maniac like Klaus Kinski for a father, I too might find even Polanski's company a desirable alternative.

Polanski's actions with Gailey, in any case, were completely beyond the pale, and he was rightly convicted. The moral issue is clear. What is tangled is the legal issue, an entanglement caused by the egregious misconduct of the late judge Laurence Rittenband, who first approved, and then gave every indication of intending to renege on, a plea bargain supported by the victim's own family. Rittenband seems to have done this, moreover, on the advice of a District Attorney who wasn't even involved in the case, itself an instance of judicial misbehavior. In desperation, Polanski fled the court's jurisdiction and then went abroad, which was another crime added to the one for which he had already been convicted.

If Polanski's celebrity status should not win him special treatment, neither should it have made him the special victim of a judge's personal pique, in violation not only of judicial ethics but of an agreement that the victim and her family had acknowledged was in her best interests. The larger legal issue is whether, having reached a court-approved plea bargain, a defendant for any crime, at any level of wealth or social prominence, should have to wonder if the court will honor its own agreement or decide, on a whim, to suddenly "get tough."

Polanski is apparently an unrepentant reprobate, and one could wish to see him humiliated and abused as his victim was that night all those years ago. But the law should serve justice, not become an instrument of popular revenge. If they wanted his hide, the court should have rejected the plea bargain and insisted on imposing the maximum sentence to begin with. If a foolish, publicity-hungry judge can do this to a celebrity, what might he do to any of us? Polanski's original sentence was for time previously served; to this, a reasonable penalty of additional time should be added for having fled legal jurisdiction.

© Michael Huggins, 2009. All rights reserved.

Monday, September 28, 2009

Global dimming?

How ironic that Josiah Franklin wanted his son Benjamin, the future discoverer of electricity, to follow Josiah's own trade of soap and candle maker, removing his son from school for that purpose when Ben was just ten years of age. As a child, fascinated by the Founding Fathers, I sometimes regretted that I had not lived in the 18th century, but as someone born in the 20th century, I am too used to the conveniences of bright light. A Christmas Eve candlelight service is all very well, but imagine having that kind and degree of light as one's sole illumination all the time. This scene from Kubrick's 1975 film Barry Lyndon gets the look about right. I can understand how Dr. Johnson had to sit so close to his candle for reading that he would singe his wig; what seems nearly incredible is the idea of men and women of that day reading and playing cards by the hour without going nearly blind.

I thought of that when I read the following from Amy Myers Jaffe of The Economist, quoted in the current issue of The Week:

To replace the global energy produced today by fossil fuels, we would need to build 6,020 new nuclear plants across the globe, or to produce 133 times the combined solar, wind, and geothermal energy currently harvested. Barring such a “monumental” transformation, we’re stuck with oil—or with “walking.”

Or candles. The problem is that it takes many candles to equal the illumination of a single bulb, and candles emit more carbon.

It gets worse, and more ironic. The same issue of The Week quotes The New York Times as saying that

To satisfy the exploding worldwide electricity demand caused by flat-screen TVs, game consoles, personal computers, and other gadgets, nations will have to build the equivalent of 560 coal-fired power plants, or 230 nuclear plants, over the next two decades. The average American now owns 25 electronic products.

I read once that if the whole world enjoyed the American standard of living, it would take the resources of three planet earths to support such consumption. Now imagine the world going dark for the sake of the Xbox, Twitter, and flat screen TVs!

And speaking of differences between the 18th century and our own, if an educated man of that day could be resurrected in ours and read the following, which opened an auto review that I read this evening, I think he would quickly ask to be reentombed:

Retirees love Cadillac’s flagship DTS, and the CTS goes up against sporty European rivals, but the SRX is taking on the Lexus RX 350 in the crossover SUV market.


© Michael Huggins, 2009. All rights reserved.

Monday, January 19, 2009

If not now...

For years, I never thought I would live to see the day when Leningrad would be called St. Petersburg once more. If my expectations of an African-American President were not quite so dismal as my hopes for the fall of Communism, they were at least projected into an ever-receding future of perhaps 30 to 50 years. The last time I watched a Presidential inauguration on television, in 1961, the Civil Rights Act had not been passed, and the University of Mississippi had not been integrated. Many adults I knew regarded Dr. Martin Luther King, Jr. as a dangerous radical.

I remember that in 1988, George F. Will suggested that the Republican Party nominate Colin Powell for Vice-President and steal a march from the Democrats, but that opportunity was forfeited by both parties. (Senator, you were no Colin Powell!)

Tomorrow's inauguration comes 200 years after the birth of the Great Emancipator, 120 years after the death of a sad and unrepentant Jefferson Davis, 100 years after the founding of the NAACP, about 70 years after FDR nominated Benjamin O. Davis as the first African-American general in the U.S. military (his son, Benjamin O. Davis, Jr., later became the first black Air Force general), and roughly 100 years after Teddy Roosevelt outraged many Southerners by having Booker T. Washington to the White House as his dinner guest. When Roosevelt had visited Memphis, in 1902, he had spoken at Church Auditorium, built several years earlier by millionaire black Memphis businessman Robert Church Sr., since local laws forbade him and his fellow blacks to use city parks and other facilities.

Writing in today's New York Times, Henry Louis Gates and John Stauffer argue, quite plausibly, that Lincoln himself, a man of his own time, would likely have been horrified by the thought of the government of the United States being entrusted to a black man. I agree. As the article points out, Lincoln casually used such terms as "Sambo," "Cuffee," and "nigger," and addressed Sojourner Truth as "Aunty." On the eve of issuing the Emancipation Proclamation, he invited black leaders to meet with him and discuss the possibility of founding a black republic in Central America to which freed slaves would be urged to emigrate. Like the author of the words "All men are created equal," Lincoln saw no possibility of racial equality as consistent with a stable system of government.

Having said that, Lincoln should be honored, not only for political measures, but for his own efforts to transcend the attitudes of his day and stretch his understanding of the possibilities between whites and blacks, as he did, for instance, in cultivating a personal friendship with his contemporary, the charismatic black spokesman Frederick Douglass. Nor was he alone; even former Confederate General Nathan Bedford Forrest, who had became notorious for the slaughter of black troops at Fort Pillow, attended an Independence Day picnic in Memphis as the invited guest of black organizers in 1875, 2 years before his death. Admitting privately after the event that he had been quite uncomfortable, the former slave trader addressed the gathering and said that he was ready to offer the hand of friendship and assist the black man in achieving any station in life to which his talents entitled him. For the founder of the Ku Klux Klan to utter such words was like walking a thousand miles, and I doubt that any of us today, having been raised in this more inclusive age, have progressed as far in our own attitudes about race.

© Michael Huggins, 2009. All rights reserved.

Tuesday, December 30, 2008

Hast thou philosophy, shepherd?

It was my father who went to college, but my mother whom I always remember absorbed in a book. Mom did not go to college until she was in her 30s. Dad was intelligent and well-spoken, but for him, the purpose of knowledge was to learn useful things or guide his thoughts in the right paths. For mom, reading was a key to asking why things were this way instead of another.

That difference appeared again a few years ago when our new Memphis Public Library building was dedicated, and local citizens were outraged to discover that among quotes from famous authors etched into the pavement near the Library's entrance, there were some from authors of whom they didn't approve, including Marx. One outraged citizen wrote to the local newspaper in protest, declaring, with perfect sincerity, that a library was supposed to be "a place of indoctrination."

For all I know, the person who wrote that absurdity held a college degree, though it didn't save him from completely misunderstanding the whole educational enterprise. Indoctrination is instruction in a prescribed set of norms that are not meant to be disputed; training is the impartation of facts, principles, and techniques meant to be mastered by rote, though that mastery may eventually lead to insights over and above the mere body of material that the student originally learned. Education, to be sure, builds on facts—there's not much point in discussing the effects of European discovery of the New World if one doesn't know when Columbus came over—but it is more than that. Education takes facts and teaches students to think. And that is really the problem.

This has nothing to do with whether most people could cultivate contemplative and analytical habits of mind if they wished; it is to reflect, instead, on the fact that the willingness to sift, to compare, to ask "why" and "what if" often causes discomfort not only to others but to the questioner himself. Philosopher James P. Carse was right to comment that "Many people read to have their views confirmed; the educated person reads to be surprised."

It may be that just about anyone could benefit from wrestling, at some point in his life, with the insights of Plato or Shakespeare; the question is whether he should pursue this as a private interest or be forced to pay thousands of dollars to do so as a requirement for obtaining the most ordinary employment. Charles Murray, of the American Enterprise Institute, made this point in an excellent article in The New York Times last Sunday:

My beef is not with liberal education, but with the use of the degree as a job qualification.

For most of the nation’s youths, making the bachelor’s degree a job qualification means demanding a credential that is beyond their reach. It is a truth that politicians and educators cannot bring themselves to say out loud: A large majority of young people do not have the intellectual ability to do genuine college-level work.

If you doubt it, go back and look through your old college textbooks, and then do a little homework on the reading ability of high school seniors. About 10 percent to 20 percent of all 18-year-olds can absorb the material in your old liberal arts textbooks. For engineering and the hard sciences, the percentage is probably not as high as 10....

But I’m not thinking just about students who are not smart enough to deal with college-level material. Many young people who have the intellectual ability to succeed in rigorous liberal arts courses don’t want to. For these students, the distribution requirements of the college degree do not open up new horizons. They are bothersome time-wasters.

A century ago, these students would happily have gone to work after high school. Now they know they need to acquire additional skills, but they want to treat college as vocational training, not as a leisurely journey to well-roundedness.

Lest this seem like another dyspeptic rant on "today's good-for-nothing youngsters," a similar perspective was provided in the June Atlantic by an anonymous professor teaching English 101 and 102 in a "college of last resort" to classes made up mostly of forty-somethings who must complete a degree for job advancement:

Some of their high-school transcripts are newly minted, others decades old. Many of my students have returned to college after some manner of life interregnum: a year or two of post-high-school dissolution, or a large swath of simple middle-class existence, 20 years of the demands of home and family. They work during the day and come to class in the evenings. I teach young men who must amass a certain number of credits before they can become police officers or state troopers, lower-echelon health-care workers who need credits to qualify for raises, and municipal employees who require college-level certification to advance at work.

My students take English 101 and English 102 not because they want to but because they must. Both colleges I teach at require that all students, no matter what their majors or career objectives, pass these two courses. For many of my students, this is difficult. Some of the young guys, the police-officers-to-be, have wonderfully open faces across which play their every passing emotion, and when we start reading “Araby” or “Barn Burning,” their boredom quickly becomes apparent. They fidget; they prop their heads on their arms; they yawn and sometimes appear to grimace in pain, as though they had been tasered. Their eyes implore: How could you do this to me?

The goal of English 101 is to instruct students in the sort of expository writing that theoretically will be required across the curriculum. My students must venture the compare-and-contrast paper, the argument paper, the process-analysis paper (which explains how some action is performed—as a lab report might), and the dreaded research paper, complete with parenthetical citations and a listing of works cited, all in Modern Language Association format. In 102, we read short stories, poetry, and Hamlet, and we take several stabs at the only writing more dreaded than the research paper: the absolutely despised Writing About Literature.

The author relates the heartbreaking story of Mrs. L., a mature student assigned to do a research paper citing both sides of a historical controversy. Not only could she not write a coherent paragraph; she was never really able to understand the nature of the assignment in the first place. This has nothing to do with socio-economic status; I remember an article in The American Scholar some years ago remarking that for a certain sort of 60-something member of the country club class, taking graduate courses was seen as an interesting alternate form of recreation.

I overheard half of a telephone conversation once, in which one of my fellow students tried to reassure her caller that she would give her the help she needed in writing a comparison-and-contrast paper, a concept that the caller seemed unable to grasp. After the call was over, my fellow student chuckled merrily and said "Oh, that Anne! What a character! She just loves education. She has got herself two Master's degrees, and she has come back for more!" And if she continued to pay fees, no doubt the school saw no reason not to collect them.

A college degree has become a sort of űber-high school diploma in the minds of many employers and for no good reason. While I agree that a study of Shakespeare's Julius Caesar is probably one of the best introductions ever to office politics, I see no reason to require a clerical worker to learn it as an indispensable step to promotion, unless she simply wants to, and if she does, more power to her. Meanwhile, I have a step-cousin, a very sharp individual who has contributed computer code to NASA's missions to Mars, who cannot get permanent positions in the private sector for want of a college degree.

Once, it was assumed that a college degree was undertaken only as preparation for the ministry or a teaching career, and I agree with that archaic standard to the extent that everyone who sought it knew exactly what they were after and why. Again, to admit that college is not for everyone has nothing to do with misanthropy or invidious social distinctions. In 1983, 30-year-old Robert Martin was found living near Rossville, Tennessee, barefoot, with half his teeth missing, in a shack with no electricity or running water, with his elderly grandmother. He owned a Bible and a copy of Milton's works, and he knew both very nearly by heart. Taken to Vanderbilt, he amazed the professors with his knowledge. He had a hunger to know, and to think consequentially about what he had learned. I think it's time to leave liberal arts educations to those constituted like Martin and let the rest of the workforce demonstrate their competence through certification exercises that actually have something to do with their occupations. If they discover, at some point, that they have an urge to learn what Chaucer's pilgrims were up to and why, then I hope they find a willing teacher who can make those characters speak once more.

© Michael Huggins, 2008. All rights reserved.

Monday, December 1, 2008

Extraordinarily popular bargains and the madness of crowds

How is it that 27 million people visited the 1893 Chicago World's Fair—716,000 on a single day, the largest peacetime gathering in American history—with no trampling fatalities, but Wal-Mart can't manage a crowd it has been expecting for weeks or protect its staff? Concerning the 1893 exposition, which took place, remember, in the same era as lynchings and gunfights, we learn that

According to the security department report, only 954 arrests were made over the six months of operation, 10 attempts were made to pass counterfeit coins, 408 people were able to get over the fence into the grounds, and only 33 attempts were made to gain admission on fraudulent passes.

Of course, nothing lasts forever, and Time printed a decidedly different account of the closing of Chicago's Century of Progress Exposition in 1934:

In the Avenue of Flags elderly matrons fought like savages for bits of bunting. For their backyard gardens housewives stripped the Horticultural Building of rare plants and flowers, some worth as much as $200 each. Roving bands of youths stormed the booths of concessionaires. A 13-year-old boy was caught by police lugging off two huge bones of a prehistoric monster, to feed to his dog. Recurring showers of bottles from the 64-story Skyride Tower grew so alarming that the elevators were finally stopped. Dancing feet stomped into ruin landscaped lawns. Into Lake Michigan went benches and tables, and when policemen sought to admonish the revelers, they tossed the policemen in, too.

On the night of those fateful events in Chicago, Sam Walton was just 16 and had presumably followed his normal practice of delivering surplus milk from the family cow to his neighbors in Columbia, Missouri that morning, so we can't blame everything on the man from Bentonville, whatever may be said about his company's emerging corporate ethics or labor practices.

But what was Wal-Mart management at the store in Valley Stream, New York thinking before they opened before dawn on Friday, November 28? (The peculiarly American touch of someone being trampled to death in a mall with the name Green Acres is something you couldn't make up.) According to The New York Times, Hank Mullany, a Wal-Mart regional vice-president, claimed the store had hired additional security guards and erected barricades that morning. Really? Where were they? Why was a temporary employee, Haitian immigrant Jdimytai Damour, delegated to be a human barrier, trying to hold two sliding glass doors secure against a surging crowd of hundreds? Was it because he weighed about 270 pounds?

Four shoppers were also injured in the melée, including an 8-months' pregnant woman, and Times reporter Peter Goodman was right to call it "A Shopping Guernica." It's heartening to read that union representatives are demanding an investigation:

"This incident was avoidable," said Bruce Both, President of United Food and Commercial Workers Union Local 1500. "Where were the safety barriers? Where was security? How did store management not see dangerous numbers of customers barreling down on the store in such an unsafe manner?" asked President Both. "This is not just tragic; it rises to a level of blatant irresponsibility by Wal-Mart. UFCW Local 1500 will demand a full investigation by all levels of Government to ensure both justice for the surviving family members and to ensure the safety of current employees and the general public. This can never be allowed to happen again and those responsible must be held accountable," Both concluded.

Director of Special Projects for Local 1500 Patrick Purcell called Wal-Mart's comments in response to the incident both "cold and heartless." "If the safety of their customers and workers was a top priority, then this never would have happened," Purcell stated. "Wal-mart must step up to the plate and ensure that all those injured, as well as the family of the deceased, be financially compensated for their injuries and their losses. Their words are weak. The community demands action," Purcell concluded.

Sympathetic Wal-Mart employees held a prayer vigil at the shattered front door for their hapless fellow worker. An employee in the store's electronics department offered a different perspective on the circumstances of Damour's death:

"It was crazy—the deals weren't even that good."

© Michael Huggins, 2008. All rights reserved.

Sunday, November 30, 2008

But it doesn't have quite the same ring as "Everett Dirksen"

For those to whom it matters, it seems that Ashlee Simpson and Pete Wentz named their newborn son Bronx Mowgli; by one account, they hoped for a name that would be suitable for either a rock star or a United States Senator, and I certainly agree that we should strive for versatility. Contemplating the question of what kind of adult would give a child such a name should at least add to the baby's eventual happiness, since neurologists have found that increased activity in the left prefrontal cortex is an important contributor to a sense of well-being.

What sensation of personal happiness may have been enjoyed by Neanderthal man is an open question, which, among other reasons, makes the question of cloning them an ethical issue. Mapping their genome is one thing, but inserting them into 21st-century life is another. Though they vocalized in some manner, it is not certain that they had what we would call speech, so at least they would not add to the growing public chatter on cell phones, and since they required over 3,000 calories per day, the fast food chains would gain a new and eager addition to their customer base. The October issue of National Geographic provides informative text and artistic recreations of these hardy folk.

Nicholas Wade examined the topic in The New York Times, along with the question of reviving the Wooly Mammoth. Since cloning a Neanderthal would presumably be done by altering modern human DNA, it raises a human dignity objection from Richard Doerflinger of the United States Conference of Catholic Bishops. An alternate approach, Wade notes, would be to alter the genome of a chimpanzee.

Slate's William Saletan, usually a very acute reasoner who writes with clarity and incisiveness on questions of bioethics and is always eager to show how science must not be waylaid by religion, takes Doerflinger and his bishops to task and, thus, strangely misses the point. The issue of whether a Neanderthal should be cloned from modern human or chimp DNA is made moot by the question of whether we should clone a Neanderthal at all: we shouldn't. Whether or not it would be an offense against modern human dignity, it would compromise the recreated Neanderthal's own dignity.

Primitive or not, a living Neanderthal would be a conscious creature, vaguely aware of his inability to understand or cope with our world and with no immunity to the crowd diseases that have been a major factor in selecting human genes for the past several centuries. Used to being a member of small hunter-gatherer bands that almost never saw other humans for long, he would be placed in a world that was, to him, intolerably overcrowded. Never having developed devices so simple as projectile weapons or the concept of division of labor, he would have little to contribute to his own sense of efficacy except brute strength, endurance, and the pursuit of the very simple life his fellows once knew, little valued in our own day. If his cognitive functions were not of as high an order as ours, it would become really necessary, for the first time in human history, to evolve two tiers of civil rights for two different types of Homo Sapiens, based on their respective capabilities. The Neanderthal would become a sort of living zoo exhibit, to be observed by tourists in safari parks, if not exploited for his strength. Study them as topics in biology and paleoanthropology, by all means, but as for the rest, let them rest in peace.

© Michael Huggins, 2008. All rights reserved.

Saturday, November 22, 2008

Yon Cassius hath a lean and hungry look

When I read that Barack Obama was seriously considering Hillary Clinton for Secretary of State, I thought he had lost his mind. Looking at it again, I think it's a shrewd move. Resigning her Senate seat, she can't hinder any of his preferred legislation out of pique, and if she displays the same persona on the world stage that she did in the primaries, she will appear even more plainly for what she is, while Obama will only win sympathy for enduring her.

Michael Hirsh argues that the position of Secretary of State is as much subject to Presidential control as any other and cites three instances in which, rightly or wrongly, Secretaries of State were pushed aside. However, his argument seems to depend on the presence of an Acheson- or Kissinger-like figure in an Administration to take the place of the original appointee, while Obama specifically tries to avoid conditions that make such changes necessary to begin with. I hope that Newsweek's observation is accurate: that Obama is unusually detached and self-aware for a politician. The seven words that even waterboarding could never force from Hillary's lips were also spoken by someone of unusual detachment, who ate locusts and wild honey.

(A friend kindly passes along David Brooks's column from yesterday's New York Times; Brooks reminds us that Clinton, whatever else she may be, is one of a galaxy of daunting talent assembled by the President-elect. The Governor of Alaska, on the other hand, seems to be among those who embrace the creed, "Ye need not any man to teach you," though admittedly, the author of those words was speaking of a spiritual assurance and not knowledge of geopolitics. This week's Time notes that Palin will get a $7 million book deal, and Oliver Stone nominated her for Time's Person of the Year.)

Speaking of detachment, it appears that Obama is about to govern a nation in which, "According to a 2006 study by the Pew Forum on Religion & Public Life, a third of white evangelicals believe the world will end in their lifetimes." Michael Gross analyzed the effect of this strain of thought in The Atlantic a few years ago.) One zealous soul runs a web site that he wishes to be known as "the eBay of prophecy," a concept so amazingly oblivious to the context of its own religious origins that one hardly knows where to begin. Those who are less convinced that the dread day is upon us may have been among the admirers lined up at Wolfchase Mall in Memphis yesterday to get the autograph of Thomas Kinkade, "Painter of Light™." It seems that Kinkade's works hang in 1 in 20 American homes, a fact that, in itself, is enough to make one hope that the apocalypse is not so far off after all. Eighty-five years ago, the art of choice for 1 in 4 American homes was Maxfield Parrish's Daybreak. To be sure, Parrish was no Rembrandt, but at least his work reminds one of Alma-Tadema.

I was in 6th grade when Kennedy was shot. The Zapruder film became the horrifying precursor of YouTube. I remember seeing John Jr. salute his father's casket as it passed down Pennsylvania Avenue, though I had forgotten that his mother induced him to do so; I thought I remembered it as spontaneous. A classmate of mine visited Martinique the week after the assassination, and locals were asking him if there would be a coup in the United States.

© Michael Huggins, 2008. All rights reserved.