I'm glad The Atlantic's Conor Friersdorf included a video of Georgetown law student Sandra Fluke's intended testimony for women's health. Fluke, moved by the plight of a fellow student who needs birth control to protect against ovarian cysts, reports that an outlay of $3,000 for birth control during law school sans insurance represents a hardship for many students. I for one have to question that; this source says that birth control pills not covered by a prescription cost about $20-$50 per month, which, even at its highest, is probably less than most people spend in vending machines. Still, let's give these young people the benefit of the doubt for a moment and acknowledge that it may be idle to say that if they can afford Georgetown Law, they can afford birth control, when the high tuition and living expenses may be precisely the reason why they can't very well afford birth control, unless it is covered in their health insurance.
Needless to say, Rush Limbaugh's description of Fluke as a "slut" and his demand that she perform in a sex tape to earn money for contraception is vile beyond description. If there is a fit target for his taunts, it isn't Sandra Fluke or her unfortunate friend.
Still, we do have to deal with what another public figure, closer to the Fluke way of thinking, called "an inconvenient truth," and if this were a court case, someone ought to ask future lawyer Sandra if she really believes that her friend with the ovarian cysts represents the typical seeker of other people's money to pay for her contraception.
It will surprise no one that many college students are more sexually active than you or I. A report released several years ago by the University of Minnesota Boynton Health Service said, among other things, that 72% of college-age respondents reported having been sexually active in the year before the study. Young blood runs hot, and we need not be shocked at this nor object to the very practical consideration that most of these encounters probably should not result in a live birth; people of that age are not necessarily ready to be parents.
And the cost of contraception, unassisted by insurance, may not be so affordable on a student's budget, no matter what future earnings they may expect to pull down from a Wall St. law firm. As to denying coverage to people with medical conditions such as ovarian cysts, on the grounds that the product they use for this is also used with regard to purely discretionary behavior, that is monstrously wrongheaded and should not be tolerated in any humane society. It is of a piece with denying terminal patients in hideous pain "too much" pain medication on the grounds of the risk of dependency, which in their case, is absurd.
No, I do not deny that there are practical considerations on Sandra's side, but having said that, I do think she has played the fool.
When a nicely dressed student at Georgetown Law appears in a video, announces to the public that a certain sum for contraception is an undue burden on her and her friends, and cites, as her only example, someone who suffers from a medical condition, what can she possibly expect to be running through the minds of many of her hearers, and not Limbaugh alone?
Their likely thought is something like this: "Lady, what planet are you from? If college students wish to enjoy their sexual freedom, that is their choice, but then to expect the rest of us to put up the money for it? Are you out of your mind? Are you really this clueless and expecting to make a career in the practice of the law? Puh-lease!"
So here is my question: since condoms, which cost from 50 cents to a dollar apiece, run toward the lower end of the cost of birth control pills for a month's supply...
...if a male jock testified that he had a very active "social" life, and if he urged insurance companies to pay for condoms on the grounds that many of his fellow athletes were scholarship students from modest backgrounds with little discretionary income, and after all, the provision of condoms to males promoted public health...
...and if a female commentator replied by saying "Buy your own condoms, super stud, or keep it zipped"...
...would we see the same level of outrage?
© Michael Huggins, 2012. All rights reserved.
Showing posts with label Atlantic. Show all posts
Showing posts with label Atlantic. Show all posts
Friday, March 2, 2012
Tuesday, December 30, 2008
Hast thou philosophy, shepherd?
It was my father who went to college, but my mother whom I always remember absorbed in a book. Mom did not go to college until she was in her 30s. Dad was intelligent and well-spoken, but for him, the purpose of knowledge was to learn useful things or guide his thoughts in the right paths. For mom, reading was a key to asking why things were this way instead of another.
That difference appeared again a few years ago when our new Memphis Public Library building was dedicated, and local citizens were outraged to discover that among quotes from famous authors etched into the pavement near the Library's entrance, there were some from authors of whom they didn't approve, including Marx. One outraged citizen wrote to the local newspaper in protest, declaring, with perfect sincerity, that a library was supposed to be "a place of indoctrination."
For all I know, the person who wrote that absurdity held a college degree, though it didn't save him from completely misunderstanding the whole educational enterprise. Indoctrination is instruction in a prescribed set of norms that are not meant to be disputed; training is the impartation of facts, principles, and techniques meant to be mastered by rote, though that mastery may eventually lead to insights over and above the mere body of material that the student originally learned. Education, to be sure, builds on facts—there's not much point in discussing the effects of European discovery of the New World if one doesn't know when Columbus came over—but it is more than that. Education takes facts and teaches students to think. And that is really the problem.
This has nothing to do with whether most people could cultivate contemplative and analytical habits of mind if they wished; it is to reflect, instead, on the fact that the willingness to sift, to compare, to ask "why" and "what if" often causes discomfort not only to others but to the questioner himself. Philosopher James P. Carse was right to comment that "Many people read to have their views confirmed; the educated person reads to be surprised."
It may be that just about anyone could benefit from wrestling, at some point in his life, with the insights of Plato or Shakespeare; the question is whether he should pursue this as a private interest or be forced to pay thousands of dollars to do so as a requirement for obtaining the most ordinary employment. Charles Murray, of the American Enterprise Institute, made this point in an excellent article in The New York Times last Sunday:
Lest this seem like another dyspeptic rant on "today's good-for-nothing youngsters," a similar perspective was provided in the June Atlantic by an anonymous professor teaching English 101 and 102 in a "college of last resort" to classes made up mostly of forty-somethings who must complete a degree for job advancement:
The author relates the heartbreaking story of Mrs. L., a mature student assigned to do a research paper citing both sides of a historical controversy. Not only could she not write a coherent paragraph; she was never really able to understand the nature of the assignment in the first place. This has nothing to do with socio-economic status; I remember an article in The American Scholar some years ago remarking that for a certain sort of 60-something member of the country club class, taking graduate courses was seen as an interesting alternate form of recreation.
I overheard half of a telephone conversation once, in which one of my fellow students tried to reassure her caller that she would give her the help she needed in writing a comparison-and-contrast paper, a concept that the caller seemed unable to grasp. After the call was over, my fellow student chuckled merrily and said "Oh, that Anne! What a character! She just loves education. She has got herself two Master's degrees, and she has come back for more!" And if she continued to pay fees, no doubt the school saw no reason not to collect them.
A college degree has become a sort of űber-high school diploma in the minds of many employers and for no good reason. While I agree that a study of Shakespeare's Julius Caesar is probably one of the best introductions ever to office politics, I see no reason to require a clerical worker to learn it as an indispensable step to promotion, unless she simply wants to, and if she does, more power to her. Meanwhile, I have a step-cousin, a very sharp individual who has contributed computer code to NASA's missions to Mars, who cannot get permanent positions in the private sector for want of a college degree.
Once, it was assumed that a college degree was undertaken only as preparation for the ministry or a teaching career, and I agree with that archaic standard to the extent that everyone who sought it knew exactly what they were after and why. Again, to admit that college is not for everyone has nothing to do with misanthropy or invidious social distinctions. In 1983, 30-year-old Robert Martin was found living near Rossville, Tennessee, barefoot, with half his teeth missing, in a shack with no electricity or running water, with his elderly grandmother. He owned a Bible and a copy of Milton's works, and he knew both very nearly by heart. Taken to Vanderbilt, he amazed the professors with his knowledge. He had a hunger to know, and to think consequentially about what he had learned. I think it's time to leave liberal arts educations to those constituted like Martin and let the rest of the workforce demonstrate their competence through certification exercises that actually have something to do with their occupations. If they discover, at some point, that they have an urge to learn what Chaucer's pilgrims were up to and why, then I hope they find a willing teacher who can make those characters speak once more.
© Michael Huggins, 2008. All rights reserved.
That difference appeared again a few years ago when our new Memphis Public Library building was dedicated, and local citizens were outraged to discover that among quotes from famous authors etched into the pavement near the Library's entrance, there were some from authors of whom they didn't approve, including Marx. One outraged citizen wrote to the local newspaper in protest, declaring, with perfect sincerity, that a library was supposed to be "a place of indoctrination."
For all I know, the person who wrote that absurdity held a college degree, though it didn't save him from completely misunderstanding the whole educational enterprise. Indoctrination is instruction in a prescribed set of norms that are not meant to be disputed; training is the impartation of facts, principles, and techniques meant to be mastered by rote, though that mastery may eventually lead to insights over and above the mere body of material that the student originally learned. Education, to be sure, builds on facts—there's not much point in discussing the effects of European discovery of the New World if one doesn't know when Columbus came over—but it is more than that. Education takes facts and teaches students to think. And that is really the problem.
This has nothing to do with whether most people could cultivate contemplative and analytical habits of mind if they wished; it is to reflect, instead, on the fact that the willingness to sift, to compare, to ask "why" and "what if" often causes discomfort not only to others but to the questioner himself. Philosopher James P. Carse was right to comment that "Many people read to have their views confirmed; the educated person reads to be surprised."
It may be that just about anyone could benefit from wrestling, at some point in his life, with the insights of Plato or Shakespeare; the question is whether he should pursue this as a private interest or be forced to pay thousands of dollars to do so as a requirement for obtaining the most ordinary employment. Charles Murray, of the American Enterprise Institute, made this point in an excellent article in The New York Times last Sunday:
My beef is not with liberal education, but with the use of the degree as a job qualification.
For most of the nation’s youths, making the bachelor’s degree a job qualification means demanding a credential that is beyond their reach. It is a truth that politicians and educators cannot bring themselves to say out loud: A large majority of young people do not have the intellectual ability to do genuine college-level work.
If you doubt it, go back and look through your old college textbooks, and then do a little homework on the reading ability of high school seniors. About 10 percent to 20 percent of all 18-year-olds can absorb the material in your old liberal arts textbooks. For engineering and the hard sciences, the percentage is probably not as high as 10....
But I’m not thinking just about students who are not smart enough to deal with college-level material. Many young people who have the intellectual ability to succeed in rigorous liberal arts courses don’t want to. For these students, the distribution requirements of the college degree do not open up new horizons. They are bothersome time-wasters.
A century ago, these students would happily have gone to work after high school. Now they know they need to acquire additional skills, but they want to treat college as vocational training, not as a leisurely journey to well-roundedness.
Lest this seem like another dyspeptic rant on "today's good-for-nothing youngsters," a similar perspective was provided in the June Atlantic by an anonymous professor teaching English 101 and 102 in a "college of last resort" to classes made up mostly of forty-somethings who must complete a degree for job advancement:
Some of their high-school transcripts are newly minted, others decades old. Many of my students have returned to college after some manner of life interregnum: a year or two of post-high-school dissolution, or a large swath of simple middle-class existence, 20 years of the demands of home and family. They work during the day and come to class in the evenings. I teach young men who must amass a certain number of credits before they can become police officers or state troopers, lower-echelon health-care workers who need credits to qualify for raises, and municipal employees who require college-level certification to advance at work.
My students take English 101 and English 102 not because they want to but because they must. Both colleges I teach at require that all students, no matter what their majors or career objectives, pass these two courses. For many of my students, this is difficult. Some of the young guys, the police-officers-to-be, have wonderfully open faces across which play their every passing emotion, and when we start reading “Araby” or “Barn Burning,” their boredom quickly becomes apparent. They fidget; they prop their heads on their arms; they yawn and sometimes appear to grimace in pain, as though they had been tasered. Their eyes implore: How could you do this to me?
The goal of English 101 is to instruct students in the sort of expository writing that theoretically will be required across the curriculum. My students must venture the compare-and-contrast paper, the argument paper, the process-analysis paper (which explains how some action is performed—as a lab report might), and the dreaded research paper, complete with parenthetical citations and a listing of works cited, all in Modern Language Association format. In 102, we read short stories, poetry, and Hamlet, and we take several stabs at the only writing more dreaded than the research paper: the absolutely despised Writing About Literature.
The author relates the heartbreaking story of Mrs. L., a mature student assigned to do a research paper citing both sides of a historical controversy. Not only could she not write a coherent paragraph; she was never really able to understand the nature of the assignment in the first place. This has nothing to do with socio-economic status; I remember an article in The American Scholar some years ago remarking that for a certain sort of 60-something member of the country club class, taking graduate courses was seen as an interesting alternate form of recreation.
I overheard half of a telephone conversation once, in which one of my fellow students tried to reassure her caller that she would give her the help she needed in writing a comparison-and-contrast paper, a concept that the caller seemed unable to grasp. After the call was over, my fellow student chuckled merrily and said "Oh, that Anne! What a character! She just loves education. She has got herself two Master's degrees, and she has come back for more!" And if she continued to pay fees, no doubt the school saw no reason not to collect them.
A college degree has become a sort of űber-high school diploma in the minds of many employers and for no good reason. While I agree that a study of Shakespeare's Julius Caesar is probably one of the best introductions ever to office politics, I see no reason to require a clerical worker to learn it as an indispensable step to promotion, unless she simply wants to, and if she does, more power to her. Meanwhile, I have a step-cousin, a very sharp individual who has contributed computer code to NASA's missions to Mars, who cannot get permanent positions in the private sector for want of a college degree.
Once, it was assumed that a college degree was undertaken only as preparation for the ministry or a teaching career, and I agree with that archaic standard to the extent that everyone who sought it knew exactly what they were after and why. Again, to admit that college is not for everyone has nothing to do with misanthropy or invidious social distinctions. In 1983, 30-year-old Robert Martin was found living near Rossville, Tennessee, barefoot, with half his teeth missing, in a shack with no electricity or running water, with his elderly grandmother. He owned a Bible and a copy of Milton's works, and he knew both very nearly by heart. Taken to Vanderbilt, he amazed the professors with his knowledge. He had a hunger to know, and to think consequentially about what he had learned. I think it's time to leave liberal arts educations to those constituted like Martin and let the rest of the workforce demonstrate their competence through certification exercises that actually have something to do with their occupations. If they discover, at some point, that they have an urge to learn what Chaucer's pilgrims were up to and why, then I hope they find a willing teacher who can make those characters speak once more.
© Michael Huggins, 2008. All rights reserved.
Labels:
Atlantic,
Chaucer,
College,
Columbus,
Education,
Indoctrination,
James P. Carse,
Library,
Marx,
Memphis,
NASA,
New York Times,
Plato,
Shakespeare,
Vanderbilt
Monday, December 29, 2008
No bailout bargain
It would be nice if the economy would have a Clive Huggins moment. Clive was my late father, and, having been raised in the Depression, was always fearful of the prospect of paying too much for something; thus, he was often punished by the rule that says you get what you pay for. The Christmas trees he brought home looked like the last survivors of a worldwide drought, while he grumbled that he had overpaid. I, on the other hand, being perfectly willing to pay a premium price where I can afford it (though, on my budget, such willingness is more often a state of mind than an actual monetary transaction) and am likely to gain real quality by doing so, sometimes find myself buying perfectly creditable items for absurdly low prices. When that happens—e.g., paying $85 for $250 luggage or $18 for a $120 lounging robe—I call it a Clive Huggins moment. One of the largest Christmas trees I ever had, a tree so large I actually had to block one of the doors of my apartment to find a place to set it up, cost me $2, which was ridiculously low even for 1974.
Jeffrey Garten, writing in Newsweek recently, was convinced that we are trying to nickel-and-dime our way out of the current financial crisis and that nothing but an investment of $1 trillion, or 7% of GDP over the next 2 years, will give the economy the necessary infusion of capital and inspire consumer confidence once more:
I only wish someone could figure out the ratio of fear to real potential economic damage. Of course the government needs to spend, but how much? The exasperating thing is the extent to which the crisis is driven by emotion. I think of this, for instance, when I read of homeowners who, we are told, are still stuck paying on homes that are now worth less than the balance of the mortgage. So? My Saturn Ion is worth less than what I owe on it, even at the 0% APR financing I obtained from a desperate car dealer last year (another Clive Huggins moment), but that doesn't keep it from getting me to work, and I am still making payments. There is a real economic crisis—no argument there—and an additional amount of hand-wringing that is no doubt making the situation potentially much worse than it needs to be.
Actually, psychology is as interesting in figuring out how we got here as it is in figuring out how to get out of this mess. Henry Blodget, once notorious as a Wall St. tech stock analyst forced out of his occupation after the dot.com meltdown by Eliot Spitzer (never mind), provides a very incisive analysis of why there will always be economic bubbles and why perfectly intelligent, well-intentioned people will miss them until it's too late, in his article "Why Wall Street Always Blows It" in the current Atlantic:
One of the biggest culprits, as Blodget points out, is the recurring belief that "it's different this time" in a way that is supposed to make caution irrelevant:
The other deadly ingredient in bubbles is that investment professionals won't keep their jobs if they restrict themselves to prudent courses leading to merely reasonable returns; the same competition that fuels a free market also drives each fund manager to chase larger and larger returns for his investors, on peril of a forced retirement. Blodget cites the instance of fund manager Julian Robertson, whose Tiger Management company lost 66% of its assets to withdrawals by disgruntled investors and finally closed its doors because Robertson correctly anticipated the tech stock meltdown and moved his investors' funds elsewhere (where returns were lesser, though on safer ground).
In other words, we don't want the slick mortgage broker, the BMW-driving realtor, or the glib investment pitchman to do the right thing for us but the thing, instead, that will make us feel as well off as our neighbors occupying the houses that they also couldn't afford. Frankly, I wonder if the Dutch had the right idea when they went nuts over tulip bulbs in 1634; the Semper Augustus bulb was commanding prices equal to that of a house on the Amsterdam market, but at least that was a product of nature.
Actually, Jane Bryant Quinn quotes financial advisor Steve Leuthold in a recent column as saying that investors should buy just about anything at this point, on the grounds that it is likely to be a bargain. That was certainly true of the $22.99 box of firelogs I talked an exasperated Kroger manager into selling me for $5.99 yesterday, after it had been mislabeled, but I don't think that's what Leuthold meant. If stocks scare you too much, perhaps you could consider bidding on Jeff Koons's metal "Hanging Heart" sculpture, a colossal piece of kitsch that recently sold for $27 million, which proves that a fool and his money are soon parted. I think my best buy today was ordering Facing Mount Kenya from Amazon.com on the advice of the Kenyan clergyman of the church I have been visiting; an ethnographic study done at the London School of Economics by a young Jomo Kenyatta in the 1930s, it is supposed to be a very insightful analysis of Kikuyu life and culture. I got it for just $1.75.
© Michael Huggins, 2008. All rights reserved.
Jeffrey Garten, writing in Newsweek recently, was convinced that we are trying to nickel-and-dime our way out of the current financial crisis and that nothing but an investment of $1 trillion, or 7% of GDP over the next 2 years, will give the economy the necessary infusion of capital and inspire consumer confidence once more:
The fundamental issue is fear. Despite the colossal problems in the U.S. economy, the dollar continues to strengthen, which just shows that investors fear other markets even more. Billions of dollars are flowing into three-year U.S. Treasury bills, whose interest rate is zero, so investors are merely trying to minimize losses, not make money. Clearly, the governments have not succeeded in restoring calm. Their efforts look improvised, confused and ineffective to the average consumer or investor. The poster child for this problem is the $700 billion Troubled Asset Relief Program in the United States. The bitter congressional debates over the program and its shifting purpose—from buying toxic assets to injecting cash—has left the public feeling that Washington isn't quite sure what it is doing. For many weeks now, the Treasury and the Fed have appeared to be constantly on the brink of unveiling yet another new program, leaving the impression that even they don't believe the current ones will work.
I only wish someone could figure out the ratio of fear to real potential economic damage. Of course the government needs to spend, but how much? The exasperating thing is the extent to which the crisis is driven by emotion. I think of this, for instance, when I read of homeowners who, we are told, are still stuck paying on homes that are now worth less than the balance of the mortgage. So? My Saturn Ion is worth less than what I owe on it, even at the 0% APR financing I obtained from a desperate car dealer last year (another Clive Huggins moment), but that doesn't keep it from getting me to work, and I am still making payments. There is a real economic crisis—no argument there—and an additional amount of hand-wringing that is no doubt making the situation potentially much worse than it needs to be.
Actually, psychology is as interesting in figuring out how we got here as it is in figuring out how to get out of this mess. Henry Blodget, once notorious as a Wall St. tech stock analyst forced out of his occupation after the dot.com meltdown by Eliot Spitzer (never mind), provides a very incisive analysis of why there will always be economic bubbles and why perfectly intelligent, well-intentioned people will miss them until it's too late, in his article "Why Wall Street Always Blows It" in the current Atlantic:
...most bubbles are the product of more than just bad faith, or incompetence, or rank stupidity; the interaction of human psychology with a market economy practically ensures that they will form. In this sense, bubbles are perfectly rational—or at least they’re a rational and unavoidable by-product of capitalism (which, as Winston Churchill might have said, is the worst economic system on the planet except for all the others). Technology and circumstances change, but the human animal doesn’t.
One of the biggest culprits, as Blodget points out, is the recurring belief that "it's different this time" in a way that is supposed to make caution irrelevant:
Those are said to be the most expensive words in the English language, by the way: it’s different this time. You can’t have a bubble without good explanations for why it’s different this time. If everyone knew that this time wasn’t different, the market would stop going up. But the future is always uncertain—and amid uncertainty, all sorts of faith-based theories can flourish, even on Wall Street.
In the 1920s, the “differences” were said to be the miraculous new technologies (phones, cars, planes) that would speed the economy, as well as Prohibition, which was supposed to produce an ultra-efficient, ultra-responsible workforce. (Don’t laugh: one of the most respected economists of the era, Irving Fisher of Yale University, believed that one.) In the tech bubble of the 1990s, the differences were low interest rates, low inflation, a government budget surplus, the Internet revolution, and a Federal Reserve chairman apparently so divinely talented that he had made the business cycle obsolete. In the housing bubble, they were low interest rates, population growth, new mortgage products, a new ownership society, and, of course, the fact that “they aren’t making any more land.”
In hindsight, it’s obvious that all these differences were bogus (they’ve never made any more land—except in Dubai, which now has its own problems). At the time, however, with prices going up every day, things sure seemed different.
In fairness to the thousands of experts who’ve snookered themselves throughout the years, a complicating factor is always at work: the ever-present possibility that it really might have been different. Everything is obvious only after the crash.
The other deadly ingredient in bubbles is that investment professionals won't keep their jobs if they restrict themselves to prudent courses leading to merely reasonable returns; the same competition that fuels a free market also drives each fund manager to chase larger and larger returns for his investors, on peril of a forced retirement. Blodget cites the instance of fund manager Julian Robertson, whose Tiger Management company lost 66% of its assets to withdrawals by disgruntled investors and finally closed its doors because Robertson correctly anticipated the tech stock meltdown and moved his investors' funds elsewhere (where returns were lesser, though on safer ground).
In other words, we don't want the slick mortgage broker, the BMW-driving realtor, or the glib investment pitchman to do the right thing for us but the thing, instead, that will make us feel as well off as our neighbors occupying the houses that they also couldn't afford. Frankly, I wonder if the Dutch had the right idea when they went nuts over tulip bulbs in 1634; the Semper Augustus bulb was commanding prices equal to that of a house on the Amsterdam market, but at least that was a product of nature.
Actually, Jane Bryant Quinn quotes financial advisor Steve Leuthold in a recent column as saying that investors should buy just about anything at this point, on the grounds that it is likely to be a bargain. That was certainly true of the $22.99 box of firelogs I talked an exasperated Kroger manager into selling me for $5.99 yesterday, after it had been mislabeled, but I don't think that's what Leuthold meant. If stocks scare you too much, perhaps you could consider bidding on Jeff Koons's metal "Hanging Heart" sculpture, a colossal piece of kitsch that recently sold for $27 million, which proves that a fool and his money are soon parted. I think my best buy today was ordering Facing Mount Kenya from Amazon.com on the advice of the Kenyan clergyman of the church I have been visiting; an ethnographic study done at the London School of Economics by a young Jomo Kenyatta in the 1930s, it is supposed to be a very insightful analysis of Kikuyu life and culture. I got it for just $1.75.
© Michael Huggins, 2008. All rights reserved.
Tuesday, December 23, 2008
It ain't over 'til it's over
If only scientists would stop learning new things, I could be comfortable in my previous assumptions. For several years, I have accepted as true that we confront the modern age with essentially Stone Age minds and that human evolution had receded to vanishingly small levels, shielded from adaptive challenges by technology.
That some people speak and act with Neolithic sensibilities seems indisputable in my experience, but in general, I may be premature in marking an end to human evolution and guilty at least of oversimplication in thinking that we are mostly wired to hunt Mastodons. Writing in the December Scientific American, Peter Ward describes the ways in which agriculture and urban life have contributed to changes in the human genome:
As to the Stone Age mind, philosopher David Buller argues in the same issue that we can't really know enough about the environment of our ancestors to describe their psychology with any precision and, that, indeed, the concept of a Stone Age mind does an injustice both to our pre-human past and to our more recent development:
I don't know if the Internet rises to the level of a selection pressure or not (for me, post-divorce encounters with Internet dating sites have acted more as an inducement to celibacy), but some hold high hopes for it. Futurist Raymond Kurzweil believes that human and machine intelligence will eventually merge, perhaps improving both. Other observers believe and hope that the Internet will enable a qualitative advance in communication that results in the merging of all minds into a super-mind, a whole greater than the sum of its parts (an expectation that I think is completely unfounded, for the simple reason that two cars don't make a bus).
Still others are deeply skeptical about the influence of the Net on civilization, and the two sides have lined up in a clear and vigorous debate, as noted by David Brin in his article "Will the Net Help Us Evolve?" in today's Salon:
Nicholas Carr weighed in for the skeptics with his article "Is Google Making Us Stupid?" in the July Atlantic. Carr laments:
Responding to Carr in the Encyclopedia Britannica Blog, Clay Shirky agrees that
Shirky suspects that Carr has misdiagnosed his own pain and only thinks it's about the Net while actually, it is a lament for a literary culture that was vanishing before the computer age even began:
While agreeing that the need to focus must not be lost, Shirky argues that the challenge with which the Net presents us is not an avalanche of intellectual junk but merely a selection problem analogous to what happened a couple of centuries ago when the printing press produced so many works that one man could no longer hope to be master of all knowledge. The concept of the sage as cathedral-like structure, Shirky says, must give way to the idea of a shopper in a bazaar.
Brin isn't ready to wholeheartedly sign up with either side. He agrees with Carr that the internet tempts some to become part of a dim-witted mob but also hails the same abundance that so delights Shirky and his fellow techno-enthusiasts. His solution is to ensure that there are tools on the web for winnowing the wheat from the chaff:
Being temperamentally inclined toward the conservative and curmudgeonly, I ought to give Shirky his due. Folly didn't begin with the Internet, and gullibility existed in the days of the papyrus scroll and before. By itself, the Net can't keep people from reading Milton any more than widespread popular use of the transistor radio in the '50s could destroy interest in Antonio Vivaldi; indeed, Vivaldi enjoyed a revival in that decade, as did Georg Philipp Telemann in the decade after, which was also, of course, the decade of Woodstock. As to reading great works, it was true once, as Samuel Johnson said, that "Classical quotation is the parole of educated men the world over," but sadly, no longer (I didn't know whether to laugh or cry at the moment in The Sixth Sense when Bruce Willis, playing a Ph.D. in Psychology, had to look up the meaning of De Profundis).
Still, I'm less concerned over whether or not someone can quote Vergil than whether he is inclined to examine questions rigorously on the evidence, and in sufficient detail to verify anything. As to verification, Farhad Manjoo correctly points out in his book True Enough that many people no longer care as much for the truth as their truth, and their truth may be that Obama is a Muslim or that the Pentagon secretly planned 9/11. How is this any worse on the Net than in print 200 years ago? It spreads more quickly, and its credit is aided by the almost superstitious awe in which many people hold technology—for some, the Internet itself is taken as a sort of verification of even foolish claims.
As to reading at length, I have seen the way instant messaging and the BlackBerry® have transformed the workplace; for many, if it can't fit on a BlackBerry screen, it's superfluous. Shirky partly misses the point by focusing specifically on Carr's mention of War and Peace; under the trend to fragmented reading and thinking promoted by the Internet, how many people does he imagine are willing, even today, to read a piece of the length and conceptual complexity of his own essay?
Lord Chesterfield wrote:
Were he alive now, Chesterfield might be hired for his skill at making himself liked, but his disapproval of multitasking would make him an odd egg in today's business environment.
A book or essay, whether sitting on my shelf or downloadable online, reminds me of what Robert Maynard Hutchins called "The Great Conversation," a dialogue that has lasted for centuries, on texts that were the result of wrestling with important questions. The Internet, which makes its users impatient to read any one thing for very long, is less like a symposium than a food fight. Just as the democratic character of Wikipedia® made guardians of content more necessary than ever, the openness of the internet, the variety of its distractions, and the brevity of many of its offerings make it necessary for the user to recollect himself and ask if the entertaining site he has discovered conveys the truth of the matter or is the online equivalent of a supermarket tabloid. It is a question that I think fewer and fewer are willing to ask.
© Michael Huggins, 2008. All rights reserved.
That some people speak and act with Neolithic sensibilities seems indisputable in my experience, but in general, I may be premature in marking an end to human evolution and guilty at least of oversimplication in thinking that we are mostly wired to hunt Mastodons. Writing in the December Scientific American, Peter Ward describes the ways in which agriculture and urban life have contributed to changes in the human genome:
...over the past 10,000 years humans have evolved as much as 100 times faster than at any other time since the split of the earliest hominid from the ancestors of modern chimpanzees. [Scientists attribute] the quickening pace to the variety of environments humans moved into and the changes in living conditions brought about by agriculture and cities. It was not farming per se or the changes in the landscape that conversion of wild habitat to tamed fields brought about but the often lethal combination of poor sanitation, novel diet and emerging diseases (from other humans as well as domesticated animals).
As to the Stone Age mind, philosopher David Buller argues in the same issue that we can't really know enough about the environment of our ancestors to describe their psychology with any precision and, that, indeed, the concept of a Stone Age mind does an injustice both to our pre-human past and to our more recent development:
[The] claim that human nature was designed during the Pleistocene, when our ancestors lived as hunter-gatherers, gets it wrong on both ends of the epoch.
Some human psychological mechanisms undoubtedly did emerge during the Pleistocene. But others are holdovers of a more ancient evolutionary past, aspects of our psychology that are shared with some of our primate relatives. Evolutionary neuroscientist Jaak Panksepp of Bowling Green State University has identified seven emotional systems in humans that originated deeper in our evolutionary past than the Pleistocene. The emotional systems that he terms Care, Panic and Play date back to early primate evolutionary history, whereas the systems of Fear, Rage, Seeking and Lust have even earlier, premammalian origins....
The view that “our modern skulls house a Stone Age mind” gets things wrong on the contemporary end of our evolutionary history as well. The idea that we are stuck with a Pleistocene-adapted psychology greatly underestimates the rate at which natural and sexual selection can drive evolutionary change. Recent studies have demonstrated that selection can radically alter the life-history traits of a population in as few as 18 generations (for humans, roughly 450 years).
Of course, such rapid evolution can occur only with significant change in the selection pressures acting on a population. But environmental change since the Pleistocene has unquestionably altered the selection pressures on human psychology. The agricultural and industrial revolutions precipitated fundamental changes in the social structures of human populations, which in turn altered the challenges humans face when acquiring resources, mating, forming alliances or negotiating status hierarchies. Other human activities—ranging from constructing shelter to preserving food, from contraception to organized education—have also consistently altered the selection pressures. Because we have clear examples of post-Pleistocene physiological adaptation to changing environmental demands (such as malaria resistance), we have no reason to doubt similar psychological evolution.
I don't know if the Internet rises to the level of a selection pressure or not (for me, post-divorce encounters with Internet dating sites have acted more as an inducement to celibacy), but some hold high hopes for it. Futurist Raymond Kurzweil believes that human and machine intelligence will eventually merge, perhaps improving both. Other observers believe and hope that the Internet will enable a qualitative advance in communication that results in the merging of all minds into a super-mind, a whole greater than the sum of its parts (an expectation that I think is completely unfounded, for the simple reason that two cars don't make a bus).
Still others are deeply skeptical about the influence of the Net on civilization, and the two sides have lined up in a clear and vigorous debate, as noted by David Brin in his article "Will the Net Help Us Evolve?" in today's Salon:
Some of today's most vaunted tech philosophers are embroiled in a ferocious argument. On one side are those who think the Internet will liberate humanity, in a virtuous cycle of e-volving creativity that may culminate in new and higher forms of citizenship. Meanwhile, their diametrically gloomy critics see a kind of devolution taking hold, as millions are sucked into spirals of distraction, shallowness and homogeneity, gradually surrendering what little claim we had to the term "civilization."
Call it cyber-transcendentalists versus techno-grouches.
Nicholas Carr weighed in for the skeptics with his article "Is Google Making Us Stupid?" in the July Atlantic. Carr laments:
...what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.
I’m not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing....Scott Karp, who writes a blog about online media, recently confessed that he has stopped reading books altogether. “I was a lit major in college, and used to be [a] voracious book reader,” he wrote. “What happened?”
....Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the Internet has altered his mental habits. “I now have almost totally lost the ability to read and absorb a longish article on the web or in print,” he wrote earlier this year. A pathologist who has long been on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His thinking, he said, has taken on a “staccato” quality, reflecting the way he quickly scans short passages of text from many sources online. “I can’t read War and Peace anymore,” he admitted. “I’ve lost the ability to do that. Even a blog post of more than three or four paragraphs is too much to absorb. I skim it.”
Responding to Carr in the Encyclopedia Britannica Blog, Clay Shirky agrees that
The web presents us with unprecedented abundance. This can lead to interrupt-driven info-snacking, which robs people of the ability to find time to think about just one thing persistently. I also think that these changes are significant enough to motivate us to do something about it. I disagree, however, about what it is we should actually be doing.
Shirky suspects that Carr has misdiagnosed his own pain and only thinks it's about the Net while actually, it is a lament for a literary culture that was vanishing before the computer age even began:
Despite the sweep of the title, it’s focused on a very particular kind of reading, literary reading, as a metonym for a whole way of life. You can see this in Carr’s polling of “literary types,” in his quoting of Wolf and the playwright Richard Foreman, and in the reference to War and Peace, the only work mentioned by name. Now War and Peace isn’t just any piece of writing, of course; it is one of the longest novels in the canon, and symbolizes the height of literary ambition and of readerly devotion.
But here’s the thing: it’s not just Carr’s friend, and it’s not just because of the web—no one reads War and Peace. It’s too long, and not so interesting.
This observation is no less sacrilegious for being true. The reading public has increasingly decided that Tolstoy’s sacred work isn’t actually worth the time it takes to read it, but that process started long before the internet became mainstream. Much of the current concern about the internet, in fact, is a misdirected complaint about television, which displaced books as the essential medium by the 1970s....
And this, I think, is the real anxiety behind the essay: having lost its actual centrality some time ago, the literary world is now losing its normative hold on culture as well. The threat isn’t that people will stop reading War and Peace. That day is long since past. The threat is that people will stop genuflecting to the idea of reading War and Peace.
While agreeing that the need to focus must not be lost, Shirky argues that the challenge with which the Net presents us is not an avalanche of intellectual junk but merely a selection problem analogous to what happened a couple of centuries ago when the printing press produced so many works that one man could no longer hope to be master of all knowledge. The concept of the sage as cathedral-like structure, Shirky says, must give way to the idea of a shopper in a bazaar.
Brin isn't ready to wholeheartedly sign up with either side. He agrees with Carr that the internet tempts some to become part of a dim-witted mob but also hails the same abundance that so delights Shirky and his fellow techno-enthusiasts. His solution is to ensure that there are tools on the web for winnowing the wheat from the chaff:
...what's needed is not the blithe enthusiasm preached by Ray Kurzweil and Clay Shirky. Nor Nicholas Carr's dyspeptic homesickness. What is called for is a clear-eyed, practical look at what's missing from today's Web. Tools that might help turn quasar levels of gushing opinion into something like discourse, so that several billion people can do more than just express a myriad of rumors and shallow impulses, but test, compare and actually reach some conclusions now and then.
But what matters even more is to step back from yet another tiresome dichotomy, between fizzy enthusiasm and testy nostalgia. Earlier phases of the great Enlightenment experiment managed to do this by taking a wider perspective. By taking nothing for granted.
Being temperamentally inclined toward the conservative and curmudgeonly, I ought to give Shirky his due. Folly didn't begin with the Internet, and gullibility existed in the days of the papyrus scroll and before. By itself, the Net can't keep people from reading Milton any more than widespread popular use of the transistor radio in the '50s could destroy interest in Antonio Vivaldi; indeed, Vivaldi enjoyed a revival in that decade, as did Georg Philipp Telemann in the decade after, which was also, of course, the decade of Woodstock. As to reading great works, it was true once, as Samuel Johnson said, that "Classical quotation is the parole of educated men the world over," but sadly, no longer (I didn't know whether to laugh or cry at the moment in The Sixth Sense when Bruce Willis, playing a Ph.D. in Psychology, had to look up the meaning of De Profundis).
Still, I'm less concerned over whether or not someone can quote Vergil than whether he is inclined to examine questions rigorously on the evidence, and in sufficient detail to verify anything. As to verification, Farhad Manjoo correctly points out in his book True Enough that many people no longer care as much for the truth as their truth, and their truth may be that Obama is a Muslim or that the Pentagon secretly planned 9/11. How is this any worse on the Net than in print 200 years ago? It spreads more quickly, and its credit is aided by the almost superstitious awe in which many people hold technology—for some, the Internet itself is taken as a sort of verification of even foolish claims.
As to reading at length, I have seen the way instant messaging and the BlackBerry® have transformed the workplace; for many, if it can't fit on a BlackBerry screen, it's superfluous. Shirky partly misses the point by focusing specifically on Carr's mention of War and Peace; under the trend to fragmented reading and thinking promoted by the Internet, how many people does he imagine are willing, even today, to read a piece of the length and conceptual complexity of his own essay?
Lord Chesterfield wrote:
A man is fit for neither business nor pleasure, who either cannot, or does not, command and direct his attention to the present object, and, in some degree, banish for that time all other objects from his thoughts....steady and undissipated attention to one object is a sure mark of a superior genius; as hurry, bustle, and agitation are the never-failing symptoms of a weak and frivolous mind.
Were he alive now, Chesterfield might be hired for his skill at making himself liked, but his disapproval of multitasking would make him an odd egg in today's business environment.
A book or essay, whether sitting on my shelf or downloadable online, reminds me of what Robert Maynard Hutchins called "The Great Conversation," a dialogue that has lasted for centuries, on texts that were the result of wrestling with important questions. The Internet, which makes its users impatient to read any one thing for very long, is less like a symposium than a food fight. Just as the democratic character of Wikipedia® made guardians of content more necessary than ever, the openness of the internet, the variety of its distractions, and the brevity of many of its offerings make it necessary for the user to recollect himself and ask if the entertaining site he has discovered conveys the truth of the matter or is the online equivalent of a supermarket tabloid. It is a question that I think fewer and fewer are willing to ask.
© Michael Huggins, 2008. All rights reserved.
Labels:
Atlantic,
Chesterfield,
Evolution,
Genome,
Internet,
Johnson,
Kurzweil,
Milton,
Obama,
Pleistocene,
Salon,
Scientific American,
Singularity,
Stone Age,
Telemann,
Vergil,
Vivaldi,
Woodstock
Friday, December 12, 2008
Justice delayed
Visitors to London who find Buckingham Palace passé can sign up for an after-dark guided tour of the haunts of Jack the Ripper, as reported in Newsweek. The Ripper, whoever he was, was never caught, and one hopes it was not because the London police succumbed to the laissez-faire attitude of police in Frederick, Maryland who did not show up for 3 hours after Sears employees called last week to report a shoplifter, who didn't even try to flee but waited patiently to surrender. Finally, the Sears employees sent him home.
The Decider of Crawford is also about to be sent home, and the question arises (for some, anyway), as to what to do about his Administration's flagrant disregard of the law. I was aghast at Nixon's misdeeds 35 years ago, but if Nixon could be brought back from the dead, I suspect that even he would wonder what country he had landed in. As New York attorney Scott Horton crisply summarizes in the December issue of Harper's (online access by subscription only):
It's interesting that, as the article reminds us, a U.S. Army Captain was court-martialed for using waterboarding on Filipinos in 1902, and waterboarding was among the war crimes for which the death penalty was sought in prosecuting Japanese war criminals after World War II. Washington and Lincoln expressly forbade inhumane treatment of enemy prisoners in their respective wars, but then, they were never inspired by the work of Jack Bauer.
We can't have our cake and eat it too. To pursue terrorists and tyrants because they are inhumanly cruel and then resort, ourselves, to cruel methods discredited 100 years ago to make sure the cruelty is fully cleansed is a no-brainer, all right, though not the way the Vice-President meant it. Aside from the monumental stupidity of such a policy, we lose our moral distinction.
Not to mention the practical question of how much useful intelligence such methods actually yield. Mark Bowden's hard-headed look at torture ("The Dark Art of Coercion" in the October, 2003 Atlantic) examines many aspects of the issue; Bowden is actually willing to countenance some uses of what he calls "Torture Lite," but he also tellingly quotes Bill Cowan, a retired Marine lieutenant colonel who conducted interrogations in Vietnam:
Montaigne, who lived through an especially brutal period in French history that saw religious wars, the St. Bartholomew's Day Massacre, and the assassination of a French King, wrote this about torture:
But to return to Horton's question in Harper's, what should be done about Bush, Rumsfeld, and Cheney? I wonder, after viewing Errol Morris's film Standard Operating Procedure, about Abu Ghraib, whether we shouldn't require the three stalwarts, along with Alberto Gonzalez and John Yoo, to endure for a weekend what was done for months at Abu Ghraib—not as cruel and unusual punishment, of course, but only as an exercise in fact finding. Perhaps, like boys inured to the sadism said to be practiced in upper-class British schools, they will emerge unrepentant, and that, in itself, would comprise as much truth as is likely to be discovered by any "Truth Commission," no matter how well managed.
© Michael Huggins, 2008. All rights reserved.
The Decider of Crawford is also about to be sent home, and the question arises (for some, anyway), as to what to do about his Administration's flagrant disregard of the law. I was aghast at Nixon's misdeeds 35 years ago, but if Nixon could be brought back from the dead, I suspect that even he would wonder what country he had landed in. As New York attorney Scott Horton crisply summarizes in the December issue of Harper's (online access by subscription only):
This administration did more than commit crimes. It waged war against the law itself....it also introduced a sweeping surveillance program that was so clearly illegal that virtually the entire senior echelon of the Justice Department threatened to (but did not in fact) tender their resignations over it....And through it all, as if to underscore its contempt for any authority but its own, the administration issued more than a hundred carefully crafted “signing statements” that raised pervasive doubt about whether the president would even accede to bills that he himself had signed into law.
No prior administration has been so systematically or so brazenly lawless....in weighing the enormity of the administration’s transgressions against the realistic prospect of justice, it is possible to determine not only the crime that calls most clearly for prosecution but also the crime that is most likely to be successfully prosecuted. In both cases, that crime is torture.
There can be no doubt that torture is illegal. There is no wartime exception for torture, nor is there an exception for prisoners or “enemy combatants,” nor is there an exception for “enhanced” methods. The authors of the Constitution forbade “cruel and unusual punishment,” the details of that prohibition were made explicit in the Geneva Conventions (“No physical or mental torture, nor any other form of coercion, may be inflicted on prisoners of war to secure from them information of any kind whatever”), and that definition has in turn become subject to U.S. enforcement through the Uniform Code of Military Justice, the U.S. Criminal Code, and several acts of Congress.
Nor can there be any doubt that this administration conspired to commit torture: Waterboarding. Hypothermia. Psychotropic drugs. Sexual humiliation. Secretly transporting prisoners to other countries that use even more brutal techniques. The administration has carefully documented these actions and, in many cases, proudly proclaimed them. The written guidelines for interrogations at Guantánamo Bay, for instance, describe several techniques for degrading and physically debilitating prisoners, including the “forceful removal of detainees’ clothing” and the use of “stress positions.” And in a 2006 radio interview, Dick Cheney said simply that the use of waterboarding to obtain intelligence was a “no-brainer.”
It's interesting that, as the article reminds us, a U.S. Army Captain was court-martialed for using waterboarding on Filipinos in 1902, and waterboarding was among the war crimes for which the death penalty was sought in prosecuting Japanese war criminals after World War II. Washington and Lincoln expressly forbade inhumane treatment of enemy prisoners in their respective wars, but then, they were never inspired by the work of Jack Bauer.
We can't have our cake and eat it too. To pursue terrorists and tyrants because they are inhumanly cruel and then resort, ourselves, to cruel methods discredited 100 years ago to make sure the cruelty is fully cleansed is a no-brainer, all right, though not the way the Vice-President meant it. Aside from the monumental stupidity of such a policy, we lose our moral distinction.
Not to mention the practical question of how much useful intelligence such methods actually yield. Mark Bowden's hard-headed look at torture ("The Dark Art of Coercion" in the October, 2003 Atlantic) examines many aspects of the issue; Bowden is actually willing to countenance some uses of what he calls "Torture Lite," but he also tellingly quotes Bill Cowan, a retired Marine lieutenant colonel who conducted interrogations in Vietnam:
I don't see the proof in the pudding. If you had a top leader like Mohammed [notorious terrorist Khalid Sheikh Mohammed] talking, someone who could presumably lay out the whole organization for you, I think we'd be seeing sweeping arrests in several different countries at the same time. Instead what we see is an arrest here, then a few months later an arrest there.
Montaigne, who lived through an especially brutal period in French history that saw religious wars, the St. Bartholomew's Day Massacre, and the assassination of a French King, wrote this about torture:
The putting men to the rack is a dangerous invention, and seems to be rather a trial of patience than of truth. Both he who has the fortitude to endure it conceals the truth, and he who has not....when all is done, 'tis, in plain truth, a trial full of uncertainty and danger what would not a man say, what would not a man do, to avoid so intolerable torments?
But to return to Horton's question in Harper's, what should be done about Bush, Rumsfeld, and Cheney? I wonder, after viewing Errol Morris's film Standard Operating Procedure, about Abu Ghraib, whether we shouldn't require the three stalwarts, along with Alberto Gonzalez and John Yoo, to endure for a weekend what was done for months at Abu Ghraib—not as cruel and unusual punishment, of course, but only as an exercise in fact finding. Perhaps, like boys inured to the sadism said to be practiced in upper-class British schools, they will emerge unrepentant, and that, in itself, would comprise as much truth as is likely to be discovered by any "Truth Commission," no matter how well managed.
© Michael Huggins, 2008. All rights reserved.
Sunday, December 7, 2008
Ironies of history
The 67th anniversary of Pearl Harbor sees the appointment of retired General Eric Shinseki, first Japanese-American to achieve four-star rank in our military, to the Cabinet-level post of Secretary of Veterans' Affairs. It's a fitting benchmark of progress since the savage conflict of 60 years ago (and some in Japan, where denial of wartime atrocities stands in odd contrast to that country's progress in other areas, would do well to learn from it); born on the Hawaiian island of Kauai 11 months after the Japanese attack, Shinseki is a West Point graduate and decorated Vietnam veteran and was the 34th Chief of Staff of the U.S. Army.
Of course, Shinseki is best known for his warning to Donald Rumsfeld and Paul Wolfowitz that their plan to fight a cut-rate war in Iraq would not work, a warning that they derided at the time but that has been vindicated by subsequent events, as James Fallows reminds us today in The Atlantic. As someone known for his care in preparing estimates, Shinseki should provide a welcome change in the handling of veterans' affairs—particularly in advocating for their health care needs, strangely "misunderestimated" by the outgoing Administration.
On that Sunday morning in 1941, 17-year-old Joe Riley was working as a gas station attendant in Memphis when he heard the news of the Pearl Harbor attack on a customer's car radio. Riley, who later served in combat in Europe and then went on to teach literature at the University of Memphis, rushed inside and blurted out the news to his employer, who scoffed at him and told him to stop listening to Orson Welles. When young Riley was tending cars, there was a Japanese Garden in Memphis's historic Overton Park, though it was torn down by angry citizens the next day. Now, a Japanese "Garden of Tranquility" graces the Memphis Botanical Gardens and has for some years.
Across Park Avenue from the Botanical Gardens is one of my favorite Memphis sights, the Dixon Gallery and Gardens, set up under the will of the former owners of the 1938 Neo-Georgian mansion and 17-acre grounds as a setting in which to display their art to the public. Opening in 1976, the Dixon has built a fine permanent collection and hosted outstanding exhibits, including an exhibit of the work of Rodin in 1988 and art treasures of Chatsworth in 2003. A 2006 exhibit on the work of Margaret Bourke-White included a lecture on her collaboration with Erskine Caldwell, resulting in their 1937 book about sharecroppers, You Have Seen Their Faces.
While Bourke-White photographed sharecroppers in the South, Dorothea Lange documented the plight of victims of the Depression in the West, for the Farm Security Administration; her work includes the iconic photograph of Florence Owens Thompson, then a widow with small children, whose daughter, Katherine McIntosh, pictured in the photo as a small child and now 77, reflected on their plight in an interview last week with CNN. One of Lange's photos of Thompson is the first one in this collection of images of 65–75 years ago in the United States, assembled by a photography teacher at a community college in North Carolina and forwarded to me by my aunt this afternoon. These were made before I was born, but I have met people who looked like this; indeed, the photo of small children living in company housing in a Pennsylvania mining town remind me of what I know of my mother's family when my mother and her five sisters were children in West Virginia.
© Michael Huggins, 2008. All rights reserved.
Of course, Shinseki is best known for his warning to Donald Rumsfeld and Paul Wolfowitz that their plan to fight a cut-rate war in Iraq would not work, a warning that they derided at the time but that has been vindicated by subsequent events, as James Fallows reminds us today in The Atlantic. As someone known for his care in preparing estimates, Shinseki should provide a welcome change in the handling of veterans' affairs—particularly in advocating for their health care needs, strangely "misunderestimated" by the outgoing Administration.
On that Sunday morning in 1941, 17-year-old Joe Riley was working as a gas station attendant in Memphis when he heard the news of the Pearl Harbor attack on a customer's car radio. Riley, who later served in combat in Europe and then went on to teach literature at the University of Memphis, rushed inside and blurted out the news to his employer, who scoffed at him and told him to stop listening to Orson Welles. When young Riley was tending cars, there was a Japanese Garden in Memphis's historic Overton Park, though it was torn down by angry citizens the next day. Now, a Japanese "Garden of Tranquility" graces the Memphis Botanical Gardens and has for some years.
Across Park Avenue from the Botanical Gardens is one of my favorite Memphis sights, the Dixon Gallery and Gardens, set up under the will of the former owners of the 1938 Neo-Georgian mansion and 17-acre grounds as a setting in which to display their art to the public. Opening in 1976, the Dixon has built a fine permanent collection and hosted outstanding exhibits, including an exhibit of the work of Rodin in 1988 and art treasures of Chatsworth in 2003. A 2006 exhibit on the work of Margaret Bourke-White included a lecture on her collaboration with Erskine Caldwell, resulting in their 1937 book about sharecroppers, You Have Seen Their Faces.
While Bourke-White photographed sharecroppers in the South, Dorothea Lange documented the plight of victims of the Depression in the West, for the Farm Security Administration; her work includes the iconic photograph of Florence Owens Thompson, then a widow with small children, whose daughter, Katherine McIntosh, pictured in the photo as a small child and now 77, reflected on their plight in an interview last week with CNN. One of Lange's photos of Thompson is the first one in this collection of images of 65–75 years ago in the United States, assembled by a photography teacher at a community college in North Carolina and forwarded to me by my aunt this afternoon. These were made before I was born, but I have met people who looked like this; indeed, the photo of small children living in company housing in a Pennsylvania mining town remind me of what I know of my mother's family when my mother and her five sisters were children in West Virginia.
© Michael Huggins, 2008. All rights reserved.
Saturday, December 6, 2008
Wanted, dead or alive
Watching Angel Heart once more, for the first time since I saw it in the theatre 21 years ago, I am struck by how completely Robert De Niro dominates his few scenes, saying almost nothing and quietly indifferent to Mickey Rourke's insouciance. It reminds me of what someone, perhaps Capote, once wrote after seeing Brando as Stanley Kowalski:
One sees again why De Niro had been chosen to play the young Vito Corleone at age 31; indeed, the café scene with Rourke is rather too close a reference to De Niro's own scene with Gaston Moschin's Don Fanucci in Godfather II (it also reflects Rourke's giving Burt Young his comeuppance in The Pope of Greenwich Village). Of course, Rourke had already shown that he, too, knew how to steal a scene, in his cameo as Teddy the arsonist in Body Heat. Watching him warn William Hurt not to go down the path of ruin, I was convinced that I was seeing, not an actor but a real ex-convict whom they had cast for authenticity.
De Niro's Louis Cyphre of course reminds one also of the same sort of laconic and disdainful character in literature; for example, Du Maurier's Maxim De Winter (I was fascinated by Jeremy Brett's 1979 performance in the role and am sorry that it is not available on DVD) and George Eliot's Mallinger Grandcourt, the villain of her novel Daniel Deronda. For that matter, that character type provided a brief road to stardom for Jonathan Frid and David Selby in the late sixties Gothic soap, Dark Shadows.
The bad boy nature of the character continues to grip the imagination (though Bret Harte poked witty fun at it in Miss Mix, his short parody of Jane Eyre). Its latest incarnation is Edward, the undead hero of Stephenie Meyer's Twilight series of novels. Caitlin Flanagan analyzes its preternatural appeal to thousands of teen girls, including her own pre-teen daughter, in the current Atlantic; Flanagan notes that these novels, which are apparently flying off the shelves faster than awakened bats, feature teen girls who don't text or use MySpace and a teen boy who loves a girl so much that he won't sleep with her, circumstances that, in today's culture, make one wonder if the characters are really in the next life already and simply don't realize it.
And speaking of the next life, RIP wealthy but unfortunate Martha von Bülow, dead today at 76 after lingering in a 28-year coma. It was a strange irony that a minor player in this real-life Gothic soap opera and tragedy was Alexandra Isles, who had appeared in Dark Shadows and was later Claus von Bülow's lover. For a fascinating portrayal of the rather strange character that is von Bülow, see Jeremy Irons's performance in Barbet Schroeder's 1990 film, Reversal of Fortune.
© Michael Huggins, 2008. All rights reserved.
I can't explain how Brando, wordlessly, did what he did, but he had found a way, no doubt instinctively, to master a paradox—he had implicitly threatened us and then given us pardon. Here was Napoleon, here was Caesar, here was Roosevelt. Brando had not asked the members of the audience merely to love him; that is only charm. He had made them wish that he would deign to love them. That is a star. That is power, no different in its essence than the power that can lead nations.
One sees again why De Niro had been chosen to play the young Vito Corleone at age 31; indeed, the café scene with Rourke is rather too close a reference to De Niro's own scene with Gaston Moschin's Don Fanucci in Godfather II (it also reflects Rourke's giving Burt Young his comeuppance in The Pope of Greenwich Village). Of course, Rourke had already shown that he, too, knew how to steal a scene, in his cameo as Teddy the arsonist in Body Heat. Watching him warn William Hurt not to go down the path of ruin, I was convinced that I was seeing, not an actor but a real ex-convict whom they had cast for authenticity.
De Niro's Louis Cyphre of course reminds one also of the same sort of laconic and disdainful character in literature; for example, Du Maurier's Maxim De Winter (I was fascinated by Jeremy Brett's 1979 performance in the role and am sorry that it is not available on DVD) and George Eliot's Mallinger Grandcourt, the villain of her novel Daniel Deronda. For that matter, that character type provided a brief road to stardom for Jonathan Frid and David Selby in the late sixties Gothic soap, Dark Shadows.
The bad boy nature of the character continues to grip the imagination (though Bret Harte poked witty fun at it in Miss Mix, his short parody of Jane Eyre). Its latest incarnation is Edward, the undead hero of Stephenie Meyer's Twilight series of novels. Caitlin Flanagan analyzes its preternatural appeal to thousands of teen girls, including her own pre-teen daughter, in the current Atlantic; Flanagan notes that these novels, which are apparently flying off the shelves faster than awakened bats, feature teen girls who don't text or use MySpace and a teen boy who loves a girl so much that he won't sleep with her, circumstances that, in today's culture, make one wonder if the characters are really in the next life already and simply don't realize it.
And speaking of the next life, RIP wealthy but unfortunate Martha von Bülow, dead today at 76 after lingering in a 28-year coma. It was a strange irony that a minor player in this real-life Gothic soap opera and tragedy was Alexandra Isles, who had appeared in Dark Shadows and was later Claus von Bülow's lover. For a fascinating portrayal of the rather strange character that is von Bülow, see Jeremy Irons's performance in Barbet Schroeder's 1990 film, Reversal of Fortune.
© Michael Huggins, 2008. All rights reserved.
Labels:
Angel Heart,
Atlantic,
Body Heat,
Brando,
Caitlin Flanagan,
De Niro,
Mickey Rourke,
William Hurt
Friday, December 5, 2008
The next voice you hear
Some of my fellow Republicans seem to be in a race to make the GOP deserve the name of "Know-Nothings." First, the Governor of Alaska was taken in by radio pranksters who convinced her that she was talking to the President of France, and now, Florida Congresswoman Ileana Ros-Lehtinen has hung up on Barack Obama, twice, refusing to believe it was he. Had she thought to ask him a difficult question, she could have told at once: intelligent as he is, he can't stop himself from blurting out "Well—" or "I tell you what—" before answering, when he anticipates that what he says may not be liked. From the first Presidential debate, I became convinced that Obama needs to hire a debate coach who will fire off a flare pistol whenever he does that. He has things to say, and he needs to say them, in the face of likely disagreement, without awkwardness or apology.
The President-elect regularly heads for the basketball court to work out and clear his mind; perhaps he should take up chess as well, since the rapidly emerging crises will force anyone to think several moves ahead. As Edward Tenner writes in the current Atlantic:
That sounds like an excellent analogy for what the new Administration faces, in both economics and foreign affairs. Of course politics has always resembled chess, but the rapid spread and relentless retention of information, forcing even the masters to vary their favorite strategies, seems peculiar to our own day.
Failure to think several moves ahead seems to have been a major contributor to the worst airline disaster of all time, the crash of KLM flight 4805 with Pan-Am 1736 at Tenerife, Canary Islands, in March, 1977. Airline pilot Patrick Smith's gripping account of the tragedy, appearing in Salon last year to mark the 30th anniversary of the event, describes not only its senseless horror but the irony that the individual more responsible than any other for the crash, KLM pilot Jacob van Zanten, was distinguished in the aviation world as a safety expert—indeed, on first hearing of the crash on the radio, KLM authorities tried to reach van Zanten, in the hopes that he could lead the investigation!
Van Zanten's last recorded words were "We gaan" (let's go), a phrase that, in the blind obstinacy in which it was spoken, might be a fitting epitaph for the Bush Presidency.
For memories of better times in aviation, see this fascinating gallery of early photos from Dayton and Kitty Hawk. I also like the photos of Wilbur and Orville Wright, both of whom look as if nothing could get past them. (Years ago, when saying her evening prayers, my daughter said, "Dear God, thank you for inventing the Wright brothers so we could fly to Granny's house for Christmas.") A quote from Wilbur is equally applicable to aviation and statecraft:
© Michael Huggins, 2008. All rights reserved.
The President-elect regularly heads for the basketball court to work out and clear his mind; perhaps he should take up chess as well, since the rapidly emerging crises will force anyone to think several moves ahead. As Edward Tenner writes in the current Atlantic:
ChessBase, introduced for Atari in 1987, is now a compendium of 3.75 million games reaching back more than five centuries. Compiling statistics, including the results from games just downloaded from the Web, it also shows percentages of games won after various alternative moves.
....Knowing thine adversary has never been easier. Even the victorious defending champion Viswanathan Anand has said he can’t afford to have a favorite opening. Under pressure because of efficient scrutiny through databases and analysis engines like Fritz (another popular high-level software program that works out new moves), top players must prepare more variations than ever.
That sounds like an excellent analogy for what the new Administration faces, in both economics and foreign affairs. Of course politics has always resembled chess, but the rapid spread and relentless retention of information, forcing even the masters to vary their favorite strategies, seems peculiar to our own day.
Failure to think several moves ahead seems to have been a major contributor to the worst airline disaster of all time, the crash of KLM flight 4805 with Pan-Am 1736 at Tenerife, Canary Islands, in March, 1977. Airline pilot Patrick Smith's gripping account of the tragedy, appearing in Salon last year to mark the 30th anniversary of the event, describes not only its senseless horror but the irony that the individual more responsible than any other for the crash, KLM pilot Jacob van Zanten, was distinguished in the aviation world as a safety expert—indeed, on first hearing of the crash on the radio, KLM authorities tried to reach van Zanten, in the hopes that he could lead the investigation!
Van Zanten's last recorded words were "We gaan" (let's go), a phrase that, in the blind obstinacy in which it was spoken, might be a fitting epitaph for the Bush Presidency.
For memories of better times in aviation, see this fascinating gallery of early photos from Dayton and Kitty Hawk. I also like the photos of Wilbur and Orville Wright, both of whom look as if nothing could get past them. (Years ago, when saying her evening prayers, my daughter said, "Dear God, thank you for inventing the Wright brothers so we could fly to Granny's house for Christmas.") A quote from Wilbur is equally applicable to aviation and statecraft:
It is possible to fly without motors, but not without knowledge and skill.
© Michael Huggins, 2008. All rights reserved.
Saturday, November 22, 2008
Yon Cassius hath a lean and hungry look
When I read that Barack Obama was seriously considering Hillary Clinton for Secretary of State, I thought he had lost his mind. Looking at it again, I think it's a shrewd move. Resigning her Senate seat, she can't hinder any of his preferred legislation out of pique, and if she displays the same persona on the world stage that she did in the primaries, she will appear even more plainly for what she is, while Obama will only win sympathy for enduring her.
Michael Hirsh argues that the position of Secretary of State is as much subject to Presidential control as any other and cites three instances in which, rightly or wrongly, Secretaries of State were pushed aside. However, his argument seems to depend on the presence of an Acheson- or Kissinger-like figure in an Administration to take the place of the original appointee, while Obama specifically tries to avoid conditions that make such changes necessary to begin with. I hope that Newsweek's observation is accurate: that Obama is unusually detached and self-aware for a politician. The seven words that even waterboarding could never force from Hillary's lips were also spoken by someone of unusual detachment, who ate locusts and wild honey.
(A friend kindly passes along David Brooks's column from yesterday's New York Times; Brooks reminds us that Clinton, whatever else she may be, is one of a galaxy of daunting talent assembled by the President-elect. The Governor of Alaska, on the other hand, seems to be among those who embrace the creed, "Ye need not any man to teach you," though admittedly, the author of those words was speaking of a spiritual assurance and not knowledge of geopolitics. This week's Time notes that Palin will get a $7 million book deal, and Oliver Stone nominated her for Time's Person of the Year.)
Speaking of detachment, it appears that Obama is about to govern a nation in which, "According to a 2006 study by the Pew Forum on Religion & Public Life, a third of white evangelicals believe the world will end in their lifetimes." Michael Gross analyzed the effect of this strain of thought in The Atlantic a few years ago.) One zealous soul runs a web site that he wishes to be known as "the eBay of prophecy," a concept so amazingly oblivious to the context of its own religious origins that one hardly knows where to begin. Those who are less convinced that the dread day is upon us may have been among the admirers lined up at Wolfchase Mall in Memphis yesterday to get the autograph of Thomas Kinkade, "Painter of Light™." It seems that Kinkade's works hang in 1 in 20 American homes, a fact that, in itself, is enough to make one hope that the apocalypse is not so far off after all. Eighty-five years ago, the art of choice for 1 in 4 American homes was Maxfield Parrish's Daybreak. To be sure, Parrish was no Rembrandt, but at least his work reminds one of Alma-Tadema.
I was in 6th grade when Kennedy was shot. The Zapruder film became the horrifying precursor of YouTube. I remember seeing John Jr. salute his father's casket as it passed down Pennsylvania Avenue, though I had forgotten that his mother induced him to do so; I thought I remembered it as spontaneous. A classmate of mine visited Martinique the week after the assassination, and locals were asking him if there would be a coup in the United States.
© Michael Huggins, 2008. All rights reserved.
Michael Hirsh argues that the position of Secretary of State is as much subject to Presidential control as any other and cites three instances in which, rightly or wrongly, Secretaries of State were pushed aside. However, his argument seems to depend on the presence of an Acheson- or Kissinger-like figure in an Administration to take the place of the original appointee, while Obama specifically tries to avoid conditions that make such changes necessary to begin with. I hope that Newsweek's observation is accurate: that Obama is unusually detached and self-aware for a politician. The seven words that even waterboarding could never force from Hillary's lips were also spoken by someone of unusual detachment, who ate locusts and wild honey.
(A friend kindly passes along David Brooks's column from yesterday's New York Times; Brooks reminds us that Clinton, whatever else she may be, is one of a galaxy of daunting talent assembled by the President-elect. The Governor of Alaska, on the other hand, seems to be among those who embrace the creed, "Ye need not any man to teach you," though admittedly, the author of those words was speaking of a spiritual assurance and not knowledge of geopolitics. This week's Time notes that Palin will get a $7 million book deal, and Oliver Stone nominated her for Time's Person of the Year.)
Speaking of detachment, it appears that Obama is about to govern a nation in which, "According to a 2006 study by the Pew Forum on Religion & Public Life, a third of white evangelicals believe the world will end in their lifetimes." Michael Gross analyzed the effect of this strain of thought in The Atlantic a few years ago.) One zealous soul runs a web site that he wishes to be known as "the eBay of prophecy," a concept so amazingly oblivious to the context of its own religious origins that one hardly knows where to begin. Those who are less convinced that the dread day is upon us may have been among the admirers lined up at Wolfchase Mall in Memphis yesterday to get the autograph of Thomas Kinkade, "Painter of Light™." It seems that Kinkade's works hang in 1 in 20 American homes, a fact that, in itself, is enough to make one hope that the apocalypse is not so far off after all. Eighty-five years ago, the art of choice for 1 in 4 American homes was Maxfield Parrish's Daybreak. To be sure, Parrish was no Rembrandt, but at least his work reminds one of Alma-Tadema.
I was in 6th grade when Kennedy was shot. The Zapruder film became the horrifying precursor of YouTube. I remember seeing John Jr. salute his father's casket as it passed down Pennsylvania Avenue, though I had forgotten that his mother induced him to do so; I thought I remembered it as spontaneous. A classmate of mine visited Martinique the week after the assassination, and locals were asking him if there would be a coup in the United States.
© Michael Huggins, 2008. All rights reserved.
Subscribe to:
Posts (Atom)