Showing posts with label Internet. Show all posts
Showing posts with label Internet. Show all posts

Tuesday, March 9, 2010

Bitten by bytes

It appalls but does not completely surprise me that a 3-month-old Korean girl slowly starved to death while her feckless parents haunted a nearby internet café playing computer games; the horrible irony is that the game that addicted them was a simulation in which they raised a virtual child online.

The article from the Guardian Online cited yet other instances of the same social evil:

A 22-year-old Korean man was charged last month with murdering his mother because she nagged him for spending too much time playing games. After killing her the man went to a nearby internet cafe and continued with his game, said officials. In 2005 a young man collapsed in an internet cafe in the city of Taegu after playing the game StarCraft almost continuously for 50 hours. He went into cardiac arrest and died at a local hospital.

Admittedly, distraction from the obligations of the immediate was not born with the Internet. In Gulliver's Travels, Swift satirized absent-minded thinkers who needed minders to follow them around through the day and periodically tap them on the shoulder to remind them where they were. A guilt-ridden Mark Twain confessed in his Autobiography that his oldest son's death was his fault; sunk deep in thought as he took a carriage ride one winter day with the toddler, Clemens did not notice that the blanket had slipped off the boy's bare legs; his son caught a chill and died shortly after.

Perhaps the difference with the Internet is that it is interactive and that there is an immediate payoff; this, plus the distraction from tedium, must be among the reasons that people text while driving. As long as people are obsessed with the world online, they could do worse than to spend their time addressing one of the next great issues in national security: the ease with which unmanned aerial vehicles (UAVs), or drones, can be assembled, chillingly detailed last week in Newsweek by P.W. Singer:

At least 40 other countries—from Belarus and Georgia to India, Pakistan, and Russia—have begun to build, buy, and deploy unmanned aerial vehicles, or UAVs, showcasing their efforts at international weapons expos ranging from the premier Paris Air Show to smaller events in Singapore and Bahrain. In the last six months alone, Iran has begun production on a pair of weapons-ready surveillance drones, while China has debuted the Pterodactyl and Sour Dragon, rivals to America's Predator and Global Hawk. All told, two thirds of worldwide investment in unmanned planes in 2010 will be spent by countries other than the United States.

When we invaded Iraq, I explained to my worried son, then 14, why Saddam couldn't send planes to bomb us as we could bomb Baghdad. Times are changing: Singer's article mentions that a 77-year-old blind man in Canada designed a drone that flew across the Atlantic to Ireland. These home-made gadgets actually gain from being less advanced than the machinery of our current defenses:

Smaller UAVs' cool, battery-powered engines make them difficult to hit with conventional heat-seeking missiles; Patriot missiles can take out UAVs, but at $3 million apiece such protection comes at a very steep price. Even seemingly unsophisticated drones can have a tactical advantage: Hizbullah's primitive planes flew so slowly that Israeli F-16s stalled out trying to decelerate enough to shoot them down.

According to a robotics expert cited in the article, an amateur could build a machine for less than $50,000 that could shut down Manhattan. Actually, our own government nearly achieved that when some nitwit let Air Force One fly over the city for a photo opportunity last year, panicking thousands.

Getting back to UAVs, the "Popular Mechanics" aspect isn't the only problem; even worse, it seems that overlooked and easily exploitable security flaws aren't limited to the Giant of Redmond:

More recently, The Wall Street Journal reported, the U.S. ignored a dangerous flaw in its UAV technology that allowed Iraqi insurgents to tap into the planes' video feeds using $30 software purchased over the Internet.

Until this Terminator-like future arrives, one can still take refuge in the quiet pleasures of an art museum (though the guard at the Phillips Collection in Washington nearly assaulted me last year when my flash went off as I photographed Renoir's Luncheon of the Boating Party), but according to Newsweek staffer Jennie Yabroff, art appreciation can have its own hazards:

Stendhal syndrome isn't included in the draft version of the new Diagnostic and Statistical Manual of Mental Disorders, released last month, but with proposed additions including "apathy syndrome" and Internet addiction, it's probably only a matter of time. The affliction takes its name from the 19th-century French writer, who was overcome after visiting Florence's Basilica di Santa Croce. In 1989 an Italian psychiatrist named Graziella Magherini published La Sindrome di Stendhal, describing more than 100 tourists who suffered dizziness and heart palpitations (some requiring hospitalization) after seeing the Florentine sights. According to Magherini, great art can make you sick.

Yabroff cites Stendhal's own account of the experience that caused Magherini's diagnosis:

Stendhal visited Florence in 1817: maybe he was suffering Grand Tour pressure to have a properly edifying travel experience. But what actually happened? He writes, "On leaving the Santa Croce church, I felt a pulsating in my heart. Life was draining out of me, while I walked fearing a fall."

I doubt that any age has ever equalled ours for discovering previously unknown disorders and tagging them with clinical names, but I think there may be something to this. What happens if a work of art really grips you? If it is sufficiently powerful, it may affect the viewer, on a smaller scale, like the feeling described in Sylvia Plath's poem, "Mystic":

Once one has seen God, what is the remedy?
Once one has been seized up

Without a part left over,
Not a toe, not a finger, and used,
Used utterly, in the sun’s conflagrations, the stains
That lengthen from ancient cathedrals
What is the remedy?

How many transformative experiences can one endure in a single day? As my son wisely observed after we had toured the Isabella Stewart Gardner Museum in Boston, "I think I'm all museumed out for now."

© Michael Huggins, 2010. All rights reserved.

Tuesday, December 23, 2008

It ain't over 'til it's over

If only scientists would stop learning new things, I could be comfortable in my previous assumptions. For several years, I have accepted as true that we confront the modern age with essentially Stone Age minds and that human evolution had receded to vanishingly small levels, shielded from adaptive challenges by technology.

That some people speak and act with Neolithic sensibilities seems indisputable in my experience, but in general, I may be premature in marking an end to human evolution and guilty at least of oversimplication in thinking that we are mostly wired to hunt Mastodons. Writing in the December Scientific American, Peter Ward describes the ways in which agriculture and urban life have contributed to changes in the human genome:

...over the past 10,000 years humans have evolved as much as 100 times faster than at any other time since the split of the earliest hominid from the ancestors of modern chimpanzees. [Scientists attribute] the quickening pace to the variety of environments humans moved into and the changes in living conditions brought about by agriculture and cities. It was not farming per se or the changes in the landscape that conversion of wild habitat to tamed fields brought about but the often lethal combination of poor sanitation, novel diet and emerging diseases (from other humans as well as domesticated animals).

As to the Stone Age mind, philosopher David Buller argues in the same issue that we can't really know enough about the environment of our ancestors to describe their psychology with any precision and, that, indeed, the concept of a Stone Age mind does an injustice both to our pre-human past and to our more recent development:

[The] claim that human nature was designed during the Pleistocene, when our ancestors lived as hunter-gatherers, gets it wrong on both ends of the epoch.

Some human psychological mechanisms undoubtedly did emerge during the Pleistocene. But others are holdovers of a more ancient evolutionary past, aspects of our psychology that are shared with some of our primate relatives. Evolutionary neuroscientist Jaak Panksepp of Bowling Green State University has identified seven emotional systems in humans that originated deeper in our evolutionary past than the Pleistocene. The emotional systems that he terms Care, Panic and Play date back to early primate evolutionary history, whereas the systems of Fear, Rage, Seeking and Lust have even earlier, premammalian origins....

The view that “our modern skulls house a Stone Age mind” gets things wrong on the contemporary end of our evolutionary history as well. The idea that we are stuck with a Pleistocene-adapted psychology greatly underestimates the rate at which natural and sexual selection can drive evolutionary change. Recent studies have demonstrated that selection can radically alter the life-history traits of a population in as few as 18 generations (for humans, roughly 450 years).

Of course, such rapid evolution can occur only with significant change in the selection pressures acting on a population. But environmental change since the Pleistocene has unquestionably altered the selection pressures on human psychology. The agricultural and industrial revolutions precipitated fundamental changes in the social structures of human populations, which in turn altered the challenges humans face when acquiring resources, mating, forming alliances or negotiating status hierarchies. Other human activities—ranging from constructing shelter to preserving food, from contraception to organized education—have also consistently altered the selection pressures. Because we have clear examples of post-Pleistocene physiological adaptation to changing environmental demands (such as malaria resistance), we have no reason to doubt similar psychological evolution.

I don't know if the Internet rises to the level of a selection pressure or not (for me, post-divorce encounters with Internet dating sites have acted more as an inducement to celibacy), but some hold high hopes for it. Futurist Raymond Kurzweil believes that human and machine intelligence will eventually merge, perhaps improving both. Other observers believe and hope that the Internet will enable a qualitative advance in communication that results in the merging of all minds into a super-mind, a whole greater than the sum of its parts (an expectation that I think is completely unfounded, for the simple reason that two cars don't make a bus).

Still others are deeply skeptical about the influence of the Net on civilization, and the two sides have lined up in a clear and vigorous debate, as noted by David Brin in his article "Will the Net Help Us Evolve?" in today's Salon:

Some of today's most vaunted tech philosophers are embroiled in a ferocious argument. On one side are those who think the Internet will liberate humanity, in a virtuous cycle of e-volving creativity that may culminate in new and higher forms of citizenship. Meanwhile, their diametrically gloomy critics see a kind of devolution taking hold, as millions are sucked into spirals of distraction, shallowness and homogeneity, gradually surrendering what little claim we had to the term "civilization."

Call it cyber-transcendentalists versus techno-grouches.

Nicholas Carr weighed in for the skeptics with his article "Is Google Making Us Stupid?" in the July Atlantic. Carr laments:

...what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.

I’m not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing....Scott Karp, who writes a blog about online media, recently confessed that he has stopped reading books altogether. “I was a lit major in college, and used to be [a] voracious book reader,” he wrote. “What happened?”

....Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the Internet has altered his mental habits. “I now have almost totally lost the ability to read and absorb a longish article on the web or in print,” he wrote earlier this year. A pathologist who has long been on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His thinking, he said, has taken on a “staccato” quality, reflecting the way he quickly scans short passages of text from many sources online. “I can’t read War and Peace anymore,” he admitted. “I’ve lost the ability to do that. Even a blog post of more than three or four paragraphs is too much to absorb. I skim it.”

Responding to Carr in the Encyclopedia Britannica Blog, Clay Shirky agrees that

The web presents us with unprecedented abundance. This can lead to interrupt-driven info-snacking, which robs people of the ability to find time to think about just one thing persistently. I also think that these changes are significant enough to motivate us to do something about it. I disagree, however, about what it is we should actually be doing.

Shirky suspects that Carr has misdiagnosed his own pain and only thinks it's about the Net while actually, it is a lament for a literary culture that was vanishing before the computer age even began:

Despite the sweep of the title, it’s focused on a very particular kind of reading, literary reading, as a metonym for a whole way of life. You can see this in Carr’s polling of “literary types,” in his quoting of Wolf and the playwright Richard Foreman, and in the reference to War and Peace, the only work mentioned by name. Now War and Peace isn’t just any piece of writing, of course; it is one of the longest novels in the canon, and symbolizes the height of literary ambition and of readerly devotion.

But here’s the thing: it’s not just Carr’s friend, and it’s not just because of the web—no one reads War and Peace. It’s too long, and not so interesting.

This observation is no less sacrilegious for being true. The reading public has increasingly decided that Tolstoy’s sacred work isn’t actually worth the time it takes to read it, but that process started long before the internet became mainstream. Much of the current concern about the internet, in fact, is a misdirected complaint about television, which displaced books as the essential medium by the 1970s....

And this, I think, is the real anxiety behind the essay: having lost its actual centrality some time ago, the literary world is now losing its normative hold on culture as well. The threat isn’t that people will stop reading War and Peace. That day is long since past. The threat is that people will stop genuflecting to the idea of reading War and Peace.

While agreeing that the need to focus must not be lost, Shirky argues that the challenge with which the Net presents us is not an avalanche of intellectual junk but merely a selection problem analogous to what happened a couple of centuries ago when the printing press produced so many works that one man could no longer hope to be master of all knowledge. The concept of the sage as cathedral-like structure, Shirky says, must give way to the idea of a shopper in a bazaar.

Brin isn't ready to wholeheartedly sign up with either side. He agrees with Carr that the internet tempts some to become part of a dim-witted mob but also hails the same abundance that so delights Shirky and his fellow techno-enthusiasts. His solution is to ensure that there are tools on the web for winnowing the wheat from the chaff:

...what's needed is not the blithe enthusiasm preached by Ray Kurzweil and Clay Shirky. Nor Nicholas Carr's dyspeptic homesickness. What is called for is a clear-eyed, practical look at what's missing from today's Web. Tools that might help turn quasar levels of gushing opinion into something like discourse, so that several billion people can do more than just express a myriad of rumors and shallow impulses, but test, compare and actually reach some conclusions now and then.

But what matters even more is to step back from yet another tiresome dichotomy, between fizzy enthusiasm and testy nostalgia. Earlier phases of the great Enlightenment experiment managed to do this by taking a wider perspective. By taking nothing for granted.

Being temperamentally inclined toward the conservative and curmudgeonly, I ought to give Shirky his due. Folly didn't begin with the Internet, and gullibility existed in the days of the papyrus scroll and before. By itself, the Net can't keep people from reading Milton any more than widespread popular use of the transistor radio in the '50s could destroy interest in Antonio Vivaldi; indeed, Vivaldi enjoyed a revival in that decade, as did Georg Philipp Telemann in the decade after, which was also, of course, the decade of Woodstock. As to reading great works, it was true once, as Samuel Johnson said, that "Classical quotation is the parole of educated men the world over," but sadly, no longer (I didn't know whether to laugh or cry at the moment in The Sixth Sense when Bruce Willis, playing a Ph.D. in Psychology, had to look up the meaning of De Profundis).

Still, I'm less concerned over whether or not someone can quote Vergil than whether he is inclined to examine questions rigorously on the evidence, and in sufficient detail to verify anything. As to verification, Farhad Manjoo correctly points out in his book True Enough that many people no longer care as much for the truth as their truth, and their truth may be that Obama is a Muslim or that the Pentagon secretly planned 9/11. How is this any worse on the Net than in print 200 years ago? It spreads more quickly, and its credit is aided by the almost superstitious awe in which many people hold technology—for some, the Internet itself is taken as a sort of verification of even foolish claims.

As to reading at length, I have seen the way instant messaging and the BlackBerry® have transformed the workplace; for many, if it can't fit on a BlackBerry screen, it's superfluous. Shirky partly misses the point by focusing specifically on Carr's mention of War and Peace; under the trend to fragmented reading and thinking promoted by the Internet, how many people does he imagine are willing, even today, to read a piece of the length and conceptual complexity of his own essay?

Lord Chesterfield wrote:

A man is fit for neither business nor pleasure, who either cannot, or does not, command and direct his attention to the present object, and, in some degree, banish for that time all other objects from his thoughts....steady and undissipated attention to one object is a sure mark of a superior genius; as hurry, bustle, and agitation are the never-failing symptoms of a weak and frivolous mind.

Were he alive now, Chesterfield might be hired for his skill at making himself liked, but his disapproval of multitasking would make him an odd egg in today's business environment.

A book or essay, whether sitting on my shelf or downloadable online, reminds me of what Robert Maynard Hutchins called "The Great Conversation," a dialogue that has lasted for centuries, on texts that were the result of wrestling with important questions. The Internet, which makes its users impatient to read any one thing for very long, is less like a symposium than a food fight. Just as the democratic character of Wikipedia® made guardians of content more necessary than ever, the openness of the internet, the variety of its distractions, and the brevity of many of its offerings make it necessary for the user to recollect himself and ask if the entertaining site he has discovered conveys the truth of the matter or is the online equivalent of a supermarket tabloid. It is a question that I think fewer and fewer are willing to ask.

© Michael Huggins, 2008. All rights reserved.