Friday, October 29, 2004

NASA Image Analyst Confirms Bush Wore Wire During Debate

Dr. Robert M. Nelson, a senior research scientist for NASA and Caltech's Jet Propulsion Laborator, declares that he is "willing to stake my scientific reputation to the statement that Bush was wearing something under his jacket during the debate" that caused the notorious bulge. Obviously, Bush's cheating during the debates is a minor offense compared to, say, lying about the weapons of mass destruction in Iraq. Nonetheless, this incident deserves attention, as it is indicative of Bush's character, demonstrating that our pampered president, who was born with a silver spoon in his mouth, believes that he doesn't have to play by the rules by which most people are expected to abide. Had John Kerry or Bill Clinton worn a wire during one of their presidential debates, you can be damn sure the Republicans would have been calling for their heads.

Thursday, October 28, 2004

How Has Bush Betrayed Thee?: Let Us Count the 100 Ways

Judd Legum has compiled an extensive (though no doubt not entirely comprehensive) list of 100 facts about the Bush Administration that every American should read, particularly those who intend to vote on November 2. The piece is titled "100 Facts and 1 Opinion" and can be found in the November 8, 2004 edition The Nation. The online version contains links to sources that corroborate the 100 facts, most of which did receive mainstream media coverage, but in an age of near-instantaneous amnesia and information overload were, perhaps, too soon forgotten or overlooked by many Americans.

Wednesday, October 27, 2004

Pro-Dope Students Duped into Registering Republican

Actually, my headline is a slightly sensationalistic. Indiana University of Pennsylvania students thought that they were signing a petition in support of legalizing marijuana for medical use, but the documents were actually used by duplicitous Republicans (sadly, these days it seems as though the 'duplicitous' tag almost goes without saying for most GOP politicos) to register them as Republicans. As this article from The Indiana Gazette reports, the party-registration issue should not matter in the November 2 election; however, it could be an issue in the next primary election. It's going to be an ugly election day this year, I think, as the GOP appears to be going to great lengths to disenfranchise voters inclined to vote for John Kerry.

Tuesday, October 26, 2004

Eminem Moshes All Over Bush, But Will His Fans Follow?

I'm no Eminem fan, though I've enjoyed a few of his tracks, but I was pleased to hear about and even more impressed by the video for his new single "Mosh," in which he holds Bush accountable for the disastrous war in Iraq and encourages his fans to give Bush the bum's rush. How will "Mosh" play for all the suburban frat boys in Eminem's fan base who applaud Bush's shallow cowboy act but, like Bush, are chicken hawks who would never think of enlisting in the military?

Unfortunately, in what seems to be an act of cowardice on the part of the record company, "Mosh" isn't the first single to be released from Eminem's new album, which, I understand, isn't scheduled to be released until November 16, when the ballots will have long been cast and, presumably, the next President will have been elected. I'm hoping that MTV and other major media outlets pick the video up, though my guess is that they can't handle a real rap 'controversy.' The entertainment industry made it seem as though Eminem's dissing Britney Spears was shocking, but let's see if they have the cajones to generate some pre-election buzz for "Mosh."

If you want to see Eminem's "Mosh" video now, it's available at the Guerrilla News Network.

Thursday, October 21, 2004

Ronald Reagan, Neuromancer

William Gibson has been blogging lately, and this post is simply a reminder to myself to archive the following quote for the next time I teach one of his novels: "If I were to put together a truly essential thank-you list for the people who most made it possible for me to write my first six novels, I'd certainly owe as much to Ronald Reagan as to Bill Gates or Lou Reed. Reagan's presidency put the grit in my dystopia. His presidency was the fresh kitty litter I spread for utterly crucial traction on the icey driveway of uncharted futurity. His smile was the nightmare in my back pocket."

Most of my students were just infants during Reagan's second term, and what they 'know' of Reagan is largely the stuff of campaign ads, i.e., the morning in America myth, etc. When teaching the literature that came out of the 1980s, of course, a very different account of life in America emerges.

Sunday, October 17, 2004

Faith-Based Fanaticism

Ron Suskind's profile of George W. Bush and faith-based presidency "Without a Doubt, " is truly frightening, because it makes clear that Bush and his inner circle are not only Machiavellian, believing that might makes right, but that they believe they have a divine mandate to define and construct reality. Bush's mix of arrogance and stupidity

It's ironic to think that, in the past week, I've been reading in publications, including The New York Times and Spiked, about "the pernicious influence of Derrida's philosophy." The latter claim is made by James Heartfield, who aruges (wrongly) that that Jacques Derrida's legacy is nihilistic, denying the possibility of true knowledge, and that Derrida contributed to the "unreason of the age."

Now, I agree that we are living in an age of unreason, but to identify Derrida, a left-leaning French philosopher, as being a "cunning articulator" of unreason and a facilitator in the undermining of rationality is ridiculous. The Left (Heartfield) and the Right (Lynne Cheney, etc.) love to attack the so-called 'postmodernists' and 'deconstructionists' for abandoning the truth, while, in actuality, it's the perverse culture of lying in in the name of faith propagated by the political Right that, at present, is denying the possibility of the truth outright.

As Suskind's article demonstrates, Bush and his inner circle simply override any information that might conflict their ideological agenda and make dialogue, let alone dissent, impossible, even from members of their own party. Read the following anecdote and shudder...

In the summer of 2002, after I had written an article in Esquire that the White House didn't like about Bush's former communications director, Karen Hughes, I had a meeting with a senior adviser to Bush. He expressed the White House's displeasure, and then he told me something that at the time I didn't fully comprehend -- but which I now believe gets to the very heart of the Bush presidency.

The aide said that guys like me were ''in what we call the reality-based community,'' which he defined as people who ''believe that solutions emerge from your judicious study of discernible reality.'' I nodded and murmured something about enlightenment principles and empiricism. He cut me off. ''That's not the way the world really works anymore,'' he continued. ''We're an empire now, and when we act, we create our own reality. And while you're studying that reality -- judiciously, as you will -- we'll act again, creating other new realities, which you can study too, and that's how things will sort out. We're history's actors . . . and you, all of you, will be left to just study what we do.''

Who besides guys like me are part of the reality-based community? Many of the other elected officials in Washington, it would seem. A group of Democratic and Republican members of Congress were called in to discuss Iraq sometime before the October 2002 vote authorizing Bush to move forward. A Republican senator recently told
Time Magazine that the president walked in and said: ''Look, I want your vote. I'm not going to debate it with you.'' When one of the senators began to ask a question, Bush snapped, ''Look, I'm not going to debate it with you.''

Friday, October 15, 2004

Tips on Teaching Lolita

The following remarks were originally posted on Scott Rettberg's blog in response to comments he made about teaching Lolita this semester. Scott's reflections were prompted, in part, by his reading of Mark Edmunson's essay "All Entertainment, All the Time."

Scott,

I taught Lolita last spring and experienced the same difficulty you describe: getting the students to get beyond their impulse to condemn Humbert Humbert's reprehensible behavior and to reflect upon Nabokov's artistry.

Here’s what I found worked well to overcome this obstacle.

1. Read the annotated edition. We read The Annotated Lolita, edited by Alfred Appel Jr. Appel's introductory essay and his useful annotations cue students in to things such as Nabokov's intricate wordplay. This edition costs a bit more, but is worth every penny.

2. Beware the Morality Fallacy. Explaining why reading literature for a moral lesson is lame. Every semester, I typically give a lecture in which I explain what I like to call the 'morality fallacy,' which is based on the premise that art and literature differ from a sermon and that it is a critical error to evaluate art or literature as though they were merely models for right, proper or 'politically correct behavior.

3. Explain your affective reaction. I asked students to reflect carefully upon their feelings toward Humbert. In which passages did they find him most reprehensible, and where did they find themselves feeling some pity for him? After pinpointing some of these passages, including, of course, the account of the first seduction, we discussed how the narrative strategies Nabokov deployed via his unreliable, pompous, but nonetheless rhetorically savvy narrator, encouraged particular emotional or affective responses.

4. Read smart literary criticism. We read several essays from Vladimir Nabokov's Lolita: A Casebook, edited by Ellen Pifer that do a great job of addressing issues such as (1) how the novel can be read as a romance in a parodic mode ("Parody and Authenticity in Lolita) (2) why Humbert is only partially successful in his rhetorical manipulations ("The Art of Persuasion in Lolita") and (3) how Humbert's attempts at self-exoneration lead him to denigrate America ("The Americanization of Humbert Humbert").

5. Screen both versions of the film. We watched both Stanley Kubrick's and Adrian Lyne's film versions of Lolita and discussed how the two movies approached the novel differently, particularly in the extent and manner in which each film leads us to identify with Humbert. I argued that Kubrick treated the book as a black comedy and emphasized the outrageous humor in the novel, whereas Lyne emphasized the more melodramatic aspects of the narrative. The result, as I saw it, was that Lyne's Humbert, played by Jeremy Irons, seemed more authentic and elicited more pathos from viewers, due in part to his awkwardness, whereas Kubrick's Humbert, played by James Mason, emphasized the cultivated aloof, somewhat arrogant European. I highly recommend screening both versions, in part because doing so will demonstrate how time constraints and the need for a certain cinematographic consistency require filmmakers to adhere more strictly to one genre than novelists, who are more free to vary the 'tone' of their work.

6. Discuss Lolita as a popular culture phenomenon. We also read Michael Wood's essay "Revisiting Lolita (also in Vladimir Nabokov's Lolita: A Casebook) which was written as a response to the 1997 media controversy surrounding Adrian Lyne’s film. Wood addresses important issues including how filmic constraints are both limiting and enabling when remediating a work of fiction into a film and how the term "Lolita" has entered our vocabulary and why the colloquial use of the term signifies something vastly different from Humbert and Nabokov's use of the term.

7. Let the master have his say. Finally, before beginning the novel, I familiarized students with some of Nabokov’s views on aesthetics and literature.

• “I do not give a damn for public morals, in America or elsewhere.”

• “Now if we want to pin down poshlost in contemporary writing, we must look for it in Freudian symbolism, moth-eaten mythologies, social comment, humanistic messages, political allegories, overconcern with class or race, and the journalistic generalities we all know.”

• “Let me suggest that the very term ‘everyday reality’ is utterly static since it presupposes a situation that is permanently observable, essentially objective, and universally known.”

I look forward to hearing or reading more about your students’ responses to Lolita. Good luck!

Derridian Wisdom Explained

After publishing Jonathan Kandell's ugly (for both its xenophobia and its arrogant ingorance) obiturary on the occcasion of Jacques Derrida's death, it's somewhat heartening to see that the New York Times has published a more fair and informed account of Derrida's project by Mark C. Taylor, who edited Deconstruction in Context: Literature and Philosophy, a useful anthology of 19th-and-20th century continental philosophy from Kant to Derrida, and who actually has read and engaged with Derrida's texts.

Unfortunately, in a situation similar to the Times's willingness to propagandize on behalf of the pro-Iraqi war neoconservatives, it may be a case of too little, too late. Taylor's piece, "What Derrida Really Meant, was published on the op-ed pages, implying, perhaps, that Taylor's account is more subjective and less accurate than Kandell's ostensibly objective obituary.

I don't want to suggest that any obituary can be entirely objective; indeed, Derrida's thought teaches us to be attentive to the dangers of understanding any situation in terms of the crudely reductive objective/subjective binary. Nonetheless, for those who have read any Derrida, it is clear that Kandell went beyond the typical obituary format, a death notice accompanied by a biographical account of the person's life, to launch a vitriolic attack on a straw man dubbed 'deconstruction.'

But enough on Kandell and his hack work. I want to congratulate Taylor for managing to convey a few important Derridean insights in a short, fourteen-paragraph-long essay. For those of us who occasionally teach Derrida to undergraduates, Taylor's essay is a particularly welcome gift, for it not only explains why Derrida is such an important philosopher, but also explains how not to approach Derrida's texts.

Taylor's second paragraph opens with this fantastic observation: "To people addicted to sound bites and overnight polls, Mr. Derrida's works seem hopelessly obscure." I say fantastic, because Taylor doesn't hesitate to imply that it is intellectual laziness, not philosophical complexity, that should be condemned. One of the particularly disturbing and disgusting trends to be discerned in the mass mediasphere is a general willingness to condone intellectual laziness, either through outright dismissals of attempts by people, such as John Kerry or Ralph Nader, to articulate complex position, or through the failure to provide an adequate forum for intelligent debate.

Anticipating a typical anti-intellectual response to Taylor's claim, I'd just like to add that Taylor is not being elitist or endorsing complexity for complexity's sake. Rather, he is simply suggesting that the world is complex, and any truly thoughtful engagement with it should reflect that fact. Thus, Taylor goes on to note, correctly, that "density and complexity [are] characteristic of all great works of philosophy, literature, and art."

I particulary appreciated the following sentence, which pinpoints Derrida's especial relevance at a time when "cultural conservatism and religious fundamentalism" are on the rise around the globe: "Fortunately, he [Derrida] also taught us that the alternative to blind belief is not simply unbelief but a different kind of belief - one that embraces uncertainty and enables us to respect others whom we do not understand. In a complex world, wisdom is knowing what we don't know so that we can keep the future open."

'Nuff said, for now anyway...

Thursday, October 14, 2004

Remembering Jacques Derrida

The School of Humanities at the University of California, Irivine is hosting a website in honor of Jacques Derrida. At this website, you can register your name on a NY Times / In Memoriam page that has been established in order to testify to the lasting influence of Derrida's writing, teaching, and life.

As I mentioned in a previous post, in many of the obituaries for Derrida, including that written by Jonathan Kandell in the New York Times, Derrida's legacy has been deliberately distorted and maligned. The most typical distortion is some version of the claim that Derrida wanted to destroy the Western canon. In fact, Derrida's writings were intense and thorough engagements with some of the most important texts in the Western Canon, ranging from Plato's Phaedrus to Joyce's Ulysses to the Declaration of Independence.

By signing the letter, you can show your support for Derrida and send a message to the editors of the New York Times and the world that their dismissal of Derrida as an "abstruse theorist" is ignorant.

Sunday, October 10, 2004

Giant Ape 'Discovered' in the DR Congo

If this giant ape is actually a new species or a gorilla-chimpanzee hybrid, it's amazing news. Protect these primates, please!

Saturday, October 09, 2004

Jacques Derrida Dead of Pancreatic Cancer

Sad news, indeed, though not that surprising. Ned Lukacher, a friend of Derrida's and the English translator of Cinders, told me last year that Derrida announced he had cancer to participants in a conference in UC-Irvine. Ned said that he was rather shocked at how frail Derrida looked at that time and noted that the philosopher, known for his vigor, was clearly tired and didn't participate as actively in the proceeding.

What's particularly frustrating is that this obituary is full of erroneous statements and distortions of Derrida's thought and project. Derrida's readings of philosophical texts don't aim to reveal hidden meanings; they proceed to interrogate rigorously the basic trope of revelation as it functions to perpetuate the myth of communicative immediacy. Moreover, why not quote Derrida directly, rather than telling us that "Fellow academics have charged that Derrida's writings 'deny distinction between reality and fiction'"? Which academics have said this, and why not name them and hold them accountable for their misinterpretation? It will be interesting to see which, if any, of the major journalistic publications publish an accurate description of Derrida's philosophy, particularly his account of language's capacity to generate signifying effects beyond the control of any single language user.

Thursday, October 07, 2004

Was Bush Wired for the Debate?

Who knew that the rules governing the presedential debates allowed for electronic earpieces and offstage promptings? Sadly, this latest rumor is probably true and, even more disheartening, it's not even surprising. Everybody knows (as Leonard Cohen sings) that we've got an animatronic president with a history of lying and cheating.

Friday, October 01, 2004

Here We Are Now, Entertain Us

I'm taking the liberty of posting the following essay All Entertainment All the Time by Mark Edmunson, which pinpoints many of the challenges facing professors in the contemporary university, particularly professors who teach in the humanities.

The following essay is excerpted from Why Read? by Mark Edmunson, which was published in September by Bloomsbury.

I can date my sense that something was going badly wrong in my own teaching to a particular event. It took place on evaluation day in a class I was giving on the works of Sigmund Freud. The class met twice a week, late in the afternoon, and the students, about fifty undergraduates, tended to drag in and slump into their chairs looking slightly disconsolate, waiting for a jump start. To get the discussion moving, I often provided a joke, an anecdote, an amusing query. When you were a child, I had asked a few weeks before, were your Halloween costumes id costumes, superego costumes, or ego costumes? Were you monsters—creatures from the black lagoon, vampires and werewolves? Were you Wonder Women and Supermen? Or were you something in between? It often took this sort of thing to raise them from the habitual torpor.

But today, evaluation day, they were full of life. As I passed out the assessment forms, a buzz rose up in the room. Today they were writing their course evaluations; their evaluations of Freud, their evaluations of me. They were pitched into high gear. As I hurried from the room, I looked over my shoulder to see them scribbling away like the devil’s auditors. They were writing furiously, even the ones who struggled to squeeze out their papers and journal entries word by word.

But why was I distressed, bolting out the door of my classroom, where I usually held easy sway? Chances were that the evaluations would be much like what they had been in the past: They’d be just fine. And in fact, they were. I was commended for being “interesting,” and complimented for my relaxed and tolerant ways; my sense of humor and capacity to connect the material we were studying with contemporary culture came in for praise.

In many ways, I was grateful for the evaluations, as I always had been, just as I’m grateful for the chance to teach in an excellent university surrounded everywhere with very bright people. But as I ran from that classroom, full of anxious intimations, and then later as I sat to read the reports, I began to feel that there was something wrong. There was an undercurrent to the whole process I didn’t like. I was disturbed by the evaluation forms themselves with their number ratings (“What is your ranking of the instructor?—1, 2, 3, 4 or 5") which called to mind the sheets they circulate after a TV pilot plays to the test audience in Burbank. Nor did I like the image of myself that emerged—a figure of learned but humorous detachment, laid-back, easygoing, cool. But most of all, I was disturbed by the attitude of calm consumer expertise that pervaded the responses. I was put off by the serenely implicit belief that the function of Freud—or, as I’d seen it expressed on other forms, in other classes, the function of Shakespeare, of Wordsworth or of Blake—was diversion and entertainment. “Edmundson has done a fantastic job,” said one reviewer, “of presenting this difficult, important and controversial material in an enjoyable and approachable way.”

Enjoyable: I enjoyed the teacher. I enjoyed the reading. Enjoyed the course. It was pleasurable, diverting, part of the culture of readily accessible, manufactured bliss: the culture of Total Entertainment All the Time.

As I read the reviews, I thought of a story I’d heard about a Columbia University instructor who issued a two-part question at the end of his literature course. Part one: What book in the course did you most dislike; part two: What flaws of intellect or character does that dislike point up in you? The hand that framed those questions may have been slightly heavy. But at least it compelled the students to see intellectual work as confrontation between two people, reader and author, where the stakes mattered. The Columbia students were asked to relate the quality of an encounter, not rate the action as though it had unfolded across the big screen. A form of media connoisseurship was what my students took as their natural right.

But why exactly were they describing the Oedipus complex and the death drive as interesting and enjoyable to contemplate? Why were they staring into the abyss, as Lionel Trilling once described his own students as having done, and commending it for being a singularly dark and fascinatingly contoured abyss, one sure to survive as an object of edifying contemplation for years to come? Why is the great confrontation—the rugged battle of fate where strength is born, to recall Emerson—so conspicuously missing? Why hadn’t anyone been changed by my course?

To that question, I began to compound an answer. We Americans live in a consumer culture, and it does not stop short at the university’s walls. University culture, like American culture at large, is ever more devoted to consumption and entertainment, to the using and using up of goods and images. We Americans are six percent of the world’s population: We use a quarter of its oil; we gorge while others go hungry; we consume everything with a vengeance and then we produce movies and TV shows and ads to celebrate the whole consumer loop. We make it—or we appropriate it—we “enjoy” it and we burn it up, pretty much whatever “it” is. For someone coming of age in America now, I thought, there are few available alternatives to the consumer worldview. Students didn’t ask for it much less create it, but they brought a consumer Weltanschauung to school, where it exerted a potent influence.

The students who enter my classes on day one are generally devotees of spectatorship and of consumer-cool. Whether they’re sorority-fraternity denizens, piercer-tattooers, gay or straight, black or white, they are, nearly across the board, very, very self-contained. On good days, there’s a light, appealing glow; on bad days, shuffling disgruntlement. But there is little fire, little force of spirit or mind in evidence.

More and more, we Americans like to watch (and not to do). In fact watching is our ultimate addiction. My students were the progeny of two hundred available cable channels and omnipresent Blockbuster outlets. They grew up with their noses pressed against the window of that second spectral world that spins parallel to our own, the World Wide Web. There they met life at second or third hand, peering eagerly, taking in the passing show, but staying remote, apparently untouched by it. So conditioned, they found it almost natural to come at the rest of life with a sense of aristocratic expectation: “What have you to show me that I haven’t yet seen?”

But with this remove comes timidity, a fear of being directly confronted. There’s an anxiety at having to face life firsthand. (The way the word “like” punctuates students’ speech—“I was like really late for like class”—indicates a discomfort with immediate experience and wish to maintain distance, to live in a simulation.) These students were, I thought, inclined to be both lordly and afraid.

The classroom atmosphere they most treasured was relaxed, laid-back, cool. The teacher should never get exercised about anything, on pain of being written off as a buffoon. Nor should she create an atmosphere of vital contention, where students lost their composure, spoke out, became passionate, expressed their deeper thoughts and fears, or did anything that might cause embarrassment. Embarrassment was the worst thing that could befall one; it must be avoided at whatever cost.

Early on, I had been a reader of Marshall McLuhan, and I was reminded of his hypothesis that the media on which we as a culture have become dependent are themselves cool. TV, which seemed on the point of demise, so absurd had it become to the culture of the late sixties, rules again. To disdain TV now is bad form; it signifies that you take yourself far too seriously. TV is a tranquilizing medium, a soporific, inducing in its devotees a light narcosis. It reduces anxiety, steadies and quiets the nerves. But also deadens. Like every narcotic, it will be consumed in certain doses, produce something like a hangover, the habitual watchers’ irritable languor that persists after the TV is off. It’s been said that the illusion of knowing and control that heroin engenders isn’t entirely unlike the TV consumer’s habitual smug-torpor, and that seems about right.

Those who appeal most on TV over the long haul are low-key and nonassertive. Enthusiasm quickly looks absurd. The form of character that’s most ingratiating on the tube, that’s most in tune with the medium itself, is laid-back, tranquil, self-contained, and self-assured. The news anchor, the talk-show host, the announcer, the late-night favorite—all are prone to display a sure sense of human nature, avoidance of illusion, reliance on timing and strategy rather than on aggressiveness or inspiration. With such figures, the viewer is invited to identify. On what’s called reality TV, on game shows, quiz shows, inane contests, we see people behaving absurdly, outraging the cool medium with their firework personalities. Against such excess the audience defines itself as wordly, laid-back, and wise.

Is there also a financial side to the culture of cool? I believed that I saw as much. A cool youth culture is a marketing bonanza for producers of right products, who do all they can to enlarge that culture and keep it humming. The Internet, TV, and magazines teem with what I came to think of as persona ads, ads for Nikes and Reeboks, and Jeeps and Blazers that don’t so much endorse the powers of the product per se as show you what sort of person you’ll inevitably become once you’ve acquired it. The Jeep ad that featured hip outdoorsy kids flinging a Frisbee from mountain top to mountaintop wasn’t so much about what Jeeps can do as it was about the kind of people who own them: vast, beautiful creatures, with godlike prowess and childlike tastes. Buy a Jeep and be one with them. The ad by itself is of little consequence, but expand its message exponentially and you have the central thrust of postmillennial consumer culture: buy in order to be. Watch (coolly) so as to learn how to be worthy of being watched (while being cool).

To the young, I thought, immersion in consumer culture, immersion in cool, is simply felt as natural. They have never known a world other than the one that accosts them from every side with images of mass-marketed perfection. Ads are everywhere: on TV, on the Internet, on billboards, in magazines, sometimes plastered on the side of the school bus. The forces that could challenge the consumer style are banished to the peripheries of culture. Rare is the student who arrives at college knowing something about the legacy of Marx or Marcuse, Gandhi or Thoreau. And by the time she does encounter them, they’re presented as diverting, interesting, entertaining—or perhaps as object for rigorously dismissive analysis—surely not as goads to another kind of life.

As I saw it, the specter of the uncool was creating a subtle tyranny for my students. It’s apparently an easy standard to subscribe to, the standard of cool, but once committed to it, you discover that matters are different. You’re inhibited, except on ordained occasions, from showing feeling, stifled from trying to achieve anything original. Apparent expression of exuberance now seem to occur with dimming quotation marks around them. Kids celebrating at a football game ironically play the roles of kids celebrating at a football game, as it’s been scripted on multiple TV shows and ads. There’s always self-observation, no real letting-go. Students apparently feel that even the slightest departure from the reigning code can get you genially ostracized. This is a culture tensely committed to a laid-back norm.

In the current university environment, I saw, there was only one form of knowledge that was generally acceptable. And that was knowledge that allowed you to keep your cool. It was fine to major in economics or political science or sociology, for there you could acquire ways of knowing that didn’t compel you to reveal and risk yourself. There you could stay detached. And—what was at least as important—you could acquire skills that would stand you in good financial stead later in life. You could use your educations to make yourself rich. All of the disciples that did not traduce the canons of cool were thriving. It sometimes seemed that everyone of my first-year advisees wanted to major in economics, even when they had no independent interest in the subject. They’d never read an economics book, had no attraction to the business pages of the Times. They wanted economics because word had it that econ was the major that made you look best to Wall Street and the investment banks. “We like economics majors,” an investment banking recruiter reportedly said, “because they’re people who’re willing to sacrifice their educations to the interest of their careers.”

The subjects that might threaten consumer cool, literary study in particular, had to adapt. They could offer diversion—it seems that’s what I (and Freud) had been doing—or they could make themselves over to look more like the so-called hard, empirically based disciplines.

Here computers come in. Now that computers are everywhere, each area of inquiry in the humanities is more and more defined by the computer’s resources. Computers are splendid research tools. Good. The curriculum turns in the direction of research. Professors don’t ask students to try to write as Dickens would were he alive today. Rather, they research Dickens. They delve into his historical context; they learn what the newspapers were gossiping about on the day that the first installment of Bleak House hit the stands. We shape our tools, McLuhan said, and thereafter our tools shape us.

Many educated people in America seem persuaded that the computer is the most significant invention in human history. Those who do not master its intricacies are destined for a life of shame, poverty, and neglect. More humanities courses are becoming computer-oriented, which keeps them safely in the realm of cool, financially negotiable endeavors. A professor teaching Blake’s “The Chimney Sweeper,” which depicts the exploitation of young boys whose lot is not altogether unlike the lot of many children living now in American inner cities, is likely to charge his students with using the computer to compile as much information about the poem as possible. They can find articles about chimney sweepers from 1790s newspapers; contemporary pictures and engravings that depict these unfortunate little creatures; critical articles that interpret the poem in a seemingly endless variety of ways; biographical information on Blake, with hints about events in his own boyhood that would have made chimney sweepers a special interest; portraits of the author at various stages of his life; maps of Blake’s London. Together the class might create a Blake—Chimney Sweeper Web site: www.blakesweeper.edu.

Instead of spending class time wondering what the poem means, and what application it has to present-day experience, students compile information about it. They set the poem in its historical and critical context, showing first how the poem is the product and the property of the past—and, implicitly, how it really has nothing to do with the present except as an artful curiosity; and second how, given the number of ideas about it already available, adding more thought would be superfluous.

By putting a world of facts at the end of a key-stroke, computers have made facts, their command, their manipulation, their ordering, central to what now can qualify as humanistic education. The result is to suspend reflection about the differences among wisdom, knowledge, and information. Everything that can be accessed online can seem equal to everything else, no datum more important or more profound than any other. Thus the possibility presents itself that there really is no more wisdom; there is no more knowledge; there is only information. No thought is a challenge or an affront to what one currently believes.

Am I wrong to think that the kind of education on offer in the humanities now is in some measure an education for empire? The people who administer an empire need certain very precise capacities. They need to be adept technocrats. They need the kind of training that will allow them to take up an abstract and unfelt relation to the world and its peoples—a cool relation, as it were. Otherwise, they won’t be able to squeeze forth the world’s wealth without suffering debilitating pains of conscience. And the denizen of the empire needs to be able to consume the kinds of pleasures that will augment his feeling of rightful rulership. Those pleasures must be self-inflating and not challenging; they need to confirm the current empowered state of the self and not challenge it. The easy pleasures of this nascent American empire, akin to the pleasures to be had in first-century Rome, reaffirm the right to mastery—and, correspondingly, the existence of a world teeming with potential vassals and exploitable wealth.

Immersed in preprofessionalism, swimming in entertainment, my students have been sealed off from the chance to call everything they’ve valued into question, to look at new forms of life, and to risk everything. For them, education is knowing and lordly spectatorship, never the Socratic dialogue about how one ought to live one’s life.

These thoughts of mine didn’t come with any anger at my students. For who was to blame them? They didn’t create the consumer biosphere whose air was now their purest oxygen. They weren’t the ones who should have pulled the plug on the TV or disabled the game port when they were kids. They hadn’t invited the ad flaks and money changers into their public schools. What I felt was an ongoing sense of sorrow about their foreclosed possibilities. They seemed to lack chances that I, born far poorer than most of them, but into a different world, had abundantly enjoyed.

As I read those evaluation forms and thought them over, I recalled a story. In Vienna, there was once a superb teacher of music, very old. He accepted a few students. There came to him once a young violinist whom all of Berlin was celebrating. Only fourteen, yet he played exquisitely. The young man arrived in Austria hoping to study with the master. At the audition, he played to perfection; everyone surrounding the old teacher attested to the fact. When it came time to make his decision. The old man didn’t hesitate. “I don’t want him,” he said. “But, master, why not?” asked a protégé. “He’s the most gifted young violinist we’ve ever heard.” “Maybe,” said the old man. “But he lacks something, and without this thing real development is not possible. What that young man lacks in inexperience.” It’s a precious possession, inexperience; my students have had it stolen from them.

But what about the universities themselves? Do they do all they can to fight the reign of consumer cool?

From the start, the university’s approach to students now has a solicitous, maybe even a servile tone. As soon as they enter their junior year in high school, and especially if they live in a prosperous zip code, the information materials, which is to say the advertising, come rolling in. Pictures, testimonials, videocassettes, and CD-ROMs (some hidden, some not) arrive at the door from colleges across the country, all trying to capture the students and their tuition dollars.

The freshman-to-be sees photographs of well-appointed dorm rooms; of elaborate physed facilities; of expertly maintained sports fields; of orchestras and drama troupes; of students working joyously, off by themselves. It’s a retirement spread for the young. “Colleges don’t have admissions offices anymore, they have marketing departments,” a school financial officer said to me once. Is it surprising that someone who has been approached with photos and tapes, bells and whistles, might come to college thinking that the Shakespeare and Freud courses were also going to be agreeable treats?

How did we reach this point? In part, the answer is a matter of demographics and also of money. Aided by the GI Bill, the college-going population increased dramatically after the Second World War. Then came the baby boomers, and to accommodate them colleges continued to grow. Universities expand readily enough, but with tenure locking in faculty for lifetime jobs, and with the general reluctance of administrators to eliminate their own slots, it’s not easy for a university to contract. So after the baby boomers had passed through—like a tasty lump sliding the length of a boa constrictor—the colleges turned to promotional strategies—to advertising—to fill the empty chairs. Suddenly college, except for the few highly selective establishments, became a buyers’ market. What students and their parents wanted had to be taken potently into account. That often meant creating more comfortable, less challenging environments, places where almost no one failed, everything was enjoyable, and everyone was nice.

Just as universities must compete with one another for students, so must individual departments. At a time of rank economic anxiety (and what time is not in America?), the English department and the history department have to contend for students against the more success-ensuring branches, such as the science departments, and the commerce school. In 1968, more than 21 percent of all the bachelor’s degrees conferred in America were humanities degrees; by 1993 that total had fallen to about 13 percent, and it continues to sink. The humanities now must struggle to attract students, many of whose parents devoutly wish that they would go elsewhere.

One of the ways we’ve tried to be attractive is by loosening up. We grade much more genially than our colleagues in the sciences. In English and history, we don’t give many D’s, or C’s either. (The rigors of Chem 101 may create almost as many humanities majors per year as the splendors of Shakespeare.) A professor at Stanford explained that grades were getting better because the students were getting smarter every year. Anything, I suppose, is possible.

Along with easing up on grades, many humanities departments have relaxed major requirements. There are some good reasons for introducing more choice into the curricula and requiring fewer standard courses. But the move jibes with a tendency to serve the students instead of challenging them. Students can float in and out of classes during the first two weeks of the term without making any commitment. The common name for this span—shopping period—attests to the mentality that’s in play.

One result of the university’s widening elective leeway is to give students more power over teachers. Those who don’t like you can simply avoid you. If the students dislike you en masse, you can be left with an empty classroom. I’ve seen other professors, especially older ones, often those with the most to teach, suffer real grief at not having enough students sign up for their courses: Their grading was too tough; they demanded too much; their beliefs were too far out of line with the existing dispensation. It takes only a few such incidents to draw other professors into line.

Before students arrive, universities ply them with luscious ads, guaranteeing them a cross between summer camp and lotusland. When they get to campus, flattery, entertainment, and preprofessional training are theirs, if that’s what they want. The world we present them is not a world elsewhere, an ivory tower world, but one that’s fully continuous with the American entertainment and consumer culture they’ve been living in. They hardly know they’ve left home. Is it a surprise, then, that this generation of students—steeped in consumer culture before they go off to school; treated as potent customers by the university well before they arrive, then pandered to from day one—are inclined to see the books they read as a string of entertainments to be enjoyed without effort or languidly cast aside?

So I had my answer. The university had merged almost seamlessly with the consumer culture that exists beyond its gates. Universities were running like businesses, and very effective businesses at that. Now I knew why my students were greeting great works of mind and heart as consumer goods. They came looking for what they’d had in the past, Total Entertainment All the Time, and the university at large did all it could to maintain the flow. (Though where this allegiance to the Entertainment-Consumer Complex itself came from—that is a much larger question. It would take us into politics and economics, becoming, in time, a treatise in itself.)

But what about me? Now I had to look at my own place in the culture of training and entertainment. Those course evaluations made it clear enough. I was providing diversion. To some students I was offering an intellectualized midday variant of Letterman and Leno. They got good times from my classes, and maybe a few negotiable skills, because that’s what I was offering. But what was I going to do about it? I had diagnosed the problem, all right, but as yet I had nothing approaching a plan for action.

I’d like to say that I arrived at something like a breakthrough simply by delving into my own past. In my life I’ve had a string of marvelous teachers, and thinking back on them was surely a help. But some minds—mine, at times, I confess—tend to function best in opposition. So it was looking not just to the great and good whom I’ve known, but to something like an arch-antagonist, that got me thinking in fresh ways about how to teach and why.


From the book Why Read? by Mark Edmundson. Published by Bloomsbury USA.Copyright (c) 2004 by Mark Edmundson. Reprinted courtesy of Bloomsbury Publishing. Available wherever books are sold.

Mark Edmundson is a professor of English at the University of Virginia. A prizewinning scholar, he has published a number of works of literary and cultural criticism, including Literature Against Philosophy, Plato to Derrida, and Teacher: The One Who Made the Difference. He has also written for such publications as the New Republic, the New York Times Magazine, the Nation, and Harper's, where he is a contributing editor.