Saturday, January 28, 2006

Common Dreams

http://www.commondreams.org/views06/0121-22.htm

Should the President be King? Reflections from the Deep Origins of America

by Robert Freeman

When he wrote the Constitution in 1789, James Madison had a specific goal in
mind: to create a system of government that would constrain the tyrannous
behavior of an unaccountable executive. Only in this way, Madison knew,
would the "blessings of liberty" be able to flourish and grow in the new
United States.

The essential features of the government he envisioned to carry out this aim
included representation of the people, separation of powers, checks and
balances, the rule of law, and protection of the citizenry from unwarranted
intrusion by the government.

But many of those ideals are at risk today in President Bush's breathtaking
assertion that he is accountable to no one in his determination to spy on
American citizens. Indeed, according to the theory of the "unitary
executive" espoused by Samuel Alito, there are literally no limits to
presidential actions so long as they are couched as part of the "war on
terror."

Yet claims of unchallengeable authority rooted in the Constitution are
belied in a straightforward understanding of what Madison intended to
create. The founders had just fought the Revolutionary War to free
themselves from the tyranny of an unbridled King - one who would not even
deign to obey his own laws. And before that, in the 1600s, the English
people had fought a Civil War to prevent their own subjugation to a series
of despotic monarchs.

It is preposterous, therefore, to imagine that Madison would then turn
around and design a government where the Executive, the president, had
uncontrollable powers in any circumstance. Only a fantastically licentious,
indeed, deceitful reading of the history of the time can produce such an
interpretation.

Democracy first died when Augustus Caesar overthrew the Roman Republic in 28
B.C. It was not reborn until the 1600s when the English people confronted a
new king, James I, who claimed to rule under the doctrine of "the divine
right of Kings."

James was an arrogant man. He had members of Parliament arrested for
questioning his conduct of the war with Spain. Parliament responded that
such arrests challenged its very existence and that that existence was not
subject to the king's whim. This was the first statement of the inherent
right to legislative representation independent of the authority of the
king.

James' son, Charles I, was even more imperious than his father. He extracted
"forced loans" from wealthy members of Parliament and threw 76 of them in
jail when they refused to pay. Parliament responded in 1628 with the
Petition of Right, a landmark in western constitutional law. It stated that
the king could not force money from people without the approval of
Parliament. This idea would become the rallying cry of the American
Revolution: no taxation without representation.

Unbowed, Charles began imprisoning adversaries on trumped-up charges
including treason and even murder. Parliament replied that before such
charges would be taken up, the king would have to "show us the body," habeas
corpus. This became the foundation of the due process of law, one of the
most important protections of the citizenry against a vengeful or renegade
executive.

In 1637, Charles started a war with Scotland. But he did so without
consulting Parliament, something that had not happened since 1323. The war
ended before Parliament could rebuke Charles but when the Irish rebelled
four years later, Parliament refused to grant Charles an army. It was a
harbinger of the U.S. Congress' power both to declare war and to allocate
monies for public purposes.

Charles' conflict with Parliament escalated into Civil War. Charles lost the
War and in 1649 Parliament cut off his head. It was the first time in the
western world that a sitting monarch had been executed by a rebellion of his
own people. It signaled an astonishing reversal in the historical
relationship between executive and legislature, and between citizen and
sovereign.

In 1685, Charles' son, James II, became King. His conflicts with Parliament
harkened back to those of his father. But by 1689, Parliament had wearied of
James' contemptuous treatment and threatened him with the same fate as his
father: beheading. James abdicated and quietly quit England for France.
In his stead, Parliament invited James' son-in-law, William of Orange, then
king of Holland, to become king of England. William and his wife, Mary,
accepted the position, but only after they had acceded to the creation of
the English Bill of Rights.

This "Glorious Revolution" of 1689 was a peaceful, bloodless coup d'etat.
For the first time in history the rights of an entire people were enshrined
into a Constitution and a Bill of Rights-a framework of laws that define how
a king may govern and how a government must relate to its citizens.
Over the course of this century, then, England made the
first-time-in-the-history-of-the-world transition from an absolute monarchy
based on the claim of the divine right of kings to a constitutional monarchy
based on the twin ideals of the rule of law and the consent of the governed.
It was a breathtakingly noble ascent to political maturity, the willingness
of a people to govern themselves by laws rather than submit as cattle to the
autocratic dictates of a single man.

It should come as no surprise that the two leading theorists of modern
government emerged from this epochal conflict. Thomas Hobbes was appalled at
the disorder of the country and wrote Leviathan, claiming that the highest
duty of the King was to protect the security of the citizens. Hobbes, a firm
believer in the divine right of kings, is the philosopher on whom George
Bush relies to legitimize his peremptory actions.

John Locke, on the other hand, theorized that a government was made
legitimate, not by divine right, but rather by the "consent of the
governed". Locke believed that people had "natural rights" that could not be
taken away and that among these were "life, liberty, and private property."
Pointedly, it was Locke and not Hobbes who Thomas Jefferson was channeling
(however imperfectly) when he wrote the Declaration of Independence in 1776.

The colonists, of course, were Englishmen. The American Revolutionary War
occurred because a new King, George III, refused to honor these ideals,
denying the protections of English law to his own citizens, the colonists.
As Thomas Jefferson wrote in the Declaration of Independence, Americans had
"suffered a long train of abuses and usurpations.designed to reduce them
under absolute Despotism.

In the Declaration, Jefferson listed 27 specific offenses including, among
others, the facts that the King had:

. Dissolved Representative Houses repeatedly.
. Obstructed the Administration of Justice
. Quartered large bodies of armed troops among us
. Imposed taxes without our consent
. Deprived us, in many cases, of the benefits of Trial by Jury

So grave were these violations, and so intransigent was the King in
remedying them, that the colonists had no recourse but to go to war.
There is no room for interpretation here: the Revolutionary War was fought
and the Constitution was written to free the colonists from the abuse of
"absolute Despotism." The manner of securing such freedom was the system of
separation of powers embodied in the Executive, Legislative, and Judicial
branches of government and the checks and balances attendant on each of
their roles.

Given this history, it is startling, even brazen, that some try to claim a
"unitary executive" that cannot be challenged by Congress, at least in times
of war. Challenging the executive in time of war is precisely the way that
America was born. Madison himself could not have been more lucid on this
point.

In 1795, he wrote, "Of all the enemies of true liberty, war is, perhaps, the
most to be dreaded, because it comprises and develops the germ of every
other. In war, the discretionary power of the Executive is extended; its
influence is multiplied; and all the means of seducing the minds are added
to those of subduing the force of the people." A more prescient description
of the allure of war - at least for the executive - could hardly be written.

The supreme irony - if not hypocrisy - of the theory of the "unitary
executive" is that it is espoused by the very same people who purport that
the Judiciary should be bound by an equally phantasmical theory of "original
intent." Under this theory the Supreme Court should interpret the
Constitution according to the intent of its authors, an intent only these
latter-day "originalists" claim to be able to accurately divine.

But the Executive, on the other hand, should be freed entirely from such
original intent, liberated to pursue a starkly post-modern vision of a
virulently anti-democratic authoritarianism that would have been wholly
repugnant to the very same founders. Either Madison and the founders were
schizophrenic or the current "theorists" are duplicitous. They can't have it
both ways.

The most dangerous of George Bush's formulations surrounding the issue of
unwarranted wiretapping is that his own usurpations must continue so long as
the country is at "war." Bush's "war on terror" is effectively endless
because it is inherently self-catalyzing, spawning more terror than it is
capable of eradicating.

Before Bush's invasion, Iraq was not a source of terrorism. Today, it is the
world's pre-eminent trainer and exporter of terror. Major incidents of
international terrorism have tripled since the invasion in 2003. Perhaps it
is this auto-inflammatory dynamic that Dick Cheney referred to when he
claimed we were facing a war, "that will not end in our lifetimes."

Tellingly, Madison wrote, "No nation can preserve its freedom in the midst
of continual warfare."

The confluence of these two startling facts, the claim for unlimited power
based on war, and the endless nature of the war itself, poses grave threats
to the American Constitutional order. And the threat is made all the more
dire in the realization that the war had been planned since the first days
of the Bush administration and that it was sold to the American people
through a vigorous, sustained campaign of Executive deceit.

Shorn of all distractions, the "unitary executive" and Bush's claim to
legitimacy in spying amount to this: that one man can lie the country into
war and then, on the basis of that war, declare himself above the law -
essentially suspending the Constitution. It is a legal prescription for the
self-destruction of democratic government.

But the American form of government is a legacy that belongs to all
Americans, indeed, to all humans. It is the product of four centuries of
human aspirations for protection from an abusive executive. It is not Bush's
to take away. Which is not to say that it cannot be lost. Hannah Arendt once
wrote that, "Although tyranny may successfully rule over foreign peoples, it
can stay in power only if it first destroys the national institutions of its
own people."

Bush has openly declared and imperiously acted out his preference for a
dictatorship-provided, of course, that he is the dictator. But dictatorship
stands against every value, every virtue that lies at the heart of America.

It will require a fierce determination on the part of the people to keep
what is their most long lived, hard won, and (hopefully) deeply treasured
gift. But it is a fight that can and must be waged. The alternative is a
return to a medieval darkness of divine right, autocracy and oppression.
--------------------------------
Robert Freeman writes on history, economics, and education. He can be
reached at robertfreeman10@yahoo.com.

Hindu Textbooks

Amitava Kumar on English Textbooks
The Hindu, Literary Review, October 2, 2005

Textbook of Laughter and Forgetting

We have repeatedly witnessed in recent years, almost like the seasonal outbreak of a distressing form of cholera, controversies over the contents of history textbooks.

But why is there no discussion about what school-children are asked to read in their English textbooks?

I have very little memory now of what I had read in the books used in my history classes, although I do remember the attention with which I would copy out on clean sheets of paper the line-drawings that represented the portraits of emperors. Akbar's moustache drooped. Humayun was thin and wizened, already preparing, it seemed, for a premature death. The rounded lines in the portrait of Shah Jahan contained all the sorrow of love's futile striving. Nearly everything else in those books escapes me at the moment.

This might be entirely because I was a mediocre student and, like the uninspired everywhere, I found my classes stultifying. But the fact remains that I still have vivid and exact memories of what I read in my English textbooks. It was there that I read George Orwell's account of shooting an elephant in Burma, Dom Moraes on a trip to the Thar, Khushwant Singh's depiction of life in the village of Mano Majra, Somerset Maugham describing the solitude on his seventieth birthday.

When I was sixteen, I left my hometown Patna to go to school in Delhi. The school where I got admission, Modern School on Barakhamba Road, was a prestigious enclave where the children of the rich and the powerful came each day as if they were visiting a familiar club. Our teachers, for the most part drawn from the Punjabi middle-class, could only use a puritanical and unimaginative pedagogy to prop themselves up against the display of wealth. They knew in their hearts that they were superfluous and stuck to the dull routine of making us read and repeat the words in the textbooks prescribed by the school board.

Nevertheless, the English textbooks that I read and reread during those two years gave me a sense of language and an idea of how to express my own sense of the world that I inhabited. This is what literature can do, even without your knowing it. Shouldn't there be wider debate, then, on what our students read in their books?

I recently received a letter from an editor at Macmillan-India. He had written to say that he was preparing a textbook for the Intermediate level students in Bihar and he wanted permission to use an essay of mine in which I had written about a visit to the Khudabaksh Library in Patna. *

The letter brought back the mixed memories from my youth. In my reply to the editor, I readily granted permission. I didn't ask for any payment. It seemed to me that even one poor student reading me in Bihar would be worth a thousand readers in South Delhi or abroad.

When I remembered my own alienating classroom experiences, it gave me pleasure to think that now a reader in Bihar would be able to rediscover his or her own world in my writing. The names of places as well as the people, the sentiments shared by the writer, even the dust on the streets—all of this would be familiar to the student in towns like Ara or Motihari. How many times before had Bihari students found their lives reflected in the English textbooks prescribed for their courses?

Then, just last week, the postman brought a registered package from India. It was the textbook with my essay in it. I read the book quickly. The search for relevance by the education council had meant not only the inclusion of Bihari writers like Tabish Khair among the contributors but also pieces that provided urgent social critique. A good example was a poem "Voice of the Unwanted Girl" by Sujata Bhatt, written in the voice of a destroyed foetus, presenting a protest against female infanticide. Textbooks elsewhere in India should include writings like this that touch the heart and challenge the mind.

Our students need to be freed from the claustrophobia of the classroom. The prose and poetry that we offer them should appear to them fresh and enlivening. The Macmillian-India book began with a brilliant, hopeful piece by Jawaharlal Nehru, its elegant rhetoric paying homage to the arrival of Gandhi. I felt my senses lift while reading the essay. However, I'd like to see students also reading well-written critical pieces on subjects as seemingly trivial as Bombay films. Let's give them Ashis Nandy's incisive essay on P.C. Barua and Devdas. It will engage—and educate—students as much if not more than Shakespeare and Blake.

Why is it that English textbooks, including the one I was sent, are top-heavy with hagiographies of our national leaders? I have rarely seen letters printed in these books. There is very little travel-writing. There is no space ever for quality journalism. In general, we should also be publishing more women writers. To my students in America, I have taught Mahasweta Devi, Ismat Chughtai, Urvashi Butalia, and Arundhati Roy. Why are these writers not being taught in the places where I studied in India?

In the textbook that sparked these reflections, I found a story by O. Henry called "After Twenty Years." I had read this story in my English class twenty years ago. The lines of dialogue and the characteristic, surprising O. Henry twist at the end of the story came flooding back as I turned the pages. But this experience also made me distrust my pleasure and my nostalgia. Why are textbooks so remarkably unchanged even after decades?

The most disturbing aspect of the controversies over the history textbooks has been the extent to which current political interests determined what was taught in the classroom. That was detrimental, no doubt, but in the matter of English textbooks the opposite has been true. Our textbooks have remained for the most part trapped in the bubble of their own past. They continue to be hodge-podge collections of quaint pieces, somewhat suspect in their usefulness, a bit like the clay-objects strewn beside a corpse in a ceremonial grave. It is no surprise that in our professional use of the English language, as a people, we remain stiff, formal, awkward. Unless these textbooks are radically changed, our teachers will remain mummy-makers, wrapping cotton around our children's mouths.


****

* In the same issue of the Hindu, the following commentary on Amitava Kumar's Bombay-London-New York was also published. The writer is Pradeep Sebastian.

On an impulse, I decided to read Amitava Kumar's Bombay, London, New York again. I read it in a hurry when it first came out in 2002, noting with pleasure that it was, among many other things, the first really good book on reading written by an Indian. Reading it this time, I discovered with excitement that it is not only still the best Indian book about how and why we read but also an original, riveting piece of non-fiction. (His last book, Husband of a Fanatic is another brilliantly sustained work of literary non-fiction). Bombay, London, New York is a meditation on the self, the home and the world as experienced in books — a sort of "Amitava Through the Looking Glass of Books." It had been a long time since I read a book with such absorption. When I first began to reread it a few weeks ago, I read chapter after chapter hungrily, admiring its quiet brilliance, taking pleasure in its prose. I realised I would finish it too quickly and wanted to savour it. I took to rationing the chapters — two a day, I decided. One day I found myself returning eagerly from wherever I had gone to get back to the book.

While the theme of Diaspora runs memorably through his work, Kumar uses it to explore what home means in poignant, complex ways. "What I am always going back to is the moment when I was going away," he writes in Bombay, London, New York. "The movement I am most conscious of now is the movement of memory, shuttling between places. One place is home, the other the world." The different journeys Amitava Kumar makes in the book — actual and from memory — are insightful, deeply moving and clarifying for both the reader at home and the reader abroad. His perspectives are tough minded, unsentimental, nuanced. For instance, in the chapter "Traveling Light," he writes that what he is asking for is not that we turn our backs on the past but, "Rather, the point is to ascertain what our narratives of travel are going to be... what I'd like to know more about are the day to day struggles, successes, failures, and confusions of the ones who leave home to seek better fortune elsewhere. And equally crucial, what I want to see are accounts of what is suffered as well as celebrated in the most ordinary of ways by those who do not leave, those who stay behind, whether because they want to or simply because it cannot be otherwise... What if we were to replace all the hypocritical, self- mythologising accounts of expatriate fiction... with imaginative maps of toils and tales of small, unnoticed triumphs?"

The book's structure is beguiling: moving back and forth in suspenseful and surprising ways from personal narrative to marginalia on contemporary Indian fiction to cultural and political criticism. The photographs (by the author) that accompany the book are lovely — both the pictures themselves and the idea to use them that way. (One photograph in it — two young, striking looking South Asian women taking a cigarette break on a stoop — is something you almost want to own: It feels like a favourite still from a favourite movie.) The epilogue titled "Indian Restaurant," an account of an older, burnt-out academic, Shastriji, befriending a younger Kumar at an American university, reads like a wonderful chapter from a novel you don't want to see end. (His new, eagerly awaited, yet-to-be-out-book is a novel!)

I have a favourite passage in the book: a visit Kumar makes to the Khudabaksh library in Patna, his hometown. With my love for descriptions of the holding and handling of books, I found myself seduced. The library, the author tells us, is perhaps the richest manuscript library on Islam in the world and it is full of hidden treasures, such as 22,000 handwritten books, out of which at least 7,000 are rare manuscripts. The old, gentle librarian with his shaky right hand shows Kumar a "priceless book of poems by the Persian Hafiz that was presented by the Mogul ruler Humayun to the emperor of Iran... The librarian's dark finger hovers over the lines that the emperor had inscribed. The page is filigreed in gold, the bare portions stained with age. I want to touch the page myself. I ask the librarian's permission, and he says yes, I gently place my index finger where the emperor has signed his name."

What is particularly remarkable about Amitava Kumar's writing (to read his essays look at http://www.amitavakumar.com/) is the way he puts himself on the line over and over again in a way few Indian writers would. He writes in the tradition of the best personal essayists such as Philip Lopate, Joan Didion and Vivian Gornick, who write about the self and the world with a sense of discovery and intrepid candour. Kumar takes himself as the starting point and then goes on to examine his relationship with the world with even rarer, brutal, moving honesty. And yet the personal details in his books don't amount to self-absorption or self-promotion: more remarkably, his presence in the narrative, because of the risks he takes, feels self-effacing, illuminating, heroic.

Friday, January 20, 2006

Whose Afraid of the MLA?

Who's Afraid of the MLA?
By Nick Gillespie : BIO | 27 Dec 2005
http://www.tcsdaily.com/article.aspx?id=122705A
No academic conference draws more smirks and bitch-slaps than the annual Modern Language Association convention. Held every December 27-30, the MLA convention pulls together upwards of 10,000 literary scholars ranging in status from rock-star professors feeling the love of their intellectual acolytes to starving, hysterical grad students desperate for any position in a perennially tight job market.

This year's meeting, which is taking place in Washington, D.C., features almost 800 panels and presentations, ranging from Tuesday's "Women and Devotional Writing in Early Middle English" (the first literature panel listed in a program as thick as a phone book) to Friday's finale, "Gypsies in European Literature, Culture, and the Arts."
In between are meetings of groups devoted to Andre Gide, Margaret Fuller, William Carlos Williams, and seemingly every other author with more than a haiku to his name; endless job interviews in which those nervous grad students throw off more flop sweat than Thomas Jefferson contemplating a just god; and, not uncoincidentally, more cash bars than there are in heaven (or at least Brooklyn).
Despite its preeminence within academic literary and cultural studies, the MLA convention is the Rodney Dangerfield of such confabs, getting little or no respect not just from right-wingers who reliably scoff at the unmistakable left-wing bent to the proceedings but from liberal mainstream media who eye the jargon-choked pronouncements of the professoriate with equal helpings of disdain, derision, and dismissiveness.
Indeed, the MLA has been a running joke since 1989, when The New York Times ran a story mocking the titles of some of the conference's papers, most memorably one called "Jane Austen and the Masturbating Girl" (which on the face of it sounds far more interesting than the latest remake of Pride and Prejudice). Since then, the MLA's public image, to the extent that it has one, has been as a sort of a cruise ship of fools, where loony tenured radicals prattle on about defining "the Lacanian gaze," undermining the persistence of "late capitalism," and resisting commodity fetishism (while ostensibly embracing every other sort of fetishism), and more. Such postmodern antics caused the conservative journal The New Criterion to say "Farewell to the MLA" in a 1995 article. But the MLA's politically correct and arguably even more annoying obscurantist tendencies have also provided fertile ground for an annually repeated story in the Times and elsewhere, one every bit as worn out and tedious as an Art Buchwald holiday column.
Last year, for instance, the Times pooh-poohed what it dubbed "Eggheads' Naughty Word Games" and ran through a quick litany of silly-sounding titles (somehow, the paper of record never seems to stop chuckling long enough to get around to actually reading the essays in question), including "She's Just Like Alvy Singer! Kissing Jessica Stein and the Postethnic Jewish Lesbian"; "A Place for Giggling Field Hands: Queer Power and Social Equality in the Mid-20th-Century Plantation Myth"; "'Dude! Your Dress Is So Cute!' Patterns of Semantic Widening in 'Dude'"; and "A Pynch in Time: The Postmodernity of Prenational Philadelphia in Thomas Pynchon's Mason and Dixon and Mark Knopfler's 'Sailing to Philadelphia.'" Concluded the Times:
“What any of it has to do with teaching literature to America's college students remains as vexing a question to some today as it was a decade ago....The association has come to resemble a hyperactive child who, having interrupted the grownups' conversation by dancing on the coffee table, can't be made to stop.”
This is, to say the least, a peculiar way to frame coverage of a major academic conference in which leading scholars get together to discuss new research. As I've noted else where, beneath the mechanical reproduction of basically the same story every year is the buried presumption that literary scholarship properly should be about "teaching literature to America's college students." Does the Times worry about this same question when the annual chemistry conference comes to town?
As important, such a take misrepresents that vast bulk of MLA papers and panels which not only don't have laughable titles but are devoted to recognizable subject areas, historical periods, influential authors, and serious examination of new and old texts important to specialists. But panels called "American Neoclassicism," "John Donne and the Crises of His Times: Intellectual, Political, Religious," and "Troilus and Criseyde" (to name three from this year's offerings) just don't get the belly laughs. It's also worth pointing out that the assembled scholars do indeed spend time thinking about connections between their research and the classroom. Hence, panels such as "Iconicity and
Literature: Teaching Strategies;" "Teaching Indigenous and Foreign Languages"; and "How to Teach Prerevolutionary French Literature to Undergraduates and Why We Still Should."
None of this is to suggest that the MLA doesn't in many -- perhaps most -- ways live up to its reputation as one of the very most reliable bastions of political correctness. In 1999, for instance, the group passed a resolution opposing the "use of sweatshop, prison, and nonunion labor throughout the academic world," as if there are no meaningful distinctions to be drawn between, say, a convict working in a Chinese textile mill and a Fedex driver delivering packages at Harvard. At an annual convention in the recent past, French social critic Pierre Bourdieu, beamed in via satellite from Paris, exhorted the tweed-cloaked masses to join unions en masse. This year's conference features sessions on "Marxism and Globalization," "Marxist Theory: Between Aesthetics and Politics," and "Academic Work and the New McCarthyism," suggesting that Karl Marx remains more warmly remembered in U.S. literature departments than anywhere else in the world outside of Havana and Pyongyang. Similarly, this year's program leaves little doubt that the p.c. Holy Trinity of race, class, and gender will not go begging for attention.
But still, if you care about literature or culture, pat dismissals of the MLA are a shame. Despite its excesses, the annual convention comprises a State of the Union address when it comes to lit-crit studies. If it's true that we're in the midst of a culture boom -- a massive and ongoing expansion in art, music, print, video, and other forms of creative expression -- we'd be wise to keep up with analytical tools being created and perfected in the nation's universities. And, truth be told, the MLA is far less ideologically homogenous than one might think. Over the past several years, I even managed to organize special sessions on such market-friendly, libertarian topics as "The Economics of Culture: Non-Marxist Materialist Approaches to Literary and Cultural Studies" and "The Anti-Capitalist Mentality in Literary and Cultural Studies"; both sessions were well-attended and received. Hell, this year even boasts a luncheon arranged by the "Conference on Christianity and Literature." As with most things, there's surely more here than what you read about in The New York Times.
To that end, I'll be covering this year's MLA convention for TCS, filing dispatches on a daily basis through the week. I won't stint on reporting on ridiculous political excesses, but I'll also be on the lookout for new and interesting developments in lit-crit that might just all help us understand our text-soaked world a little better. And I'll be checking out those cash bars, too. Really, who could walk by something called "Romantics Cash Bar and Dinner," arranged by the Keats-Shelley Association of America, without stopping for a drink or two?

The Kids Are All Right, Dammit
WASHINGTON -- As the 2005 Modern Language Association annual convention officially got underway last night, attendees could choose from panels on "Travel Writing and Empire" (a growth field over the past decade or so, as "postcolonial studies" looking at the interplay between cultural artifacts and geopolitics has gained in popularity); "Contemporary Fiction and the Novel of Ideas" (featuring readings of books by Richard Powers, Tariq Ali, and Nicholas Mosley); and "Religion and Cultural Studies: Postmodern Approaches" (which included interesting-sounding papers on "Selling Religion and Literature in Cold War America" and "Georges Bataille's Yoga Practice" -- somehow I'm guessing the Surrealist author of several of the dirtiest books ever was into the Tantric variety); and much, much more.
I laid in with a panel called "English Studies and Political Literacy." As one of several "presidential forums," scattered throughout the week's proceedings, this crew not only tackled a big picture topic but pulled in scholars from outside traditional literature departments -- in this case a journalism professor and a political scientist.
In the end, the panel didn't really come up with any silver-bullet solutions to what all agreed was a stunning and troubling decline in student knowledge of and participation in not just partisan politics but civic engagement more broadly defined. But far more interesting than what they said was how they said it: Panelists alternated between recognizing that they had to change their teaching practices in order to connect with increasingly conservative students (see table 272) to invoking large-scale social, economic, and political forces and, in the words of one panelist, a "concerted effort" by "Goebbelsian" masters of rhetoric that have turned the country to the right over the past generation or so.
In other words, the panel accurately summed up the state of exasperation that many liberal and left-leaning academics feel not just about the kids these days but about American society more generally. More promising, perhaps, were the signs that such exasperation is leading to a moderation of ideological excess rather than a heightening of it. That is, faced with a choice between a sort of bitter righteousness and increasing irrelevance on the one hand and engaging students with more fair-minded argumentation and open-ended discussion, some academics are choosing the latter. That's certainly good news for kids stuck in freshman composition classes, those dreary required classes which are often little more than clumsy attempts at political indoctrination.
Political literacy, noted Emory University's Mark Bauerlein, matters because "of the heavy burden that democracy places on its citizens. Every government that is not watched closely slides into tyranny." If students don't know the basic facts of government and politics, they really have no role in the debates that will greatly affect their lives, he said. Among the signs of declining political literacy in college students, said the panelists, were lower rates of news consumption.
In 1972, said David Mindich, a journalism professor at St. Michael's College and author of Tuned Out: Why Americans Under 40 Don't Follow the News, half of all college students read a newspaper every day. Now the percentage is 21 percent. He and other panelists invoked a long-term decline in youth political participation, a trend which is at the very least complicated by the turnout in the 2004 election, in which, according to Pew Charitable Trusts, the youth vote surged more than any other group's. Indeed, with some notable exceptions, the notion of today's youth as disengaged is not particularly convincing.
Nonetheless, the panel's moderator, University of Tennessee's Donald Lazere, attributed political apathy largely to economic forces. Students from all socioeconomic backgrounds need to work outside jobs to pay for school and that financial squeeze leaves them less time for study. I'm not convinced that students are working more outside jobs in the past but if they are, that may be one of the prices paid to have more kids going directly to college after graduating high school. Since the mid-'90s, about two-thirds of graduating seniors go on to college, which is a sign of a strong higher education system (even if far fewer than 66 percent actually graduate). So is, for that matter, the amazing diversity of educational institutions -- indeed, one wishes that K-12 education offered up as many competing alternatives as you see in higher education.
And is working a job really antithetical to intellectual and political engagement? I never worked fewer than 30 hours a week during my undergraduate years and still I found plenty of time to kill in the library, engage in wee-hours bull sessions, and indulge in lost weekends.
Lazere, who identified himself openly as a "progressive," also noted that many students are simply "lacking the basic Hirschian cultural literacy" required for engagement in civic life. Adolph Reed, Jr., a political scientist at University of Pennsylvania, contributor to The Nation, and Labor Party stalwart, argued that "kids are sponges, they soak up what's around them." The past 25 years, he said, have been a period in which there's been "a concerted effort" to push the country to the right, to stifle left-wing political activity and dissent, and to create a consumer model of education in which the professor is really little more than glorified counter help. He singled out the rise of the for-profit University of Phoenix as a particularly reactionary trend (ironically, the University of Phoenix was created by billionaire John Sperling, a former political science professor, drug legalization advocate, and one of the biggest supporters of the Democratic Party). Such forces, concluded Reed, work to keep students from being politically conscious and engaged.
So what is to be done? Reed champions a plan for the federal government to fund tuition for any American that wants to go to college. Journalism professor Mindich argued half-heartedly for a non-binding national exam in current events and civics given to all 18-year-olds; he also called for a reinvigorated Federal Communications Commission to start forcing networks to show more news programming. Reed's plan is unlikely to ever be put in motion for any number of political and economic reasons. What's more, it is an unnecessary solution. There's no reason why middle- and upper-middle class students shouldn't pay for their education. Targeted aid programs, both privately and publicly funded, are already available to help lower-income students; what is probably more needed is not more money per se, but a spreading of social capital that will match students with no family experience of college with schools at which they will flourish. But that hardly necessitates a massive program that Reed likened to the G.I. Bill. (the remarkable self-interest of an academic arguing for essentially unlimited funding for higher education went uncommented upon). Mindich's exam seems ridiculous on the face of it -- and his view of the FCC as something other than a negative force on public discourse seems positively nostalgic.
Certainly, the last 20 years or so -- precisely the period in which cable and satellite services gave viewers a end-run around the FCC-regulated broadcast networks -- have seen a massive flourishing in all sorts of informational programming.
The University of Chicago's Kenneth Warren emphasized the role of pre-college education, even as he gently chided moderator Lazere for subtly equating "political literacy" with agreement on a particular political agenda. Lazere argued that instructors shouldn't shy away from politics in their classroom, because "literature can't be studied independent of political literacy." In fact, he said, they should bring in a wide array of sources, including The Nation and The Weekly Standard, where appropriate or relevant. That's all well and good. But Warren keyed into one of the unfortunate subtexts of any discussion of politics in academe (and, truth be told, everywhere else too). There's always a sense that a speaker thinks that once you understand things as clearly as he does, you'll of course agree completely with him. In this context, to be educated and smart strongly implies agreement on major issues. Which is one reason why the good faith of the classroom instructor is paramount: Students will turn off immediately if they realize they are being railroaded into agreement when discussing a topic.
Emory's Bauerlein -- who during a stint at the National Endowment for the Arts produced the widely discussed report "Reading at Risk" -- pushed the point of true ideological diversity. "We need more and wider perspectives," represented in the classroom, he said. "Bring in a little less Foucault and a little more Hayek. Some Whitaker Chambers to go along with Ralph Ellison." Bauerlein said to bring in a libertarian perspective, one that will upset longstanding Manichean right-left categories. One policy proposal with which I agreed wholeheartedly was his insistence that, along with The Nation and Weekly Standard, instructors should "bring in Reason magazine" (full disclosure: Bauerlein reviewed The Anti-Chomsky Reader for Reason earlier this year). In a more contentious moment, Bauerlein also pushed for instructors to provide students with an American identity that is positive. "Often the identity students get is too negative," he said. "We need not uncritical patriotism, but some line of argument about American history that students can espouse while criticizing other elements." That sort of positive feeling would, he argued, make it easier for students to want to become engaged politically and civically.
Arguably the most surprising presentation was offered up by Patricia Roberts-Miller, an associate professor of rhetoric and composition at the University of Texas at Austin. Roberts-Miller argued that in the classroom, "everyone's politics" -- including that of the professor's -- "should be open to change." She talked about the downsides of what she called "Calvinist political literacy," in which individuals, irrespective of ideology, look for reasons not to engage in political conversation. If Calvinism separates people into saints and sinners whose fates are predetermined and fixed forever, Calvinist political literacy means you don't have to argue with anyone with whom you disagree, because such interaction can only reveal differences rather than persuade.
Channeling radical education theorist Paolo Freire, she warned against thinking of students as "empty vessels" into which knowledge or enlightenment is poured. Rather, they need to be respected and taken seriously even and especially when they appear to be politically reactionary or obtuse.
Most of this is common sense, of course. But what is surprising is that it's coming from a composition theorist. When one digs into press accounts about the most tendentious classes in today's universities and colleges, they are often freshman comp classes. Over the past two decades or so, many of the designers of composition curricula have consciously seen those classes as the ideal place for political indoctrination to a sort of standard left-wing agenda. As one professor friend of mine told me, she's been in department meetings where comp doyennes have declared, "This is our best shot at getting into the minds of students."
So it's heartening to hear someone in Roberts-Miller's position talking the way she does. It suggests that one of the great virtues of higher education -- open-ended discussion -- is hardly a dead letter. Ironically, the beleaguered position of the left in contemporary America, if not the country's universities, may lead to its resurgence as it forced to engage and persuade indifferent -- or skeptical -- students.


When Darwin Meets Dickens


Editor’s note: This is the third installment of Nick Gillespie’s coverage of the Modern Language Association’s annual meeting.

One of the subtexts of this year's Modern Language Association conference -- and, truth be told, of most contemporary discussions of literary and cultural studies -- is the sense that lit-crit is in a prolonged lull. There's no question that a huge amount of interesting work is being done -- scholars of 17th-century British and Colonial American literature, for instance, are bringing to light all sorts of manuscripts and movements that are quietly revising our understanding of liberal political theory and gender roles -- and that certain fields -- postcolonial studies, say, and composition and rhetoric -- are hotter than others. But it's been years -- decades even -- since a major new way of thinking about literature has really taken the academic world by storm.

If lit-crit is always something of a roller-coaster ride, the car has been stuck at the top of the first big hill for a while now, waiting for some type of rollicking approach to kick in and get the blood pumping again. What's the next big thing going to be? The next first-order critical paradigm that -- like New Criticism in the 1940s and '50s; cultural studies in the '60s; French post-structural theory in the '70s, and New Historicism in the '80s -- really rocks faculty lounges? (Go here for summaries of these and other movements).

It was with this question in mind that I attended yesterday's panel on "Cognition, Emotion, and Sexuality," which was arranged by the discussion group on Cognitive Approaches to Literature and moderated by Nancy Easterlin of the University of New Orleans. Scholars working in this area use developments in cognitive psychology, neurophysiology, evolutionary psychology, and related fields to figure out not only how we process literature but, to borrow the title of a forthcoming book in the field, Why We Read Fiction.

Although there are important differences, cognitive approaches often overlap with evolutionary approaches, or what The New York Times earlier this year dubbed "The Literary Darwinists"; those latter critics, to quote the Times:

“...read books in search of innate patterns of human behavior: child bearing and rearing, efforts to acquire resources (money, property, influence) and competition and cooperation within families and communities. They say that it's impossible to fully appreciate and understand a literary text unless you keep in mind that humans behave in certain universal ways and do so because those behaviors are hard-wired into us. For them, the most effective and truest works of literature are those that reference or exemplify these basic facts.“

Both cognitive and evolutionary approaches to lit-crit have been gaining recognition and adherents over the past decade or so. Cognitive critics are less interested in recurring plots or specific themes in literature, but they share with the Darwinists an interest in using scientific advances to help explore the universally observed human tendency toward creative expression, or what the fascinating anthropologist Ellen Dissanayake called in Homo Aestheticus: Where Art Comes From and Why, “making special.”

This unironic -- though hardly uncritical -- interest in science represents a clear break with much of what might be called the postmodern orthodoxy, which views science less as a pure source of knowledge and more as a means of controlling and regulating discourse and power. The postmodern view has contributed to a keener appreciation of how appeals to science are often self-interested and obfuscating. In this, it was anticipated in many ways by libertarian analyses such as F.A. Hayek's The Counter-Revolution of Science: Studies on the Abuse of Reason (1952) and Thomas Szasz's The Myth of Mental Illness, which exposed a hidden agenda of social control behind the helper rhetoric of the medical establishment and, not uncoincidentally, appeared the same year as Michel Foucault's The Birth of the Clinic. (For more on connections between libertarian thought and postmodernism, go here and here.)

At the same time, the postmodern view of science as simply one discourse among many could be taken to pathetic and self-defeating extremes, as the Sokal Hoax, in which physicist Alan Sokal published a secret parody in a leading pomo journal, illustrated. Indeed, the status of science -- and perhaps especially evolution and theories of human cognition that proceed from it -- in literary studies is curious. On the one hand, a belief in evolution as opposed to creationism or Intelligent Design is considered by most scholars a sign of cosmopolitan sophistication and a clear point of difference with religious fundamentalists. On the other hand, there are elements of biological determinism implicit in evolution that cut against various left-wing agendas -- and against the postmodern assertions that all stories are equally (in)valid.

Yet if evolution is real in any sense of the word, it must have a profound effect on what we do as human beings when it comes to art and culture.
Which brings us back to the "Cognition, Emotions, and Sexuality" panel, which sought, pace most literary theory of the past few decades, to explore universal processes by which human beings produce and consume literature. That alone makes the cognitive approach a significant break with the status quo.

The first presenter was Alan Palmer, an independent scholar based in London and the author of the award-winning Fictional Minds. For Palmer, how we process fiction is effectively hardwired, though not without cultural emphases that depend on social and historical context; it also functions as a place where we can understand more clearly how we process the "real" world. After summarizing recent cognitive work that suggests "our ways of knowing the world are bound up in how we feel the world...that cognition and emotion are inseparable," he noted that the basic way we read stories is by attributing intentions, motives, and emotions to characters. "Narrative," he argued, "is in essence the description of fictional mental networks," in which characters impute and test meanings about the world.

He led the session through a close reading of a passage from Thomas Pynchon's The Crying of Lot 49. The section in question was filled with discrepant emotions popping up even in the same short phrases. For instance, the female protagonist Oedipa Maas at one point hears in the voice of her husband "something between annoyance and agony." Palmer -- whose argument was incredibly complex and is hard to reproduce -- mapped out the ways in which both the character and the reader made sense of those distinct emotional states of mind. The result was a reading that, beyond digging deep into Pynchon, also helped make explicit the "folk psychology" Palmer says readers bring to texts -- and how we settle on meanings in the wake of unfamiliar emotional juxtapositions. As the panel's respondent, University of Connecticut's Elizabeth Hart, helpfully summarized, Palmers' reading greatly "complexified the passage" and was "richly descriptive" of the dynamics at play.

The second paper, by Auburn's Donald R. Wehrs, argued that infantile sexual experiences based around either the satisfaction of basic wants by mothers or proximity to maternal figures grounded the metaphors used by various philosophers of religious experience. Drawing on work that argues that consciousness emerges from the body's monitoring itself in relation to objects outside of it, Wehrs sketched a metaphoric continuum of images of religious fulfillment with St. Augustine at one end and Emmanuel Levinas on the other; he also briefly located the preacher Jonathan Edwards and Ralph Waldo Emerson on the continuum too. As Hart the respondent noted, Wehrs showed that there's "an emotional underwebbing to the history of ideas." That is, a set of diverse philosophers expressed a "common cognitive ground rooted in infantile erotic experience rather than practical reasoning."

Augustine, says Wehrs, conflates the divine and human and locates the origin of love and religious ecstasy with the stilling of appetite or desire. In essence, peace is understood as the absence of bad appetites, which accords with one basic infantile erotic or physical response to wants. Levinas, on the other hand, also draws on infantile experience but focuses not on ingestion but on proximity to the mother. Both of these reactions are basic cognitive realities that all humans experience as infants; together, they create a range of possible metaphors that recur in religious discussions. On the one hand, Augustine talks of being one with God (and the mother), of an inviolate bond that shows up in somewhat attenuated form in Jonathan Edward's imagery of being penetrated by God. On the other, Levinas stresses proximity to the Other, which mirrors infantile cognitive experience of closeness with the mother. This understanding, he said, is also reflected in Emerson's metaphors of resting and laying in Nature.

Will cognitive approaches become the next big thing in lit-crit? Or bio-criticism of the Darwinian brand? That probably won't happen, even as these approaches will, I think, continue to gain in reputation and standing. More to the point, as I argued in a 1998 article, these scholars who are linking Darwin and Dickens have helped challenge an intellectual orthodoxy that, however exciting it once was, seems pretty well played out. In his tour de force Mimesis and the Human Animal: On the Biogenetic Foundations of Literary Representation (1996), Temple's Robert Storey -- one of Nancy Easterlin's doctoral advisors -- warns:

“If [literary theory] continues on its present course, its reputation as a laughingstock among the scientific disciplines will come to be all but irreversible. Given the current state of scientific knowledge, it is still possible for literary theory to recover both seriousness and integrity and to be restored to legitimacy in the world at large.”

Ten years out, Storey's warning seems less pressing. The lure of the most arch forms of anti-scientific postmodernism has subsided, partly because of their own excesses and partly because of challenges such as Storey's. As important, the work being done by the cognitive scholars and others suggest that literature and science can both gain from ongoing collaboration.


http://www.tcsdaily.com/article.aspx?id=123005A
English Patients: Literature in the Digital Age

The final session I attended at the 2005 Modern Language Association convention -- "Taking It Digital: Teaching Literature in the 21st Century" -- wasn't just interesting in and of itself (though it was certainly that). It also opens up a broader discussion about the future of academic literary studies -- and suggests some ways that literature departments might turn around a long, slow decline in the number of students majoring in English and related fields.

Data from the Association of Departments of English are pretty grim. In 1950, English majors accounted for about four bachelor's degrees out of every 100 granted. That number rose steadily for about 20 years, peaking in 1971, when English degrees accounted for 7.66 degrees per 100 earned. Since then, it's been basically downhill (with the exception of a slight uptick in the late 1980s and early '90s). In 2003, English departments were back where they were in 1950, accounting for about four degrees per 100. The trends in foreign language degrees are similar, and you'll search in vain for a professor of foreign language who is not in an absolute panic over declines in student enrollment. These trends generally track with other disciplines in the humanities and social sciences and hold up when you control for gender, too (English has always been a more popular major for women than men but enrollments follow the same pattern for each).

Which is not to say that all is darkness, at least in terms of English departments. According to government data, in 2003 English was the 10th most popular major (the most popular by far was business; go here and check table 289 for a detailed breakdown). Still, the relative loss of undergraduate majors leads directly to a loss of institutional clout, which in turn leads to a loss of faculty resources ranging from tenure-track lines, research leaves, and more. A decline in majors hits humanities programs especially hard, since they have less opportunity than, say, most of the research sciences or professional schools to attract major grants from private industry or governments.

There's no simple accounting for the decline in English and foreign language enrollments. (Indeed, there's no simple accounting for their increase in the period from the end of World War II through the 1960s). Some of it is surely is due to the changing makeup of student populations. Despite the myths that surround the G.I. Bill, attending college only really became a mass phenomenon in the United States in the 1970s. (There has been a corresponding boom in the sheer number and variety of institutions of higher education, too. There are roughly 4,200 two- and four- year colleges and universities in the United States; in 1980, there were only 3,200.) It seems logical to expect that first-generation college students are more likely to focus on majors and courses of study that are more directly tied to job possibilities. I rush to say that I'm not fully convinced of this: I was among the first generation in my family to attend college and majored in English; my sister majored in French. Still, it might be that as college became more democratic, the perceived luxury of a lit degree seemed less appealing, especially as college costs have climbed. Part of it is surely economic in another sense. In terms of starting salaries, English majors actually do pretty well, but their immediate prospects are dwarfed by those taking degrees in fields such as electrical engineering, accounting, and economics.

Part of it is likely due to the changing nature of literary criticism. There's little doubt that over the past 30 years or so, academic literary criticism, as in every other field, has become more insular, segmented, and jargon-ridden. Some of this is inevitable--it represents an ever keener division of labor among scholars -- and much of it has resulted in work that is, ironically, fascinating to a broad reading public. For all the screeds -- which come from the left as well as the right -- decrying the rise of French theory and especially deconstruction -- it surely means something that the term deconstruction has entered the American vernacular. Sure, the common usage may have precious little to do with the precise way that Jacques Derrida and Paul de Man specified, but they more than most would recognize that concepts morph and change over time and circumstance. It's tempting to think that if all academic literary critics wrote like, say, the great Leslie Fiedler, that lit-crit would be packing in undergraduates like the carnival freak sideshows he wrote about so memorably.

Perhaps more important -- and this is something that Fiedler recognized in his excellent 1982 meditation on the changing nature of cultural consumption, production, and elite gate-keeping, What Was Literature? -- much of the work traditionally done by academics has seeped into the culture at large. In an age of cultural proliferation, where more of us can make and take whatever we want, whenever we want, Literature with a capital "l" -- doesn't command the same position it did even 30, much less 50, years ago (think of the difference in contemporaneous cultural standing, say, between Ernest Hemingway and Don DeLillo). The world we live in is not simply awash in an increasing amount of print, video, music, art, and other forms of creative expression; it's awash in an increasing number of critics of the same. Such a world is increasingly dispersed and decentralized and it is extremely hard for any single locus of power to exercise much control over what we consume or how we interpret what we consume. That used to be a role that academic literary studies could, if not quite dominate, lead. But no more. Ironically, a world filled with more culture may inevitably be one where the ostensible guardians of culture are less important than before.

So those are some -- and only some, for sure -- of the possible reasons literature departments have been losing students. And as Zachary Karabell argued in What's College For?: The Struggle to Define American Higher Education (1999), the loss of students eventually means the loss of institutional power at colleges and universities.

The presenters at the "Taking It Digital" panel suggest some interesting ways to rejuvenate the lit classroom. Olin Bjork, a graduate student at the University of Texas, described "The Tempest Multimedia Project," in which he and a fellow instructor had students build an extensive online compendium of all sorts of primary and secondary resources, including audio recordings of music used in various stagings of the Shakespeare play, period maps, and more. The students worked collaboratively and gained Web and organizational skills while they sharpened their critical faculties and gained deeper knowledge about a play whose theme and various interpretations remain vital to understanding conceptions of the New World. (Bjork said the project was still available online but I was unable to dig it up via Google.)

Mary Michele Bendel-Simso and Julianne Jasken, two professors at a small Maryland College, presented a summary of "The McDaniel College Short Story Project," in which students created extremely rich sites built around individual short stories by authors ranging from Sarah Orne Jewett to Kate Chopin to Mark Twain. The students worked with area high-school teachers to help create study guides and resources, as well, creating meaningful community ties between the college and nearby secondary schools.

The most interesting presentation was by Alain-Philippe Durand, who is an associate professor of French at the University of Rhode Island. He detailed his experiences with two "online seminars" he taught in 2001 and 2005. As a member of French department, Durand was especially cognizant of falling enrollments and he noted with pride that between 1999 and 2005, URI had upped its French majors from 35 students to 125, largely by trying to make its offerings more interesting and relevant--and focused on content and analysis rather than language instruction.

Following Foucault (mais oui), he noted that we live in an "epoch of juxtaposition...of the side by side" in which "networks rule." For his online seminars, he combined traditional classroom instruction with interactive Web forums in which students directly engaged some of the authors and filmmakers they were discussing as part of the course's requirements. He contacted the writers through their publishing houses and reported that most were not only willing to participate gratis but were energized by the direct connection with readers, especially those in a foreign country. The 2005 seminar forum is online here; one set of students also created an elaborate site built around the writer Marie Darrieussecq. Such exercises not only allowed students to hone their French skills but to explore more fully the way that tools of literary criticism and analysis function in a broad variety of settings. As Durand emphasized throughout his remarks, parents routinely insist on the "practicality" of their children's courses of study.

This isn't to suggest that the only way literary studies can or should survive is by teaching some mixture of Web skills and critical reading tools that might be applied outside of literary studies. Still, it's clear that in an information- and media-drenched world such as ours, critical reading and writing skills are at more of a premium than ever before. On this point, Durand cited Roland Barthes who once said, with characteristic overstatement, that if the university could teach only one subject it should be literature -- because literature includes all other disciplines. What is blogging if not literary criticism gone wild?

What each of these presentations had in common was an understanding of what University of Tulsa communication professor Joli Jensen has talked about as an "expressive view" of culture. That is, culture broadly defined "is a way that all of us, even those of us who are not in a special guardian class, understand and symbolically engage the world." This understanding puts art, music, literature, and other forms of creative expression, including political expression, at the very center of our individual and collective experience. Which means that lucid interpretation of the same is vital.

And it need not rely on cutting-edge multimedia technologies; in the end, the panel was less about "taking it digital" and more about engaging students in the creation of meaning. To the extent that literature professors can make it clear that what they do is central to what we all do -- engage and interpret the world within ever-changing and ever-evolving traditions and communities -- literary studies may well be poised for a great 21st century.
Nick Gillespie is Editor of Reason.

Thursday, January 12, 2006

Tom Wolf's Laundry List

http://www.thestatesman.net/page.news.php?clid=30&theme=&usrsess=1&id=102289
Silhouette: It’s for real

A classic Tom Wolfe laundry list reads much like a reporter’s notebook verbatim; but then it is funny and effective, writes
Stephen Bayley
VERY few are deliriously optimistic at the survival prospects of print-on-paper journalism, not least those sorry drones who get paid for it. So, two new publications are a timely stimulus to wondering what’s now happening to the writing economy. First is The Hotspur, parish magazine of St John’s, Healey, a village in the North Pennines.
The whim of Jamie Warde-Aldam, a rococo Newcastle ad-man with a long local family history, The Hotspur is interesting, clever and funny. Warde-Aldam’s first edition has a Gilpin painting on the cover, a starburst offering a “free concrete poem”, articles on Inuit vocabulary, WH Auden and recipes for the Northumberland snow-bound who may, at this time of year, take recourse to fried squirrel or crow and mushroom stew. Through sheer neck and nerve, Warde-Aldam has attracted, with charm rather than editorial budgets, real writers: it is free and it is fabulous.
The second is Marc Weingarten’s The Gang That Wouldn’t Write Straight (Crown Publishers, New York, 2006). This is the first proper study of The New Journalism, the name taken from a 1973 anthology which described a loosely connected group taking journalism beyond reportage and the rancid solemnities of criticism into new territory where they dealt with the “art of fact”.
Weingarten’s title is taken from Jimmy Breslin’s novel about hopeless Brooklyn mafiosi who had problems with the accuracy of their firearms. Breslin, a University of Hard Knocks sportswriter, was a pioneer New Journalist. But it was not a “movement” in the sense that The Beats — Kerouac, Ginsberg, Corso et al — were a movement, with beliefs and objectives. Rather, the New Journalists were united (mostly) by ambition and (mostly) by talent. And they were responding to a changed world. Norman Mailer described the Eisenhower years as “the triumph of the corporation”, a period whose cultural achievement was “tasteless, sexless, odourless sanctity in architecture, manners, modes, styles”. He must have been thinking of IBM. The suntanned JFK was a sort of catalyst, releasing among Americans, again in Mailer’s words, “untapped, lonely and romantic ideas”. And so we got the Sixties. The New Journalists were scarcely bo-ho, counter-cultural radicals (Joan Didion and Gay Talese were, for instance, very up-town indeed, as was the immaculately groomed Tom Wolfe). But they extended the boundaries of journalism to cope with the greater opportunities of the age: book length articles based on close observation where the writer was a participant rather than a neutral observer.
There had been precedents. Jonathan Swift, Dickens, Balzac. Or Jack London living among the London poor. George Orwell, posing as a down and out in Paris and in London. Mailer himself and even John Hersey whose 1946 New Yorker article Hiroshima is regularly cited in journalism school polls as the greatest of the genre. But of the New Journalism there is no better, nor more typical, exponent than Tom Wolfe. With his bravura effects, innovative language, sesquipedalian habits, mischief, his chutzpah, his hubris, his passion, satire and humour — not to mention his questions, even problems, of method — Wolfe stands for the New Journalism, good or bad.
At just the time Truman Capote was finishing his six years’ research on the Kansas murders whose account became 1965’s In Cold Blood, Wolfe was sent as general assignment reporter to cover a redneck stock-car race in Wilkes-Barre. The Last American American Hero is Junior Johnson, Yes! was published in 1964; in it the patrician Wolfe was, it is fair to assume, and to use one of his favourite words, boondoggled by the beer and the noise and the pheromones. Thus, awesomely, Wolfe describes the noise Junior Johnson’s car makes: “Ggggghhzzzzzzzhhhhhhggggggzzzzzzzzz eeeeeeong! gawdam!” The year after, a collection of Wolfe’s journalism appeared as The Kandy-Kolored Tangerine-Flake Streamline Baby, low-brow culture given high-brow treatment. His dander sufficiently up, Wolfe felt able to attack The New Yorker, sanctus sanctorum of the literary establishment. With the secretive William Shawn presiding over its closed world, Wolfe wrote an article in the Sunday supplement of The New York Herald Tribune called The True Story of the Ruler of 43rd Street’s Land of the Walking Dead. Shawn called it “murderous”, although it was in truth no more than cheerfully disrespectful. The New Yorker’s counter-attack accused Wolfe of “parajournalism”, of being more interested in elaboration than development and then, helpfully, defined New Journalism as “A bastard form, having it both ways, exploiting the factual authority of journalism and the atmospheric licence of fiction. Entertainment rather than information...” That accusation of frivolity has stuck, John Updike skewering Wolfe’s recent novels as entertainment rather than literature. And as a criticism, the “bastard form” also has some basis. But that is exactly the point.
Wolfe compared his technique to method acting, he was turning history into novels, a process that was complete when his account of Nasa was published as The Right Stuff in 1979. Thereafter, Wolfe devoted himself to “pure” fiction, although his novels have been written with a reporter’s painstaking diligence and tireless observation. The literary establishment continues to disdain him. And when you read a classic Wolfe laundry list — “bangs manes bouffants beehive Beatle caps butter faces brush-on lashes decal eyes puffy sweaters French thrust bras flailing leather blue jeans” — you do wonder if this is not merely a reporter’s notebook verbatim; but then you remember it is funny and effective.
Others were involved in changing journalism: The New Publishers, for instance. There was Jann Wenner of Rolling Stone, Wolfe’s patron. Or Clay Felker, the editor who spun New York magazine out of the Herald Tribune’s Sunday supplement to become the most influential and most imitated magazine of the Seventies and Eighties.
New York established a services template that made stories like The Ten Best Upper East Side Sushi Restaurants with Dinner Under $100 the staples of magazine journalism to this day. Only when he manoeuvred a misalliance with the more radical Village Voice did Felker’s sure step stumble and parodists pitched stories called The Favourite Recipes of the Ten Worst Bisexual Judges in New York.
It all seems so quaint now, these battles about style and method. With its commitment to research, impressive effects, controversy and pleasure, The New Journalism was an end, not a beginning. It was not so new, in fact rather old-fashioned. It was the last time magazine writing really mattered.
It does not really exist any more, but you can read about it in Marc Weingarten’s engrossing account. Or, if you happen to be in the North Pennines, you could always try to find a copy of The Hotspur.
— The Independent, London