A PERSONAL JOURNAL, KEPT LARGELY TO RECORD REFERENCES TO WRITINGS, MUSIC, POLITICS, ECONOMICS, WORLD HAPPENINGS, PLAYS, FILMS, PAINTINGS, OBJECTS, BUILDINGS, SPORTING EVENTS, FOODS, WINES, PLACES AND/OR PEOPLE.
About Me
- Xerxes
- New Orleans, Louisiana, United States
- Admire John McPhee, Bill Bryson, David Remnick, Thomas Merton, Richard Rohr and James Martin (and most open and curious minds)
29.11.08
ROGET
Peter Mark Roget, the future Linnaus of the English word, began compiling word-lists at the age of eight. Why was he not playing with other children, honing his social skills? The problem was his mother, a widow at 28, who drained her son of sympathy. Catherine Romilly gave birth to a wonderful, handsome, talented boy , but couldn’t let him be himself. Independence, he would write in his Thesaurus under list 744, equals freedom of action, unilaterality; freedom of choice, initiative. But for freedom see also non-liability, disobedience, seclusion and liberation: the way one insists on freedom in the face of opposition.
Catherine Roget née Romilly came from a well-regarded and successful London Huguenot family blighted by mental illness. After the early death of her Swiss-born husband, Catherine never recovered her capacity for normal life. Her own mother had been mentally incapable and Catherine slipped inexorably into a lesser version of her mother's state. Shlepping with his sister backwards and forwards between London and the country on the wheels of maternal restlessness, Peter never felt he had a home, except in his wordlists. He worked on them in solitude, while qualifying as a doctor.
Fully fledged at 20, five years too young to practise, he was exceptionally able and also peculiar and solitary. He hated disorder and dirt. When he took a job accompanying two rich teenagers on their European Grand Tour, their notebooks revealed his crabbed and pernickety mind. He taught them to count the windows in cathedrals, and visitor numbers, and tally how many paintings were in a collection. He taught them to structure the world prosaically and reliably; at all costs to avoid emotional surrender. His response to both human and natural life was to classify it, the foundation of his great work to come.
Roget’s next job in London was a desperate couple of months living with Jeremy Bentham, working, in the pre-Bird’s Eye 1820s, on an unlikely project involving frozen food. The cat-loving philosopher was abysmally untidy and there were no locks on the bedroom doors. Addicted to privacy, Roget couldn’t retreat to his own room and his lists without encountering Bentham’s brother and his wife and a lot of cat hair. The born classifier couldn’t get out fast enough.
Then he was a volunteer testing nitrous oxide—laughing gas—as a possible cure for consumption. This was a high-class operation: the other guinea-pigs included Wordsworth and Coleridge, who enjoyed getting high in a good cause. But for Dr Roget this was torture. He got stiff with misery at being out of control and upset the statistics by claiming the gas had no effect.
One of the problems with Dr Roget, about whom Lennon and McCartney could have written a song, was that he wasn’t terribly interesting, despite his hangups and misfortunes. He finally became the top professor of physiology in the country, secretly stroking his word lists while addressing his first audience lest his vocabulary fail him. His new role in public life urged a new perfection of his long work in progress. Meanwhile he also made his mark in natural science. Darwin read Roget’s most famous work in his lifetime and held him in esteem for something about frogs’ toes. Aged 45, Peter Roget even managed to get married. Having frustrated several admirers along the way, he eventually made a happy union with a beautiful and intelligent woman 16 years his junior. Then fate hurled everything personal downhill again, with loved ones dying or going mad in quick succession. He got better at dealing with such things.
While the tidy functionality of Roget’s projected personality has been reproduced by a series of modern editors talking about his achievement, the latest of them in 2002, what still remains a mystery is the nature of Roget’s imagination. To flick through the Thesaurus, which he finessed for publication in his sixties, is to feel the enormous energy it took even to aim to structure all human experience in objective and specific language. The goal was to contain imagination within the parameters of the known world. As the young Wittgenstein would declare a century later, the world is everything that is the case. Roget hoped his lists of concepts would become an intellectual and a moral tool: a real mind-expander, not just a lexical crib. Groups of words like (list 60): disciplined, obedient, schematic, systematic, tidy, dinky, just so, suggest there was something of Aristotle in him, teaching us the golden mean, but also the name for the fallen standards we actually live by.
The irony is that what helped Dr Roget avoid emotion became the tool of others struggling to express it. Sylvia Plath, noting her first kiss with Ted Hughes, observed that she was two-timing him with Dr Roget. Dylan Thomas, observed one scholar drily, used Roget in place of inspiration.
Like a work of art the Thesaurus works on different levels. It helps generate new ideas and captures a hundred tarnished states of being, animate and inanimate, on every page. It’s an essay-writing tool and more. The Lists show exactly how a rich culture benefits from normative language, for if you don’t have the norm, how can you have the (shared) nuance. At a deeper, unofficial level, the Thesaurus is obviously a clue to Roget’s psyche. See, for instance, the record number of paragraphs of sub-lists under the heading "Disorder." Roget was a Freudian case half a century before Freud, and one might deconstruct his real magnum opus as a secret autobiography, to be matched alongside the recorded life. *
Yet that would be a pity, because the point of the Thesaurus is to be a bible of objectivity. For its merits, see list 3 under Substantiality, essentiality; subhhead: reality. Consider also list 494: True, objective, rational; certain, undisputed. Dr Roget’s 990 Lists are an enduring pleasure in deviant days.
* Joshua Kendall The Man Who Made Lists G.P. Putnam’s Sons 2008
Catherine Roget née Romilly came from a well-regarded and successful London Huguenot family blighted by mental illness. After the early death of her Swiss-born husband, Catherine never recovered her capacity for normal life. Her own mother had been mentally incapable and Catherine slipped inexorably into a lesser version of her mother's state. Shlepping with his sister backwards and forwards between London and the country on the wheels of maternal restlessness, Peter never felt he had a home, except in his wordlists. He worked on them in solitude, while qualifying as a doctor.
Fully fledged at 20, five years too young to practise, he was exceptionally able and also peculiar and solitary. He hated disorder and dirt. When he took a job accompanying two rich teenagers on their European Grand Tour, their notebooks revealed his crabbed and pernickety mind. He taught them to count the windows in cathedrals, and visitor numbers, and tally how many paintings were in a collection. He taught them to structure the world prosaically and reliably; at all costs to avoid emotional surrender. His response to both human and natural life was to classify it, the foundation of his great work to come.
Roget’s next job in London was a desperate couple of months living with Jeremy Bentham, working, in the pre-Bird’s Eye 1820s, on an unlikely project involving frozen food. The cat-loving philosopher was abysmally untidy and there were no locks on the bedroom doors. Addicted to privacy, Roget couldn’t retreat to his own room and his lists without encountering Bentham’s brother and his wife and a lot of cat hair. The born classifier couldn’t get out fast enough.
Then he was a volunteer testing nitrous oxide—laughing gas—as a possible cure for consumption. This was a high-class operation: the other guinea-pigs included Wordsworth and Coleridge, who enjoyed getting high in a good cause. But for Dr Roget this was torture. He got stiff with misery at being out of control and upset the statistics by claiming the gas had no effect.
One of the problems with Dr Roget, about whom Lennon and McCartney could have written a song, was that he wasn’t terribly interesting, despite his hangups and misfortunes. He finally became the top professor of physiology in the country, secretly stroking his word lists while addressing his first audience lest his vocabulary fail him. His new role in public life urged a new perfection of his long work in progress. Meanwhile he also made his mark in natural science. Darwin read Roget’s most famous work in his lifetime and held him in esteem for something about frogs’ toes. Aged 45, Peter Roget even managed to get married. Having frustrated several admirers along the way, he eventually made a happy union with a beautiful and intelligent woman 16 years his junior. Then fate hurled everything personal downhill again, with loved ones dying or going mad in quick succession. He got better at dealing with such things.
While the tidy functionality of Roget’s projected personality has been reproduced by a series of modern editors talking about his achievement, the latest of them in 2002, what still remains a mystery is the nature of Roget’s imagination. To flick through the Thesaurus, which he finessed for publication in his sixties, is to feel the enormous energy it took even to aim to structure all human experience in objective and specific language. The goal was to contain imagination within the parameters of the known world. As the young Wittgenstein would declare a century later, the world is everything that is the case. Roget hoped his lists of concepts would become an intellectual and a moral tool: a real mind-expander, not just a lexical crib. Groups of words like (list 60): disciplined, obedient, schematic, systematic, tidy, dinky, just so, suggest there was something of Aristotle in him, teaching us the golden mean, but also the name for the fallen standards we actually live by.
The irony is that what helped Dr Roget avoid emotion became the tool of others struggling to express it. Sylvia Plath, noting her first kiss with Ted Hughes, observed that she was two-timing him with Dr Roget. Dylan Thomas, observed one scholar drily, used Roget in place of inspiration.
Like a work of art the Thesaurus works on different levels. It helps generate new ideas and captures a hundred tarnished states of being, animate and inanimate, on every page. It’s an essay-writing tool and more. The Lists show exactly how a rich culture benefits from normative language, for if you don’t have the norm, how can you have the (shared) nuance. At a deeper, unofficial level, the Thesaurus is obviously a clue to Roget’s psyche. See, for instance, the record number of paragraphs of sub-lists under the heading "Disorder." Roget was a Freudian case half a century before Freud, and one might deconstruct his real magnum opus as a secret autobiography, to be matched alongside the recorded life. *
Yet that would be a pity, because the point of the Thesaurus is to be a bible of objectivity. For its merits, see list 3 under Substantiality, essentiality; subhhead: reality. Consider also list 494: True, objective, rational; certain, undisputed. Dr Roget’s 990 Lists are an enduring pleasure in deviant days.
* Joshua Kendall The Man Who Made Lists G.P. Putnam’s Sons 2008
C.S.Lewis
It's the birthday of C.S. Lewis, (books by this author) born in Belfast, Northern Ireland, in 1898. His family's house was filled with books, and he said that finding a new book to read was as easy as finding a blade of grass. Lewis moved to England, and at first he hated everything about England the landscape, the accents, the people. But he taught at Oxford for almost 30 years, and while he was there he became part of a literary group, a gathering of friends who called themselves "The Inklings." They met twice a week, drank beer, and discussed their writing. One of these Inklings was J.R.R. Tolkien, and he and C.S. Lewis became close friends, and Tolkien inspired Lewis who had been an atheist to convert to Christianity. Lewis became one of the most important Christian thinkers of the day. He published Mere Christianity (1952), but his most famous books are The Chronicles of Narnia, beginning with The Lion, the Witch, and the Wardrobe (1950). C.S. Lewis said, "Some day you will be old enough to start reading fairy tales again."
28.11.08
NINE THINGS WOMEN SAY
1) Fine: This is the word women use to end an argument when they are right and you need to shut up.
2) Five Minutes: If she is getting dressed, this means a half an hour. Five minutes is only five minutes if you have just been given five more minutes to watch the game before helping around the house.
(3) Nothing: This is the calm before the storm. This means something, and you should be on your toes. Arguments that begin with nothing usually end in fine.
(4) Go Ahead: This is a dare, not permission. Don't Do It!
(5) Loud Sigh: This is actually a word, but is a non-verbal statement often misunderstood by men. A loud sigh means she thinks you are an idiot and wonders why she is wasting her time standing here and arguing with you about nothing. (Refer back to # 3 for the meaning of nothing.)
(6) That's Okay: This is one of the most dangerous statements a women can make to a man. That's okay means she wants to think long and hard before deciding how and when you will pay for your mistake.
(7) Thanks: A woman is thanking you, do not question, or Faint. Just say you're welcome. (I want to add in a clause here - This is true, unless she says 'Thanks a lot' - that is PURE sarcasm and she is not thanking you at all. DO NOT say 'you're welcome' ..... that will bring on a 'whatever').
(8) Whatever: Is a women's way of saying
Up Yours!!
2) Five Minutes: If she is getting dressed, this means a half an hour. Five minutes is only five minutes if you have just been given five more minutes to watch the game before helping around the house.
(3) Nothing: This is the calm before the storm. This means something, and you should be on your toes. Arguments that begin with nothing usually end in fine.
(4) Go Ahead: This is a dare, not permission. Don't Do It!
(5) Loud Sigh: This is actually a word, but is a non-verbal statement often misunderstood by men. A loud sigh means she thinks you are an idiot and wonders why she is wasting her time standing here and arguing with you about nothing. (Refer back to # 3 for the meaning of nothing.)
(6) That's Okay: This is one of the most dangerous statements a women can make to a man. That's okay means she wants to think long and hard before deciding how and when you will pay for your mistake.
(7) Thanks: A woman is thanking you, do not question, or Faint. Just say you're welcome. (I want to add in a clause here - This is true, unless she says 'Thanks a lot' - that is PURE sarcasm and she is not thanking you at all. DO NOT say 'you're welcome' ..... that will bring on a 'whatever').
(8) Whatever: Is a women's way of saying
Up Yours!!
27.11.08
GIVE THANKS!
Today is Thanksgiving Day.
When we talk about the first Thanksgiving, we're referring to an event that happened in 1621 in Plymouth, Massachusetts. But there were actually Thanksgiving ceremonies in the United States much earlier — in 1565, 600 Spanish settlers arrived in what is now St. Augustine, Florida, and had a Mass of Thanksgiving to celebrate their safe arrival, and followed it up with a feast. Other Thanksgiving celebrations occurred in El Paso, Texas, and in the Virginia Colony.
But the most famous is the Thanksgiving in the fall of 1621, when the Plymouth colonists celebrated with the Wampanoag Indians. It was the colonists' first harvest, so it was a joyful occasion. The Pilgrims had barely survived the last winter and had lost about half their population. But since then, they had built seven houses, a meeting place, and three storehouses for food. Now they actually had food to store.
They invited the Wampanoag Indians to feast with them. The Wampanoag people and their chief, Massasoit, were friendly toward the Pilgrims and helped teach them how to live on different land and with new food sources. A man known as Squanto, a Patuxet living with the Wampanoag tribe, knew English because he had been a slave in England. He taught the settlers how to plant corn, beans, and squash and how to catch eel and shellfish. And he was their interpreter.
So the Pilgrims asked the Native Americans to share in their first harvest. Harvest festivals were nothing new; both the English and the Wampanoag had similar traditions in their culture.
At the first Thanksgiving, they didn't eat mashed potatoes and pumpkin pie, and they probably didn't even eat turkey. The only two foods that are actually named in the primary accounts are wild fowl and venison. The meal was mostly meat and seafood, but probably included squash, cabbage, corn, and onions, and spices like cinnamon, ginger, nutmeg, and pepper.
Unlike our modern Thanksgiving, this event wasn't just one day. Many of the Wampanoag had to walk two days to get to the Plymouth settlement. There were about 50 English people and 90 Wampanoag, and since there wasn't enough room in the seven houses for the guests, they went ahead and built themselves temporary shelters. In between eating, they played games and sports, danced and sang.
The most detailed account of the first Thanksgiving comes from one of the Pilgrims, Edward Winslow. He wrote:
Our harvest being gotten in, our governor sent four men on fowling, that so we might after a special manner rejoice together after we had gathered the fruits of our labor. […] At which time, amongst other recreations, we exercised our arms, many of the Indians coming amongst us, and among the rest their greatest king Massasoit, with some ninety men, whom for three days we entertained and feasted.
Thanksgiving has been celebrated as a national holiday on different dates, in different months, and one year it was even celebrated twice. It wasn't standardized until 1941, when President Roosevelt signed a bill declaring that the fourth Thursday in November would be Thanksgiving Day.
As we gather together around this table
laden with your plentiful gifts to us,
we thank You for always providing
what we really need
and for sometimes granting wishes
for things we don’t really need.
Today, let us be especially thankful
for each other--for family and friends
who enrich our lives in wonderful ways,
even when they present us with challenges.
Let us join together now
in peaceful, loving fellowship
to celebrate Your love for us
and our love for each other.
Abraham Lincoln's Thanksgiving Proclamation
The year that is drawing towards its close, has been filled with the blessings of fruitful fields and healthful skies. To these bounties, which are so constantly enjoyed that we are prone to forget the source from which they come, others have been added, which are of so extraordinary a nature, that they cannot fail to penetrate and soften even the heart which is habitually insensible to the ever watchful providence of Almighty God. In the midst of a civil war of unequalled magnitude and severity, which has sometimes seemed to foreign States to invite and to provoke their aggression, peace has been preserved with all nations, order has been maintained, the laws have been respected and obeyed, and harmony has prevailed everywhere except in the theatre of military conflict; while that theatre has been greatly contracted by the advancing armies and navies of the Union. Needful diversions of wealth and of strength from the fields of peaceful industry to the national defence, have not arrested the plough, the shuttle, or the ship; the axe had enlarged the borders of our settlements, and the mines, as well of iron and coal as of the precious metals, have yielded even more abundantly than heretofore. Population has steadily increased, notwithstanding the waste that has been made in the camp, the siege and the battle-field; and the country, rejoicing in the consciousness of augmented strength and vigor, is permitted to expect continuance of years with large increase of freedom. No human counsel hath devised nor hath any mortal hand worked out these great things. They are the gracious gifts of the Most High God, who, while dealing with us in anger for our sins, hath nevertheless remembered mercy. It has seemed to me fit and proper that they should be solemnly, reverently and gratefully acknowledged as with one heart and voice by the whole American People. I do therefore invite my fellow citizens in every part of the United States, and also those who are at sea and those who are sojourning in foreign lands, to set apart and observe the last Thursday of November next, as a day of Thanksgiving and Praise to our beneficent Father who dwelleth in the Heavens. And I recommend to them that while offering up the ascriptions justly due to Him for such singular deliverances and blessings, they do also, with humble penitence for our national perverseness and disobedience, commend to his tender care all those who have become widows, orphans, mourners or sufferers in the lamentable civil strife in which we are unavoidably engaged, and fervently implore the interposition of the Almighty Hand to heal the wounds of the nation and to restore it as soon as may be consistent with the Divine purposes to the full enjoyment of peace, harmony, tranquillity and Union. It is the duty of nations as well as of men to own their dependence upon the overruling power of God; to confess their sins and transgressions in humble sorrow, yet with assured hope that genuine repentance will lead to mercy and pardon; and to recognize the sublime truth, announced in the Holy Scriptures and proven by all history, that those nations are blessed whose God is the Lord.
- Abraham Lincoln's Thanksgiving Proclamation October 3, 1863
Giving thanks in a hard time
We've come a long way from Plymouth. It is said that in October of 1621, after a winter of privation and even starvation, the native landowners, the Wampanoags, helped the English colonists put on a celebration of Thanksgiving.
The steely and sometimes grim form of Christianity those pilgrims practiced included several formally proclaimed days of fasting and a few of thanksgiving. Say what you will about their style and humorlessness, they got the connection between being alive and the Source of Life.
We live in a pluralistic society with people of many faiths and many of no (acknowledged) faith. Presidents gingerly navigate the religious question and proclaim a Day of Thanksgiving. Merchants usually welcome the day to rev up sales. This year it seems they're dreading the losses.
And how about us? Our outgoing and incoming presidents tell us these are the worst days since the Great Depression. Our religious tradition tells us that sometimes we go deeper, and to a more honest place, when things aren't easy. I don't hold to the "no atheists in foxholes" theology, but I do agree that human stubbornness sometimes needs an outside nudge to allow the truth to penetrate.
Buddhists often characterize their practice as "mindfulness." It's a rich and subtle notion, welcomed and used by many of the most committed Christians I know. Thankfulness might be seen as form of mindfulness--a way of "paying attention in a particular way, on purpose, in the present moment, and non-judgmentally." Instead of trying to solve the situation we're in, mindfulness means "sitting with it." This year especially, I'm going to think of Thanksgiving as the thing I do because of the actual shape we're in, not in spite of it.
If I get anywhere near that it will be because by some force of grace or love, I am just able to "sit with" my life-with its many gifts and its sober challenges together. Period.
I hope you'll find a way to go deeper--this Sunday as we wind up the church year with the powerful lesson of serving "the least of these," or at our Thanksgiving observance, or at a service near where you live. I hope that wherever you are, you'll let those who know what to do be your guide, just as the Wampanoags were to those pilgrims long ago.
When we talk about the first Thanksgiving, we're referring to an event that happened in 1621 in Plymouth, Massachusetts. But there were actually Thanksgiving ceremonies in the United States much earlier — in 1565, 600 Spanish settlers arrived in what is now St. Augustine, Florida, and had a Mass of Thanksgiving to celebrate their safe arrival, and followed it up with a feast. Other Thanksgiving celebrations occurred in El Paso, Texas, and in the Virginia Colony.
But the most famous is the Thanksgiving in the fall of 1621, when the Plymouth colonists celebrated with the Wampanoag Indians. It was the colonists' first harvest, so it was a joyful occasion. The Pilgrims had barely survived the last winter and had lost about half their population. But since then, they had built seven houses, a meeting place, and three storehouses for food. Now they actually had food to store.
They invited the Wampanoag Indians to feast with them. The Wampanoag people and their chief, Massasoit, were friendly toward the Pilgrims and helped teach them how to live on different land and with new food sources. A man known as Squanto, a Patuxet living with the Wampanoag tribe, knew English because he had been a slave in England. He taught the settlers how to plant corn, beans, and squash and how to catch eel and shellfish. And he was their interpreter.
So the Pilgrims asked the Native Americans to share in their first harvest. Harvest festivals were nothing new; both the English and the Wampanoag had similar traditions in their culture.
At the first Thanksgiving, they didn't eat mashed potatoes and pumpkin pie, and they probably didn't even eat turkey. The only two foods that are actually named in the primary accounts are wild fowl and venison. The meal was mostly meat and seafood, but probably included squash, cabbage, corn, and onions, and spices like cinnamon, ginger, nutmeg, and pepper.
Unlike our modern Thanksgiving, this event wasn't just one day. Many of the Wampanoag had to walk two days to get to the Plymouth settlement. There were about 50 English people and 90 Wampanoag, and since there wasn't enough room in the seven houses for the guests, they went ahead and built themselves temporary shelters. In between eating, they played games and sports, danced and sang.
The most detailed account of the first Thanksgiving comes from one of the Pilgrims, Edward Winslow. He wrote:
Our harvest being gotten in, our governor sent four men on fowling, that so we might after a special manner rejoice together after we had gathered the fruits of our labor. […] At which time, amongst other recreations, we exercised our arms, many of the Indians coming amongst us, and among the rest their greatest king Massasoit, with some ninety men, whom for three days we entertained and feasted.
Thanksgiving has been celebrated as a national holiday on different dates, in different months, and one year it was even celebrated twice. It wasn't standardized until 1941, when President Roosevelt signed a bill declaring that the fourth Thursday in November would be Thanksgiving Day.
As we gather together around this table
laden with your plentiful gifts to us,
we thank You for always providing
what we really need
and for sometimes granting wishes
for things we don’t really need.
Today, let us be especially thankful
for each other--for family and friends
who enrich our lives in wonderful ways,
even when they present us with challenges.
Let us join together now
in peaceful, loving fellowship
to celebrate Your love for us
and our love for each other.
Abraham Lincoln's Thanksgiving Proclamation
The year that is drawing towards its close, has been filled with the blessings of fruitful fields and healthful skies. To these bounties, which are so constantly enjoyed that we are prone to forget the source from which they come, others have been added, which are of so extraordinary a nature, that they cannot fail to penetrate and soften even the heart which is habitually insensible to the ever watchful providence of Almighty God. In the midst of a civil war of unequalled magnitude and severity, which has sometimes seemed to foreign States to invite and to provoke their aggression, peace has been preserved with all nations, order has been maintained, the laws have been respected and obeyed, and harmony has prevailed everywhere except in the theatre of military conflict; while that theatre has been greatly contracted by the advancing armies and navies of the Union. Needful diversions of wealth and of strength from the fields of peaceful industry to the national defence, have not arrested the plough, the shuttle, or the ship; the axe had enlarged the borders of our settlements, and the mines, as well of iron and coal as of the precious metals, have yielded even more abundantly than heretofore. Population has steadily increased, notwithstanding the waste that has been made in the camp, the siege and the battle-field; and the country, rejoicing in the consciousness of augmented strength and vigor, is permitted to expect continuance of years with large increase of freedom. No human counsel hath devised nor hath any mortal hand worked out these great things. They are the gracious gifts of the Most High God, who, while dealing with us in anger for our sins, hath nevertheless remembered mercy. It has seemed to me fit and proper that they should be solemnly, reverently and gratefully acknowledged as with one heart and voice by the whole American People. I do therefore invite my fellow citizens in every part of the United States, and also those who are at sea and those who are sojourning in foreign lands, to set apart and observe the last Thursday of November next, as a day of Thanksgiving and Praise to our beneficent Father who dwelleth in the Heavens. And I recommend to them that while offering up the ascriptions justly due to Him for such singular deliverances and blessings, they do also, with humble penitence for our national perverseness and disobedience, commend to his tender care all those who have become widows, orphans, mourners or sufferers in the lamentable civil strife in which we are unavoidably engaged, and fervently implore the interposition of the Almighty Hand to heal the wounds of the nation and to restore it as soon as may be consistent with the Divine purposes to the full enjoyment of peace, harmony, tranquillity and Union. It is the duty of nations as well as of men to own their dependence upon the overruling power of God; to confess their sins and transgressions in humble sorrow, yet with assured hope that genuine repentance will lead to mercy and pardon; and to recognize the sublime truth, announced in the Holy Scriptures and proven by all history, that those nations are blessed whose God is the Lord.
- Abraham Lincoln's Thanksgiving Proclamation October 3, 1863
Giving thanks in a hard time
We've come a long way from Plymouth. It is said that in October of 1621, after a winter of privation and even starvation, the native landowners, the Wampanoags, helped the English colonists put on a celebration of Thanksgiving.
The steely and sometimes grim form of Christianity those pilgrims practiced included several formally proclaimed days of fasting and a few of thanksgiving. Say what you will about their style and humorlessness, they got the connection between being alive and the Source of Life.
We live in a pluralistic society with people of many faiths and many of no (acknowledged) faith. Presidents gingerly navigate the religious question and proclaim a Day of Thanksgiving. Merchants usually welcome the day to rev up sales. This year it seems they're dreading the losses.
And how about us? Our outgoing and incoming presidents tell us these are the worst days since the Great Depression. Our religious tradition tells us that sometimes we go deeper, and to a more honest place, when things aren't easy. I don't hold to the "no atheists in foxholes" theology, but I do agree that human stubbornness sometimes needs an outside nudge to allow the truth to penetrate.
Buddhists often characterize their practice as "mindfulness." It's a rich and subtle notion, welcomed and used by many of the most committed Christians I know. Thankfulness might be seen as form of mindfulness--a way of "paying attention in a particular way, on purpose, in the present moment, and non-judgmentally." Instead of trying to solve the situation we're in, mindfulness means "sitting with it." This year especially, I'm going to think of Thanksgiving as the thing I do because of the actual shape we're in, not in spite of it.
If I get anywhere near that it will be because by some force of grace or love, I am just able to "sit with" my life-with its many gifts and its sober challenges together. Period.
I hope you'll find a way to go deeper--this Sunday as we wind up the church year with the powerful lesson of serving "the least of these," or at our Thanksgiving observance, or at a service near where you live. I hope that wherever you are, you'll let those who know what to do be your guide, just as the Wampanoags were to those pilgrims long ago.
26.11.08
Cheers to Kingsley Amis
Kingsley Amis wrote three short books on drink, which are collected for the first time here. The first, On Drink, is a witty, belligerent and often profound defence of the kind of drinking habits that Kingsley acquired in the Old England of mixed drinks and beer. Its recipes are based on spirits, the repeated recourse to which enabled Kingsley to suffer fools if not gladly, then at least with a recognition that their defects are largely human.
These recipes belong to a vanished world, in which you had to think hard as to how to get as much alcohol into the system for as little outlay as possible, and in which those noxious medicines Dubonnet, Martini, Advocaat and Noilly Prat stood on the sideboard, waiting to be enlivened with vodka or gin. Wine occasionally gets a look in, but it is clear that Kingsley despised the stuff, as representing an alcohol-to-price ratio far below the horizon of a real drinker's need.
At the start, Amis announces certain 'general principles' to be followed in creating drinks, all of which can be derived, by natural drinkers' logic, from the first of them, which holds that 'up to a point [i.e. short of offering your guests one of those Balkan plonks marketed as wine, Cyprus sherry, poteen and the like], go for quantity rather than quality'. Spirits prevail over the stuff that might soften their impact, as illustrated by the Lucky Jim, which consists of 12 to 15 parts vodka to one part vermouth and two parts cucumber juice, and there is a drink for just about every ordeal that Kingsley's ordeal-filled life could be expected to present.
Thus Paul Fussell's Milk Punch (one part brandy, one part bourbon, four parts milk, plus nutmeg and frozen milk cubes) is 'to be drunk immediately on rising, in lieu of eating breakfast. It is an excellent heartener and sustainer at the outset of a hard day: not only before an air trip or an interview, but when you have in prospect one of those gruelling nominal festivities like Christmas, the wedding of an old friend of your wife's or taking the family over to Gran's for Sunday dinner'.
The books were written between 1971 and 1984; as a guide to prices, availability and so on, they are therefore entirely out of date. But who cares? Each chapter is packed with observations that, in their utter disregard for political correctness, social inclusiveness and phoney compassion, are as punchy and uplifting as the vile cocktails they describe.
The famous hangover scene in Lucky Jim is complemented here by a philosophical chapter on the hangover that is one of the great English essays of our time. Kingsley dismisses the run-of-the-mill cures that you can find in any newspaper, since they omit 'all that vast, vague, awful, shimmering metaphysical superstructure that makes a hangover a [fortunately] unique route to self-knowledge and self-realisation'.
Other writers, he believes, have illuminated the metaphysical hangover while ostensibly writing of something else: parts of Dostoevsky and Poe can be read in this way, although the greatest attempt at capturing the experience, according to Amis, is Kafka's 'The Metamorphosis', in which the hero wakes up to discover that he has been transformed into a man-sized cockroach.
Elsewhere, Amis laments the destruction wrought on the English pub. His apprenticeship as a drinks-man began with quiet conversations among smoke-blackened trophies, with drink as but one component in a profoundly English routine of social consolation. This consolation was arbitrarily destroyed during the Seventies by one-armed bandits, kitsch signs and the conversation-stopping noise of pop music. Kingsley blames the brewery chains for this violence against the very heart of English society. To the violence of the brewers, however, has been added that of the politicians, who have banned the activity - smoking - that brought people from their homes of an evening, and which both conserved and overcame their shyness.
Judging from the effect of the smoking ban on our village pub, this great English institution has now been consigned to history. And if you are seeking a requiem for the pub culture and all that it meant, then this is the book for you. It will not console you for the loss, but it will teach you how to be rude about it, with that inimitable rudeness that Kingsley perfected and which was the breast-plate across a warm, vulnerable and thoroughly decent heart.
These recipes belong to a vanished world, in which you had to think hard as to how to get as much alcohol into the system for as little outlay as possible, and in which those noxious medicines Dubonnet, Martini, Advocaat and Noilly Prat stood on the sideboard, waiting to be enlivened with vodka or gin. Wine occasionally gets a look in, but it is clear that Kingsley despised the stuff, as representing an alcohol-to-price ratio far below the horizon of a real drinker's need.
At the start, Amis announces certain 'general principles' to be followed in creating drinks, all of which can be derived, by natural drinkers' logic, from the first of them, which holds that 'up to a point [i.e. short of offering your guests one of those Balkan plonks marketed as wine, Cyprus sherry, poteen and the like], go for quantity rather than quality'. Spirits prevail over the stuff that might soften their impact, as illustrated by the Lucky Jim, which consists of 12 to 15 parts vodka to one part vermouth and two parts cucumber juice, and there is a drink for just about every ordeal that Kingsley's ordeal-filled life could be expected to present.
Thus Paul Fussell's Milk Punch (one part brandy, one part bourbon, four parts milk, plus nutmeg and frozen milk cubes) is 'to be drunk immediately on rising, in lieu of eating breakfast. It is an excellent heartener and sustainer at the outset of a hard day: not only before an air trip or an interview, but when you have in prospect one of those gruelling nominal festivities like Christmas, the wedding of an old friend of your wife's or taking the family over to Gran's for Sunday dinner'.
The books were written between 1971 and 1984; as a guide to prices, availability and so on, they are therefore entirely out of date. But who cares? Each chapter is packed with observations that, in their utter disregard for political correctness, social inclusiveness and phoney compassion, are as punchy and uplifting as the vile cocktails they describe.
The famous hangover scene in Lucky Jim is complemented here by a philosophical chapter on the hangover that is one of the great English essays of our time. Kingsley dismisses the run-of-the-mill cures that you can find in any newspaper, since they omit 'all that vast, vague, awful, shimmering metaphysical superstructure that makes a hangover a [fortunately] unique route to self-knowledge and self-realisation'.
Other writers, he believes, have illuminated the metaphysical hangover while ostensibly writing of something else: parts of Dostoevsky and Poe can be read in this way, although the greatest attempt at capturing the experience, according to Amis, is Kafka's 'The Metamorphosis', in which the hero wakes up to discover that he has been transformed into a man-sized cockroach.
Elsewhere, Amis laments the destruction wrought on the English pub. His apprenticeship as a drinks-man began with quiet conversations among smoke-blackened trophies, with drink as but one component in a profoundly English routine of social consolation. This consolation was arbitrarily destroyed during the Seventies by one-armed bandits, kitsch signs and the conversation-stopping noise of pop music. Kingsley blames the brewery chains for this violence against the very heart of English society. To the violence of the brewers, however, has been added that of the politicians, who have banned the activity - smoking - that brought people from their homes of an evening, and which both conserved and overcame their shyness.
Judging from the effect of the smoking ban on our village pub, this great English institution has now been consigned to history. And if you are seeking a requiem for the pub culture and all that it meant, then this is the book for you. It will not console you for the loss, but it will teach you how to be rude about it, with that inimitable rudeness that Kingsley perfected and which was the breast-plate across a warm, vulnerable and thoroughly decent heart.
Thanksgiving
When Americans sit down to our annual Thanksgiving meal with family and friends, we like to imagine that we are reenacting a scene that first took place in 1621. That year, having made a successful harvest after a brutal winter that killed half their number, the 50 or so surviving Colonists in Plymouth "entertained and feasted," in the words of one, a visiting delegation of nearby Wampanoag Indians, led by "their greatest king," Massasoit.
American holidays, however, sometimes reveal more about what we have forgotten about the past than what we remember. Historical records indicate that the parties dined on venison and corn rather than on the stuffing, cranberry sauce and pumpkin pie Americans have come to associate with Thanksgiving, and that the feast probably took place in the early autumn rather than November. Moreover, it is not even clear that the Pilgrims referred to their 1621 celebration as a thanksgiving. To devout Pilgrims, a day of thanksgiving was usually a solemn religious undertaking, marked by worship and, often, fasting. It was not a day spent gorging on wild deer and engaging in "recreations" with one's Indian neighbors.
Although there were sporadic local Thanksgiving days in Colonial and early America, it was not until the middle of the Civil War -- 1863 -- that President Lincoln issued a proclamation making the last Thursday in November a national holiday of Thanksgiving. Lincoln's statement suggested that thanks were being given as much for "the advancing armies and navies of the Union" as for a bountiful harvest, and the president urged special prayers for "all those who have become widows, orphans, mourners or sufferers in the lamentable civil strife in which we are unavoidably engaged."
Not surprisingly, few at the time viewed Thanksgiving as a private, family occasion. Instead, Northern civilians donated turkey and cranberries to feed Union troops, while Jefferson Davis declared separate Thanksgiving holidays for the Confederacy.
During Reconstruction, many Southerners initially expressed reluctance at celebrating what they saw to be a Yankee holiday. And yet it was at this moment, as the recently rejoined United States struggled to reconcile its populace after a divisive Civil War, that it became useful to reinvent the history of Thanksgiving. Most Americans found it far more pleasant to imagine this American holiday as originating not during the traumas of the 1860s but rather during the more distant past of the early 1600s. To partisans of the Union and the Confederacy alike, the image of Pilgrims and Indians sitting down together to a shared meal offered a comforting vision of peace between potential rivals.
Yet this new image of Thanksgiving not only allowed Americans to gloss over the deep divisions that had led to the Civil War, it also overlooked much of the subsequent history of the Pilgrims' relations with their Indian neighbors. About 50 years after Massasoit and his fellow Wampanoags enjoyed their harvest meal at Plymouth, the Colonists' seizures of Wampanoag land would precipitate a vicious war between Plymouth Colony and the Wampanoags, now led by Massasoit's son, Metacom.
Most of the other peoples in New England at first tried to avoid the conflict between the onetime participants in the "first Thanksgiving." But the confrontation soon engulfed the entire region, pitting the New England Colonies against a fragile alliance of Wampanoags, Narragansetts, Nipmucs and other Native American groups. Although these allies succeeded in killing hundreds of Colonists and burning British settlements up to the very fringes of Boston itself, the losses suffered by New England's indigenous peoples were even more devastating. Thousands died over the two years of the war, and many of those captured were sold into slavery in the British West Indies, including Metacom's wife and 9-year-old son.
Metacom met his end at the hands of a Colonial scouting party in August of 1676. His killers quartered and decapitated his body and sent Metacom's head to Plymouth, where for two decades it would be prominently displayed on a pike outside the colony's entrance. That same year, as the violence drew to a close, the colony of Connecticut declared a "day of Publique Thankesgiving" to celebrate "the subdueing of our enemies."
Perhaps it is not surprising that we choose to remember the Thanksgiving of 1621 and to forget the Thanksgiving of 1676. Who, after all, would not prefer to celebrate a moment of peaceful unity rather than one of bloody conflict? But if our public holidays are meant to be moments for self-reflection as well as self-congratulation, we should think of Thanksgiving not as a perpetual reenactment of the "first Thanksgiving" of 1621 but instead as a dynamic event whose meaning has shifted over time.
We need not forget Massasoit's pleasant experiences dining with the Pilgrims in order to remember the more troubling fate of his son at the hands of the Pilgrims' descendants. Indeed, commemorating all the many reasons Americans have expressed thanks over the centuries allows us to come to a more complete and more honest understanding of our history. For while we cannot change events in the past, we do have the power to decide what we wish to be thankful for now and in the future.
Karl Jacoby is an associate professor of history at Brown University and the author of "Shadows at Dawn: A Borderlands Massacre and the Violence of History."
American holidays, however, sometimes reveal more about what we have forgotten about the past than what we remember. Historical records indicate that the parties dined on venison and corn rather than on the stuffing, cranberry sauce and pumpkin pie Americans have come to associate with Thanksgiving, and that the feast probably took place in the early autumn rather than November. Moreover, it is not even clear that the Pilgrims referred to their 1621 celebration as a thanksgiving. To devout Pilgrims, a day of thanksgiving was usually a solemn religious undertaking, marked by worship and, often, fasting. It was not a day spent gorging on wild deer and engaging in "recreations" with one's Indian neighbors.
Although there were sporadic local Thanksgiving days in Colonial and early America, it was not until the middle of the Civil War -- 1863 -- that President Lincoln issued a proclamation making the last Thursday in November a national holiday of Thanksgiving. Lincoln's statement suggested that thanks were being given as much for "the advancing armies and navies of the Union" as for a bountiful harvest, and the president urged special prayers for "all those who have become widows, orphans, mourners or sufferers in the lamentable civil strife in which we are unavoidably engaged."
Not surprisingly, few at the time viewed Thanksgiving as a private, family occasion. Instead, Northern civilians donated turkey and cranberries to feed Union troops, while Jefferson Davis declared separate Thanksgiving holidays for the Confederacy.
During Reconstruction, many Southerners initially expressed reluctance at celebrating what they saw to be a Yankee holiday. And yet it was at this moment, as the recently rejoined United States struggled to reconcile its populace after a divisive Civil War, that it became useful to reinvent the history of Thanksgiving. Most Americans found it far more pleasant to imagine this American holiday as originating not during the traumas of the 1860s but rather during the more distant past of the early 1600s. To partisans of the Union and the Confederacy alike, the image of Pilgrims and Indians sitting down together to a shared meal offered a comforting vision of peace between potential rivals.
Yet this new image of Thanksgiving not only allowed Americans to gloss over the deep divisions that had led to the Civil War, it also overlooked much of the subsequent history of the Pilgrims' relations with their Indian neighbors. About 50 years after Massasoit and his fellow Wampanoags enjoyed their harvest meal at Plymouth, the Colonists' seizures of Wampanoag land would precipitate a vicious war between Plymouth Colony and the Wampanoags, now led by Massasoit's son, Metacom.
Most of the other peoples in New England at first tried to avoid the conflict between the onetime participants in the "first Thanksgiving." But the confrontation soon engulfed the entire region, pitting the New England Colonies against a fragile alliance of Wampanoags, Narragansetts, Nipmucs and other Native American groups. Although these allies succeeded in killing hundreds of Colonists and burning British settlements up to the very fringes of Boston itself, the losses suffered by New England's indigenous peoples were even more devastating. Thousands died over the two years of the war, and many of those captured were sold into slavery in the British West Indies, including Metacom's wife and 9-year-old son.
Metacom met his end at the hands of a Colonial scouting party in August of 1676. His killers quartered and decapitated his body and sent Metacom's head to Plymouth, where for two decades it would be prominently displayed on a pike outside the colony's entrance. That same year, as the violence drew to a close, the colony of Connecticut declared a "day of Publique Thankesgiving" to celebrate "the subdueing of our enemies."
Perhaps it is not surprising that we choose to remember the Thanksgiving of 1621 and to forget the Thanksgiving of 1676. Who, after all, would not prefer to celebrate a moment of peaceful unity rather than one of bloody conflict? But if our public holidays are meant to be moments for self-reflection as well as self-congratulation, we should think of Thanksgiving not as a perpetual reenactment of the "first Thanksgiving" of 1621 but instead as a dynamic event whose meaning has shifted over time.
We need not forget Massasoit's pleasant experiences dining with the Pilgrims in order to remember the more troubling fate of his son at the hands of the Pilgrims' descendants. Indeed, commemorating all the many reasons Americans have expressed thanks over the centuries allows us to come to a more complete and more honest understanding of our history. For while we cannot change events in the past, we do have the power to decide what we wish to be thankful for now and in the future.
Karl Jacoby is an associate professor of history at Brown University and the author of "Shadows at Dawn: A Borderlands Massacre and the Violence of History."
25.11.08
Thanksgiving
We Gather Together
The American Thanksgiving has its origins in two very old and very different holidays: the Harvest Home feast and the formal day of thanksgiving proclaimed by church or government authorities in gratitude for a particular event, such as a military victory. Very few days of thanksgiving coincided with harvest celebrations. Thanksgiving Day began in Plymouth Colony (Massachusetts) in 1621, when the Pilgrims gave thanks for their survival and a good first crop. In 1863, President Abraham Lincoln set aside the last Thursday in November for a national celebration of Thanksgiving
24.11.08
Criticism
An iron law of American life decrees that the provinces of thought be limited in the collective consciousness to a single representative. Like a poor man's Noah, we take one of each. One physicist: Stephen Hawking. One literary theorist: Harold Bloom. One radical social critic: Noam Chomsky. Before her death, we had one intellectual, Susan Sontag, and one only. (Now we've dispensed with the category altogether.) We are great anointers in this country, a habit that obviates the need for scrutiny. We don't want to have to go into the ins and outs of a thing--weigh merits, examine histories, enter debates. We just want to put a face on it--the logic of celebrity culture--and move on.
How Fiction Works
by James Wood
How Wood Works Literary Criticism
It has been decided of late that the face of literary criticism shall belong to James Wood. A writer first at the Guardian (from 1992 to 1996), then at The New Republic and now, since last year, at The New Yorker, Wood has long been considered, in a formulation that soon assumed a ritual cast, "the best critic of his generation." Coming from elders like Sontag, Bloom and Saul Bellow, and nearly always incorporating that meaningless word "generation," these consecrations have bespoken a kind of Oedipal conflict, betraying the double urge first to possess one's offspring by defining them, then to destroy them altogether. For Wood has come to be seen as something more than the best of his generation: not just the best, full stop, regardless of generation, but the one, the only, even the last. Beside him, none; after him, none other. The line ends here.
Contributing to this mythology is a belief in cultural decline, as constitutive a feature of modern consciousness as its reciprocal faith, a belief in material progress. Cynthia Ozick recently called for a "thicket of Woods," a battalion of critics raised in Wood's image, to renovate not only literary criticism but literature itself. Wood is the "template," Ozick announced, from which a new cultural "infrastructure" must be built--or rebuilt. For Ozick, Wood recalls the glory days of American criticism during the middle of the last century, the age of Edmund Wilson, Lionel Trilling, Alfred Kazin and Irving Howe. Indeed, he may surpass these forebears. "We have not heard a critical mind like this at work," Ozick declared, "since Trilling's The Liberal Imagination." The Liberal Imagination was published in 1950. Everything since includes some of Wilson, most of Trilling himself and nearly all of Kazin and Howe. Perhaps Ozick was only indulging in a bit of polemical hyperbole, but the comparison she urges convinces me that there may be something to the idea of cultural decline after all. Wood may be the best we have, but to set him next to Wilson, Trilling, Kazin and Howe is to see exactly how far we have fallen.
But before we measure that distance, let us first give Wood his due. It is large. A critic's first necessity is learning, and Wood's learning is immense. He has not only read all the novels; he has read all the lives, all the letters and all the manifestoes, and he quotes them, with an exquisite ear for accent and echo, as if he'd read them all yesterday. He has read criticism, theory, aesthetics and, a special interest of his, theology. The apostate son of an evangelical childhood, he writes about literature as if our souls depended on it, which, for any serious reader, they do. For Wood, literature is truth, or as close as we're going to get to it in a world without God. In this postmodernist age, when Wood's New Yorker colleague Louis Menand can note, with a sense of amused relief, that the language of entertainment has displaced the language of moral seriousness in popular literary discourse; when Caleb Crain can declare, in the hipster literary journal n+1, that "literature is only an art," no more worthy of university instruction than wine tasting; when those same universities can turn literature over to ideologues who fear and despise it; Wood's unapologetic commitment to literature's transcendent value and criticism's high calling is his most important virtue.
The charge of that commitment is transmitted by the electricity of his style. Wood's writing is stretched taut by his command of syntax, made brilliant by his virtuosity of metaphoric coloration. For Wood, Melville "went tidally, between belief and unbelief"; "Flaubert's characters are doomed, while Chekhov's are only imprisoned"; Coleridge's style features "voluminous sentences that stretch like library corridors"; "Woolf's work is a kind of tattoo peeled off the English poets and rubbed onto her sentences"; Pynchon's Mason & Dixon, a faux eighteenth-century novel, "functions as an allegorical picaresque, rolling the brougham of itself from implication to implication, taking on extra implications at one town, and throwing off a few at the next." These are more than ornaments. In his essay on Woolf, Wood argues that the metaphoric abundance of her critical prose marks "her nearness to her subjects--her ability to use an artistic language," and so it is with him. So intimate is Wood with his authors, so feelingly does he follow the movements of their minds, that he seems to write from within the books themselves. His essays, snatches of an ongoing conversation, open midstride. Eschewing the old or fashionable question, he is always already on the second thought.
That conversation spirals around the issue that organizes all of Wood's thinking about fiction. In the words of the subtitle of his first collection, The Broken Estate, it is the issue of "literature and belief." How do novels coax us toward belief in things we know to be untrue? How do authors create believable characters? How can novelistic language be at once literary and realistic? How has the nature of verisimilitude--the image we consent to call "real"--changed over the course of literary history? What does it mean to believe a fiction in the first place? Wood addresses these questions through two principal means: broad examination of the evolution of novelistic technique and close stylistic analysis. (It is the first of these, his attention to "indebtedness" and "connectedness," his ability to think the whole of the novel's history in a single thought, that especially recommends him to Ozick.) Wood shows us how unreliable narration developed from Dostoyevsky to Knut Hamsun to Italo Svevo, how stream of consciousness begins in Shakespearean soliloquy and moves to Jane Austen's refinement of free indirect discourse before erupting into the Modernist technique that bears that name. These discussions are never abstractly theoretical but grounded in Wood's microscopic alertness to verbal texture: how Chekhov "bends" his language around his characters so that it seems to spring from their minds, how Bellow renders detail so as to make it both modernly impressionistic and classically solid, how D.H. Lawrence causes a seemingly ordinary description to glimmer with religious intimations.
In all this, Wood is centrally concerned with the ways novelists tell the truth about the world, how they "produce art that accurately sees 'the way things are,'" and it is here that we begin to see both his project's deepest motives and the first of its limitations. Wood's ideal authors are those, like Chekhov and Mann and the Sicilian writer Giovanni Verga, who are able to invent characters who seem to break free of their creators' intentions, who feel "real to themselves"--and thus to us--because they "forget" they are fictional. A novelist's ultimate achievement is to enable us to know a character so well that we catch a glimpse of his inviolable unknowability, his singular quiddity--in other words, though Wood doesn't use the term, his soul. While Wood esteems Flaubert and, to a lesser extent, Nabokov, he finally finds their exquisite artistic control too confining (hence his remark about Flaubert's characters being "doomed"). For Wood, the essential authorial endowment is what Keats called "negative capability"--the ability to remain hospitable to alien styles of being and antithetical beliefs and values.
A critic who prizes realness above all else will naturally favor a realist aesthetic. Wood's touchstones are Austen, Tolstoy, James, Chekhov, Mann, Bellow, Naipaul. Of antirealist writers--experimentalists, postmodernists, magical realists; Rushdie, DeLillo, Zadie Smith--he is famously critical (though with exceptions, like José Saramago). He extols Joyce and tips his hat to Kafka and Beckett, but he has little to say about them, and when he does say something about Joyce--or Woolf, for that matter--he makes them sound like realists, avoiding everything that is characteristically experimental or Modernist about them. He acknowledges the force of J.M. Coetzee's work, which is founded on allegory and parable, but rather than investigating that force, he chooses to pick an argument with Coetzee's most realistic and therefore least characteristic novel, Disgrace.
There is a larger issue at work here, and it is precisely that of literature and belief. The Broken Estate is framed as an inquiry into the transmutations of belief as it migrated from religion to fiction during the nineteenth and twentieth centuries. In its title essay, Wood remarks that "the child of evangelicalism, if he does not believe, inherits nevertheless a suspicion of indifference. He is always evangelical. He rejects the religion he grew up with, but he rejects it religiously." A kind of theologian manqué, Wood confesses his atheism with a ritual regularity; as vigorously as literature mobilizes his emotions, he saves his deepest feelings for theological dispute, the one place his impeccably confident prose tends to lose its composure. As he says of Melville, he can neither believe nor do without belief. What he finally seems to want from fiction is that it recapture his lost faith without spilling a drop.
In order to do so, it must be, as it were, "literally" true--transparently true. It must feel exactly like life, must exhibit not, as he puts it, "lifelikeness" but "lifeness: life on the page." Of a descriptive phrase in Henry James, Wood exclaims, "Aren't these exactly the best words in the best order?" The idea suggests the possibility of a perfect transmission from world to work, from life to "lifeness," as if the artistic medium could function like a clear pane of glass. Wood knows, of course, that realism is a set of conventions, but like a liberal Catholic who understands that Jesus wasn't really divine, he would prefer to forget it. Hence his discomfort with the artful distortion, the allegorical dislocation--the bank shot, the knight's move, the indirect approach.
Too much is sacrificed on the altar of this aesthetic theology--too much in fiction that is fine; too much, finally, that is true. Magical realism is indeed unconvincing in Rushdie and Morrison, as Wood says, but what of García Márquez, who integrates it into a seamlessly imagined world? Does it matter that Borges doesn't create realistic characters? Nabokov's characters may be "galley slaves," as the novelist boasted, but he is still able to use them as, in his words, "a kind of springboard for leaping into the highest region of serious emotion." To Roland Barthes's charge that realism is merely a collection of effects, Wood correctly replies that "realism can be an effect and still be true." But so can antirealism. Wood defends realism, justly, from accusations of naïveté, but the terms in which he does so make him susceptible to the same charge. "Almost all the great 20th-century realist novels," he says, "are full of artifice," which makes artifice sound like a kind of optional ingredient, sort of like sugar, that novelists are free to add in greater or lesser amounts. Of course, everything, in every novel, is artifice. The only distinction to be made is between artifice that is flaunted and artifice that is concealed.
Wood's unwillingness to confront the contradictions in his thinking about these matters--to distinguish between realism and reality, artifice and experiment, character and person--points to a larger problem. Wood is a daring thinker, but he is not a particularly rigorous one. His powerfully associative mind tends to run him into logical cul-de-sacs that his supreme self-assurance prevents him from noticing. He often wanders from topic to topic, always too willing to be seduced from his path by the dappled description, the blooming detail. The general question tends to make him especially approximate; reading his critique of the gaseous George Steiner, I sometimes feel like I am watching two men beat each other with balloons. As at the larger scale, so at the smaller. Wood asserts that Mann's fiction is childlike because, among other things, it contains a lot of children. He says that human free will is not necessarily important to God, since God could have made us less free. While we might grant him enough room to argue that Morrison loves her characters too indulgently and then, four pages later, that she loves them less than she loves her language, we must draw the line when he tells us that Hamsun's characters "lie both to themselves and to us" and then, later in the same paragraph, that in his work "a character can lie neither to us nor to himself."
Wood's critical authority has become so daunting, it seems, that even he is afraid to challenge it. His argumentative method rests far too heavily on hand-waving, and while he is superb at turning a phrase, the fact that something sounds good doesn't guarantee that it makes any sense. Wood never stops to ask himself what his favorite formulas actually mean: characters who feel "real to themselves," who "forget" they're in a novel and so forth. These are obviously only metaphors, but metaphors for what? What, for that matter, does "lifeness" mean? And to what extent is Wood willing to take responsibility for his assertion, near the end of How Fiction Works, his new treatise on novelistic technique, that we should "replace the always problematic word 'realism' with the much more problematic word 'truth'"? Is something true (or beautiful, or good) just because James Wood says so?
For so imperious a critic, Wood is surprisingly sloppy. He repeatedly writes "literature" when he means "fiction." He confuses Jane and Lydia Bennet in Pride and Prejudice, thinks the Professor in Conrad's The Secret Agent is a real professor and fails to see that Don Quixote and Sancho Panza know exactly how they're depicted in the first half of Cervantes's work, since someone tells them at the beginning of the second (a particularly surprising oversight, given that their resulting self-consciousness shapes the couple's behavior throughout the rest of the novel). In How Fiction Works, he spends two full pages burbling over the delicious mystery, in Joyce's A Portrait of the Artist as a Young Man, of Mr. Casey's having gotten his "three cramped fingers making a birthday present for Queen Victoria" (Why Queen Victoria? Whatever could the present have been?), when anyone can see that something sardonically political is intended, a suspicion confirmed by Richard Ellmann's standard biography of the author. Wood's prodigious ability to trace lines of descent across novelistic history, usually so illuminating, can become first a bookkeeper's compulsion (he'll complete his double entry whether it's relevant to the discussion or not), then an obsessive's delusion. Cormac McCarthy's Anton Chigurh is not a "reprise" of Conrad's Professor, even if one makes Wood think of the other; the only thing the two characters have in common is that they're both scary.
The looseness extends even to style. Aside from its profusion of metaphor, the most conspicuous feature of Wood's prose is his taste for the angled modifier: "royal fatalism," "fat charity," "white comment," "trapped loyalties." A book is described as "curlingly set in the present." Again, these sound good--they are essentially a kind of compressed metaphor--but what do they mean? Sometimes Wood unpacks them; sometimes he doesn't. Sometimes I can guess; sometimes I can't. At times it seems like he just throws an adjective at a noun and hopes it will stick. At others, the technique involves the displacement of a modifier from its expected syntactic position, a trick he probably picked up from Shakespeare. We're told that "Melville's faith quivered" on a "violent bevel," but since it's hard to see how a bevel could be violent, we understand Wood to mean that Melville's faith quivered violently, as on a bevel. This kind of wobble is frequently to be found among his metaphors, as well. Melville fingers his torment "like a wounded rosary"; Woolf "embarrasses words into confessing their abstract pigments"; Steiner's wager on the existence of meaning "is no more than the milk of optimism, and is soaked in errors." But soaking milk is like burning fire, a confessed pigment is a verbal color improperly mixed and a wounded rosary is one incarnation too many.
These stylistic imprecisions, however, are programmatic. For Wood, metaphor naturally runs away with itself, overreaches, and will necessarily at times be unsuccessful. His angled adjectives and mixed metaphors are a form of cognitive exploration, a search party stumbling on its way to new discoveries. What's more, he claims metaphor is the proper language not only of fiction but of criticism. Because literary criticism, unlike that of music or art, shares its subject's language, it can "never offer a successful summation...one is always thinking through books, not about them." Moreover, "all criticism is itself metaphorical in movement, because it deals in likeness. It asks: what is art like? What does it resemble?" The language of literary criticism, Wood says, must be literary, "which is to say metaphorical."
This is a provocative idea, but it is based on several false premises and, indeed, faulty metaphors. Literary criticism may share its subject's language, but unlike music or painting, words can also be used to form concepts. Language is not only representational and emotive--that is, literary--it is also logical and analytic--that is, critical. Wood's distinction between "thinking through" and "thinking about" (both prepositions are also spatial metaphors) is another rhetorically attractive statement that turns out upon examination to be logically void. If Wood's work cannot be described as "thinking about" books--making conceptual statements about them--then neither word has any content. And the reason criticism needs to make use of the conceptual resources of language is that while it does indeed deal with what art is like, it also deals with what it means.
But beyond his theological preoccupations, Wood never shows much interest in what novels mean. His criticism shuttles between the largest scale and the smallest, the development of fictional technique over the course of novelistic history and the minute particulars of authorial style. His brilliance in describing both is unequaled, but he ignores just about everything that lies in between. He ignores the broad middle ground of novelistic form--narrative structures, patterns of character and image, symbols that bind far-flung moments and disparate levels of a text--and he ignores the meanings that novelists use those methods to propose. (This explains his factual mistakes and interpretive blunders; he simply isn't paying attention at a certain level.) Wood can tell us about Flaubert's narrator or Bellow's style, but he's not very curious about what those writers have to say about the world: about boredom, or grief, or death, or anything else in the wide, starred universe of human experience. He would always rather spend his time tasting the flavor of a phrase or giving the deck of his theoretical interests another shuffle.
Nor is it a very large deck. Wood doesn't develop his ideas from essay to essay so much as reiterate them, often in the same words and with the same examples, like a professor pulling out his old lectures year after year. (In How Fiction Works, he mutes his rhetorical pyrotechnics in an effort not to frighten off the common reader, with the predictable result that, as the title suggests, he comes off sounding even more condescending than usual.) If The Broken Estate is about literature and belief, his second collection, The Irresponsible Self, ostensibly takes up the question of "laughter and the novel." But beyond the very general notion that humor in fiction involves a historically new symbiosis of laughter and tears, comedy and sympathy, he has nothing to say about the subject, which in any case appears in the volume only intermittently. Already in the first essay, he is drawn back willy-nilly to his intellectual lodestone, the question of belief. Wood's reading may be vast, but the stock of his ideas is rather small.
It is not hard to see why. For all his interest in fiction's ability to tell the truth about the world, there is something remarkably self-enclosed about his criticism--a sense that nothing exists beyond the boundary of his consciousness, and that his consciousness contains nothing but books. In a preface to the new work, Wood assures us that he has used "only the books I actually own--the books at hand in my study" to produce the volume. The statement is truer than he knows. Wood has read all the novels and all the volumes that bear upon the novels, and he seems to think that is all one needs to do. But there is a world outside his study, and the books in his study, and one can't understand fiction without understanding that. The novel, more than other literary forms, embodies a massive engagement with the world--has massive designs upon the world--and demands a comparable engagement from its critics. Wood thinks that McCarthy got Chigurh from Conrad because he can't imagine that he got him from anywhere other than another novel. How Fiction Works offers us "a brief history of consciousness" from Homer to Modernism but without any suggestion that the representation of consciousness in literature might have something to do with what has happened outside literature. Wood treats the novelistic canon like one giant Keatsian urn, a self-sufficient aesthetic artifact removed from commerce with the dirty, human world.
Here we begin to glimpse the enormous gulf that lies between James Wood, the best we have to offer, and the New York critics, Wilson, Trilling, Kazin and Howe--and, let us add, lest we fetishize that quartet, which in any case begins to sound like a white-shoe law firm that's being taken over by Jews, another critic with strong claims to inclusion in that company, Elizabeth Hardwick. What made these thinkers so distinguished, what made their criticism so significant not only for American literature but also for this country's intellectual culture as a whole, was not great learning, or great thinking, or great expressive ability, or great sensitivity to literary feeling and literary form, though they exhibited all of these, but a passionate involvement with what lies beyond the literary and creates its context. Wilson, who wrote about everything during his teeming career, from politics to popular culture, socialist factions to Native American tribes, warned about "the cost of detaching books from all the other affairs of human life." Trilling's whole method as a critic was to set the object of his consideration within the history of what he called "the moral imagination." Kazin, whose criticism, like Hardwick's, focused on the literature of this country in particular, sought to illuminate nothing less than "the nature of our American experiences." The goal of Howe's criticism, he said, was "the recreation of a vital democratic radicalism in America." The New York critics were interested in literature because they were interested in politics, culture, the moral life and the life of society, and all as they bore on one another. They placed literature at the center of their inquiry because they recognized its ability not only to represent life but, as Matthew Arnold said, to criticize it--to ask questions about where we are and how where we are stands in relation to where we should be. They were not aesthetes; they were, in the broadest sense, intellectuals.
To turn from Wood to any one of these writers is to breathe an incomparably richer mental atmosphere. Wood's reading of the theatrical performance in Mansfield Park, which causes so much trouble for the characters in Austen's novel by stirring up impermissible feelings, is a crabbed conceit about the difference between theatrical and novelistic pacing. Trilling's, which revolutionized our understanding of this recalcitrant work, brings the history of nineteenth-century ideas about duty, sincerity and much else to bear on the novel's uncongenial insistence on "fixity" of identity and conduct. Wood's essay on Updike conducts a critique of the novelist's religious complacency that feels like an argument in a Cambridge common room--pedantic, persnickety and finally rather bloodless. Hardwick's, slyly sophisticated and elegantly droll, swift and easy and knowing, bespeaks a critic who has lived and moved in the world, and who has made her studies of character from life as well as books. For the New York critics, novelists are people; for Wood, people, including novelists, are ideas. To set his pages on Bellow next to Kazin's or Howe's is to remember everything we've been missing. On the novelist's style they are no less incisive than he, but they understand it in relation to an incomparably broader range of relevancies: Jewishness, the city, the American vernacular--human relevancies, not just aesthetic ones. Nor is this a matter of greater demographic closeness. Compare Kazin on Jewish-American fiction to Wood on postwar English fiction ("English" as in England). The former is nothing less than the psychic biography of an entire community. The latter confines itself to purely aesthetic issues; what has happened in England since the end of World War II--anything that has happened in England since the war, politically, socially or culturally--simply doesn't enter into his thinking. The comprehensiveness of his omissions is staggering.
One cannot finally fault Wood for this. No one is doing what the New York critics once did. The real question is why. The first answer, it seems to me, has to do with a general loss of cultural ambition. We no longer have anyone who aspires to be the next Joyce or Proust either. The Modernist drive to remake the world has given way to a postmodern sense of enfeeblement. The very idea of heroic criticism, like that of heroic art, is, as Menand so gleefully announced, no longer credible. Related to this is the so-called "cultural turn," the abandonment of the political dimension of radical critique over the past several decades in favor of an exclusive emphasis on social meanings--the domain of cultural studies in the academy and "cultural criticism" in the media. Gone is any sense that politics and culture are connected, or that their criticism should be connected--the real reason we no longer have any public intellectuals, whose activity is predicated on precisely that belief. It was a belief indebted, to a large extent, to Marxism. The struggle to come to terms with Marxism was central to the New York critics' intellectual formation. It is no accident that the few older writers who still practice a broad-gauge criticism, most notably Terry Eagleton, Clive James and Christopher Hitchens, have all had extensive business with that ideology, or that its disappearance has coincided with a diminishment of critical possibility. Wood's distaste for the postmodern and admirable refusal of its easy cynicisms notwithstanding, his narrow aestheticism marks him unmistakably as a product of his time.
I don't know how we're going to get back to the kind of criticism the New York critics wrote, or the kind of intellectual life that criticism made possible. Their emergence was the result of a historical juncture that will probably never recur. But I do know that we won't get back to it by taking Wood as our critical template. Ozick's thicket of Woods would be a dwarf forest. We are immensely fortunate to have him--his talent, his erudition, his judgment--but if American criticism were to follow his lead, it would end up only in a desert.
William Deresiewicz is a regular contributor to The Nation's Books & the Arts section of the Nation. He was nominated for a 2008 National Magazine Award for reviews and criticism. more...
How Fiction Works
by James Wood
How Wood Works Literary Criticism
It has been decided of late that the face of literary criticism shall belong to James Wood. A writer first at the Guardian (from 1992 to 1996), then at The New Republic and now, since last year, at The New Yorker, Wood has long been considered, in a formulation that soon assumed a ritual cast, "the best critic of his generation." Coming from elders like Sontag, Bloom and Saul Bellow, and nearly always incorporating that meaningless word "generation," these consecrations have bespoken a kind of Oedipal conflict, betraying the double urge first to possess one's offspring by defining them, then to destroy them altogether. For Wood has come to be seen as something more than the best of his generation: not just the best, full stop, regardless of generation, but the one, the only, even the last. Beside him, none; after him, none other. The line ends here.
Contributing to this mythology is a belief in cultural decline, as constitutive a feature of modern consciousness as its reciprocal faith, a belief in material progress. Cynthia Ozick recently called for a "thicket of Woods," a battalion of critics raised in Wood's image, to renovate not only literary criticism but literature itself. Wood is the "template," Ozick announced, from which a new cultural "infrastructure" must be built--or rebuilt. For Ozick, Wood recalls the glory days of American criticism during the middle of the last century, the age of Edmund Wilson, Lionel Trilling, Alfred Kazin and Irving Howe. Indeed, he may surpass these forebears. "We have not heard a critical mind like this at work," Ozick declared, "since Trilling's The Liberal Imagination." The Liberal Imagination was published in 1950. Everything since includes some of Wilson, most of Trilling himself and nearly all of Kazin and Howe. Perhaps Ozick was only indulging in a bit of polemical hyperbole, but the comparison she urges convinces me that there may be something to the idea of cultural decline after all. Wood may be the best we have, but to set him next to Wilson, Trilling, Kazin and Howe is to see exactly how far we have fallen.
But before we measure that distance, let us first give Wood his due. It is large. A critic's first necessity is learning, and Wood's learning is immense. He has not only read all the novels; he has read all the lives, all the letters and all the manifestoes, and he quotes them, with an exquisite ear for accent and echo, as if he'd read them all yesterday. He has read criticism, theory, aesthetics and, a special interest of his, theology. The apostate son of an evangelical childhood, he writes about literature as if our souls depended on it, which, for any serious reader, they do. For Wood, literature is truth, or as close as we're going to get to it in a world without God. In this postmodernist age, when Wood's New Yorker colleague Louis Menand can note, with a sense of amused relief, that the language of entertainment has displaced the language of moral seriousness in popular literary discourse; when Caleb Crain can declare, in the hipster literary journal n+1, that "literature is only an art," no more worthy of university instruction than wine tasting; when those same universities can turn literature over to ideologues who fear and despise it; Wood's unapologetic commitment to literature's transcendent value and criticism's high calling is his most important virtue.
The charge of that commitment is transmitted by the electricity of his style. Wood's writing is stretched taut by his command of syntax, made brilliant by his virtuosity of metaphoric coloration. For Wood, Melville "went tidally, between belief and unbelief"; "Flaubert's characters are doomed, while Chekhov's are only imprisoned"; Coleridge's style features "voluminous sentences that stretch like library corridors"; "Woolf's work is a kind of tattoo peeled off the English poets and rubbed onto her sentences"; Pynchon's Mason & Dixon, a faux eighteenth-century novel, "functions as an allegorical picaresque, rolling the brougham of itself from implication to implication, taking on extra implications at one town, and throwing off a few at the next." These are more than ornaments. In his essay on Woolf, Wood argues that the metaphoric abundance of her critical prose marks "her nearness to her subjects--her ability to use an artistic language," and so it is with him. So intimate is Wood with his authors, so feelingly does he follow the movements of their minds, that he seems to write from within the books themselves. His essays, snatches of an ongoing conversation, open midstride. Eschewing the old or fashionable question, he is always already on the second thought.
That conversation spirals around the issue that organizes all of Wood's thinking about fiction. In the words of the subtitle of his first collection, The Broken Estate, it is the issue of "literature and belief." How do novels coax us toward belief in things we know to be untrue? How do authors create believable characters? How can novelistic language be at once literary and realistic? How has the nature of verisimilitude--the image we consent to call "real"--changed over the course of literary history? What does it mean to believe a fiction in the first place? Wood addresses these questions through two principal means: broad examination of the evolution of novelistic technique and close stylistic analysis. (It is the first of these, his attention to "indebtedness" and "connectedness," his ability to think the whole of the novel's history in a single thought, that especially recommends him to Ozick.) Wood shows us how unreliable narration developed from Dostoyevsky to Knut Hamsun to Italo Svevo, how stream of consciousness begins in Shakespearean soliloquy and moves to Jane Austen's refinement of free indirect discourse before erupting into the Modernist technique that bears that name. These discussions are never abstractly theoretical but grounded in Wood's microscopic alertness to verbal texture: how Chekhov "bends" his language around his characters so that it seems to spring from their minds, how Bellow renders detail so as to make it both modernly impressionistic and classically solid, how D.H. Lawrence causes a seemingly ordinary description to glimmer with religious intimations.
In all this, Wood is centrally concerned with the ways novelists tell the truth about the world, how they "produce art that accurately sees 'the way things are,'" and it is here that we begin to see both his project's deepest motives and the first of its limitations. Wood's ideal authors are those, like Chekhov and Mann and the Sicilian writer Giovanni Verga, who are able to invent characters who seem to break free of their creators' intentions, who feel "real to themselves"--and thus to us--because they "forget" they are fictional. A novelist's ultimate achievement is to enable us to know a character so well that we catch a glimpse of his inviolable unknowability, his singular quiddity--in other words, though Wood doesn't use the term, his soul. While Wood esteems Flaubert and, to a lesser extent, Nabokov, he finally finds their exquisite artistic control too confining (hence his remark about Flaubert's characters being "doomed"). For Wood, the essential authorial endowment is what Keats called "negative capability"--the ability to remain hospitable to alien styles of being and antithetical beliefs and values.
A critic who prizes realness above all else will naturally favor a realist aesthetic. Wood's touchstones are Austen, Tolstoy, James, Chekhov, Mann, Bellow, Naipaul. Of antirealist writers--experimentalists, postmodernists, magical realists; Rushdie, DeLillo, Zadie Smith--he is famously critical (though with exceptions, like José Saramago). He extols Joyce and tips his hat to Kafka and Beckett, but he has little to say about them, and when he does say something about Joyce--or Woolf, for that matter--he makes them sound like realists, avoiding everything that is characteristically experimental or Modernist about them. He acknowledges the force of J.M. Coetzee's work, which is founded on allegory and parable, but rather than investigating that force, he chooses to pick an argument with Coetzee's most realistic and therefore least characteristic novel, Disgrace.
There is a larger issue at work here, and it is precisely that of literature and belief. The Broken Estate is framed as an inquiry into the transmutations of belief as it migrated from religion to fiction during the nineteenth and twentieth centuries. In its title essay, Wood remarks that "the child of evangelicalism, if he does not believe, inherits nevertheless a suspicion of indifference. He is always evangelical. He rejects the religion he grew up with, but he rejects it religiously." A kind of theologian manqué, Wood confesses his atheism with a ritual regularity; as vigorously as literature mobilizes his emotions, he saves his deepest feelings for theological dispute, the one place his impeccably confident prose tends to lose its composure. As he says of Melville, he can neither believe nor do without belief. What he finally seems to want from fiction is that it recapture his lost faith without spilling a drop.
In order to do so, it must be, as it were, "literally" true--transparently true. It must feel exactly like life, must exhibit not, as he puts it, "lifelikeness" but "lifeness: life on the page." Of a descriptive phrase in Henry James, Wood exclaims, "Aren't these exactly the best words in the best order?" The idea suggests the possibility of a perfect transmission from world to work, from life to "lifeness," as if the artistic medium could function like a clear pane of glass. Wood knows, of course, that realism is a set of conventions, but like a liberal Catholic who understands that Jesus wasn't really divine, he would prefer to forget it. Hence his discomfort with the artful distortion, the allegorical dislocation--the bank shot, the knight's move, the indirect approach.
Too much is sacrificed on the altar of this aesthetic theology--too much in fiction that is fine; too much, finally, that is true. Magical realism is indeed unconvincing in Rushdie and Morrison, as Wood says, but what of García Márquez, who integrates it into a seamlessly imagined world? Does it matter that Borges doesn't create realistic characters? Nabokov's characters may be "galley slaves," as the novelist boasted, but he is still able to use them as, in his words, "a kind of springboard for leaping into the highest region of serious emotion." To Roland Barthes's charge that realism is merely a collection of effects, Wood correctly replies that "realism can be an effect and still be true." But so can antirealism. Wood defends realism, justly, from accusations of naïveté, but the terms in which he does so make him susceptible to the same charge. "Almost all the great 20th-century realist novels," he says, "are full of artifice," which makes artifice sound like a kind of optional ingredient, sort of like sugar, that novelists are free to add in greater or lesser amounts. Of course, everything, in every novel, is artifice. The only distinction to be made is between artifice that is flaunted and artifice that is concealed.
Wood's unwillingness to confront the contradictions in his thinking about these matters--to distinguish between realism and reality, artifice and experiment, character and person--points to a larger problem. Wood is a daring thinker, but he is not a particularly rigorous one. His powerfully associative mind tends to run him into logical cul-de-sacs that his supreme self-assurance prevents him from noticing. He often wanders from topic to topic, always too willing to be seduced from his path by the dappled description, the blooming detail. The general question tends to make him especially approximate; reading his critique of the gaseous George Steiner, I sometimes feel like I am watching two men beat each other with balloons. As at the larger scale, so at the smaller. Wood asserts that Mann's fiction is childlike because, among other things, it contains a lot of children. He says that human free will is not necessarily important to God, since God could have made us less free. While we might grant him enough room to argue that Morrison loves her characters too indulgently and then, four pages later, that she loves them less than she loves her language, we must draw the line when he tells us that Hamsun's characters "lie both to themselves and to us" and then, later in the same paragraph, that in his work "a character can lie neither to us nor to himself."
Wood's critical authority has become so daunting, it seems, that even he is afraid to challenge it. His argumentative method rests far too heavily on hand-waving, and while he is superb at turning a phrase, the fact that something sounds good doesn't guarantee that it makes any sense. Wood never stops to ask himself what his favorite formulas actually mean: characters who feel "real to themselves," who "forget" they're in a novel and so forth. These are obviously only metaphors, but metaphors for what? What, for that matter, does "lifeness" mean? And to what extent is Wood willing to take responsibility for his assertion, near the end of How Fiction Works, his new treatise on novelistic technique, that we should "replace the always problematic word 'realism' with the much more problematic word 'truth'"? Is something true (or beautiful, or good) just because James Wood says so?
For so imperious a critic, Wood is surprisingly sloppy. He repeatedly writes "literature" when he means "fiction." He confuses Jane and Lydia Bennet in Pride and Prejudice, thinks the Professor in Conrad's The Secret Agent is a real professor and fails to see that Don Quixote and Sancho Panza know exactly how they're depicted in the first half of Cervantes's work, since someone tells them at the beginning of the second (a particularly surprising oversight, given that their resulting self-consciousness shapes the couple's behavior throughout the rest of the novel). In How Fiction Works, he spends two full pages burbling over the delicious mystery, in Joyce's A Portrait of the Artist as a Young Man, of Mr. Casey's having gotten his "three cramped fingers making a birthday present for Queen Victoria" (Why Queen Victoria? Whatever could the present have been?), when anyone can see that something sardonically political is intended, a suspicion confirmed by Richard Ellmann's standard biography of the author. Wood's prodigious ability to trace lines of descent across novelistic history, usually so illuminating, can become first a bookkeeper's compulsion (he'll complete his double entry whether it's relevant to the discussion or not), then an obsessive's delusion. Cormac McCarthy's Anton Chigurh is not a "reprise" of Conrad's Professor, even if one makes Wood think of the other; the only thing the two characters have in common is that they're both scary.
The looseness extends even to style. Aside from its profusion of metaphor, the most conspicuous feature of Wood's prose is his taste for the angled modifier: "royal fatalism," "fat charity," "white comment," "trapped loyalties." A book is described as "curlingly set in the present." Again, these sound good--they are essentially a kind of compressed metaphor--but what do they mean? Sometimes Wood unpacks them; sometimes he doesn't. Sometimes I can guess; sometimes I can't. At times it seems like he just throws an adjective at a noun and hopes it will stick. At others, the technique involves the displacement of a modifier from its expected syntactic position, a trick he probably picked up from Shakespeare. We're told that "Melville's faith quivered" on a "violent bevel," but since it's hard to see how a bevel could be violent, we understand Wood to mean that Melville's faith quivered violently, as on a bevel. This kind of wobble is frequently to be found among his metaphors, as well. Melville fingers his torment "like a wounded rosary"; Woolf "embarrasses words into confessing their abstract pigments"; Steiner's wager on the existence of meaning "is no more than the milk of optimism, and is soaked in errors." But soaking milk is like burning fire, a confessed pigment is a verbal color improperly mixed and a wounded rosary is one incarnation too many.
These stylistic imprecisions, however, are programmatic. For Wood, metaphor naturally runs away with itself, overreaches, and will necessarily at times be unsuccessful. His angled adjectives and mixed metaphors are a form of cognitive exploration, a search party stumbling on its way to new discoveries. What's more, he claims metaphor is the proper language not only of fiction but of criticism. Because literary criticism, unlike that of music or art, shares its subject's language, it can "never offer a successful summation...one is always thinking through books, not about them." Moreover, "all criticism is itself metaphorical in movement, because it deals in likeness. It asks: what is art like? What does it resemble?" The language of literary criticism, Wood says, must be literary, "which is to say metaphorical."
This is a provocative idea, but it is based on several false premises and, indeed, faulty metaphors. Literary criticism may share its subject's language, but unlike music or painting, words can also be used to form concepts. Language is not only representational and emotive--that is, literary--it is also logical and analytic--that is, critical. Wood's distinction between "thinking through" and "thinking about" (both prepositions are also spatial metaphors) is another rhetorically attractive statement that turns out upon examination to be logically void. If Wood's work cannot be described as "thinking about" books--making conceptual statements about them--then neither word has any content. And the reason criticism needs to make use of the conceptual resources of language is that while it does indeed deal with what art is like, it also deals with what it means.
But beyond his theological preoccupations, Wood never shows much interest in what novels mean. His criticism shuttles between the largest scale and the smallest, the development of fictional technique over the course of novelistic history and the minute particulars of authorial style. His brilliance in describing both is unequaled, but he ignores just about everything that lies in between. He ignores the broad middle ground of novelistic form--narrative structures, patterns of character and image, symbols that bind far-flung moments and disparate levels of a text--and he ignores the meanings that novelists use those methods to propose. (This explains his factual mistakes and interpretive blunders; he simply isn't paying attention at a certain level.) Wood can tell us about Flaubert's narrator or Bellow's style, but he's not very curious about what those writers have to say about the world: about boredom, or grief, or death, or anything else in the wide, starred universe of human experience. He would always rather spend his time tasting the flavor of a phrase or giving the deck of his theoretical interests another shuffle.
Nor is it a very large deck. Wood doesn't develop his ideas from essay to essay so much as reiterate them, often in the same words and with the same examples, like a professor pulling out his old lectures year after year. (In How Fiction Works, he mutes his rhetorical pyrotechnics in an effort not to frighten off the common reader, with the predictable result that, as the title suggests, he comes off sounding even more condescending than usual.) If The Broken Estate is about literature and belief, his second collection, The Irresponsible Self, ostensibly takes up the question of "laughter and the novel." But beyond the very general notion that humor in fiction involves a historically new symbiosis of laughter and tears, comedy and sympathy, he has nothing to say about the subject, which in any case appears in the volume only intermittently. Already in the first essay, he is drawn back willy-nilly to his intellectual lodestone, the question of belief. Wood's reading may be vast, but the stock of his ideas is rather small.
It is not hard to see why. For all his interest in fiction's ability to tell the truth about the world, there is something remarkably self-enclosed about his criticism--a sense that nothing exists beyond the boundary of his consciousness, and that his consciousness contains nothing but books. In a preface to the new work, Wood assures us that he has used "only the books I actually own--the books at hand in my study" to produce the volume. The statement is truer than he knows. Wood has read all the novels and all the volumes that bear upon the novels, and he seems to think that is all one needs to do. But there is a world outside his study, and the books in his study, and one can't understand fiction without understanding that. The novel, more than other literary forms, embodies a massive engagement with the world--has massive designs upon the world--and demands a comparable engagement from its critics. Wood thinks that McCarthy got Chigurh from Conrad because he can't imagine that he got him from anywhere other than another novel. How Fiction Works offers us "a brief history of consciousness" from Homer to Modernism but without any suggestion that the representation of consciousness in literature might have something to do with what has happened outside literature. Wood treats the novelistic canon like one giant Keatsian urn, a self-sufficient aesthetic artifact removed from commerce with the dirty, human world.
Here we begin to glimpse the enormous gulf that lies between James Wood, the best we have to offer, and the New York critics, Wilson, Trilling, Kazin and Howe--and, let us add, lest we fetishize that quartet, which in any case begins to sound like a white-shoe law firm that's being taken over by Jews, another critic with strong claims to inclusion in that company, Elizabeth Hardwick. What made these thinkers so distinguished, what made their criticism so significant not only for American literature but also for this country's intellectual culture as a whole, was not great learning, or great thinking, or great expressive ability, or great sensitivity to literary feeling and literary form, though they exhibited all of these, but a passionate involvement with what lies beyond the literary and creates its context. Wilson, who wrote about everything during his teeming career, from politics to popular culture, socialist factions to Native American tribes, warned about "the cost of detaching books from all the other affairs of human life." Trilling's whole method as a critic was to set the object of his consideration within the history of what he called "the moral imagination." Kazin, whose criticism, like Hardwick's, focused on the literature of this country in particular, sought to illuminate nothing less than "the nature of our American experiences." The goal of Howe's criticism, he said, was "the recreation of a vital democratic radicalism in America." The New York critics were interested in literature because they were interested in politics, culture, the moral life and the life of society, and all as they bore on one another. They placed literature at the center of their inquiry because they recognized its ability not only to represent life but, as Matthew Arnold said, to criticize it--to ask questions about where we are and how where we are stands in relation to where we should be. They were not aesthetes; they were, in the broadest sense, intellectuals.
To turn from Wood to any one of these writers is to breathe an incomparably richer mental atmosphere. Wood's reading of the theatrical performance in Mansfield Park, which causes so much trouble for the characters in Austen's novel by stirring up impermissible feelings, is a crabbed conceit about the difference between theatrical and novelistic pacing. Trilling's, which revolutionized our understanding of this recalcitrant work, brings the history of nineteenth-century ideas about duty, sincerity and much else to bear on the novel's uncongenial insistence on "fixity" of identity and conduct. Wood's essay on Updike conducts a critique of the novelist's religious complacency that feels like an argument in a Cambridge common room--pedantic, persnickety and finally rather bloodless. Hardwick's, slyly sophisticated and elegantly droll, swift and easy and knowing, bespeaks a critic who has lived and moved in the world, and who has made her studies of character from life as well as books. For the New York critics, novelists are people; for Wood, people, including novelists, are ideas. To set his pages on Bellow next to Kazin's or Howe's is to remember everything we've been missing. On the novelist's style they are no less incisive than he, but they understand it in relation to an incomparably broader range of relevancies: Jewishness, the city, the American vernacular--human relevancies, not just aesthetic ones. Nor is this a matter of greater demographic closeness. Compare Kazin on Jewish-American fiction to Wood on postwar English fiction ("English" as in England). The former is nothing less than the psychic biography of an entire community. The latter confines itself to purely aesthetic issues; what has happened in England since the end of World War II--anything that has happened in England since the war, politically, socially or culturally--simply doesn't enter into his thinking. The comprehensiveness of his omissions is staggering.
One cannot finally fault Wood for this. No one is doing what the New York critics once did. The real question is why. The first answer, it seems to me, has to do with a general loss of cultural ambition. We no longer have anyone who aspires to be the next Joyce or Proust either. The Modernist drive to remake the world has given way to a postmodern sense of enfeeblement. The very idea of heroic criticism, like that of heroic art, is, as Menand so gleefully announced, no longer credible. Related to this is the so-called "cultural turn," the abandonment of the political dimension of radical critique over the past several decades in favor of an exclusive emphasis on social meanings--the domain of cultural studies in the academy and "cultural criticism" in the media. Gone is any sense that politics and culture are connected, or that their criticism should be connected--the real reason we no longer have any public intellectuals, whose activity is predicated on precisely that belief. It was a belief indebted, to a large extent, to Marxism. The struggle to come to terms with Marxism was central to the New York critics' intellectual formation. It is no accident that the few older writers who still practice a broad-gauge criticism, most notably Terry Eagleton, Clive James and Christopher Hitchens, have all had extensive business with that ideology, or that its disappearance has coincided with a diminishment of critical possibility. Wood's distaste for the postmodern and admirable refusal of its easy cynicisms notwithstanding, his narrow aestheticism marks him unmistakably as a product of his time.
I don't know how we're going to get back to the kind of criticism the New York critics wrote, or the kind of intellectual life that criticism made possible. Their emergence was the result of a historical juncture that will probably never recur. But I do know that we won't get back to it by taking Wood as our critical template. Ozick's thicket of Woods would be a dwarf forest. We are immensely fortunate to have him--his talent, his erudition, his judgment--but if American criticism were to follow his lead, it would end up only in a desert.
William Deresiewicz is a regular contributor to The Nation's Books & the Arts section of the Nation. He was nominated for a 2008 National Magazine Award for reviews and criticism. more...
23.11.08
John Muir
No dogma taught by the present civilization seems to form so insuperable an obstacle in a way of a right understanding of the relations which culture sustains to wilderness as that which regards the world as made especially for the uses of man. Every animal, plant and crystal controverts it in the plainest terms. Yet it is taught from century to century as something ever new and precious, and in the resulting darkness the enormous conceit is allowed to go unchallenged. -- John Muir, "Wild Wool "Given John Muir's status as the iconic representative of the preservationist wing of the modern environmental movement -- not to mention his influential work as a writer, amateur scientist and founder of the Sierra Club -- it is remarkable that a comprehensive account of his life has been so long in coming. But Muir's life story is complex, and an accurate telling of it has required nearly a century of the kind of scholarly sifting and sorting that Donald Worster does so expertly in A Passion for Nature . When he died in 1914, Muir left voluminous, disorganized manuscript journals and correspondence that would have to be gathered, ordered and studied before the private side of this very public man's life could be understood.
Born in 1838 in Dunbar, Scotland, John Muir emigrated with his family to the Wisconsin prairies in 1849. After a decade of backbreaking labor on the family farm, the 22-year-old Muir made it to the State Fair in Madison, where his "inventions"â??including handcrafted wooden clocks of his own design -- earned him an invitation to become a student at the University of Wisconsin. Muir studied in Madison for several years but never finished his degree, instead tramping off to Canada in order to botanize and to avoid being drafted into service in the Civil War.
When the war ended, Muir returned to America and put his mechanical genius to work in a carriage factory in Indianapolis, but in 1867 he was temporarily blinded in an industrial accident, which left him deeply uncertain about his future. As he recovered his sight, however, he became confident that he was called not to be an inventor or a captain of industry, but rather to be a student of nature in all its forms. With his life's direction clarified, Muir left Indianapolis and walked a thousand miles, through the Appalachians to the Gulf of Mexico, botanizing nearly every step of the way.
Muir ultimately followed the nation west, to California. He was in search of mountains rather than gold. He would make seven trips to Alaska to study glaciers and would also travel the globe to investigate trees, but he was most strongly associated with the Sierra Nevada (which he famously called "the range of light") and Yosemite. Although Yosemite was not the nation's first national park (Yellowstone has that distinction), Muir is nonetheless often credited with having been the father of the national park system, for it was his argument on behalf of nature aesthetics, recreation and preservation that helped Yosemite to become the first national park established with the goal of protecting nature from commercial logging, grazing and mining.
Muir would later assist in founding the Sierra Club to help protect the environment. By the start of the 20th century, he was the best-known and most public spokesman for a new environmental ethic -- he challenged the presumptions of anthropocentrism, argued for the spiritual and aesthetic value of nature, and advocated for environmental preservation in an age largely dominated by ruthless industrial forces.
It is difficult to take the measure of Muir's life as a scientist. He was a gifted amateur naturalist, and his expertise as a botanist was widely celebrated. More important was his insightful work in glaciology. A young and mostly untutored Muir challenged the dominant geological theory regarding Yosemite's formation. Basing his arguments on glacial striation patterns on rock and on other observations made during his Sierra rambles, Muir insisted that the sublime landscape of Yosemite had been sculpted by ice rather than caused by a catastrophic subsidence of the valley floor (the scientific explanation then endorsed by most professional scientists -- including Josiah Whitney, who lambasted Muir as an "ignoramus"). Muir was right, of course. He not only wrote articles and gave lectures making his case, but also used his courage and skill as a mountaineer to discover that living glaciers, remnants of the last ice age, still existed in the remote high Sierra. Even the geological catastrophists found it difficult to repudiate this physical evidence for the glacial theory.
Despite such achievements, science was not Muir's true vocation; he soon abandoned strictly scientific writing in favor of direct wilderness experience, a more lyrical form of nature writing, and a calling to save what he could of America's wild places. He referred to himself not as a glaciologist, but rather as a "poetico-trampo-geologisÂt-botanist and ornithologist-naturalist." Although he received honorary degrees from Harvard, Yale and Berkeley, Muir refused offers to become a professional scientist or an academic, instead choosing a more experiential and reverential path through the wilderness.
Worster is an environmental historian whose distinguished career includes such books as Nature's Economy: A History of Ecological Ideas, The Ends of the Earth: Perspectives on Modern Environmental History and A River Running West, a biography of John Wesley Powell. In A Passion for Nature, ÂWorster has given us what is clearly the most detailed and complete account of Muir's life. He has soundly rooted this biography in manuscript evidence rather than depending on popular accounts of Muir's adventures.
Muir's status as the iconic representative of the fledgling environmental movement gave him a larger-than-life role in the American imagination. As a result of this exaggerated public persona it has sometimes been difficult to distinguish the iconic "John O' Mountains" from John Muir the man.
A Passion for Nature helps correct a number of distortions and omissions that are perennial in popular representations of Muir's life. For example, we have generally preferred to envision Muir as a solitary rambler, alone in the mountain tabernacle of the high Sierra. Worster's account, if less dramatic, makes clear that although Muir was often a solo wilderness adventurer, he was also a remarkably sociable man who nurtured and enjoyed his relationships with family and friends.
Likewise, Muir has often been depicted as an inspired pauper who renounced commercial activity in favor of the unsullied wilderness, a distortion that has caused us to elide the long chapter of his adult life that was devoted to profit-generating agricultural enterprise. Worster brings Muir's Martinez, California, fruit-growing business into focus. He also shows that Muir was close friends with many of the wealthiest and most powerful men of his day (including Theodore Roosevelt and the railroad magnate E. H. Harriman), and that the adult Muir was rather well off for an inspired tramp.
Worster is also more honest than many previous commentators in acknowledging that Muir was, despite his generally egalitarian views of human and nonhuman beings alike, deeply ambivalent about Native Americans and also about Chinese immigrants, a number of whom worked as laborers on his ranch. If A Passion for Nature is biography rather than hagiography, however, it is far from being an expose; on the contrary, Worster remains respectful of Muir's vision and accomplishments throughout.
Among the few disappointments is Worster's tendency in the book to focus on examining Muir as an actor in the drama of American environmental history, only rarely analyzing his work as a writer. It is true that the literary Muir was a late bloomer who did not publish his first book until 1894, when he released The Mountains of California at age 55. Nevertheless, it was his writing that moved so many Americans to recognize, develop and express their own relationship to nature. In scores of periodical essays, as well as in the half dozen books he published during his lifetime, Muir crafted new approaches to writing about the relationship between nature and the self, and in so doing helped shape a genre of American environmental literature that remains vibrant today. Although A Passion for Nature is not intended as a literary biography, closer analysis of Muir's literary techniques -- including his deliberate self-fashioning of the "John O' Mountains" icon -- would have been a welcome addition to Worster's treatment.
A Passion for Nature is an excellent, readable, engaging piece of scholarship that should now be considered the definitive biography of one of America's most influential advocates for nature. Although John Muir was not an "American scientist" in the strict sense, his life and work provide an inspiring model of the sort of broad-minded, multidisciplinary, ethically engaged relationship to nature on which support for environmental protection still depends.
Born in 1838 in Dunbar, Scotland, John Muir emigrated with his family to the Wisconsin prairies in 1849. After a decade of backbreaking labor on the family farm, the 22-year-old Muir made it to the State Fair in Madison, where his "inventions"â??including handcrafted wooden clocks of his own design -- earned him an invitation to become a student at the University of Wisconsin. Muir studied in Madison for several years but never finished his degree, instead tramping off to Canada in order to botanize and to avoid being drafted into service in the Civil War.
When the war ended, Muir returned to America and put his mechanical genius to work in a carriage factory in Indianapolis, but in 1867 he was temporarily blinded in an industrial accident, which left him deeply uncertain about his future. As he recovered his sight, however, he became confident that he was called not to be an inventor or a captain of industry, but rather to be a student of nature in all its forms. With his life's direction clarified, Muir left Indianapolis and walked a thousand miles, through the Appalachians to the Gulf of Mexico, botanizing nearly every step of the way.
Muir ultimately followed the nation west, to California. He was in search of mountains rather than gold. He would make seven trips to Alaska to study glaciers and would also travel the globe to investigate trees, but he was most strongly associated with the Sierra Nevada (which he famously called "the range of light") and Yosemite. Although Yosemite was not the nation's first national park (Yellowstone has that distinction), Muir is nonetheless often credited with having been the father of the national park system, for it was his argument on behalf of nature aesthetics, recreation and preservation that helped Yosemite to become the first national park established with the goal of protecting nature from commercial logging, grazing and mining.
Muir would later assist in founding the Sierra Club to help protect the environment. By the start of the 20th century, he was the best-known and most public spokesman for a new environmental ethic -- he challenged the presumptions of anthropocentrism, argued for the spiritual and aesthetic value of nature, and advocated for environmental preservation in an age largely dominated by ruthless industrial forces.
It is difficult to take the measure of Muir's life as a scientist. He was a gifted amateur naturalist, and his expertise as a botanist was widely celebrated. More important was his insightful work in glaciology. A young and mostly untutored Muir challenged the dominant geological theory regarding Yosemite's formation. Basing his arguments on glacial striation patterns on rock and on other observations made during his Sierra rambles, Muir insisted that the sublime landscape of Yosemite had been sculpted by ice rather than caused by a catastrophic subsidence of the valley floor (the scientific explanation then endorsed by most professional scientists -- including Josiah Whitney, who lambasted Muir as an "ignoramus"). Muir was right, of course. He not only wrote articles and gave lectures making his case, but also used his courage and skill as a mountaineer to discover that living glaciers, remnants of the last ice age, still existed in the remote high Sierra. Even the geological catastrophists found it difficult to repudiate this physical evidence for the glacial theory.
Despite such achievements, science was not Muir's true vocation; he soon abandoned strictly scientific writing in favor of direct wilderness experience, a more lyrical form of nature writing, and a calling to save what he could of America's wild places. He referred to himself not as a glaciologist, but rather as a "poetico-trampo-geologisÂt-botanist and ornithologist-naturalist." Although he received honorary degrees from Harvard, Yale and Berkeley, Muir refused offers to become a professional scientist or an academic, instead choosing a more experiential and reverential path through the wilderness.
Worster is an environmental historian whose distinguished career includes such books as Nature's Economy: A History of Ecological Ideas, The Ends of the Earth: Perspectives on Modern Environmental History and A River Running West, a biography of John Wesley Powell. In A Passion for Nature, ÂWorster has given us what is clearly the most detailed and complete account of Muir's life. He has soundly rooted this biography in manuscript evidence rather than depending on popular accounts of Muir's adventures.
Muir's status as the iconic representative of the fledgling environmental movement gave him a larger-than-life role in the American imagination. As a result of this exaggerated public persona it has sometimes been difficult to distinguish the iconic "John O' Mountains" from John Muir the man.
A Passion for Nature helps correct a number of distortions and omissions that are perennial in popular representations of Muir's life. For example, we have generally preferred to envision Muir as a solitary rambler, alone in the mountain tabernacle of the high Sierra. Worster's account, if less dramatic, makes clear that although Muir was often a solo wilderness adventurer, he was also a remarkably sociable man who nurtured and enjoyed his relationships with family and friends.
Likewise, Muir has often been depicted as an inspired pauper who renounced commercial activity in favor of the unsullied wilderness, a distortion that has caused us to elide the long chapter of his adult life that was devoted to profit-generating agricultural enterprise. Worster brings Muir's Martinez, California, fruit-growing business into focus. He also shows that Muir was close friends with many of the wealthiest and most powerful men of his day (including Theodore Roosevelt and the railroad magnate E. H. Harriman), and that the adult Muir was rather well off for an inspired tramp.
Worster is also more honest than many previous commentators in acknowledging that Muir was, despite his generally egalitarian views of human and nonhuman beings alike, deeply ambivalent about Native Americans and also about Chinese immigrants, a number of whom worked as laborers on his ranch. If A Passion for Nature is biography rather than hagiography, however, it is far from being an expose; on the contrary, Worster remains respectful of Muir's vision and accomplishments throughout.
Among the few disappointments is Worster's tendency in the book to focus on examining Muir as an actor in the drama of American environmental history, only rarely analyzing his work as a writer. It is true that the literary Muir was a late bloomer who did not publish his first book until 1894, when he released The Mountains of California at age 55. Nevertheless, it was his writing that moved so many Americans to recognize, develop and express their own relationship to nature. In scores of periodical essays, as well as in the half dozen books he published during his lifetime, Muir crafted new approaches to writing about the relationship between nature and the self, and in so doing helped shape a genre of American environmental literature that remains vibrant today. Although A Passion for Nature is not intended as a literary biography, closer analysis of Muir's literary techniques -- including his deliberate self-fashioning of the "John O' Mountains" icon -- would have been a welcome addition to Worster's treatment.
A Passion for Nature is an excellent, readable, engaging piece of scholarship that should now be considered the definitive biography of one of America's most influential advocates for nature. Although John Muir was not an "American scientist" in the strict sense, his life and work provide an inspiring model of the sort of broad-minded, multidisciplinary, ethically engaged relationship to nature on which support for environmental protection still depends.
22.11.08
STUFF
The Comfort of Things
Daniel Miller
This book sums up how far social anthropology has progressed since Henry Mayhew wrote about the skull shapes of costermongers in the 19th century. Daniel Miller's approach is more in keeping with that of the wild and weird Tom Harrisson and the pioneers of Mass Observation in the 1930s. Having studied cannibalistic tribes in the New Hebrides, Harrisson despatched researchers to Bolton and north London to spy on the British working class at play. They reported on, among other things, the fixation with astrology, the football Pools, and "the cult of the aspidistra". These brief expeditions were undertaken as a tentative consumerism began to lighten the lives of the masses. At the time, George Orwell, having returned from his sojourn in Wigan, suggested that fish and chips, tinned salmon, radio and strong tea might have averted revolution.
Ultimately, if hope lay with the proles it lay with them as consumers. This, at least, was the contention of Dr Gallup, whose market research techniques attempted to understand the British as consumers, just as Mass Observation attempted to understand them as citizens. In The Comfort of Things, Miller investigates the citizens of contemporary London by way of their consumerism - or at least their material possessions, in an era of unprecedented mass consumption.
Initially Miller - currently a professor of anthropology at University College London - took the conventional approach to his craft, using expeditions to India, Trinidad and the Solomon Islands to investigate contemporary humanity "through the material form". In this new book he challenges the assumption that an attachment to things makes us more materialistic and superficial, consequently ruining the true potential of relationships. It is an assumption that environmental fundamentalists and certain psychologists line up behind when blaming the "affluence" of the masses for every earthbound evil. It is these and those of a similar mindset that you hope Miller might be addressing when arguing that such clichés and assumptions are seldom put to the test. "Possessions often remain profound," he says, "and usually the closer our relationships are with objects, the closer our relationships are with people."
But this is only part of the wider question he addresses in The Comfort of Things. It's a question that goes to the root of social science: what rituals and customs do human beings create to bring each other together? He argues that contemporary Londoners do not live their lives according to the cosmology of a religion or a belief in society. In fact, he echoes Margaret Thatcher in suggesting that there may no longer be such a thing as society, simply individuals in relationships with other people and objects. The latter is the impetus for this book, and a year and a half of research spent interviewing the inhabitants of a street in south-east London.
The "Stuart Street" of The Comfort of Things is an arbitrary choice, but its very ordinariness makes it interchangeable with other neighbourhoods in the capital. Diversity rather than homogeneity is what interests Miller, and this is what distinguishes the endeavour from the Britain that Mass Observation investigated. Only 23 per cent of the inhabitants of Stuart Street are London-born, and many of the 30 individuals allocated a "portrait" in this book hold allegiances to foreign localities. The modern London is fragmented, and the loss of identity has become its defining characteristic. In Miller's findings collectivism and community do not have an effective role to play in Stuart Street or elsewhere in the metropolis. "If ever we lived in a post-society, whose primary focus is on diversity rather than shared or systematically ordered culture, the London street is that post-society."
As such, for Miller, the study of material culture is the clue to understanding modern values. However, it is the characters less defined by the objects that surround them which prove to be the biggest finds in this book. There is Malcolm, a man whose email address is more of a "home" than his accommodation on Stuart Street, where his desire to embrace a digital existence has him jettisoning ornaments and accoutrements for a virtual life on the laptop. And the opening chapter of the book, "Empty", is the story of George, a 76-year-old who is more the stuff of fiction - the missing link between Melville's Bartleby and Miss Haversham. "It was after meeting George that we found ourselves in tears," writes Miller. "Because in every other instance there was a sense that at least that person had once lived. This was a man more or less waiting for his time on earth to be over, but who had never seen his life actually begin."
George is someone whose existence has been entirely dependent on the say-so of others, ranging from his parents to the state. Even the business of obtaining objects and decorating his flat requires decisions that are too big for him to deal with. His environment is beyond that self-conscious minimalism, that ethical thrift or that anti-consumerism which becomes its own lifestyle choice. His is a home where nothing survives as a clue to the history, or even the existence of its sole inhabitant: no mementoes, ornaments, photographs.
George is therefore the character who rattles part of Miller's thesis: "People sediment possessions, lay them down as foundations, material walls mortared with memory, strong supports that come into their own when times are difficult and the people who laid them down face experiences of loss."
Each portrait in The Comfort of Things is a chapter that reads like a short story. Miller has a tenderness and an affection for these characters, and his descriptions sometimes soar like passages in a novel, although there are moments when the author's projections tend to hint more at his own limitations than those of his subject. One such is the suggestion that George's lack of identity and passion for royalty make him ideal fodder for fascism.
Between the alienated and dysfunctional figures unearthed by Miller's research are those who find a joy and a passion in the things that help them nest and settle in a fragmented city. There is the cockney Londoner of old here, too, the breed whose bones lie beneath the city's paving stones; those forgotten by the new model "Londoner" who has rebranded the capital by way of a beloved multiculturalism that is as mythical as the "Middle England" he or she loathes. Working-class Marjorie accumulates things that, according to Miller, "never lose their rapport with the present". She is constantly changing the gallery of framed photographs that shroud her living room and watching old videos in an abode stacked with images of her family, as well as those of celebrities from the Beatles to television newsreaders.
Marjorie, perhaps more than any of the other characters in The Comfort of Things, best epitomises the theory that Miller is left with when his work in Stuart Street is done: in modern London, households and individuals alike have themselves to create the values that once defined us as a society. This is the departure that, here in the 21st century, has made social anthropology embark on a rethink. In losing the opportunity to study something known as society, it has been forced to focus solely on the individual and the home.
To the contemporary anthropologist such as Daniel Miller, "This street is New Guinea and every household in this book is a tribe."
Michael Collins is the author of "The Likes of Us: a Biography of the White Working Class", published by Granta Books (£7.99)
Daniel Miller
This book sums up how far social anthropology has progressed since Henry Mayhew wrote about the skull shapes of costermongers in the 19th century. Daniel Miller's approach is more in keeping with that of the wild and weird Tom Harrisson and the pioneers of Mass Observation in the 1930s. Having studied cannibalistic tribes in the New Hebrides, Harrisson despatched researchers to Bolton and north London to spy on the British working class at play. They reported on, among other things, the fixation with astrology, the football Pools, and "the cult of the aspidistra". These brief expeditions were undertaken as a tentative consumerism began to lighten the lives of the masses. At the time, George Orwell, having returned from his sojourn in Wigan, suggested that fish and chips, tinned salmon, radio and strong tea might have averted revolution.
Ultimately, if hope lay with the proles it lay with them as consumers. This, at least, was the contention of Dr Gallup, whose market research techniques attempted to understand the British as consumers, just as Mass Observation attempted to understand them as citizens. In The Comfort of Things, Miller investigates the citizens of contemporary London by way of their consumerism - or at least their material possessions, in an era of unprecedented mass consumption.
Initially Miller - currently a professor of anthropology at University College London - took the conventional approach to his craft, using expeditions to India, Trinidad and the Solomon Islands to investigate contemporary humanity "through the material form". In this new book he challenges the assumption that an attachment to things makes us more materialistic and superficial, consequently ruining the true potential of relationships. It is an assumption that environmental fundamentalists and certain psychologists line up behind when blaming the "affluence" of the masses for every earthbound evil. It is these and those of a similar mindset that you hope Miller might be addressing when arguing that such clichés and assumptions are seldom put to the test. "Possessions often remain profound," he says, "and usually the closer our relationships are with objects, the closer our relationships are with people."
But this is only part of the wider question he addresses in The Comfort of Things. It's a question that goes to the root of social science: what rituals and customs do human beings create to bring each other together? He argues that contemporary Londoners do not live their lives according to the cosmology of a religion or a belief in society. In fact, he echoes Margaret Thatcher in suggesting that there may no longer be such a thing as society, simply individuals in relationships with other people and objects. The latter is the impetus for this book, and a year and a half of research spent interviewing the inhabitants of a street in south-east London.
The "Stuart Street" of The Comfort of Things is an arbitrary choice, but its very ordinariness makes it interchangeable with other neighbourhoods in the capital. Diversity rather than homogeneity is what interests Miller, and this is what distinguishes the endeavour from the Britain that Mass Observation investigated. Only 23 per cent of the inhabitants of Stuart Street are London-born, and many of the 30 individuals allocated a "portrait" in this book hold allegiances to foreign localities. The modern London is fragmented, and the loss of identity has become its defining characteristic. In Miller's findings collectivism and community do not have an effective role to play in Stuart Street or elsewhere in the metropolis. "If ever we lived in a post-society, whose primary focus is on diversity rather than shared or systematically ordered culture, the London street is that post-society."
As such, for Miller, the study of material culture is the clue to understanding modern values. However, it is the characters less defined by the objects that surround them which prove to be the biggest finds in this book. There is Malcolm, a man whose email address is more of a "home" than his accommodation on Stuart Street, where his desire to embrace a digital existence has him jettisoning ornaments and accoutrements for a virtual life on the laptop. And the opening chapter of the book, "Empty", is the story of George, a 76-year-old who is more the stuff of fiction - the missing link between Melville's Bartleby and Miss Haversham. "It was after meeting George that we found ourselves in tears," writes Miller. "Because in every other instance there was a sense that at least that person had once lived. This was a man more or less waiting for his time on earth to be over, but who had never seen his life actually begin."
George is someone whose existence has been entirely dependent on the say-so of others, ranging from his parents to the state. Even the business of obtaining objects and decorating his flat requires decisions that are too big for him to deal with. His environment is beyond that self-conscious minimalism, that ethical thrift or that anti-consumerism which becomes its own lifestyle choice. His is a home where nothing survives as a clue to the history, or even the existence of its sole inhabitant: no mementoes, ornaments, photographs.
George is therefore the character who rattles part of Miller's thesis: "People sediment possessions, lay them down as foundations, material walls mortared with memory, strong supports that come into their own when times are difficult and the people who laid them down face experiences of loss."
Each portrait in The Comfort of Things is a chapter that reads like a short story. Miller has a tenderness and an affection for these characters, and his descriptions sometimes soar like passages in a novel, although there are moments when the author's projections tend to hint more at his own limitations than those of his subject. One such is the suggestion that George's lack of identity and passion for royalty make him ideal fodder for fascism.
Between the alienated and dysfunctional figures unearthed by Miller's research are those who find a joy and a passion in the things that help them nest and settle in a fragmented city. There is the cockney Londoner of old here, too, the breed whose bones lie beneath the city's paving stones; those forgotten by the new model "Londoner" who has rebranded the capital by way of a beloved multiculturalism that is as mythical as the "Middle England" he or she loathes. Working-class Marjorie accumulates things that, according to Miller, "never lose their rapport with the present". She is constantly changing the gallery of framed photographs that shroud her living room and watching old videos in an abode stacked with images of her family, as well as those of celebrities from the Beatles to television newsreaders.
Marjorie, perhaps more than any of the other characters in The Comfort of Things, best epitomises the theory that Miller is left with when his work in Stuart Street is done: in modern London, households and individuals alike have themselves to create the values that once defined us as a society. This is the departure that, here in the 21st century, has made social anthropology embark on a rethink. In losing the opportunity to study something known as society, it has been forced to focus solely on the individual and the home.
To the contemporary anthropologist such as Daniel Miller, "This street is New Guinea and every household in this book is a tribe."
Michael Collins is the author of "The Likes of Us: a Biography of the White Working Class", published by Granta Books (£7.99)
22 November 1963
I was driving from Cambridge to New Haven in advance of the Yale Game when President John F. Kennedy assassinated.
The most notorious political murder in recent American history occurred this day in 1963, when John F. Kennedy, the 35th U.S. president (1961–63), was shot and killed in Dallas, Texas, while riding in an open car.
21.11.08
Michael Lewis
Excellent and perceptive piece.
http://www.portfolio.com/news-markets/national-news/portfolio/2008/11/11/The-End-of-Wall-Streets-Boom?print=true
He brought us Liar's Poker; now he brings us the endgame.
http://www.portfolio.com/news-markets/national-news/portfolio/2008/11/11/The-End-of-Wall-Streets-Boom?print=true
He brought us Liar's Poker; now he brings us the endgame.
20.11.08
Celebritocracy
Fifty years ago, the sociologist Michael Young—my father—published a book that, in his own words, gave him a minor claim to immortality. A dystopian satire in the same vein as 1984, it was an attempt to sound a warning bell about various social and political trends by describing a future in which they had come to fruition. It wasn't as successful as Orwell's book, but it did enjoy some afterlife thanks to a word my father coined to describe the new ruling class that would hold sway in this nightmarish future. It was called The Rise of the Meritocracy.
People are often surprised when I tell them that my father invented the word "meritocracy"—they assume it must have been around for ever—and even more astonished to learn that he wasn't a fan. How could anyone be against meritocracy? It seems incomprehensible today. The commitment to making Britain more meritocratic has become an ideological shibboleth that almost no one dissents from.
Michael disapproved of meritocracy because he saw it as a way of legitimising inequality. After all, if everyone starts out on a level playing field, then the resulting allocation of rewards—however unequal—seems fair. Those at the very pinnacle of our society might not inherit their privileged position, as their forebears had done, but its pyramid-like shape would be preserved. Indeed, once this hierarchical structure became legitimised, as it would in a meritocratic society, it was likely that power and wealth would become concentrated in even fewer hands.
Just how prescient was The Rise of the Meritocracy? Equality of opportunity has become every bit as entrenched as my father thought it would, but that hasn't had a corresponding impact on the composition of Britain's elites. Much of today's ruling class is still drawn from a narrow band of schools and universities and while those institutions accept only the "brightest" applicants they have not had to compete with the rest of the population on a level playing field. They have not earned their place at the top on merit alone which, for the purposes of his book, my father defined as IQ + effort.
The Sutton Trust—which tirelessly compiles evidence to show that Britain is not a meritocracy—has calculated that the proportion of privately educated high court judges has barely changed in the past 18 years: 74 per cent in 1989 compared to 70 per cent in 2007. And according to the trust, "the proportion of independently educated top newspaper editors, columnists and news presenters and editors has actually increased over the past 20 years."
Analysts of the broader sweep of social mobility are divided on how much it has slowed down (see David Goodhart's previous article), but there is some consensus that there has been a falling off since the time my father wrote Meritocracy. What there is no dispute about is the surge in inequality in Britain in recent decades. According to a recent survey by the OECD, income inequality grew steadily from the mid-1970s, dipped briefly in the mid-1990s, then continued to grow until 2000 when it started to dip again. Overall, the long-term trend is towards greater inequality. In 2005, the earnings gap between rich and poor was 20 per cent wider than it was in 1985.
This begs the question of what, if anything, legitimises Britain's current levels of inequality? In the absence of genuine equality of opportunity, what secures the consent of ordinary people to the unequal distribution of rewards? To put it another way, if the meritocratic bulwark against egalitarianism that my father identified has failed to materialise, why are higher taxes on the rich not more popular?
Writing in the 1960s, the sociologist WG Runciman, author of Relative Deprivation and Social Justice, argued that ordinary people tolerate high levels of inequality because they don't compare themselves with those at the top, but with people like themselves. By that measure, they are far better off than they were 50 years ago, even if their incomes have grown by a smaller percentage than the top earners.
However, this argument doesn't seem plausible any longer. Mark Pearson, the head of the OECD's social policy division, has identified something he calls the "Hello! magazine effect" whereby people now compare themselves with the most successful members of society, thereby increasing their insecurity and sense of deprivation. This appears to be tied up with the decline of deference. A person's social background may still affect their life chances, but it no longer plays such an important role in determining their attitudes and aspirations, particularly towards those higher up—and lower down—the food chain. That famous sketch on the Frost Report in which Ronnie Corbett, Ronnie Barker and John Cleese explained the workings of the British class system—"I look up to him because he is upper class, but I look down on him because he is lower class"—now belongs to a bygone age.
As Ferdinand Mount notes in Mind the Gap: "The old class markers have become taboo… The manners of classlessness have become de rigueur." To put it another way: a profound increase in economic inequality has been accompanied by a dramatic increase in social and cultural equality. We can see this most clearly in changing attitudes to popular culture. It is a cliché to point out that the distinction between high and low culture has all but disappeared in the past 25 years or so. In this free-for-all it is high culture that has been the loser, with most educated people under 45 embracing popular culture almost exclusively. As a student in the mid-80s, I was proud to call myself an "Oxbridge Gooner"—one of several dozen students at Oxford and Cambridge who regularly attended Arsenal games—and such groupings are commonplace now. The rich and the poor no longer live in two nations, at least not socially. Economic divisions may be more pronounced than ever, but we support the same football teams, watch the same television programmes, go to the same movies. Mass culture is for everyone, not just the masses.
***
Yet if Britain is no longer a deferential nation—if its citizens don't accept that their place in life should be dictated by their class status—why is egalitarianism still the dog that hasn't barked in British politics since 1979? Could it be that partly because of the power and ubiquity of popular culture, Britain is now perceived to be far more meritocratic than it actually is? This phenomenon has been widely documented in America, where belief in the meritocratic American dream persists with low social mobility.
If this is the case, I believe it is largely due to the emergence of a new class that my father didn't anticipate and which, for want of a better word, I shall call the "celebritariat." I am thinking of the people featured in Heat magazine, rather than Hello!—the premier league footballers and their wives, pop stars, movie stars, soap stars and the like. For all its shortcomings, the celebrity class is broadly meritocratic and because it is so visible it may help to persuade people that Britain is a fairer place than it really is.
One of the criticisms levelled at Britain's professional elites is that they have become closed shops, creating insurmountable barriers to entry. The same could not be said of the celebritariat, a class that is constantly being refreshed, with old members being forced out to make way for the new. Indeed, we now hold national competitions—The X Factor, Pop Idol, Britain's Got Talent—to discover genuinely deserving candidates to promote into the celebrity class.
If the celebritariat really does play a role in legitimising economic inequality, it is also because ordinary people imagine that they, too, could become members. A YouGov poll of nearly 800 16-19-year-olds conducted on behalf of the Learning and Skills Council in 2006 revealed that 11 per cent said they were "waiting to be discovered."
Some commentators believe that the preponderance of reality shows and their casts of freaks and wannabes—the lumpen celebritariat—have devalued the whole notion of stardom. Yet the YouGov survey discovered that appearing on a reality television programme was a popular career option among teenagers, and another poll found 26 per cent of 16 to 19 year olds believe it is easy to secure a career in sports, entertainment or the media. If the existence of the celebrity class does play a role in securing people's consent to our winner-takes-all society, then the fact that the entry requirements are so low helps this process along. If people believe there is a genuine chance they might be catapulted to the top, they're more likely to endorse a system in which success is so highly rewarded. To paraphrase the advertising slogan for the National Lottery, it could be them. As with the lottery, people may know that the actual chances of winning are low but the selection mechanism itself is fair—a level playing field. After that, their "specialness" will take care of the rest.
The music critic Albert Goldman identified this attitude in his 1978 book on the disco phenomenon: "Everybody sees himself as a star today. This is both a cliché and a profound truth. Thousands of young men and women have the looks, the clothes, the hairstyling, the drugs… the self-confidence, and the history of conquest that proclaim a star… Never in the history of showbiz has the gap between amateur and professional been so small." (Quoted by James Wolcott in "Now, Voyeur," Vanity Fair, September 2000.)
The celebritariat—and the illusion of easy access to it—has played the role in postwar Britain that my father expected to be played by the educational meritocracy. The Rise of the Meritocracy ends with a riot at Peterloo in which the disenfranchised masses overthrow their new masters. This is largely because the meritocratic class has become so efficient at identifying the most able children at birth that the ones left behind have no hope of making it. Will the day come when the celebritariat endangers its own existence by becoming a self-perpetuating elite, closed off to new members? There are signs that this is beginning to happen, with the children of famous people inheriting their celebrity status, just as aristocrats inherited their parents estates. It sounds odd to say it, but for those like my father who dream of turning Britain into a socialist paradise, the greatest cause for hope may be the existence of Peaches Geldof.
People are often surprised when I tell them that my father invented the word "meritocracy"—they assume it must have been around for ever—and even more astonished to learn that he wasn't a fan. How could anyone be against meritocracy? It seems incomprehensible today. The commitment to making Britain more meritocratic has become an ideological shibboleth that almost no one dissents from.
Michael disapproved of meritocracy because he saw it as a way of legitimising inequality. After all, if everyone starts out on a level playing field, then the resulting allocation of rewards—however unequal—seems fair. Those at the very pinnacle of our society might not inherit their privileged position, as their forebears had done, but its pyramid-like shape would be preserved. Indeed, once this hierarchical structure became legitimised, as it would in a meritocratic society, it was likely that power and wealth would become concentrated in even fewer hands.
Just how prescient was The Rise of the Meritocracy? Equality of opportunity has become every bit as entrenched as my father thought it would, but that hasn't had a corresponding impact on the composition of Britain's elites. Much of today's ruling class is still drawn from a narrow band of schools and universities and while those institutions accept only the "brightest" applicants they have not had to compete with the rest of the population on a level playing field. They have not earned their place at the top on merit alone which, for the purposes of his book, my father defined as IQ + effort.
The Sutton Trust—which tirelessly compiles evidence to show that Britain is not a meritocracy—has calculated that the proportion of privately educated high court judges has barely changed in the past 18 years: 74 per cent in 1989 compared to 70 per cent in 2007. And according to the trust, "the proportion of independently educated top newspaper editors, columnists and news presenters and editors has actually increased over the past 20 years."
Analysts of the broader sweep of social mobility are divided on how much it has slowed down (see David Goodhart's previous article), but there is some consensus that there has been a falling off since the time my father wrote Meritocracy. What there is no dispute about is the surge in inequality in Britain in recent decades. According to a recent survey by the OECD, income inequality grew steadily from the mid-1970s, dipped briefly in the mid-1990s, then continued to grow until 2000 when it started to dip again. Overall, the long-term trend is towards greater inequality. In 2005, the earnings gap between rich and poor was 20 per cent wider than it was in 1985.
This begs the question of what, if anything, legitimises Britain's current levels of inequality? In the absence of genuine equality of opportunity, what secures the consent of ordinary people to the unequal distribution of rewards? To put it another way, if the meritocratic bulwark against egalitarianism that my father identified has failed to materialise, why are higher taxes on the rich not more popular?
Writing in the 1960s, the sociologist WG Runciman, author of Relative Deprivation and Social Justice, argued that ordinary people tolerate high levels of inequality because they don't compare themselves with those at the top, but with people like themselves. By that measure, they are far better off than they were 50 years ago, even if their incomes have grown by a smaller percentage than the top earners.
However, this argument doesn't seem plausible any longer. Mark Pearson, the head of the OECD's social policy division, has identified something he calls the "Hello! magazine effect" whereby people now compare themselves with the most successful members of society, thereby increasing their insecurity and sense of deprivation. This appears to be tied up with the decline of deference. A person's social background may still affect their life chances, but it no longer plays such an important role in determining their attitudes and aspirations, particularly towards those higher up—and lower down—the food chain. That famous sketch on the Frost Report in which Ronnie Corbett, Ronnie Barker and John Cleese explained the workings of the British class system—"I look up to him because he is upper class, but I look down on him because he is lower class"—now belongs to a bygone age.
As Ferdinand Mount notes in Mind the Gap: "The old class markers have become taboo… The manners of classlessness have become de rigueur." To put it another way: a profound increase in economic inequality has been accompanied by a dramatic increase in social and cultural equality. We can see this most clearly in changing attitudes to popular culture. It is a cliché to point out that the distinction between high and low culture has all but disappeared in the past 25 years or so. In this free-for-all it is high culture that has been the loser, with most educated people under 45 embracing popular culture almost exclusively. As a student in the mid-80s, I was proud to call myself an "Oxbridge Gooner"—one of several dozen students at Oxford and Cambridge who regularly attended Arsenal games—and such groupings are commonplace now. The rich and the poor no longer live in two nations, at least not socially. Economic divisions may be more pronounced than ever, but we support the same football teams, watch the same television programmes, go to the same movies. Mass culture is for everyone, not just the masses.
***
Yet if Britain is no longer a deferential nation—if its citizens don't accept that their place in life should be dictated by their class status—why is egalitarianism still the dog that hasn't barked in British politics since 1979? Could it be that partly because of the power and ubiquity of popular culture, Britain is now perceived to be far more meritocratic than it actually is? This phenomenon has been widely documented in America, where belief in the meritocratic American dream persists with low social mobility.
If this is the case, I believe it is largely due to the emergence of a new class that my father didn't anticipate and which, for want of a better word, I shall call the "celebritariat." I am thinking of the people featured in Heat magazine, rather than Hello!—the premier league footballers and their wives, pop stars, movie stars, soap stars and the like. For all its shortcomings, the celebrity class is broadly meritocratic and because it is so visible it may help to persuade people that Britain is a fairer place than it really is.
One of the criticisms levelled at Britain's professional elites is that they have become closed shops, creating insurmountable barriers to entry. The same could not be said of the celebritariat, a class that is constantly being refreshed, with old members being forced out to make way for the new. Indeed, we now hold national competitions—The X Factor, Pop Idol, Britain's Got Talent—to discover genuinely deserving candidates to promote into the celebrity class.
If the celebritariat really does play a role in legitimising economic inequality, it is also because ordinary people imagine that they, too, could become members. A YouGov poll of nearly 800 16-19-year-olds conducted on behalf of the Learning and Skills Council in 2006 revealed that 11 per cent said they were "waiting to be discovered."
Some commentators believe that the preponderance of reality shows and their casts of freaks and wannabes—the lumpen celebritariat—have devalued the whole notion of stardom. Yet the YouGov survey discovered that appearing on a reality television programme was a popular career option among teenagers, and another poll found 26 per cent of 16 to 19 year olds believe it is easy to secure a career in sports, entertainment or the media. If the existence of the celebrity class does play a role in securing people's consent to our winner-takes-all society, then the fact that the entry requirements are so low helps this process along. If people believe there is a genuine chance they might be catapulted to the top, they're more likely to endorse a system in which success is so highly rewarded. To paraphrase the advertising slogan for the National Lottery, it could be them. As with the lottery, people may know that the actual chances of winning are low but the selection mechanism itself is fair—a level playing field. After that, their "specialness" will take care of the rest.
The music critic Albert Goldman identified this attitude in his 1978 book on the disco phenomenon: "Everybody sees himself as a star today. This is both a cliché and a profound truth. Thousands of young men and women have the looks, the clothes, the hairstyling, the drugs… the self-confidence, and the history of conquest that proclaim a star… Never in the history of showbiz has the gap between amateur and professional been so small." (Quoted by James Wolcott in "Now, Voyeur," Vanity Fair, September 2000.)
The celebritariat—and the illusion of easy access to it—has played the role in postwar Britain that my father expected to be played by the educational meritocracy. The Rise of the Meritocracy ends with a riot at Peterloo in which the disenfranchised masses overthrow their new masters. This is largely because the meritocratic class has become so efficient at identifying the most able children at birth that the ones left behind have no hope of making it. Will the day come when the celebritariat endangers its own existence by becoming a self-perpetuating elite, closed off to new members? There are signs that this is beginning to happen, with the children of famous people inheriting their celebrity status, just as aristocrats inherited their parents estates. It sounds odd to say it, but for those like my father who dream of turning Britain into a socialist paradise, the greatest cause for hope may be the existence of Peaches Geldof.
19.11.08
Milton
When Richard J. DuRocher, a professor of English at St. Olaf College, in Northfield, Minn., told one of his classes that he was running a marathon, everybody cheered. Then he told them what kind of marathon: a straight-through, out-loud reading of John Milton's Paradise Lost — all 12 books of it, from Satan's fall to Adam and Eve's eviction from the Garden of Eden.
If that sounds eccentric, even masochistic, consider that December 9 is the poet's 400th birthday. What better way to mark the quatercentenary than to read his greatest work aloud? Marathons are happening at the University of Cambridge, Milton's alma mater; at the University of Richmond; and at dozens of other places, notes Mr. DuRocher.
If it's good enough for James Joyce, whose Ulysses gets a public airing every Bloomsday (June 16), it's good enough for John Milton. But is it heaven or hell for the participants?
At 9 a.m. on a blustery Saturday in late October, Mr. DuRocher kicked off the Milton marathon in the two-story atrium, or Crossroads, of Buntrock Commons, St. Olaf's student center. The quiet-enthusiast type, he had corralled a crowd of about 30 to get things started, including the students in his seminar on Milton and ethics. Most people brought their own copies of Paradise Lost, but the professor had a stack of extras handy.
Paula Carlson, a vice president at the college, took the microphone and delivered the famous opening lines: "Of man's first disobedience, and the fruit/Of that forbidden tree, whose mortal taste/Brought death into the World, and all our woe,/With loss of Eden, till one greater Man/Restore us, and regain the blissful Seat,/Sing, Heavenly Muse... ."
A red apple made the rounds, each reader tempting the next. By 9:53 they'd made it to Book II. Students traveled the stone stairs to the second-floor cafeteria like angels ascending to and descending from the celestial realm.
In between his reading stints, Chad Goodroad, a senior majoring in English and political science, hawked black "Milton Marathon" T-shirts at a card table. Someone asked him how sales were. "Crazy," he said. "Actually, kinda slow." A student in one of the shirts knitted her way through Book III. A professor's toddlers played nearby on a harvest display of pumpkins and sheaves.
By Book IV, the lunchtime crowd had begun to wander into the Commons, making it hard to hear, but Milton's blank verse was unstoppable. The marathon relocated to a gable room in the campus library. New readers showed up; others left for sports matches, then wandered back a couple of hours later. Mr. DuRocher estimated that 200 people had shown up over the course of the day.
Participants fortified themselves with coffee and Subway sandwiches. Another English professor contributed a devil's-food cake and a pair of devil's horns. Somebody drew a picture of the archangel Michael on the chalkboard.
"It's cool," Mr. Goodroad observed midafternoon, when the group had made it to Book VI. "It's kind of like a purging."
Miraculously, nobody's energy flagged. It wasn't just the coffee and sandwiches; read out loud, Milton's blank verse can be propulsive, and the readers had caught the rhythm.
At 5 p.m. a student passed around a basket of apples, just in time for Book IX and the temptation of Eve. A colleague of Mr. DuRocher's with just the right British accent turned up to read the part of Satan. When Eve took a bite, everyone else did, too. The apples tasted pretty good.
Around 7:30 and Book XI, there were still 15 people in the room. A few had been along since the beginning, including Mr. Goodroad and Johanna Ruprecht, a senior English major from Lewiston, Minn., who were two of the most enthusiastic readers. Charles Drotning, of the Class of 1970, was another. A thin man with an Old Testament beard, Mr. Drotning appeared determined to stick it out until the bittersweet end.
"Have we hit the wall yet?" someone asked.
"Oh, we're way beyond the wall," Mr. DuRocher joked.
At 8:16, the end of Book XI arrived. By 9 p.m. the readers had reached the finish line: Adam and Eve's expulsion from Paradise, at the end of Book XII. Everybody, including Mr. Drotning, clapped and cheered. Then Mr. DuRocher and a handful of students went out for ice cream. Somewhere, Milton smiled.
Here are some of the things you learn when you participate in a Milton marathon:
Milton is not as boring as you think. Paradise Lost has something for everyone: Hot but innocent sex! (You thought Adam and Eve spent all their time in Eden gardening?) Descriptions of hellfire that would make The Lord of the Rings' archfiend, Sauron, weep with envy! Epic battles, with angels hurling mountains at their demonic foes! This is edge-of-your-seat material. "It's a really cool story, which I wasn't expecting," said Anna Coffey, a sophomore who took part in the reading to get a jump on her homework for a "Great Conversations" core-curriculum course.
Milton is not that hard to read out loud. As Mr. DuRocher pointed out in a set of "Guidelines for Reciting" he handed out before the marathon, "Paradise Lost is written in modern English." Compared with Beowulf, Paradise Lost is a walk in the park.
Milton is really hard to read out loud. Very few people get words like "puissance" right on the first try. Milton loved a runaway sentence and just about any now-obscure classical or geographical reference he could get his hands on, many of them polysyllabic nightmares. Partway through Book VI, Mr. DuRocher offered advice to the tongue-tied. "Whenever you encounter a word you don't know, that's a word to pronounce with special certainty," he said. "It's probably best to mispronounce demonic names anyway."
It's worth it. "It's really a good poem," said Mr. Goodroad. "It's a lot better to hear it than to read it."
If that sounds eccentric, even masochistic, consider that December 9 is the poet's 400th birthday. What better way to mark the quatercentenary than to read his greatest work aloud? Marathons are happening at the University of Cambridge, Milton's alma mater; at the University of Richmond; and at dozens of other places, notes Mr. DuRocher.
If it's good enough for James Joyce, whose Ulysses gets a public airing every Bloomsday (June 16), it's good enough for John Milton. But is it heaven or hell for the participants?
At 9 a.m. on a blustery Saturday in late October, Mr. DuRocher kicked off the Milton marathon in the two-story atrium, or Crossroads, of Buntrock Commons, St. Olaf's student center. The quiet-enthusiast type, he had corralled a crowd of about 30 to get things started, including the students in his seminar on Milton and ethics. Most people brought their own copies of Paradise Lost, but the professor had a stack of extras handy.
Paula Carlson, a vice president at the college, took the microphone and delivered the famous opening lines: "Of man's first disobedience, and the fruit/Of that forbidden tree, whose mortal taste/Brought death into the World, and all our woe,/With loss of Eden, till one greater Man/Restore us, and regain the blissful Seat,/Sing, Heavenly Muse... ."
A red apple made the rounds, each reader tempting the next. By 9:53 they'd made it to Book II. Students traveled the stone stairs to the second-floor cafeteria like angels ascending to and descending from the celestial realm.
In between his reading stints, Chad Goodroad, a senior majoring in English and political science, hawked black "Milton Marathon" T-shirts at a card table. Someone asked him how sales were. "Crazy," he said. "Actually, kinda slow." A student in one of the shirts knitted her way through Book III. A professor's toddlers played nearby on a harvest display of pumpkins and sheaves.
By Book IV, the lunchtime crowd had begun to wander into the Commons, making it hard to hear, but Milton's blank verse was unstoppable. The marathon relocated to a gable room in the campus library. New readers showed up; others left for sports matches, then wandered back a couple of hours later. Mr. DuRocher estimated that 200 people had shown up over the course of the day.
Participants fortified themselves with coffee and Subway sandwiches. Another English professor contributed a devil's-food cake and a pair of devil's horns. Somebody drew a picture of the archangel Michael on the chalkboard.
"It's cool," Mr. Goodroad observed midafternoon, when the group had made it to Book VI. "It's kind of like a purging."
Miraculously, nobody's energy flagged. It wasn't just the coffee and sandwiches; read out loud, Milton's blank verse can be propulsive, and the readers had caught the rhythm.
At 5 p.m. a student passed around a basket of apples, just in time for Book IX and the temptation of Eve. A colleague of Mr. DuRocher's with just the right British accent turned up to read the part of Satan. When Eve took a bite, everyone else did, too. The apples tasted pretty good.
Around 7:30 and Book XI, there were still 15 people in the room. A few had been along since the beginning, including Mr. Goodroad and Johanna Ruprecht, a senior English major from Lewiston, Minn., who were two of the most enthusiastic readers. Charles Drotning, of the Class of 1970, was another. A thin man with an Old Testament beard, Mr. Drotning appeared determined to stick it out until the bittersweet end.
"Have we hit the wall yet?" someone asked.
"Oh, we're way beyond the wall," Mr. DuRocher joked.
At 8:16, the end of Book XI arrived. By 9 p.m. the readers had reached the finish line: Adam and Eve's expulsion from Paradise, at the end of Book XII. Everybody, including Mr. Drotning, clapped and cheered. Then Mr. DuRocher and a handful of students went out for ice cream. Somewhere, Milton smiled.
Here are some of the things you learn when you participate in a Milton marathon:
Milton is not as boring as you think. Paradise Lost has something for everyone: Hot but innocent sex! (You thought Adam and Eve spent all their time in Eden gardening?) Descriptions of hellfire that would make The Lord of the Rings' archfiend, Sauron, weep with envy! Epic battles, with angels hurling mountains at their demonic foes! This is edge-of-your-seat material. "It's a really cool story, which I wasn't expecting," said Anna Coffey, a sophomore who took part in the reading to get a jump on her homework for a "Great Conversations" core-curriculum course.
Milton is not that hard to read out loud. As Mr. DuRocher pointed out in a set of "Guidelines for Reciting" he handed out before the marathon, "Paradise Lost is written in modern English." Compared with Beowulf, Paradise Lost is a walk in the park.
Milton is really hard to read out loud. Very few people get words like "puissance" right on the first try. Milton loved a runaway sentence and just about any now-obscure classical or geographical reference he could get his hands on, many of them polysyllabic nightmares. Partway through Book VI, Mr. DuRocher offered advice to the tongue-tied. "Whenever you encounter a word you don't know, that's a word to pronounce with special certainty," he said. "It's probably best to mispronounce demonic names anyway."
It's worth it. "It's really a good poem," said Mr. Goodroad. "It's a lot better to hear it than to read it."
THE TIE
Back in 1968, When a Tie Was No Tie
By MANOHLA DARGIS
For most of the world, I suspect, the year 1968 signifies upheaval, revolution, power to the people, Vietnam and My Lai, Paris in flames, Martin and Bobby, Nixon versus Humphrey. Another great rivalry played out that year in the form of a college football game. And while it seems absurd to include such a picayune event in the annals, the filmmaker Kevin Rafferty makes the case for remembrance and for the art of the story in his preposterously entertaining documentary “Harvard Beats Yale 29-29,” preposterous at least for those of us who routinely shun that pagan sacrament.
True gridiron believers doubtless know every unlikely, heart-skipping minute of this showdown. (The schools, like some others, honor their football rivalry with vainglorious capitalization, calling each matchup The Game.) On Nov. 23, 1968, the undefeated Yale team and its two glittering stars — the quarterback Brian Dowling and the running back Calvin Hill — went helmet to helmet against its longtime rival, Harvard, also undefeated. Mr. Dowling, a legendary figure whom grown men still call god (and the inspiration for Garry Trudeau’s Doonesbury character B. D.), had not lost a game he started since the sixth grade, a record that well into the fourth quarter, with Yale leading by 16 points, seemed safe.
Everything changed in the final 42 seconds as all the forces of the universe, or so it seemed, shifted and one player after another either rose to the occasion or stumbled with agonizing frailty. Gods became men as the ball was lost and found and one improbable pass after another was completed. In front of the increasingly raucous packed stadium, each play became an epic battle in miniature with every second stretching into an eternity. As in film, time in football doesn’t tick, it races and oozes, a fact that Mr. Rafferty, working as his own editor and using the simplest visual material — talking-head interviews and game footage — exploits for a narrative that pulses with the artful, exciting beats of a thriller.
What’s most surprising about this consistently surprising movie is how forcefully those beats resonate, even though you know how the story ends from the start. (Take another look at the coyly, cleverly enigmatic title, borrowed from the famous headline in The Harvard Crimson.) One reason for the excitement is the game, of course, which remains a nail-biter despite the visual quality of the footage, which is so unadorned and so humble — and almost entirely in long shot — it looks like a dispatch from a foreign land. And in some ways it was: Football fans still wore raccoon coats to games and the women in the stands cheering for Yale could not attend the college. The same month, Yale announced it was (finally) opening that door.
This history helps explain why there are no women here, at least in close-up. “Harvard Beats Yale 29-29” is very much about men, triumphant, regretful, defiant, sentimental, touchingly vulnerable men who are made all the more poignant with each image of them as young players. For some, the game was and remains the greatest moment of their lives — even better than sex, one volunteers, prompting Mr. Rafferty to ask off-camera if the man had then been a virgin (no). Mr. Rafferty, himself a Harvard man, films his subjects (Tommy Lee Jones, a Harvard lineman, included) with a lack of fuss in plain kitchens and cluttered offices. He lets them roam around their memories and, for a time, gives them back sweet youth.
HARVARD BEATS YALE 29-29
Opens on Wednesday in Manhattan.
Produced, directed and edited by Kevin Rafferty; director of photography, Mr. Rafferty; released by Kino International. At the Film Forum, 209 West Houston Street, west of Avenue of the Americas, West Village. Running time: 1 hour 45 minutes. This film is not rated.
Subscribe to:
Posts (Atom)