The giant video screen at 745 Seventh Avenue, in Manhattan, is still lit up: only now, instead of the old Lehman Brothers promo, with its tossing oceans and desert sunsets, it projects the ice-blue bling of Barclays Capital, five-storeys high. The problem is, though the lights are still on for finance capital, ideologically there’s nobody home.
Lehman's bankruptcy marked the end of a 20-year experiment in financial deregulation. But it was Alan Greenspan's congressional testimony, a month later, that marked the collapse of something bigger: the neoliberal ideology that has underpinned it all.
It was Greenspan who had begun ripping away restrictions on financial speculation and investment banking in 1987. Last month, he said: "I have found a flaw. I don't know how significant or permanent it is. But I have been very distressed by that fact . . . Those of us who have looked to the self-interest of lending institutions to protect shareholders' equity, myself especially, are in a state of shock and disbelief."
The belief in self-interest as the guiding principle of commerce is as old as Adam Smith. What happened with the Anglo-Saxon model of capitalism was something different: the principle of rational self-interest was elevated to replace regulation and the state. Selfishness became a virtue. Inspired by Ayn Rand's credo - "I will never live for the sake of another man, nor ask another man to live for mine" - the giants of global finance revelled in amoralism. Morgan Stanley boss John Mack's legendary trading-floor motto - "There's blood in the water, let's go kill somebody" - sums up the era.
But the theory was flawed. Instead of safeguarding the property of shareholders, self-regulation drove the system to the point of collapse. Trillions of dollars worth of capital has been destroyed. "My view of the range of dispersion of outcomes has been shaken," Greenspan conceded. That's a logical response when the range of outcomes is clustered around the collapse of the savings system, the evaporation of global credit and the bankruptcy of most banks.
But selfishness was not the only tenet of neo-liberalism. Any definition of the term would include: a belief in the market as the only guarantor of prosperity and democracy; the futility of state intervention in pursuit of social justice; the creative destruction of cherished institutions and stable communities; the shrinkage of the state to regulatory functions only, and then as minimal as possible.
And the problem for the G20 leaders who will assemble at the Washington summit on 15 Nov ember is this: every single one of them has, to a greater or lesser extent, bought into the neoliberal ideology. It has dictated the direction of travel even in economies such as Brazil, India, Indonesia and China, classified as "mostly unfree" on the neoliberal league table.
The summit's most pressing task is to come up with a co-ordinated crisis response: for all the rhetoric, this is a firefighting operation not a second Bretton Woods. In the end, the route to a Bretton Woods-style settlement may be impassable for the weakened, multi-polar capitalism represented by the G20. But, even to begin that journey, there must be an honest reckoning with neoliberalism.
An ideology does three things: it justifies the economic dominance of a ruling group; it is transmitted through that group's control of the media and education; and it describes the experience of millions of people accurately enough for them to accept it as truth. But it does not have to be logical. For this reason, picking logical flaws in neoliberalism has been an exercise with diminishing returns.
For example, Milton Friedman's assertion that free-market capitalism and democracy are mutually reinforcing always looked a non-sequitur after he hotfooted it to Chile in 1975, personally urging General Pinochet to inflict a neoliberal economic "shock", even as the secret police were administering electric shocks to the genitals of oppositionists. But his theories continued to inspire policymakers.
Instead of logic, any balance sheet of neoliberalism has to begin from its outcomes. I will list five negative outcomes for countries following the Anglo-Saxon model:
In the first place, rising inequality. Between 1947 and 1973 the income of the poorest fifth of US families grew 116 per cent, higher than any other group. From 1974 to 2004 it grew by just 2.8 per cent. In the UK, the share of national income received by the bottom 10 per cent fell from 4.2 per cent in 1979 to 2.7 per cent in 2002.
Second, the replacement of high wages by high debt. The real wages of the average American male worker are today below what they were in 1979; and for the poorest 20 per cent, much lower. In 1979, personal household debt was 46 per cent of America's GDP; now it is 98 per cent. In the UK, real household incomes grew, but slower than in the postwar boom, until early this decade, since when they have fallen. The debt pattern, however, followed the US; 30 years ago British households were in debt to the tune of 20 per cent of GDP, now it is 80 per cent.
Third is the redistribution of profits from non-financial companies to the finance sector. In 1960s America, the pretax profits of financial firms made up 14 per cent of corporate profits; now they make up 39 per cent. Most of this profit is not generated from financing productive business: the world's total stock of financial assets is three times as large as global GDP. In 1980, it was about equal to GDP.
The new power of finance capital not only creates asset bubbles, as with the dotcom, housing and commodity bubbles of the past decade, but it allows speculative capital to descend on individual companies, countries and industry sectors, smash them and move on. I present the current economic plight of Hungary as Exhibit A.
Fourth is the growth of personal and financial insecurity, the destruction of social capital and the resulting rise in crime. If you want data, then the four stark pages of membership graphs at the end of Robert Putnam's celebrated book Bowling Alone show the decline of almost every voluntary association in America during the neoliberal age. If you prefer qualitative research, walk the streets of any former industrial city at night.
Fifth is the relentless commercialisation of all forms of human life: the privatisation of drinking water that provoked the people of Coch abamba, Bolivia to revolt in 2000; the creation of a private army of 180,000 military contractors in Iraq, unaccountable to international law. In these and many other instances, the functions of the state have been turned over to private companies to the financial detriment of taxpayers, the material detriment of consumers and the loss of democratic accountability.
But there is a plus side. Since 1992, there has been stability and growth across the OECD countries and beyond, albeit lower than the average growth achieved during the postwar boom years. There has been a marked fall in absolute poverty, with the number of people living on less than $2 a day falling by 52 per cent in Asia, 30 per cent in Latin America (though rising by 3 per cent in Africa) between the years 1982-2002. And though the data is mixed, many of neoliberalism's critics accept that inequality declines as per-capita GDP growth improves.
There has been a huge movement of humanity from the farm to the factory, and 200 million people have migrated from the poor world to the rich. Access to the financial system has brought rising liquidity: access to homeownership and overdrafts for families on low pay was real, whatever its macroeconomic outcome. And above all, the musty cultural and institutional barriers that made life a misery for the young in the 1960s and 1970s are largely gone; the flipside of commoditisation has been the decline of dependency and paternalism in social life.
And this has been the source of neoliberalism's strength as an ideology: borrow big-time, negotiate your own salary, duck and dive, lock your door at night. That is the new way of life for the world's workforce. My father's generation, the generation of organised workers which saw industry and social solidarity destroyed in the 1980s, could never really accept it. But hundreds of millions of people under the age of 40 know nothing else. And if you live in a Kenyan slum or a Shenzhen factory, you have seen your life chances rise spectacularly higher than those of your father's generation, even if the reverse is true in, say, Salford or Detroit.
Until 15 September 2008 (the day Lehman Brothers filed for Chapter 11 bankruptcy protection, the largest bankruptcy in US history), the left and the right were engaged in a political debate that revolved around the balance of these positive and negative impacts. Today that debate is over: we now know that neoliberalism nearly crashed the whole financial system. I will repeat, because the adrenalin rush has subsided and it is easy to forget: neoliberalism brought the world to the brink of an economic nuclear winter. Not by accident but because of a flaw in its central mechanism. It is for this reason that President Sarkozy (once labelled "Monsieur Thatcher" by the French left) declared it dead - not flawed - but dead. "The idea that markets were always right was mad . . . The present crisis must incite us to re-found capitalism on the basis of ethics and work . . . Laissez-faire is finished. The all-powerful market that always knows best is finished," he said.
S o what comes next? Though governments are scrambling to deploy Keynesian anti-crisis measures - from George Bush's tax cut to Gordon Brown's borrowing hike - it is axiomatic that the developed world cannot return to the way things were before the 1980s: the Keynesian model broke down spectacularly, and could cure neither high inflation nor economic stagnation.
It is clear, from the sheer level of pain and trauma inflicted by the changes of the past 20 years, that we have lived through the birth of something. Its founding ideology was neoliberalism; its most tangible result was globalisation; and it was achieved through class struggle by the rich against the poor. But none of these facts can encompass the scale of change.
Because none of them allows for the most fundamental change - that information has become a primary factor of production. C omputing power has doubled every 24 months; the internet and mobile telephony have, in the past ten years, altered the patterns of human life more profoundly than any single economic policy. Info-capitalism has been inadequately theorised: call it "post-Fordism", techno-capital or the knowledge economy, whatever the label it remains the central fact of the early 21st century.
If you accept this, then the experience of neo- liberalism looks less like the dawn of a free-market empire, more like the period between the invention of the factory system and the passing of the first effective factory legislation: between the establishment of Arkwright's mill at Cromford in 1771 and the Factory Act of 1844.
For much of that period, the pioneers of industrial capitalism believed that any regulation would kill the dynamism of the system. They too had a celebrity economist to justify their actions, namely Nassau Senior, the author of the theory that all profits were made in the last hour of the working day. The fate of capitalism, quipped reformer William Cobbett, depended on 300,000 little girls in Lancashire: "For it was asserted, that if these little girls worked two hours less per day, our manufacturing superiority would depart from us."
Child labour was abolished; minimal standards of order and humanity were imposed on the factories. But capitalism did not die - it took off. It is no accident, incidentally, that 1844 was also the year Britain, traumatised by recurrent financial panics, enshrined the supremacy of the central bank and the gold standard in legislation. If the parallel is valid, then the new regulations and institutions under discussion in Washington stand a chance not of killing info-capitalism but of unleashing it.
What are the intellectual sources for the system that will replace neoliberalism? Most of the prophets of doom in advance of the credit crunch were survivors from the Keynesian era: Paul Krugman, Joseph Stiglitz, George Soros, Nouriel Roubini, Morgan Stanley economist Stephen Roach. But with the partial exception of Stiglitz, they remain dislocated from the grass-roots opposition to neoliberalism. In turn, this opposition, dominated by the principles of anarchism and charity, has revelled in its own diversity and lack of engagement with state-level solutions.
As for the world's policymakers they, for now, resemble the Hoover administration in 1930, or if you are feeling really unkind, Chamberlain's British government in 1940. They are confronted by a crisis they did not think would happen. They are approaching it with the only tools they have - but they are the old tools: the old alliances, the old experts, the unreconstructed ideas and plans: Doha, Basel II, the Lisbon agenda. The IMF's conditions for bailing out eastern Europe - public spending cuts, interest-rate rises, privatisations - confirm the pattern.
The aim, made explicit during a speech on 28 October by Catherine Ashton, the EU's new trade commissioner, is to enact crisis measures while explaining to the public that "interventions and excessive use of public subsidies - while attractive today, will damage us tomorrow". This does not match the rhetoric coming out of Paris and Washington about the "end of trickle-down" and the death of laissez-faire; and it tends to ignore the fact that the most fundamental problem created by neoliberalism was not deregulation but the replacement of high wages by high debt. In other words, it is not the policy framework that is in trouble, it is the growth model.
There are three possible ways out. First, the revival of neoliberalism in a hair shirt: less addicted to the celebration of greed; with government spending temporarily replacing consumer debt as the driver of demand; and with some attempt at co-ordinated re-regulation. That is the maximum that can come out of the Washington summit.
Second, the abandonment of a high-growth economy: if it can't be driven by wages, debt or public spending then it can't exist. And if it can't exist in America, then Asia's model of high exports and high savings does not work, either. In previous eras the proposal to revert to a low-growth economy would have been regarded as simply barbarism and regression. Yet there is a strong sentiment among the anti-globalist and deep-green activists in favour of this solution, and it has found echoes in mass consciousness and micro-level consumer behaviour as the world has come to understand the dangers of global warming. Even a mainstream corporate economist, such as Morgan Stanley's Roach, has called for "a greater awareness of the consequences of striving for open-ended economic growth . . . This crisis is a strong signal that [high-growth] strategies are not sustainable."
The third alternative is the Minsky option. Hyman Minsky (1919-1996) was the godfather of modern financial crisis theory: his works, while largely ignored by politicians, are revered by both Marxists and hedge-fund managers. The "Minsky Moment" - a systemic financial crisis that crashes the real-world economy - was not only predicted in his work but theorised as a natural and intrinsic feature of capitalism. What we are going through now, Minsky argued, is the normal consequence of achieving growth and full employment through an unfettered private financial system.
But he had a solution - outlined in the chapters the hedge-fund managers skip and the Marxists dismiss: the socialisation of the banking system. This, he conceived, not as an anti-capitalist measure but as the only possible form of a high- consumption, stable capitalism in the future.
Minsky argued: "As socialisation of the towering heights is fully compatible with a large, growing and prosperous private sector, this high-consumption synthesis might well be conducive to greater freedom for entrepreneurial ability and daring than is our present structure."
Minsky never bothered to spell out the details of how it might be done. But there is no need to, now.
Stumbling through the underground passageways of 10 Downing Street on the morning of 8 October, I saw it done. Tetchy and bleary-eyed, fuelled by stale coffee and takeaway Indian food, British civil servants had designed and executed it in the space of 48 hours. Within days, much of the western world's banking system had been stabilised by massive injections of taxpayer credit and capital.
The problem is, though they have now been there, done that, the G20 politicians have no desire to get the T-shirt.
The G20 summit will meet in the context of a global finance system on life support. Their impulse is to get it off the respirator as quickly as possible; to put things back to normal. But the ecosystem which sustained global finance in its previous form is also in crisis: easy credit and speculative finance were the oxygen, and they have gone.
The policy challenge, in short, is much more fundamental than is being recognised in the run-up to the G20 summit. Gordon Brown speaks of a "new global order" emerging out of Washington. But in reality he is talking about multilateral crisis-resolution mechanisms, not a rethink of the relationship between finance capital, growth and debt.
If the world's leaders seriously intend to "refound capitalism on the basis of ethics and work", there is plenty of source material to start brainstorming from.
I would throw this into the mix, from Franklin Roosevelt's Oval Office in January 1934: "Americans must forswear that conception of the acquisition of wealth which, through excessive profits, creates undue private power over private affairs and, to our misfortune, over public affairs as well. In building toward this end we do not destroy ambition . . . But we do assert that the ambition of the individual to obtain a proper security, a reasonable leisure, and a decent living throughout life is an ambition to be preferred to the appetite for great wealth and great power."
Paul Mason is economics editor of BBC Newsnight; his book "Meltdown: the End of the Age of Greed" is published by Verso in April 2009
A PERSONAL JOURNAL, KEPT LARGELY TO RECORD REFERENCES TO WRITINGS, MUSIC, POLITICS, ECONOMICS, WORLD HAPPENINGS, PLAYS, FILMS, PAINTINGS, OBJECTS, BUILDINGS, SPORTING EVENTS, FOODS, WINES, PLACES AND/OR PEOPLE.
About Me

- Xerxes
- New Orleans, Louisiana, United States
- Admire John McPhee, Bill Bryson, David Remnick, Thomas Merton, Richard Rohr and James Martin (and most open and curious minds)
7.11.08
American Election: Winners & Losers
WINNERS:
The Davids - Axelrod and Plouffe: they spearheaded a near flawless campaign.
Katie Couric: her multi-part interview with Sarah Palin was the turning point in how the country saw Palin -- and by extension John McCain. And she did it in a way that left no room for accusations of being unfair or playing "Gotcha!"
Colin Powell, Scott McClellan, Ken Adelman, Chris Buckley, Kenneth Duberstein, et al: crossing party lines to endorse the eventual winner can't hurt the rep.
Saturday Night Live: went from "Is that still on?" to Must See TV (or, at least, Must See on YouTubeTV)
Tina Fey: her take on Palin was pitch perfect; a comedy mugging for the ages. And with Palin's obvious weight loss during the campaign, she ended up looking more and more like her 30 Rock doppelganger.
Sarah Palin: lost an election but there has to be a reality show in her future.
Michelle Obama: smarts, grace, style, charm, and a serious "good mommy" vibe -- she's got the whole package.
The View: went from gal chat to political headline maker.
MSNBC: Keith, Rachel, Chris... they sent a collective tingle went up the leg of progressive viewers everywhere.
The Internet: click here.
LOSERS:
Joe Lieberman: failed to deliver Democrats, independents, or Jews. And on the way to losing his committee chairmanships.
Sean Hannity, Rush Limbaugh, Dick Morris, and hate-mongers everywhere: the stink didn't stick to Obama but it stuck to them.
Bill Clinton: it's gonna take a lot of work to repair the rep.
John McCain: see Bill Clinton.
Liddy Dole: see Clinton and McCain. Her "Godless" ad will be taught in What Not To Do poli sci classes for a century.
George W. Bush: the repudiation of his presidency was overwhelming and across-the-board.
The Republican Party: the emptiness of its philosophic underpinnings was exposed for all to see.
Joe the Plumber: the clock just hit 15 minutes, and the wakeup call will not be pleasant. Joe the Plumber, meet Clara Peller ("Where's the beef?!").
Okay, these are my picks... what are yours? Please post your winners and losers in the comments section below.
The Davids - Axelrod and Plouffe: they spearheaded a near flawless campaign.
Katie Couric: her multi-part interview with Sarah Palin was the turning point in how the country saw Palin -- and by extension John McCain. And she did it in a way that left no room for accusations of being unfair or playing "Gotcha!"
Colin Powell, Scott McClellan, Ken Adelman, Chris Buckley, Kenneth Duberstein, et al: crossing party lines to endorse the eventual winner can't hurt the rep.
Saturday Night Live: went from "Is that still on?" to Must See TV (or, at least, Must See on YouTubeTV)
Tina Fey: her take on Palin was pitch perfect; a comedy mugging for the ages. And with Palin's obvious weight loss during the campaign, she ended up looking more and more like her 30 Rock doppelganger.
Sarah Palin: lost an election but there has to be a reality show in her future.
Michelle Obama: smarts, grace, style, charm, and a serious "good mommy" vibe -- she's got the whole package.
The View: went from gal chat to political headline maker.
MSNBC: Keith, Rachel, Chris... they sent a collective tingle went up the leg of progressive viewers everywhere.
The Internet: click here.
LOSERS:
Joe Lieberman: failed to deliver Democrats, independents, or Jews. And on the way to losing his committee chairmanships.
Sean Hannity, Rush Limbaugh, Dick Morris, and hate-mongers everywhere: the stink didn't stick to Obama but it stuck to them.
Bill Clinton: it's gonna take a lot of work to repair the rep.
John McCain: see Bill Clinton.
Liddy Dole: see Clinton and McCain. Her "Godless" ad will be taught in What Not To Do poli sci classes for a century.
George W. Bush: the repudiation of his presidency was overwhelming and across-the-board.
The Republican Party: the emptiness of its philosophic underpinnings was exposed for all to see.
Joe the Plumber: the clock just hit 15 minutes, and the wakeup call will not be pleasant. Joe the Plumber, meet Clara Peller ("Where's the beef?!").
Okay, these are my picks... what are yours? Please post your winners and losers in the comments section below.
Speaking of Irritating
Oxford compiles list of top ten irritating phrases
Heading the list was the expression 'at the end of the day', which was followed in second place by the phrase 'fairly unique'.
The tautological statement "I personally" made third place – an expression that BBC Radio 4 presenter John Humphreys has described as "the linguistic equivalent of having chips with rice." Also making the top 10 is the grammatically incorrect "shouldn't of", instead of "shouldn't have".
The phrases appear in a book called Damp Squid, named after the mistake of confusing a squid with a squib, a type of firework.
advertisementThe researchers who compiled the list monitor the use of phrases in a database called the Oxford University Corpus, which comprises books, papers, magazines, broadcast, the internet and other sources.
The database alerts them to new words and phrases and can tell them which expressions are disappearing. It also shows how words are being misused.
As well as the above expressions, the book's author Jeremy Butterfield says that many annoyingly over-used expressions actually began as office lingo, such as 24/7 and "synergy".
Other phrases to irritate people are "literally" and "ironically", when they are used out of context.
Mr Butterfield said: "We grow tired of anything that is repeated too often – an anecdote, a joke, a mannerism – and the same seems to happen with some language."
The top ten most irritating phrases:
1 - At the end of the day
2 - Fairly unique
3 - I personally
4 - At this moment in time
5 - With all due respect
6 - Absolutely
7 - It's a nightmare
8 - Shouldn't of 9 - 24/7
10 - It's not rocket science
Heading the list was the expression 'at the end of the day', which was followed in second place by the phrase 'fairly unique'.
The tautological statement "I personally" made third place – an expression that BBC Radio 4 presenter John Humphreys has described as "the linguistic equivalent of having chips with rice." Also making the top 10 is the grammatically incorrect "shouldn't of", instead of "shouldn't have".
The phrases appear in a book called Damp Squid, named after the mistake of confusing a squid with a squib, a type of firework.
advertisementThe researchers who compiled the list monitor the use of phrases in a database called the Oxford University Corpus, which comprises books, papers, magazines, broadcast, the internet and other sources.
The database alerts them to new words and phrases and can tell them which expressions are disappearing. It also shows how words are being misused.
As well as the above expressions, the book's author Jeremy Butterfield says that many annoyingly over-used expressions actually began as office lingo, such as 24/7 and "synergy".
Other phrases to irritate people are "literally" and "ironically", when they are used out of context.
Mr Butterfield said: "We grow tired of anything that is repeated too often – an anecdote, a joke, a mannerism – and the same seems to happen with some language."
The top ten most irritating phrases:
1 - At the end of the day
2 - Fairly unique
3 - I personally
4 - At this moment in time
5 - With all due respect
6 - Absolutely
7 - It's a nightmare
8 - Shouldn't of 9 - 24/7
10 - It's not rocket science
A New America
Few who love and admire America could fail to be moved by the victory of Barack Obama in the election on Tuesday. It was, to adapt a phrase of Tony Blair's used of another distinctive moment, an occasion when people could sense the hand of history upon this man's shoulder.
America's capacity to reinvent itself has rarely been more tellingly displayed than by the rise of the brilliant, black senator from Chicago, who rallied a vast army of supporters across the country, many young or poor, many from minority communities, many previously alienated, under the single banner of "change". His power as an orator exceeds even that of John F. Kennedy, the predecessor with whom he is most often compared; and he has used it to transform American politics. On his coat-tails both Houses of Congress have been secured for the Democrats. Those who thought they had established a near-permanent Republican hegemony, an awkward but effective alliance between socially conservative Evangelicals, neoconservative ideologues and free-marketeering big business, found their dreams discarded overnight. Suddenly such things no longer define American politics at home nor America's presence abroad; the world will have to learn to reckon with a very different American role in international affairs.
The issue that dominated the start of the presidential campaign, Iraq, faded in significance towards the end, both because of the economic downturn and because the withdrawal of American troops from Iraq ceased to be so contentious following developments on the ground. And it was the economy that finally destroyed Senator John McCain's chances. This is by far the most pressing challenge facing the incoming president, and it is a weakness of the system that, when decisions have to be made urgently, he does not formally take power until late January. President Bush, already seen as a lame duck, must now take his lead from his successor, for instance at the forthcoming international summit on the financial crisis on 15 November. The restructuring of international finance, which means designing a successor to the Bretton Woods agreement of 1944, cannot wait. Nor can efforts to reverse the recession in the American economy, which has already started to destroy jobs and displace people from their homes. If Mr Obama's policies are the right medicine, the patient needs it now.
Africa's powerful new ally
America's new president is an African American in an unusually literal way - his father was a black African, from Kenya, and his mother was a white American. He was raised by his white American grandparents, and rose through law school to be head of the Harvard Law Review, a position among the nation's intellectual elite. Unlike that of his wife, his family history does not carry the humiliation and hurt of slavery; indeed the African American community was at first a little cautious about his qualifications to be counted one of them. But his election is already demonstrating the power to transform their perceptions of what being a black American means, to lift a historical burden from their shoulders that at times seemed to shape their destiny. And this change of perception also applies to how America will be seen abroad. With a black president there can be no hint of white supremacy. And it will affect how Africa is seen too. The president-elect has many relatives still living in Kenya, whose own president has announced a public holiday in honour of this week's victory. Those with Africa's interests at heart, like Gordon Brown, have a powerful new ally. But Mr Obama has yet to admit that the protectionism he has sometimes seemed to favour will in the long run harm developing economies such as those in Latin America and Africa. Maybe an election in the midst of an economic crisis is not a good time to be candid about such matters.
Doctrine of ‘soft power'
In his victory speech Mr Obama spoke eloquently of America's true influence in the world being not because of its military might or wealth but through its ideals, "democracy, liberty, opportunity and unyielding hope". This is a doctrine of "soft power" which the world is ready to hear, having seen how the spread of American influence by armed force backfired so dangerously in Iraq. From that misadventure he now has to withdraw, but responsibly. From the more justified military campaign in Afghanistan he has to find ways to turn armed struggle into peaceful progress: the troops must come home from there eventually too. He has another chance to revisit America's most depressing international failure, its inconsequential interventions in the interminable Middle East conflict. This, too, cannot wait. He takes seriously the threat to the planet from global warming, and his administration will owe nothing to Big Oil and similar vested interests. Clean alternative energy will now be America's way forward, not the notorious "drill, baby, drill" of the defeated vice presidential candidate, Sarah Palin.
In many respects Mr Obama's policies resonate with the social justice that the Judaeo-Christian tradition promotes, such as the relief of poverty, health care for all, new jobs to replace those lost, affordable housing, care for the environment and so on. He is a Christian, although not of the fundamentalist kind, and he has Catholic connections in his background. But it appears that some leaders of the Catholic Church, America's largest denomination, failed once more to read the signs of the times, and tried to insist that this inspiring and epoch-transforming election, this turning point in American history, was once again just about abortion. The laity saw things differently; indeed this time the Catholic vote was almost indistinguishable from the population as a whole.
Policies of social justice
A rethink of the American bishops' strategy on abortion is urgently necessary; at the moment their message is not being heard by the majority of Americans. The letter from Cardinal Francis George of Chicago, written on behalf of the US bishops to President-elect Obama, congratulating him on his victory and urging him to defend the vulnerable and the life and dignity of every person, signals a more pragmatic approach and a willingness to engage in dialogue.
Americans perceive their society as unique in the world; certainly it is uniquely successful. One arrogant and bullying model of that exceptionalism has died and been buried this week - the imperial version represented by President Bush and those who surround him - while another has been reborn. It is all the more inspiring for having risen from the ranks of ordinary people, the "we the people" of whom the constitution speaks, led by a man whose skin colour still marked him as an outsider. The powerless have taken power, snatching it cleanly from the entrenched interests that clutched it close. It is a version of the American dream that has hardly been seen before, where the mighty are cast down from their thrones and the humble are exalted. And that is not the end of it, just the beginning.
America's capacity to reinvent itself has rarely been more tellingly displayed than by the rise of the brilliant, black senator from Chicago, who rallied a vast army of supporters across the country, many young or poor, many from minority communities, many previously alienated, under the single banner of "change". His power as an orator exceeds even that of John F. Kennedy, the predecessor with whom he is most often compared; and he has used it to transform American politics. On his coat-tails both Houses of Congress have been secured for the Democrats. Those who thought they had established a near-permanent Republican hegemony, an awkward but effective alliance between socially conservative Evangelicals, neoconservative ideologues and free-marketeering big business, found their dreams discarded overnight. Suddenly such things no longer define American politics at home nor America's presence abroad; the world will have to learn to reckon with a very different American role in international affairs.
The issue that dominated the start of the presidential campaign, Iraq, faded in significance towards the end, both because of the economic downturn and because the withdrawal of American troops from Iraq ceased to be so contentious following developments on the ground. And it was the economy that finally destroyed Senator John McCain's chances. This is by far the most pressing challenge facing the incoming president, and it is a weakness of the system that, when decisions have to be made urgently, he does not formally take power until late January. President Bush, already seen as a lame duck, must now take his lead from his successor, for instance at the forthcoming international summit on the financial crisis on 15 November. The restructuring of international finance, which means designing a successor to the Bretton Woods agreement of 1944, cannot wait. Nor can efforts to reverse the recession in the American economy, which has already started to destroy jobs and displace people from their homes. If Mr Obama's policies are the right medicine, the patient needs it now.
Africa's powerful new ally
America's new president is an African American in an unusually literal way - his father was a black African, from Kenya, and his mother was a white American. He was raised by his white American grandparents, and rose through law school to be head of the Harvard Law Review, a position among the nation's intellectual elite. Unlike that of his wife, his family history does not carry the humiliation and hurt of slavery; indeed the African American community was at first a little cautious about his qualifications to be counted one of them. But his election is already demonstrating the power to transform their perceptions of what being a black American means, to lift a historical burden from their shoulders that at times seemed to shape their destiny. And this change of perception also applies to how America will be seen abroad. With a black president there can be no hint of white supremacy. And it will affect how Africa is seen too. The president-elect has many relatives still living in Kenya, whose own president has announced a public holiday in honour of this week's victory. Those with Africa's interests at heart, like Gordon Brown, have a powerful new ally. But Mr Obama has yet to admit that the protectionism he has sometimes seemed to favour will in the long run harm developing economies such as those in Latin America and Africa. Maybe an election in the midst of an economic crisis is not a good time to be candid about such matters.
Doctrine of ‘soft power'
In his victory speech Mr Obama spoke eloquently of America's true influence in the world being not because of its military might or wealth but through its ideals, "democracy, liberty, opportunity and unyielding hope". This is a doctrine of "soft power" which the world is ready to hear, having seen how the spread of American influence by armed force backfired so dangerously in Iraq. From that misadventure he now has to withdraw, but responsibly. From the more justified military campaign in Afghanistan he has to find ways to turn armed struggle into peaceful progress: the troops must come home from there eventually too. He has another chance to revisit America's most depressing international failure, its inconsequential interventions in the interminable Middle East conflict. This, too, cannot wait. He takes seriously the threat to the planet from global warming, and his administration will owe nothing to Big Oil and similar vested interests. Clean alternative energy will now be America's way forward, not the notorious "drill, baby, drill" of the defeated vice presidential candidate, Sarah Palin.
In many respects Mr Obama's policies resonate with the social justice that the Judaeo-Christian tradition promotes, such as the relief of poverty, health care for all, new jobs to replace those lost, affordable housing, care for the environment and so on. He is a Christian, although not of the fundamentalist kind, and he has Catholic connections in his background. But it appears that some leaders of the Catholic Church, America's largest denomination, failed once more to read the signs of the times, and tried to insist that this inspiring and epoch-transforming election, this turning point in American history, was once again just about abortion. The laity saw things differently; indeed this time the Catholic vote was almost indistinguishable from the population as a whole.
Policies of social justice
A rethink of the American bishops' strategy on abortion is urgently necessary; at the moment their message is not being heard by the majority of Americans. The letter from Cardinal Francis George of Chicago, written on behalf of the US bishops to President-elect Obama, congratulating him on his victory and urging him to defend the vulnerable and the life and dignity of every person, signals a more pragmatic approach and a willingness to engage in dialogue.
Americans perceive their society as unique in the world; certainly it is uniquely successful. One arrogant and bullying model of that exceptionalism has died and been buried this week - the imperial version represented by President Bush and those who surround him - while another has been reborn. It is all the more inspiring for having risen from the ranks of ordinary people, the "we the people" of whom the constitution speaks, led by a man whose skin colour still marked him as an outsider. The powerless have taken power, snatching it cleanly from the entrenched interests that clutched it close. It is a version of the American dream that has hardly been seen before, where the mighty are cast down from their thrones and the humble are exalted. And that is not the end of it, just the beginning.
John Leonard
John Leonard, journalist, novelist, and critic, died of lung cancer Wednesday at the age of 69. In 2003, Meghan O'Rourke recalled Leonard's editorship of the New York Times Book Review in the '70s, a time she called the publication's "golden age." The article is reprinted below.
For 107 years, the New York Times Book Review has been the Goliath of American book reviews. It has also been the section that everyone loves to hate: Decade after decade, the epithets pile up, from "terminally dull" to "the drab wallpaper of the book world." One gets the sense that readers find its very judiciousness annoying, like finding yourself seated next to a chaste, fair-minded guest at a raucous, gossipy dinner party. This may be an institutional problem, but now that the Book Review's current editor, Charles McGrath, who has held the position since 1995, is stepping down, the Times will inevitably wrestle once again with its image.
In looking forward, the Times might want to look back—to what was widely agreed to be the Book Review's golden age, from 1971 to 1975, under the editorship of John Leonard. Nostalgia is obviously a perilous emotion, but in this case, the golden years prove to be more than just the gilt of yesteryear. They provide a useful model for what tomorrow's Book Review could look like—should it choose to.
What was so special about Leonard's Book Review? From the very start—his first issue was January 10, 1971—it stood out for its editorial brazenness and its engagement with current affairs. The reviews of Horace translations and the histories of Modernist little magazines slimmed down or shuffled to the back; in their place came a riotous thicket of pieces on film, the black arts movement, the Vietnam War, E. M. Cioran, B. F. Skinner, Michel Foucault. (Remember, it was 1971.) Women began to review political books. Feminist novelists were evaluated thoughtfully but not forgivingly. In 1972, Don DeLillo's second novel, End Zone, was given the lead review—which in those days meant it began on the cover. DeLillo was a relative unknown. When I spoke to Leonard by phone last week, he told me he'd made the unusual decision to put him on the cover because he liked the review enough to read the novel—and when he did he saw something new in it.
Mostly, though, Leonard's Book Review was distinctive because its pieces took a clear position—not only on the book at hand, but on the subject at hand. You get the sense that someone sat down and said, OK, what's a provocative way to talk about this book—why are we interested in reviewing it in the first place? (And if we don't have an answer, let's not review it.) The result was opinionated writing by journalists and specialists alike, often polemical but rarely prescriptive (as one might have expected): pointed re-examinations of everything from Vladimir Mayakovsky's "hooligan communism" to Noam Chomsky's antiwar stance (more an "outcry against the absurdity … than a sustained argument against it") to the critical reception of Albert Speer's memoir. On the fiction front, the equivocation that has become the hallmark of today's reviews nationwide was nowhere to be found. Instead, heavy-hitters like Eudora Welty, John Hawkes, and Toni Morrison weighed in firmly, dispensing with disposable literary novels—a "drafty little fable"—and fiercely taking on top-tier writers like James Jones: He "writes abominably, like Dreiser." A sub-par Joyce Carol Oates novel was described (by a reviewer who liked her work) as "bad, very bad." This wasn't the dismissive showboating that's recently won the name "snark"; it was the conviction of engaged people holding one another to high standards.
Finally, unlike many things of its era, Leonard's Book Review remains visually stylish—jazzy and unorthodox but not too arty. Leonard exploited illustrations for their documentary value and power: contact sheets of Charles Manson; a stunningly bored young Carson McCullers at a party; a nearly full-page black-and-white photo of Janis Joplin performing at Madison Square Garden, eyes closed, fists clenched, her dark hair striped by stage lights. This image, from Leonard's second issue, sent a visceral message about the modernity of the new editorial sensibility. It also bolstered a point that Jonathan Yardley was making about the perils of rock journalism.
In short, Leonard's achievement lay in recognizing that the majority of books published any given year are most interesting as an expression of their culture—not as things to be assessed in and of themselves. And so he selected books accordingly, turning the reviews into a probing dialogue about that culture, finding reviewers who were eager to plunge into controversy and urging them on. Senior statesmen like Alfred Kazin and John Kenneth Galbraith shared space with brash younger writers (now our senior statesman) eager to take one another on—Jonathan Yardley, Toni Morrison, Morris Dickstein, John Ashbery, A. Alvarez, Nancy Milford, Hilton Kramer. One of Leonard's early, risky moves was to make the Review the voice of the antiwar movement. Characteristically, he began, in March of 1971, by publishing a long, splashy essay on whether civilian deaths in Vietnam should be tried as war crimes; it touched on tens of books, including Seymour Hersh's My Lai 4.
In a sense, Leonard had it easy when he set out to make a talked-about section. He presided during a moment that had a distinct sense of itself as a historical anomaly. The Civil Rights movement, Vietnam, the push for a younger voting age—all stirred up debate. It was an era that lent itself to the creation of an opinionated, youthful, intelligent literary journal. And he had about 80 pages to work with, where today's has about 30. (It helped, too, that in the midst of this cultural turmoil the Times wasn't sure what editorial direction it wanted to take with the Review, according to Leonard.) Of course, people did complain—mostly that the Review was too much like the Voice, he told me—and in recent years, as he recalls, one colleague said that Leonard had not been a "Timesman." Ironically, this failure may have contributed to his achievement: He saw his role as an occasion to put a critical sensibility to work, rather than to keep an institution intact. And when people mention Leonard's Review today, it's with the kind of wistfulness that Kane said, "Rosebud."
What lesson might the Book Review's next editor draw from the Leonard years? For one thing, we're clearly at another crucial historical moment—one at which smart people disagree about big issues (the aftermath of Sept. 11 and the war in Iraq being only the most obvious examples). American culture is on the defensive, its distinctiveness under renewed scrutiny. Meanwhile, with the rise of the superstore, the advent of the Internet, and the ever-increasing number of books published each year, the Review, though influential, no longer makes or breaks a book as it did even 10 years ago. The editor in chief of one publishing house describes the Review instead as a "piece of the puzzle"—which now includes TV publicity and decisions made by Barnes & Noble's buyers. You could lament such a diminishment of cultural authority, except this one provides an opportunity for the Review: It may have more freedom than ever before to re-imagine its role. Leonard's provocative tack simply may not be what the Times wants, or has ever wanted, but at this particular moment Rosebud looks within reach.
Meghan O'Rourke is Slate's culture critic and the author of Halflife, a collection of poetry.
For 107 years, the New York Times Book Review has been the Goliath of American book reviews. It has also been the section that everyone loves to hate: Decade after decade, the epithets pile up, from "terminally dull" to "the drab wallpaper of the book world." One gets the sense that readers find its very judiciousness annoying, like finding yourself seated next to a chaste, fair-minded guest at a raucous, gossipy dinner party. This may be an institutional problem, but now that the Book Review's current editor, Charles McGrath, who has held the position since 1995, is stepping down, the Times will inevitably wrestle once again with its image.
In looking forward, the Times might want to look back—to what was widely agreed to be the Book Review's golden age, from 1971 to 1975, under the editorship of John Leonard. Nostalgia is obviously a perilous emotion, but in this case, the golden years prove to be more than just the gilt of yesteryear. They provide a useful model for what tomorrow's Book Review could look like—should it choose to.
What was so special about Leonard's Book Review? From the very start—his first issue was January 10, 1971—it stood out for its editorial brazenness and its engagement with current affairs. The reviews of Horace translations and the histories of Modernist little magazines slimmed down or shuffled to the back; in their place came a riotous thicket of pieces on film, the black arts movement, the Vietnam War, E. M. Cioran, B. F. Skinner, Michel Foucault. (Remember, it was 1971.) Women began to review political books. Feminist novelists were evaluated thoughtfully but not forgivingly. In 1972, Don DeLillo's second novel, End Zone, was given the lead review—which in those days meant it began on the cover. DeLillo was a relative unknown. When I spoke to Leonard by phone last week, he told me he'd made the unusual decision to put him on the cover because he liked the review enough to read the novel—and when he did he saw something new in it.
Mostly, though, Leonard's Book Review was distinctive because its pieces took a clear position—not only on the book at hand, but on the subject at hand. You get the sense that someone sat down and said, OK, what's a provocative way to talk about this book—why are we interested in reviewing it in the first place? (And if we don't have an answer, let's not review it.) The result was opinionated writing by journalists and specialists alike, often polemical but rarely prescriptive (as one might have expected): pointed re-examinations of everything from Vladimir Mayakovsky's "hooligan communism" to Noam Chomsky's antiwar stance (more an "outcry against the absurdity … than a sustained argument against it") to the critical reception of Albert Speer's memoir. On the fiction front, the equivocation that has become the hallmark of today's reviews nationwide was nowhere to be found. Instead, heavy-hitters like Eudora Welty, John Hawkes, and Toni Morrison weighed in firmly, dispensing with disposable literary novels—a "drafty little fable"—and fiercely taking on top-tier writers like James Jones: He "writes abominably, like Dreiser." A sub-par Joyce Carol Oates novel was described (by a reviewer who liked her work) as "bad, very bad." This wasn't the dismissive showboating that's recently won the name "snark"; it was the conviction of engaged people holding one another to high standards.
Finally, unlike many things of its era, Leonard's Book Review remains visually stylish—jazzy and unorthodox but not too arty. Leonard exploited illustrations for their documentary value and power: contact sheets of Charles Manson; a stunningly bored young Carson McCullers at a party; a nearly full-page black-and-white photo of Janis Joplin performing at Madison Square Garden, eyes closed, fists clenched, her dark hair striped by stage lights. This image, from Leonard's second issue, sent a visceral message about the modernity of the new editorial sensibility. It also bolstered a point that Jonathan Yardley was making about the perils of rock journalism.
In short, Leonard's achievement lay in recognizing that the majority of books published any given year are most interesting as an expression of their culture—not as things to be assessed in and of themselves. And so he selected books accordingly, turning the reviews into a probing dialogue about that culture, finding reviewers who were eager to plunge into controversy and urging them on. Senior statesmen like Alfred Kazin and John Kenneth Galbraith shared space with brash younger writers (now our senior statesman) eager to take one another on—Jonathan Yardley, Toni Morrison, Morris Dickstein, John Ashbery, A. Alvarez, Nancy Milford, Hilton Kramer. One of Leonard's early, risky moves was to make the Review the voice of the antiwar movement. Characteristically, he began, in March of 1971, by publishing a long, splashy essay on whether civilian deaths in Vietnam should be tried as war crimes; it touched on tens of books, including Seymour Hersh's My Lai 4.
In a sense, Leonard had it easy when he set out to make a talked-about section. He presided during a moment that had a distinct sense of itself as a historical anomaly. The Civil Rights movement, Vietnam, the push for a younger voting age—all stirred up debate. It was an era that lent itself to the creation of an opinionated, youthful, intelligent literary journal. And he had about 80 pages to work with, where today's has about 30. (It helped, too, that in the midst of this cultural turmoil the Times wasn't sure what editorial direction it wanted to take with the Review, according to Leonard.) Of course, people did complain—mostly that the Review was too much like the Voice, he told me—and in recent years, as he recalls, one colleague said that Leonard had not been a "Timesman." Ironically, this failure may have contributed to his achievement: He saw his role as an occasion to put a critical sensibility to work, rather than to keep an institution intact. And when people mention Leonard's Review today, it's with the kind of wistfulness that Kane said, "Rosebud."
What lesson might the Book Review's next editor draw from the Leonard years? For one thing, we're clearly at another crucial historical moment—one at which smart people disagree about big issues (the aftermath of Sept. 11 and the war in Iraq being only the most obvious examples). American culture is on the defensive, its distinctiveness under renewed scrutiny. Meanwhile, with the rise of the superstore, the advent of the Internet, and the ever-increasing number of books published each year, the Review, though influential, no longer makes or breaks a book as it did even 10 years ago. The editor in chief of one publishing house describes the Review instead as a "piece of the puzzle"—which now includes TV publicity and decisions made by Barnes & Noble's buyers. You could lament such a diminishment of cultural authority, except this one provides an opportunity for the Review: It may have more freedom than ever before to re-imagine its role. Leonard's provocative tack simply may not be what the Times wants, or has ever wanted, but at this particular moment Rosebud looks within reach.
Meghan O'Rourke is Slate's culture critic and the author of Halflife, a collection of poetry.
6.11.08
Obama
Can Obama heal America?
Amid the excitement and frenzy of election night, one man stayed calm and collected – the next president. Rupert Cornwell reports on a historic night and examines the huge The most striking thing about Barack Obama is not his youth, his oratory, or even the colour of his skin. It's simply that he knows what he's about. The vast crowd spread out before him in Chicago late on Tuesday night was transported with joy. America and most of the rest of the world hailed the moment as if it were the Second Coming.
But there in the midst of the frenzy, at this moment of supreme accomplishment, stood Mr Obama – cool, collected and already focused not on the historic victory he had just won in defeating the Republican John McCain and becoming America's first black president, but on the monumental problems he will confront, and that will not await his inauguration on 20 January 2009.
This outcome had been predicted (this time, mercifully, the polls were pretty much spot-on). But when the epochal event finally came to pass, it was still hard yesterday for most people to come to grips with it. The implications for the foreign and domestic policy of the US, for how America sees itself and how the world sees America, are too vast. The one person who appeared to grasp exactly what had happened, and what it might mean, was... Mr Obama himself.
In two-and-a-half months' time, he will take the helm of a country embroiled in two draining wars, with its name tarnished around the world, with a slumping economy and bleeding financial system, facing a federal deficit of a mind-boggling $1 trillion. No president since Franklin Roosevelt, three-quarters of a century ago, has faced such challenges.
But as the country's very foundations have trembled, the one point of calm has been a man of 47, his name all but unknown barely four years ago, and by conventional political yardsticks with next to no experience to speak of. Yet as he spoke yesterday to America and the world, Mr Obama projected an almost preternatural sense of destiny, as if he had been preparing for the moment all his life. "A new dawn of American leadership is at hand," he said, and a world long disenchanted with American leadership ached to believe him.
Mr Obama's mission is now to transform his country. But even before he starts, he has transformed its political geography. The continental divide evident in the two most recent elections is no more. The cleavage between the coasts and the upper Midwest coloured Democratic blue; and the heartlands, the West, and the South that remained a uniform Republican red, has been blurred to the point of invisibility.
On Tuesday night, Mr Obama captured the Republican strongholds of Ohio, Florida and Virginia. Yesterday, even more remarkable, he was declared the winner in Indiana, a state that previously had voted Republican in every election since 1964. He is the first Democrat to win more than 50 per cent of the national vote since Jimmy Carter in 1976. He has secured the most convincing electoral college victory in a two-candidate contest since the elder Bush's rout of Michael Dukakis in 1988.
Not since Ronald Reagan has a president entered the White House in a stronger position. Mr Obama prevailed in every major demographic group except the over-65s. He may draw on a colossal reservoir of goodwill, even among many of his opponents. President George Bush, not noted for his generosity to Democrats, yesterday hailed the extraordinary nature of what had happened. "All America", he said, "can be proud of the history that was made".
Condoleezza Rice, the thoroughly Republican Secretary of State (but of course an African American) seemed almost moved to tears. The election was "an extraordinary step forward" she said.
The Republicans have lost at least 16, and perhaps as many as 25, seats in the House of Representatives. In the Senate they have lost at least six seats, though they will prevent the Democrats from reaching the 60 required to break a filibuster, and have thus retained the power to block legislation. In reality however, Republicans are leaderless and ideologically bankrupt. Mr Obama may well be given more trouble by his friends than his opponents – by a Democratic leadership on Capitol Hill giddy with victory and eager to tip the country further to the left than it wants to go.
America cannot wait to see the back of Messrs Bush and Cheney. It is fed up with negative campaigning and with the politics of slash and burn. It is tired of the endless, unproductive fights on Capitol Hill. That is why it has elected a first-term senator who is still relatively uncontaminated by Washington.
Every victorious presidential candidate says he wants to govern across the political divide: even the super-polarising George W Bush did so in 2000 and 2004. But if anyone means it, Mr Obama does. This election may have swept an African American into the White House, and given Democrats simultaneous control of the White House and Congress for the first time since 1992. But as the President-elect knows full well, the US remains a centre-right country. Back in 2004, Mr Obama first made his name with an electrifying speech in which he proclaimed there was not a "blue America" or a "red America", but a single United States of America. On the printed page, those words read like a cliché. But like most clichés they are true – or at least Americans fervently pray they are true.
During the campaign, the Republicans constantly pointed out that he had among the most, if not the most, liberal voting record in the Senate. "The truth is, he's a socialist," one normally judicious Republican senator intoned. But Mr Obama in office will not behave like one.
Though his mandate for change is sweeping, he is likely to govern from the centre. Yes, the US is entering a period of more activist, interventionist government. The economic crisis, the urgent need to improve the nation's education, health care, and infrastructure, demand no less. Indeed, one of President Obama's first acts will be the signature of a massive new economic stimulus programme (that is, if a lame duck President Bush has not done so already, in the waning weeks of this administration). But it is also a truism of American politics that really important legislation can only pass with bi-partisan support, and if ever really important legislation is needed, it is now.
Yesterday Mr Obama was in Chicago, working out at the gym, but also working with his closest advisers on the transition and the make-up of his cabinet. Never has an incoming administration more urgently needed to hit the ground running.
To signal his intentions, Mr Obama will probably soon name a Republican to one of the two top national security jobs, either Secretary of State or Secretary of Defence. He may also choose a non-party figure to be Treasury Secretary, right now the most important cabinet post of all. Democratic sources revealed last night that Rahm Emanuel, a former aide to Bill Clinton, hadbeen appointed his WhiteHouse chief of staff.
From day one, they will have their work cut out. Abroad, the Obama administration must find a way out of Iraq, and prevent the war in Afghanistan, and the situation in neighbouring Pakistan, from slipping out of control.
It will be dealing with a resurgent, prickly Russia – which chose yesterday of all days to announce it was moving missiles to its Baltic region to counter the "threat'" posed by the US missile defence installations in Europe. It must restart work on the stalled Doha round of global trade talks.
At home, the new president not only has to tackle the immediate economic crisis. He must deliver on lavish election promises of middle-class tax cuts and the expansion of health care coverage, even though these can only push the federal deficit still higher. He must build a credible energy policy. He must place the US at the forefront of the battle against global warming.
On the campaign trail, the disconnect between the candidates' economic rhetoric and the grim daily economic reality experienced by voters was often almost surreal. Now Mr Obama has somehow to bring his country down to earth, yet without destroying the enchantment that lifted him to power.
In Grant Park, the President-elect made a start. The changes would come: "We as a people will get there." But the journey would be long, and the problems so great that they might not be solved in a year or two years, or even by the end of his first term. Again and again, he wove into his speech his campaign refrain of "Yes, we can".
But he uttered the phrase less as a triumphant statement of fact than as a quiet aspiration. Henceforth Mr Obama and the Democrats will no longer have the most unpopular president in modern US history to kick around. If they mess up, and on occasion they will, the fault will be theirs alone.
But if the overwhelming weight of expectation is Mr Obama's greatest problem, he by every indication knows how to handle it. Just as during the tough stretches of the campaign, everyone else might lose their heads – but not him.
One way or another, the man about to become the 44th president knows what he's about.
How I see it: Reactions to the Obama victory
"I wish God speed to the man who was my former opponent and will be my president. I call on all Americans... to not despair of our present difficulties but to believe in the promise and greatness of America."
John McCain
"I am especially proud because this is a country that's been through a long journey inovercoming wounds, and making race not the factor in our lives. That work is not done, but yesterday was obviously an extra-ordinary step forward."
Condoleezza Rice
"Your victory has demonstrated that no personanywhere in the world should not dare to dream of wanting to change the world for a better place."
Nelson Mandela
"I congratulate President-elect Obama on his historic victory. Now it's time to begin unifying the country so we can take on the extraordinary challenges that this generation faces."
George Clooney
"It feels like hope won. It feels like it's not just victory for Barack Obama. It feels like America did the right thing. It feels like anything is now possible."
Oprah Winfrey
Amid the excitement and frenzy of election night, one man stayed calm and collected – the next president. Rupert Cornwell reports on a historic night and examines the huge The most striking thing about Barack Obama is not his youth, his oratory, or even the colour of his skin. It's simply that he knows what he's about. The vast crowd spread out before him in Chicago late on Tuesday night was transported with joy. America and most of the rest of the world hailed the moment as if it were the Second Coming.
But there in the midst of the frenzy, at this moment of supreme accomplishment, stood Mr Obama – cool, collected and already focused not on the historic victory he had just won in defeating the Republican John McCain and becoming America's first black president, but on the monumental problems he will confront, and that will not await his inauguration on 20 January 2009.
This outcome had been predicted (this time, mercifully, the polls were pretty much spot-on). But when the epochal event finally came to pass, it was still hard yesterday for most people to come to grips with it. The implications for the foreign and domestic policy of the US, for how America sees itself and how the world sees America, are too vast. The one person who appeared to grasp exactly what had happened, and what it might mean, was... Mr Obama himself.
In two-and-a-half months' time, he will take the helm of a country embroiled in two draining wars, with its name tarnished around the world, with a slumping economy and bleeding financial system, facing a federal deficit of a mind-boggling $1 trillion. No president since Franklin Roosevelt, three-quarters of a century ago, has faced such challenges.
But as the country's very foundations have trembled, the one point of calm has been a man of 47, his name all but unknown barely four years ago, and by conventional political yardsticks with next to no experience to speak of. Yet as he spoke yesterday to America and the world, Mr Obama projected an almost preternatural sense of destiny, as if he had been preparing for the moment all his life. "A new dawn of American leadership is at hand," he said, and a world long disenchanted with American leadership ached to believe him.
Mr Obama's mission is now to transform his country. But even before he starts, he has transformed its political geography. The continental divide evident in the two most recent elections is no more. The cleavage between the coasts and the upper Midwest coloured Democratic blue; and the heartlands, the West, and the South that remained a uniform Republican red, has been blurred to the point of invisibility.
On Tuesday night, Mr Obama captured the Republican strongholds of Ohio, Florida and Virginia. Yesterday, even more remarkable, he was declared the winner in Indiana, a state that previously had voted Republican in every election since 1964. He is the first Democrat to win more than 50 per cent of the national vote since Jimmy Carter in 1976. He has secured the most convincing electoral college victory in a two-candidate contest since the elder Bush's rout of Michael Dukakis in 1988.
Not since Ronald Reagan has a president entered the White House in a stronger position. Mr Obama prevailed in every major demographic group except the over-65s. He may draw on a colossal reservoir of goodwill, even among many of his opponents. President George Bush, not noted for his generosity to Democrats, yesterday hailed the extraordinary nature of what had happened. "All America", he said, "can be proud of the history that was made".
Condoleezza Rice, the thoroughly Republican Secretary of State (but of course an African American) seemed almost moved to tears. The election was "an extraordinary step forward" she said.
The Republicans have lost at least 16, and perhaps as many as 25, seats in the House of Representatives. In the Senate they have lost at least six seats, though they will prevent the Democrats from reaching the 60 required to break a filibuster, and have thus retained the power to block legislation. In reality however, Republicans are leaderless and ideologically bankrupt. Mr Obama may well be given more trouble by his friends than his opponents – by a Democratic leadership on Capitol Hill giddy with victory and eager to tip the country further to the left than it wants to go.
America cannot wait to see the back of Messrs Bush and Cheney. It is fed up with negative campaigning and with the politics of slash and burn. It is tired of the endless, unproductive fights on Capitol Hill. That is why it has elected a first-term senator who is still relatively uncontaminated by Washington.
Every victorious presidential candidate says he wants to govern across the political divide: even the super-polarising George W Bush did so in 2000 and 2004. But if anyone means it, Mr Obama does. This election may have swept an African American into the White House, and given Democrats simultaneous control of the White House and Congress for the first time since 1992. But as the President-elect knows full well, the US remains a centre-right country. Back in 2004, Mr Obama first made his name with an electrifying speech in which he proclaimed there was not a "blue America" or a "red America", but a single United States of America. On the printed page, those words read like a cliché. But like most clichés they are true – or at least Americans fervently pray they are true.
During the campaign, the Republicans constantly pointed out that he had among the most, if not the most, liberal voting record in the Senate. "The truth is, he's a socialist," one normally judicious Republican senator intoned. But Mr Obama in office will not behave like one.
Though his mandate for change is sweeping, he is likely to govern from the centre. Yes, the US is entering a period of more activist, interventionist government. The economic crisis, the urgent need to improve the nation's education, health care, and infrastructure, demand no less. Indeed, one of President Obama's first acts will be the signature of a massive new economic stimulus programme (that is, if a lame duck President Bush has not done so already, in the waning weeks of this administration). But it is also a truism of American politics that really important legislation can only pass with bi-partisan support, and if ever really important legislation is needed, it is now.
Yesterday Mr Obama was in Chicago, working out at the gym, but also working with his closest advisers on the transition and the make-up of his cabinet. Never has an incoming administration more urgently needed to hit the ground running.
To signal his intentions, Mr Obama will probably soon name a Republican to one of the two top national security jobs, either Secretary of State or Secretary of Defence. He may also choose a non-party figure to be Treasury Secretary, right now the most important cabinet post of all. Democratic sources revealed last night that Rahm Emanuel, a former aide to Bill Clinton, hadbeen appointed his WhiteHouse chief of staff.
From day one, they will have their work cut out. Abroad, the Obama administration must find a way out of Iraq, and prevent the war in Afghanistan, and the situation in neighbouring Pakistan, from slipping out of control.
It will be dealing with a resurgent, prickly Russia – which chose yesterday of all days to announce it was moving missiles to its Baltic region to counter the "threat'" posed by the US missile defence installations in Europe. It must restart work on the stalled Doha round of global trade talks.
At home, the new president not only has to tackle the immediate economic crisis. He must deliver on lavish election promises of middle-class tax cuts and the expansion of health care coverage, even though these can only push the federal deficit still higher. He must build a credible energy policy. He must place the US at the forefront of the battle against global warming.
On the campaign trail, the disconnect between the candidates' economic rhetoric and the grim daily economic reality experienced by voters was often almost surreal. Now Mr Obama has somehow to bring his country down to earth, yet without destroying the enchantment that lifted him to power.
In Grant Park, the President-elect made a start. The changes would come: "We as a people will get there." But the journey would be long, and the problems so great that they might not be solved in a year or two years, or even by the end of his first term. Again and again, he wove into his speech his campaign refrain of "Yes, we can".
But he uttered the phrase less as a triumphant statement of fact than as a quiet aspiration. Henceforth Mr Obama and the Democrats will no longer have the most unpopular president in modern US history to kick around. If they mess up, and on occasion they will, the fault will be theirs alone.
But if the overwhelming weight of expectation is Mr Obama's greatest problem, he by every indication knows how to handle it. Just as during the tough stretches of the campaign, everyone else might lose their heads – but not him.
One way or another, the man about to become the 44th president knows what he's about.
How I see it: Reactions to the Obama victory
"I wish God speed to the man who was my former opponent and will be my president. I call on all Americans... to not despair of our present difficulties but to believe in the promise and greatness of America."
John McCain
"I am especially proud because this is a country that's been through a long journey inovercoming wounds, and making race not the factor in our lives. That work is not done, but yesterday was obviously an extra-ordinary step forward."
Condoleezza Rice
"Your victory has demonstrated that no personanywhere in the world should not dare to dream of wanting to change the world for a better place."
Nelson Mandela
"I congratulate President-elect Obama on his historic victory. Now it's time to begin unifying the country so we can take on the extraordinary challenges that this generation faces."
George Clooney
"It feels like hope won. It feels like it's not just victory for Barack Obama. It feels like America did the right thing. It feels like anything is now possible."
Oprah Winfrey
Harold Ross
It's the birthday of the man who founded The New Yorker magazine, Harold Ross, born in Aspen, Colorado (1892). His father worked in the mining business. He ran away from home when he was 16, and he worked at various newspapers from New Orleans to California. He was known for his love of the nightlife in San Francisco.
In the 1920s, Ross worked in the New York City publishing industry and became friends with F. Scott Fitzgerald, Edna St. Vincent Millay, Irving Berlin, and George Gershwin. Ross raised money from a friend whose father had made a fortune in yeast, and on February 21, 1925, the first issue of The New Yorker hit the stands.
Ross was obsessed with the details of the magazine. He believed in accuracy above all else, and he used fact checkers for everything, including fiction and cartoons. He never let a cartoonist draw a lamp without showing the cord plugged into a socket.
In the 1920s, Ross worked in the New York City publishing industry and became friends with F. Scott Fitzgerald, Edna St. Vincent Millay, Irving Berlin, and George Gershwin. Ross raised money from a friend whose father had made a fortune in yeast, and on February 21, 1925, the first issue of The New Yorker hit the stands.
Ross was obsessed with the details of the magazine. He believed in accuracy above all else, and he used fact checkers for everything, including fiction and cartoons. He never let a cartoonist draw a lamp without showing the cord plugged into a socket.
Obama
The Next President
Laurence H. Tribe 11.05.08, 12:30 PM ET
I am watching the sun rise over Lake Michigan in the land of Lincoln on this new day in America. This is the morning after a great divide in the biography of the United States. As a nation, we have come of age.
I flew to Chicago on Tuesday afternoon to witness history as the United States of America went to the polls on Election Day, 2008. Hours later, as President-Elect Barack Obama spoke in Grant Park to claim his victory before a great throng of supporters and an eagerly listening world--almost exactly 40 years after the chaos of 1968--I felt myself in the flow of time, a minor participant in a great saga punctuated by events that shaped my life, as it shaped the lives of so many others.
The year 1968 was, for me and most of my friends, a year of tragedy and disillusion. Through the years that followed, years punctuated by Watergate and Vietnam and by decades of political polarization and paralysis, politics was the game that disappointed. Yesterday it was the game that delivered. The work of governing lies ahead, but the sun is rising and the challenges we face--in reconstructing a broken economy, restoring a threatened constitution, ending a misguided war and waging a necessary one, starting to heal a wounded planet--look from here like opportunities to be seized, not obstacles to be feared.
How different this feels from the crazy election of 2000, brought to an abrupt and puzzling end by the Supreme Court's ill-starred decision to stop counting the ballots, when another new president was installed to preside over a nearly dysfunctional country. Having served as counsel before the court to the losing candidate during that sad chapter in our democratic trajectory, I returned to ordinary life but wondered when, if ever, I could fully believe in the process again.
As the decade progressed, the most impressive student I had ever taught was quietly pursuing his own political trajectory. In 1989, I had met Barack Obama and hired him as my research assistant while he was still just a first-year Harvard law student. His stunning combination of analytical brilliance and personal charisma, openness and maturity, vision and pragmatism, was unmistakable from my very first encounter with the future president.
I thought about that encounter as he and his wife Michelle each gave me a hug in one of the off-stage tents in Grant Park last night. I recalled it as I found myself unable to express in words my sense of gratitude and of possibility. The president-elect and the first lady-designate both thanked me for the part I had played in Barack Obama's education and his rise to power, but it was I, of course, who owed thanks to them, thanks for the journey on which they had embarked to reclaim America for all who dare to hope.
There will be countless efforts to dissect their improbable path from that cold winter morning in Springfield, Ill., nearly two years ago--when a still-new senator from Illinois announced his candidacy for the highest office in the land--to the unseasonably warm evening in Chicago when that quest reached its climax and when those who had led it confronted the daunting challenges of actually governing. This is not another attempt at such dissection. Nor is this another post-mortem on the failed efforts of president-elect Obama's more than formidable foes. It is simply a personal note to commemorate a milestone in a great nation's history.
As an immigrant to the United States, born in Shanghai to Russian Jewish parents who brought me with them when they settled in California in 1947, I have always felt great pride--both in that ancestry and in the gift of citizenship conferred on me by the nation that went on to provide me with such extraordinary opportunities--to thrive and to give something back for all that I have been given. My pride in that citizenship has never been greater than it is today. Truth to tell, I find myself unable to stop smiling, just as last night I found it difficult to stop crying.
Barack Obama's unique ability to explain and to motivate, coupled with his signature ability to listen and to learn, and linked with the calm that marked his nearly flawless campaign, will serve him--and all of us--well as we grapple with as daunting a set of problems as the nation has faced in three-quarters of a century. It is of course true that only time will tell just how successful this brave, brilliant and caring man will be in charting a new course for the country, something that will depend only partly on decisions that Obama will make as president.
But one thing is already certain: The very fact of Barack Obama's election at this defining moment--quite apart from the programs he pursues and the ways in which he pursues them--already speaks volumes to everyone on the planet. His election in and of itself displays how dramatically America has moved to transcend the divisions of its past and bids fair to give us a new lease on life in a world that had come, and not without reason, to see us in an awful light--a world that will now give this nation a fresh look and a second chance.
The sun is now high over Lake Michigan. It is a new day in America. We can do this. Yes, we can.
Laurence H. Tribe is the Carl M. Loeb university professor and professor of constitutional law at Harvard.
Laurence H. Tribe 11.05.08, 12:30 PM ET
I am watching the sun rise over Lake Michigan in the land of Lincoln on this new day in America. This is the morning after a great divide in the biography of the United States. As a nation, we have come of age.
I flew to Chicago on Tuesday afternoon to witness history as the United States of America went to the polls on Election Day, 2008. Hours later, as President-Elect Barack Obama spoke in Grant Park to claim his victory before a great throng of supporters and an eagerly listening world--almost exactly 40 years after the chaos of 1968--I felt myself in the flow of time, a minor participant in a great saga punctuated by events that shaped my life, as it shaped the lives of so many others.
The year 1968 was, for me and most of my friends, a year of tragedy and disillusion. Through the years that followed, years punctuated by Watergate and Vietnam and by decades of political polarization and paralysis, politics was the game that disappointed. Yesterday it was the game that delivered. The work of governing lies ahead, but the sun is rising and the challenges we face--in reconstructing a broken economy, restoring a threatened constitution, ending a misguided war and waging a necessary one, starting to heal a wounded planet--look from here like opportunities to be seized, not obstacles to be feared.
How different this feels from the crazy election of 2000, brought to an abrupt and puzzling end by the Supreme Court's ill-starred decision to stop counting the ballots, when another new president was installed to preside over a nearly dysfunctional country. Having served as counsel before the court to the losing candidate during that sad chapter in our democratic trajectory, I returned to ordinary life but wondered when, if ever, I could fully believe in the process again.
As the decade progressed, the most impressive student I had ever taught was quietly pursuing his own political trajectory. In 1989, I had met Barack Obama and hired him as my research assistant while he was still just a first-year Harvard law student. His stunning combination of analytical brilliance and personal charisma, openness and maturity, vision and pragmatism, was unmistakable from my very first encounter with the future president.
I thought about that encounter as he and his wife Michelle each gave me a hug in one of the off-stage tents in Grant Park last night. I recalled it as I found myself unable to express in words my sense of gratitude and of possibility. The president-elect and the first lady-designate both thanked me for the part I had played in Barack Obama's education and his rise to power, but it was I, of course, who owed thanks to them, thanks for the journey on which they had embarked to reclaim America for all who dare to hope.
There will be countless efforts to dissect their improbable path from that cold winter morning in Springfield, Ill., nearly two years ago--when a still-new senator from Illinois announced his candidacy for the highest office in the land--to the unseasonably warm evening in Chicago when that quest reached its climax and when those who had led it confronted the daunting challenges of actually governing. This is not another attempt at such dissection. Nor is this another post-mortem on the failed efforts of president-elect Obama's more than formidable foes. It is simply a personal note to commemorate a milestone in a great nation's history.
As an immigrant to the United States, born in Shanghai to Russian Jewish parents who brought me with them when they settled in California in 1947, I have always felt great pride--both in that ancestry and in the gift of citizenship conferred on me by the nation that went on to provide me with such extraordinary opportunities--to thrive and to give something back for all that I have been given. My pride in that citizenship has never been greater than it is today. Truth to tell, I find myself unable to stop smiling, just as last night I found it difficult to stop crying.
Barack Obama's unique ability to explain and to motivate, coupled with his signature ability to listen and to learn, and linked with the calm that marked his nearly flawless campaign, will serve him--and all of us--well as we grapple with as daunting a set of problems as the nation has faced in three-quarters of a century. It is of course true that only time will tell just how successful this brave, brilliant and caring man will be in charting a new course for the country, something that will depend only partly on decisions that Obama will make as president.
But one thing is already certain: The very fact of Barack Obama's election at this defining moment--quite apart from the programs he pursues and the ways in which he pursues them--already speaks volumes to everyone on the planet. His election in and of itself displays how dramatically America has moved to transcend the divisions of its past and bids fair to give us a new lease on life in a world that had come, and not without reason, to see us in an awful light--a world that will now give this nation a fresh look and a second chance.
The sun is now high over Lake Michigan. It is a new day in America. We can do this. Yes, we can.
Laurence H. Tribe is the Carl M. Loeb university professor and professor of constitutional law at Harvard.
Michael Crichton
Michael Crichton, who delighted lovers of his fiction and enraged environmentalists, is dead at the age of 66...
4.11.08
Hic Ririte
Obama and McCain Walk Into a Bar ..
By JOHN TIERNEY
While Americans choose their next president, let us consider a question more amenable to science: Which candidate’s supporters have a better sense of humor? In strict accordance with experimental protocol, we begin by asking you to rate, on a scale of 1 (not funny at all) to 9 (hilarious) the following three attempts at humor:
A) Jake is about to chip onto the green at his local golf course when a long funeral procession passes by. He stops in midswing, doffs his cap, closes his eyes and bows in prayer. His playing companion is deeply impressed. “That’s the most thoughtful and touching thing I’ve ever seen,” he says. Jake replies, “Yeah, well, we were married 35 years.”
B) I think there should be something in science called the “reindeer effect.” I don’t know what it would be, but I think it’d be good to hear someone say, “Gentlemen, what we have here is a terrifying example of the reindeer effect.”
C) If you saw two guys named Hambone and Flippy, which one would you think liked dolphins the most? I’d say Flippy, wouldn’t you? You’d be wrong, though. It’s Hambone.
Those were some of the jokes rated by nearly 300 people in Boston in a recent study. (You can rate some of the others at TierneyLab, nytimes.com/tierneylab.) The researchers picked out a variety of jokes — good, bad, conventional, absurdist — to look for differences in reactions between self-described liberals and conservatives.
They expected conservatives to like traditional jokes, like the one about the golfing widower, that reinforce racial and gender stereotypes. And because liberals had previously been reported to be more flexible and open to new ideas, the researchers expected them to get a bigger laugh out of unconventional humor, like Jack Handey’s “Deep Thoughts” about the reindeer effect and Hambone.
Indeed, the conservatives did rate the traditional golf and marriage jokes as significantly funnier than the liberals did. But they also gave higher ratings to the absurdist “Deep Thoughts.” In fact, they enjoyed all kinds of humor more.
“I was surprised,” said Dan Ariely, a psychologist at Duke University, who collaborated on the study with Elisabeth Malin, a student at Mount Holyoke College. “Conservatives are supposed to be more rigid and less sophisticated, but they liked even the more complex humor.”
Do conservatives have more fun? Should liberals start describing themselves as humor-challenged? To investigate these questions, we need to delve into the science of humor (not a funny enterprise), starting with two basic kinds of humor identified in the 1980s by Willibald Ruch, a psychologist who now teaches at the University of Zurich.
The first category is incongruity-resolution humor, or INC-RES in humor jargon. It covers traditional jokes and cartoons in which the incongruity of the punch line (the husband who misses his wife’s funeral) can be resolved by other information (he’s playing golf). You can clearly get the joke, and it often reinforces stereotypes (the golf-obsessed husband).
Dr. Ruch and other researchers reported that this humor, with its orderly structure and reinforcement of stereotypes, appealed most to conservatives who shunned ambiguity and complicated new ideas, and who were more repressed and conformist than liberals.
The second category, nonsense humor, covers many “Far Side” cartoons, Monty Python sketches and “Deep Thoughts.” The punch line’s incongruity isn’t neatly resolved — you’re left to enjoy the ambiguity and absurdity of the reindeer effect or Hambone’s affection for dolphins. This humor was reported to appeal to liberals because of their “openness to ideas” and their tendency to “seek new experiences.”
But then why didn’t the liberals in the Boston experiment like the nonsense humor of “Deep Thoughts” as much as the conservatives did? One possible explanation is that conservatives’ rigidity mattered less than another aspect of their personality. Rod Martin, the author of “The Psychology of Humor,” said the results of the Boston study might reflect another trait that has been shown to correlate with a taste for jokes: cheerfulness.
“Conservatives tend to be happier than liberals in general,” said Dr. Martin, a psychologist at the University of Western Ontario. “A conservative outlook rationalizes social inequality, accepting the world as it is, and making it less of a threat to one’s well-being, whereas a liberal outlook leads to dissatisfaction with the world as it is, and a sense that things need to change before one can be really happy.”
Another possible explanation is that conservatives, or at least the ones in Boston, really aren’t the stiffs they’re made out to be by social scientists. When these scientists analyze conservatives, they can sound like Victorians describing headhunters in Borneo. They try to be objective, but it’s an alien culture.
The studies hailing liberals’ nonconformity and “openness to ideas” have been done by social scientists working in a culture that’s remarkably homogenous politically. Democrats outnumber Republicans by at least seven to one on social science and humanities faculties, according to studies by Daniel Klein, an economist at George Mason University. If you’re a professor who truly “seeks new experiences,” try going into a faculty club today and passing out McCain-Palin buttons.
Could it be that the image of conservatives as humorless, dogmatic neurotics is based more on political bias than sound social science? Philip Tetlock, a psychologist at the University of California, Berkeley, who reviews the evidence of cognitive differences in his 2005 book, “Expert Political Judgment,” said that while there were valid differences, “liberals and conservatives are roughly equally closed-minded in dealing with dissonant real-world evidence.”
So perhaps conservatives don’t have a monopoly on humorless dogmatism. Maybe the stereotype of the dour, rigid conservative has more to do with social scientists’ groupthink and wariness of outsiders — which, come to think of it, resembles the herding behavior of certain hoofed animals. Ladies and gentlemen, what we have here is a terrifying example of the reindeer effect.
By JOHN TIERNEY
While Americans choose their next president, let us consider a question more amenable to science: Which candidate’s supporters have a better sense of humor? In strict accordance with experimental protocol, we begin by asking you to rate, on a scale of 1 (not funny at all) to 9 (hilarious) the following three attempts at humor:
A) Jake is about to chip onto the green at his local golf course when a long funeral procession passes by. He stops in midswing, doffs his cap, closes his eyes and bows in prayer. His playing companion is deeply impressed. “That’s the most thoughtful and touching thing I’ve ever seen,” he says. Jake replies, “Yeah, well, we were married 35 years.”
B) I think there should be something in science called the “reindeer effect.” I don’t know what it would be, but I think it’d be good to hear someone say, “Gentlemen, what we have here is a terrifying example of the reindeer effect.”
C) If you saw two guys named Hambone and Flippy, which one would you think liked dolphins the most? I’d say Flippy, wouldn’t you? You’d be wrong, though. It’s Hambone.
Those were some of the jokes rated by nearly 300 people in Boston in a recent study. (You can rate some of the others at TierneyLab, nytimes.com/tierneylab.) The researchers picked out a variety of jokes — good, bad, conventional, absurdist — to look for differences in reactions between self-described liberals and conservatives.
They expected conservatives to like traditional jokes, like the one about the golfing widower, that reinforce racial and gender stereotypes. And because liberals had previously been reported to be more flexible and open to new ideas, the researchers expected them to get a bigger laugh out of unconventional humor, like Jack Handey’s “Deep Thoughts” about the reindeer effect and Hambone.
Indeed, the conservatives did rate the traditional golf and marriage jokes as significantly funnier than the liberals did. But they also gave higher ratings to the absurdist “Deep Thoughts.” In fact, they enjoyed all kinds of humor more.
“I was surprised,” said Dan Ariely, a psychologist at Duke University, who collaborated on the study with Elisabeth Malin, a student at Mount Holyoke College. “Conservatives are supposed to be more rigid and less sophisticated, but they liked even the more complex humor.”
Do conservatives have more fun? Should liberals start describing themselves as humor-challenged? To investigate these questions, we need to delve into the science of humor (not a funny enterprise), starting with two basic kinds of humor identified in the 1980s by Willibald Ruch, a psychologist who now teaches at the University of Zurich.
The first category is incongruity-resolution humor, or INC-RES in humor jargon. It covers traditional jokes and cartoons in which the incongruity of the punch line (the husband who misses his wife’s funeral) can be resolved by other information (he’s playing golf). You can clearly get the joke, and it often reinforces stereotypes (the golf-obsessed husband).
Dr. Ruch and other researchers reported that this humor, with its orderly structure and reinforcement of stereotypes, appealed most to conservatives who shunned ambiguity and complicated new ideas, and who were more repressed and conformist than liberals.
The second category, nonsense humor, covers many “Far Side” cartoons, Monty Python sketches and “Deep Thoughts.” The punch line’s incongruity isn’t neatly resolved — you’re left to enjoy the ambiguity and absurdity of the reindeer effect or Hambone’s affection for dolphins. This humor was reported to appeal to liberals because of their “openness to ideas” and their tendency to “seek new experiences.”
But then why didn’t the liberals in the Boston experiment like the nonsense humor of “Deep Thoughts” as much as the conservatives did? One possible explanation is that conservatives’ rigidity mattered less than another aspect of their personality. Rod Martin, the author of “The Psychology of Humor,” said the results of the Boston study might reflect another trait that has been shown to correlate with a taste for jokes: cheerfulness.
“Conservatives tend to be happier than liberals in general,” said Dr. Martin, a psychologist at the University of Western Ontario. “A conservative outlook rationalizes social inequality, accepting the world as it is, and making it less of a threat to one’s well-being, whereas a liberal outlook leads to dissatisfaction with the world as it is, and a sense that things need to change before one can be really happy.”
Another possible explanation is that conservatives, or at least the ones in Boston, really aren’t the stiffs they’re made out to be by social scientists. When these scientists analyze conservatives, they can sound like Victorians describing headhunters in Borneo. They try to be objective, but it’s an alien culture.
The studies hailing liberals’ nonconformity and “openness to ideas” have been done by social scientists working in a culture that’s remarkably homogenous politically. Democrats outnumber Republicans by at least seven to one on social science and humanities faculties, according to studies by Daniel Klein, an economist at George Mason University. If you’re a professor who truly “seeks new experiences,” try going into a faculty club today and passing out McCain-Palin buttons.
Could it be that the image of conservatives as humorless, dogmatic neurotics is based more on political bias than sound social science? Philip Tetlock, a psychologist at the University of California, Berkeley, who reviews the evidence of cognitive differences in his 2005 book, “Expert Political Judgment,” said that while there were valid differences, “liberals and conservatives are roughly equally closed-minded in dealing with dissonant real-world evidence.”
So perhaps conservatives don’t have a monopoly on humorless dogmatism. Maybe the stereotype of the dour, rigid conservative has more to do with social scientists’ groupthink and wariness of outsiders — which, come to think of it, resembles the herding behavior of certain hoofed animals. Ladies and gentlemen, what we have here is a terrifying example of the reindeer effect.
Who Are We?
You might be surprised by what's in charge of your body
The Caterpillar and Alice looked at each other for some time in silence: at last the Caterpillar took the hookah out of its mouth, and addressed her in a languid, sleepy voice.
'Who are YOU?' said the Caterpillar.
This was not an encouraging opening for a conversation. Alice replied, rather shyly, 'I — I hardly know, sir, just at present.'
— Lewis Carroll
The answer might seem simple, even trivial: name, rank, serial number, family connections, occupation, and so forth. But there is more to the caterpillar's contemptuous question than meets the eye, for biologists no less than for a philosophically inclined insect.
Who are we?
It turns out that the business of being a "self" is more fraught than cultural tradition and subjective experience tell us. John Donne be damned: Each person, most of us assume, is indeed an island, entire of him or herself, everyone his own man or woman — separate, distinct, independent, and in charge. An army of one, a skin-encapsulated ego whose identity is beyond dispute. But wait a moment. Where, exactly, does that "us" reside? Whereas "self psychology" is widely attributed to the work of the psychoanalyst Heinz Kohut and often appears murky enough, "self biology" is even more so.
The deepest, most private recesses of our genome might seem, if not sacrosanct, at least the inner sanctum of our "selves." But consider this: More than four times as much space in the human genome is occupied by "endogenous retroviruses" than by all the genes that code for the enzymes and other proteins used to create and run our bodies. These viruses long ago incorporated themselves into our ancestors, apparently by using reverse-transcriptase enzymes of the sort employed by such malefactors as HIV to pry their way these days into the DNA of the immune cells they exploit. The silent viral debris we all carry is "foreign" in nearly every way, yet it must be seen as a large part of "us." (Some of these fossil bits of hitchhiking nucleic acid have even been resurrected in the laboratory, whereupon they can infect mice and act like the perfectly independent viruses they once were.)
Add to that the saga of mitochondria, fabled "powerhouses" of our cells, the sites of metabolic energy production. It is now generally agreed that these fundamental subcellular organelles are actually derived from primitive bacteria that, long ago, became incorporated into the genome of nucleated organisms. From our DNA to our ATP, the biological alphabet soup of selfhood is not so easily identified.
Think of the morgue scene in the movie Men in Black, when what appears to be a human corpse is dissected and revealed to be a highly realistic robot, its skull inhabited by a little green man from outer space. Or for a less fanciful example, turn to the disconcerting fact that there are many more parasitic creatures alive today than free-living counterparts. Endogenous retroviruses and mitochondria are either benign or downright beneficent, but pretty much every multicellular animal is also home to numerous additional fellow travelers, and — this is especially noteworthy — each of these creatures may have its own agenda. Considering just one group of worms, the biologist N.A. Cobb suggested, "If all the matter in the universe except the nematodes were swept away, our world would still be dimly recognizable. ... Trees would still stand in ghostly rows representing our streets and highways. The location of the various plants and animals would still be decipherable, and, had we sufficient knowledge, in many cases even their species could be determined by an examination of their erstwhile nematode parasites." Put another way, if a biologist were to answer Frank Zappa's question, "Suzy Creamcheese, honey, what's got into you?" the response might well be: "A whole lot of other living things."
What difference does this make? For many of us supposedly free-living creatures, quite a lot. Providing room and board to other life-forms not only compromises one's nutritional status (not to mention peace of mind), it often reduces freedom of action, too. The technical phrase is "host manipulation." For example, the tapeworm Echinococcus multilocularis causes its mouse host to become obese and sluggish, making it easy pickings for predators, notably foxes, which — not coincidentally — constitute the next phase in the tapeworm's life cycle. Those the gods intend to bring low, according to the Greeks, they first make proud; those tapeworms intending to migrate from mouse to fox do so by first making their mouse fat, sluggish, and thus, fox food.
Sometimes the process is more bizarre. For example, the life cycle of a trematode worm, known as Dicrocoelium dentriticum, involves doing time inside an ant, followed by a sheep. Getting from its insect host to its mammalian one is a bit of a stretch, but the resourceful worm has found a way: Ensconced within an ant, some of the worms migrate to its formicine brain, whereupon they manage to rewire their host's neurons and hijack its behavior. The manipulated ant, acting with zombielike fidelity to Dicrocoelium's demands, climbs to the top of a blade of grass and clamps down with its jaws, whereupon it waits patiently and conspicuously until it is consumed by a grazing sheep. Thus transported to its desired happy breeding ground deep inside sheep bowels, the worm turns, or rather, releases its eggs, which depart with a healthy helping of sheep poop, only to be consumed once more, by ants. It's a distressingly familiar story ... distressing, at least, to those committed to bodily "autonomy."
A final example, as unappetizing as it is important: Plague, the notorious Black Death, is caused by bacteria carried by fleas, which, in turn, live mostly on rats. Rat fleas sup cheerfully on rat blood, but will happily nibble people, too, and when they are infected with the plague bacillus, they spread the illness from rat to human. The important point for our purposes is that once they are infected with plague, disease-ridden fleas are especially enthusiastic diners because the plague bacillus multiplies within flea stomachs, diabolically rendering the tiny insects incapable of satisfying their growing hunger. Not only are these fleas especially voracious in their frustration; because bacteria are cramming its own belly, an infected flea vomits blood back into the wound it has just made, introducing plague bacilli into yet another victim. A desperately hungry, frustrated, plague-promoting flea, if asked, might well claim that "the devil made me do it," but in fact, it was the handiwork of Pasteurella pestis. ("So, naturalists observe, a flea/ Has smaller fleas that on him prey,/ And these have smaller still to bite 'em,/ And so proceed ad infinitum," observed Jonathan Swift.)
Not that a plague bacterium — any more than a mouse-dwelling tapeworm or ant-hijacking brainworm — knows what it is doing when it reorders the inclinations of its host. Rather, a long evolutionary history has arranged things so that the manipulators have inherited the earth.
Once again, however, things aren't quite so simple. The ways of natural selection are devious and deep, embracing not only would-be manipulators but also their intended victims. Hosts needn't meekly follow just because others seek to lead. In Shakespeare's Henry IV, Part I, Owen Glendower boasts, "I can call spirits from the vasty deep," to which Hotspur responds, "Why, so can I, or so can any man; But will they come when you do call for them?" Sometimes it is unclear whether our spirits, no less than those of Glendower, might seek to enlist others, act at another's behest, or suit themselves. Take coughing, or sneezing, or even — since we have already broached some indelicate matters — diarrhea.
When people get sick, they often cough and sneeze. Indeed, aside from making you feel crummy or possibly run a fever, coughing and sneezing are important ways we identify illness in the first place. It may be beneficial for an infected person to cough up and sneeze out some of her tiny organismic invaders, although to be sure, it isn't so beneficial for others nearby. That, in turn, leads to an interesting possibility: What if coughing and sneezing aren't merely symptoms of disease, but also — even primarily — a manipulation of us, the host, by, say, the influenza virus? Shades of fattened mice and grass-blade-besotted ants. As to diarrhea, consider that it is a major (and potentially deadly) consequence of cholera. To be sure, as with a flu victim's sneezing and coughing, perhaps it helps a cholera sufferer to expel the cholera-causing critter, Vibrio cholerae. But it also benefits Vibrio cholerae. Just as Lenin urged us to ask "Who, whom?" with regard to socioeconomic interactions — who benefits at the expense of whom? — an evolutionary perspective urges upon us the wisdom of asking a similar question. Who benefits when a cholera sufferer dies from unremitting diarrhea? Answer: the cholera bacillus.
That dramatic symptom of cholera is caused by a toxin produced by the bacillus, making the host's intestines permeable to water, which generates a colonic flood that washes out much of the native bacterial intestinal flora, leaving a comparatively competitor-free environment in which Vibrio cholerae can flourish. A big part of that flourishing, moreover, occurs when more than 100 billion V. cholerae per liter of effluent sluices out of the victim's body, whereupon, if conditions are less than hygienic, they can infect new victims. Diarrhea, then, isn't just a symptom of cholera, it is a successful manipulation of Homo sapiens, by the bacteria and for the bacteria.
Troublesome as such tales of alien invasion and pathologic manipulation may be, it is easy to shrug them off when it comes to the daily, undiseased lives most of us experience. Things feel pretty straightforward when we are "doing our own thing," if only because then — we like to insist — we are acting on our volition and not for the benefit of some parasitic or pathogenic occupying army. When we fall in love, we do so for ourselves, not at the behest of a romance-addled tapeworm. When we help a friend, we aren't being manipulated by an altruistic bacterium or a long-pacified retrovirus. If we eat when hungry, sleep when tired, scratch an itch, or write a poem, we aren't knuckling under to the needs of our nematodes or anything else. Or are we?
As modern evolutionary biologists increasingly recognize, bodies are our genes' way of projecting themselves into the future. Bodies are temporary, ephemeral, short-lived apparatuses for those genes, which are the only entities that persist over evolutionary time. No matter how much money, time, or effort is lavished on bodies, regardless of how much they are exercised, pampered, or monitored for bad cholesterol, they don't have much of a future. In the scheme of things, they are as ephemeral as a spring day, a flower's petal, a gust of wind. What does persist are genes. Bodies go the way of all flesh: ashes to ashes, and atom to atom. Genes, on the other hand, are potentially immortal.
In his poem "Heredity," Thomas Hardy had a premonition of modern evolutionary biology and the endurance of genes: "I am the family face;/ Flesh perishes, I live on."
More troublesome than the fact that our genes — at least in theory — are eternal, whereas our bodies are ephemeral, is the caterpillar's question to Alice: Who are we? Or, rephrasing Lenin by way of Glendower, Who's calling and who's heeding?
The biologically informed answer is that we're not all that different from those alarming rat/tapeworm, ant/trematode, flea/bacteria relationships, only this time it's genes/body. Unlike the cases of parasites or pathogens, when it comes to genes manipulating bodies, the situation seems less dire to contemplate, if only because it is less a matter of demonic possession than of our genes, ourselves. The problem, however, is that those presumably personal genes aren't any more hesitant about manipulating us than a brainworm is about hijacking an ant.
Consider a seemingly more benign behavior, indeed, one that is highly esteemed: altruism. This is a favorite of evolutionary biologists because, superficially, every altruistic act is a paradox. Natural selection should squash any genetically mediated tendency to confer benefits on someone else while disadvantaging the altruist. Such genes should disappear from the gene pool, to be replaced by their more selfish counterparts. To a large extent, however, the paradox of altruism has been resolved by the recognition that "selfish genes" — per Richard Dawkins and others — can promote themselves (rather, identical copies of themselves) by conferring benefits on genetic relatives, who are likely to carry copies of the genes in question. By that process, known as "kin selection," behavior that appears altruistic at the level of bodies is revealed to be selfish at the level of genes. When someone favors a genetic relative, who, then, is doing the favoring: the nepotist, or the nepotist's genes?
Just as sneezing may well be a successful manipulation of "us" (Homo sapiens) by "them" (viruses), what about altruism as another successful manipulation of "us," this time by our own "altruism genes"? Admirable as altruism may be, it is therefore, in a sense, yet another form of manipulation. After all, just as the brainworm gains by orchestrating the actions of an ant, altruism genes stand to gain when we are nice to Cousin Sarah, never mind that such niceness is costly to that entity we unthinkingly call "ourselves."
All this may seem a bit naïve, since biologists know that genes don't order their bodies around. No characteristic of any living thing emerges full-grown from the coils of DNA, like Athena leaping out of the forehead of Zeus. Every trait — including behavior — results from a complex interaction of genetic potential and experience, learning as well as instinct, nurture inextricably combined with nature. Life is a matter of genetic influence, not determinism.
But does that really resolve the problem? Let's say that a brainworm-bearing ant still possesses some free will. And that a trematode-carrying mouse has even a bit more. So what if their behaviors were influenced, but not determined? Wouldn't even "influence" be enough to cast doubt on their agency, their independence of action? And (here comes the Big Question) why should human agency or free will be any less suspect? Even if we are manipulated just a teensy-weensy bit by our genes, isn't that enough to raise once again that disconcerting question: Who R we? Or, more to the point, who's in charge?
Maybe it doesn't matter. Or, put differently, maybe there is no "one" home, nobody minding the store. If so, that is because the "environment" is no more outside us than inside, part tapeworm, part bacterium, part genes, part hitchhiking retroviruses, collaborating mitochondria, and no independent, self-serving, order-issuing homunculus. Purveyors of Buddhist wisdom note that our skin doesn't separate our organismic selves from the environment; it joins us, just as purveyors of biological wisdom know that we are manipulated by, no less than manipulators of, the rest of life. Who R we? Well, that depends on what the meaning of "who" is. Who's left after the parasites and pathogens and other fellow travelers are removed? And after "you" are separated from your genes?
"That what we call mind," wrote David Hume, "is nothing but a heap or collection of different perceptions, united together by certain relations, and supposed, though falsely, to be endowed with a perfect simplicity and identity." The question of whether human consciousness reflects a single, coherent identity is fraught enough to have occupied battalions of philosophers and, these days, more than a few neurobiologists as well (even without the vexed question of free will). Add to that the identity-challenging insights of evolutionary genetics and ecology, and the caterpillar's question takes on even greater weight. One thing we know, however, is that no one can issue a bona fide Declaration of Independence.
Accordingly, let's leave the last words to a modern icon of organic, oceanic wisdom: SpongeBob SquarePants. Mr. SquarePants, a cheerful, talkative — although admittedly, somewhat cartoonish — fellow of the phylum Porifera, "lives in a pineapple under the sea/ Absorbent and yellow and porous is he!" I don't know about the pineapple or the yellow, but absorbent and porous are we, too.
David P. Barash is a professor of psychology at the University of Washington and a frequent Chronicle contributor. His most recent book, Natural Selections: Selfish Altruists, Honest Liars, and Other Realities of Evolution (Bellevue Literary Press), is based on his recent Chronicle articles.
You might be surprised by what's in charge of your body
The Caterpillar and Alice looked at each other for some time in silence: at last the Caterpillar took the hookah out of its mouth, and addressed her in a languid, sleepy voice.
'Who are YOU?' said the Caterpillar.
This was not an encouraging opening for a conversation. Alice replied, rather shyly, 'I — I hardly know, sir, just at present.'
— Lewis Carroll
The answer might seem simple, even trivial: name, rank, serial number, family connections, occupation, and so forth. But there is more to the caterpillar's contemptuous question than meets the eye, for biologists no less than for a philosophically inclined insect.
Who are we?
It turns out that the business of being a "self" is more fraught than cultural tradition and subjective experience tell us. John Donne be damned: Each person, most of us assume, is indeed an island, entire of him or herself, everyone his own man or woman — separate, distinct, independent, and in charge. An army of one, a skin-encapsulated ego whose identity is beyond dispute. But wait a moment. Where, exactly, does that "us" reside? Whereas "self psychology" is widely attributed to the work of the psychoanalyst Heinz Kohut and often appears murky enough, "self biology" is even more so.
The deepest, most private recesses of our genome might seem, if not sacrosanct, at least the inner sanctum of our "selves." But consider this: More than four times as much space in the human genome is occupied by "endogenous retroviruses" than by all the genes that code for the enzymes and other proteins used to create and run our bodies. These viruses long ago incorporated themselves into our ancestors, apparently by using reverse-transcriptase enzymes of the sort employed by such malefactors as HIV to pry their way these days into the DNA of the immune cells they exploit. The silent viral debris we all carry is "foreign" in nearly every way, yet it must be seen as a large part of "us." (Some of these fossil bits of hitchhiking nucleic acid have even been resurrected in the laboratory, whereupon they can infect mice and act like the perfectly independent viruses they once were.)
Add to that the saga of mitochondria, fabled "powerhouses" of our cells, the sites of metabolic energy production. It is now generally agreed that these fundamental subcellular organelles are actually derived from primitive bacteria that, long ago, became incorporated into the genome of nucleated organisms. From our DNA to our ATP, the biological alphabet soup of selfhood is not so easily identified.
Think of the morgue scene in the movie Men in Black, when what appears to be a human corpse is dissected and revealed to be a highly realistic robot, its skull inhabited by a little green man from outer space. Or for a less fanciful example, turn to the disconcerting fact that there are many more parasitic creatures alive today than free-living counterparts. Endogenous retroviruses and mitochondria are either benign or downright beneficent, but pretty much every multicellular animal is also home to numerous additional fellow travelers, and — this is especially noteworthy — each of these creatures may have its own agenda. Considering just one group of worms, the biologist N.A. Cobb suggested, "If all the matter in the universe except the nematodes were swept away, our world would still be dimly recognizable. ... Trees would still stand in ghostly rows representing our streets and highways. The location of the various plants and animals would still be decipherable, and, had we sufficient knowledge, in many cases even their species could be determined by an examination of their erstwhile nematode parasites." Put another way, if a biologist were to answer Frank Zappa's question, "Suzy Creamcheese, honey, what's got into you?" the response might well be: "A whole lot of other living things."
What difference does this make? For many of us supposedly free-living creatures, quite a lot. Providing room and board to other life-forms not only compromises one's nutritional status (not to mention peace of mind), it often reduces freedom of action, too. The technical phrase is "host manipulation." For example, the tapeworm Echinococcus multilocularis causes its mouse host to become obese and sluggish, making it easy pickings for predators, notably foxes, which — not coincidentally — constitute the next phase in the tapeworm's life cycle. Those the gods intend to bring low, according to the Greeks, they first make proud; those tapeworms intending to migrate from mouse to fox do so by first making their mouse fat, sluggish, and thus, fox food.
Sometimes the process is more bizarre. For example, the life cycle of a trematode worm, known as Dicrocoelium dentriticum, involves doing time inside an ant, followed by a sheep. Getting from its insect host to its mammalian one is a bit of a stretch, but the resourceful worm has found a way: Ensconced within an ant, some of the worms migrate to its formicine brain, whereupon they manage to rewire their host's neurons and hijack its behavior. The manipulated ant, acting with zombielike fidelity to Dicrocoelium's demands, climbs to the top of a blade of grass and clamps down with its jaws, whereupon it waits patiently and conspicuously until it is consumed by a grazing sheep. Thus transported to its desired happy breeding ground deep inside sheep bowels, the worm turns, or rather, releases its eggs, which depart with a healthy helping of sheep poop, only to be consumed once more, by ants. It's a distressingly familiar story ... distressing, at least, to those committed to bodily "autonomy."
A final example, as unappetizing as it is important: Plague, the notorious Black Death, is caused by bacteria carried by fleas, which, in turn, live mostly on rats. Rat fleas sup cheerfully on rat blood, but will happily nibble people, too, and when they are infected with the plague bacillus, they spread the illness from rat to human. The important point for our purposes is that once they are infected with plague, disease-ridden fleas are especially enthusiastic diners because the plague bacillus multiplies within flea stomachs, diabolically rendering the tiny insects incapable of satisfying their growing hunger. Not only are these fleas especially voracious in their frustration; because bacteria are cramming its own belly, an infected flea vomits blood back into the wound it has just made, introducing plague bacilli into yet another victim. A desperately hungry, frustrated, plague-promoting flea, if asked, might well claim that "the devil made me do it," but in fact, it was the handiwork of Pasteurella pestis. ("So, naturalists observe, a flea/ Has smaller fleas that on him prey,/ And these have smaller still to bite 'em,/ And so proceed ad infinitum," observed Jonathan Swift.)
Not that a plague bacterium — any more than a mouse-dwelling tapeworm or ant-hijacking brainworm — knows what it is doing when it reorders the inclinations of its host. Rather, a long evolutionary history has arranged things so that the manipulators have inherited the earth.
Once again, however, things aren't quite so simple. The ways of natural selection are devious and deep, embracing not only would-be manipulators but also their intended victims. Hosts needn't meekly follow just because others seek to lead. In Shakespeare's Henry IV, Part I, Owen Glendower boasts, "I can call spirits from the vasty deep," to which Hotspur responds, "Why, so can I, or so can any man; But will they come when you do call for them?" Sometimes it is unclear whether our spirits, no less than those of Glendower, might seek to enlist others, act at another's behest, or suit themselves. Take coughing, or sneezing, or even — since we have already broached some indelicate matters — diarrhea.
When people get sick, they often cough and sneeze. Indeed, aside from making you feel crummy or possibly run a fever, coughing and sneezing are important ways we identify illness in the first place. It may be beneficial for an infected person to cough up and sneeze out some of her tiny organismic invaders, although to be sure, it isn't so beneficial for others nearby. That, in turn, leads to an interesting possibility: What if coughing and sneezing aren't merely symptoms of disease, but also — even primarily — a manipulation of us, the host, by, say, the influenza virus? Shades of fattened mice and grass-blade-besotted ants. As to diarrhea, consider that it is a major (and potentially deadly) consequence of cholera. To be sure, as with a flu victim's sneezing and coughing, perhaps it helps a cholera sufferer to expel the cholera-causing critter, Vibrio cholerae. But it also benefits Vibrio cholerae. Just as Lenin urged us to ask "Who, whom?" with regard to socioeconomic interactions — who benefits at the expense of whom? — an evolutionary perspective urges upon us the wisdom of asking a similar question. Who benefits when a cholera sufferer dies from unremitting diarrhea? Answer: the cholera bacillus.
That dramatic symptom of cholera is caused by a toxin produced by the bacillus, making the host's intestines permeable to water, which generates a colonic flood that washes out much of the native bacterial intestinal flora, leaving a comparatively competitor-free environment in which Vibrio cholerae can flourish. A big part of that flourishing, moreover, occurs when more than 100 billion V. cholerae per liter of effluent sluices out of the victim's body, whereupon, if conditions are less than hygienic, they can infect new victims. Diarrhea, then, isn't just a symptom of cholera, it is a successful manipulation of Homo sapiens, by the bacteria and for the bacteria.
Troublesome as such tales of alien invasion and pathologic manipulation may be, it is easy to shrug them off when it comes to the daily, undiseased lives most of us experience. Things feel pretty straightforward when we are "doing our own thing," if only because then — we like to insist — we are acting on our volition and not for the benefit of some parasitic or pathogenic occupying army. When we fall in love, we do so for ourselves, not at the behest of a romance-addled tapeworm. When we help a friend, we aren't being manipulated by an altruistic bacterium or a long-pacified retrovirus. If we eat when hungry, sleep when tired, scratch an itch, or write a poem, we aren't knuckling under to the needs of our nematodes or anything else. Or are we?
As modern evolutionary biologists increasingly recognize, bodies are our genes' way of projecting themselves into the future. Bodies are temporary, ephemeral, short-lived apparatuses for those genes, which are the only entities that persist over evolutionary time. No matter how much money, time, or effort is lavished on bodies, regardless of how much they are exercised, pampered, or monitored for bad cholesterol, they don't have much of a future. In the scheme of things, they are as ephemeral as a spring day, a flower's petal, a gust of wind. What does persist are genes. Bodies go the way of all flesh: ashes to ashes, and atom to atom. Genes, on the other hand, are potentially immortal.
In his poem "Heredity," Thomas Hardy had a premonition of modern evolutionary biology and the endurance of genes: "I am the family face;/ Flesh perishes, I live on."
More troublesome than the fact that our genes — at least in theory — are eternal, whereas our bodies are ephemeral, is the caterpillar's question to Alice: Who are we? Or, rephrasing Lenin by way of Glendower, Who's calling and who's heeding?
The biologically informed answer is that we're not all that different from those alarming rat/tapeworm, ant/trematode, flea/bacteria relationships, only this time it's genes/body. Unlike the cases of parasites or pathogens, when it comes to genes manipulating bodies, the situation seems less dire to contemplate, if only because it is less a matter of demonic possession than of our genes, ourselves. The problem, however, is that those presumably personal genes aren't any more hesitant about manipulating us than a brainworm is about hijacking an ant.
Consider a seemingly more benign behavior, indeed, one that is highly esteemed: altruism. This is a favorite of evolutionary biologists because, superficially, every altruistic act is a paradox. Natural selection should squash any genetically mediated tendency to confer benefits on someone else while disadvantaging the altruist. Such genes should disappear from the gene pool, to be replaced by their more selfish counterparts. To a large extent, however, the paradox of altruism has been resolved by the recognition that "selfish genes" — per Richard Dawkins and others — can promote themselves (rather, identical copies of themselves) by conferring benefits on genetic relatives, who are likely to carry copies of the genes in question. By that process, known as "kin selection," behavior that appears altruistic at the level of bodies is revealed to be selfish at the level of genes. When someone favors a genetic relative, who, then, is doing the favoring: the nepotist, or the nepotist's genes?
Just as sneezing may well be a successful manipulation of "us" (Homo sapiens) by "them" (viruses), what about altruism as another successful manipulation of "us," this time by our own "altruism genes"? Admirable as altruism may be, it is therefore, in a sense, yet another form of manipulation. After all, just as the brainworm gains by orchestrating the actions of an ant, altruism genes stand to gain when we are nice to Cousin Sarah, never mind that such niceness is costly to that entity we unthinkingly call "ourselves."
All this may seem a bit naïve, since biologists know that genes don't order their bodies around. No characteristic of any living thing emerges full-grown from the coils of DNA, like Athena leaping out of the forehead of Zeus. Every trait — including behavior — results from a complex interaction of genetic potential and experience, learning as well as instinct, nurture inextricably combined with nature. Life is a matter of genetic influence, not determinism.
But does that really resolve the problem? Let's say that a brainworm-bearing ant still possesses some free will. And that a trematode-carrying mouse has even a bit more. So what if their behaviors were influenced, but not determined? Wouldn't even "influence" be enough to cast doubt on their agency, their independence of action? And (here comes the Big Question) why should human agency or free will be any less suspect? Even if we are manipulated just a teensy-weensy bit by our genes, isn't that enough to raise once again that disconcerting question: Who R we? Or, more to the point, who's in charge?
Maybe it doesn't matter. Or, put differently, maybe there is no "one" home, nobody minding the store. If so, that is because the "environment" is no more outside us than inside, part tapeworm, part bacterium, part genes, part hitchhiking retroviruses, collaborating mitochondria, and no independent, self-serving, order-issuing homunculus. Purveyors of Buddhist wisdom note that our skin doesn't separate our organismic selves from the environment; it joins us, just as purveyors of biological wisdom know that we are manipulated by, no less than manipulators of, the rest of life. Who R we? Well, that depends on what the meaning of "who" is. Who's left after the parasites and pathogens and other fellow travelers are removed? And after "you" are separated from your genes?
"That what we call mind," wrote David Hume, "is nothing but a heap or collection of different perceptions, united together by certain relations, and supposed, though falsely, to be endowed with a perfect simplicity and identity." The question of whether human consciousness reflects a single, coherent identity is fraught enough to have occupied battalions of philosophers and, these days, more than a few neurobiologists as well (even without the vexed question of free will). Add to that the identity-challenging insights of evolutionary genetics and ecology, and the caterpillar's question takes on even greater weight. One thing we know, however, is that no one can issue a bona fide Declaration of Independence.
Accordingly, let's leave the last words to a modern icon of organic, oceanic wisdom: SpongeBob SquarePants. Mr. SquarePants, a cheerful, talkative — although admittedly, somewhat cartoonish — fellow of the phylum Porifera, "lives in a pineapple under the sea/ Absorbent and yellow and porous is he!" I don't know about the pineapple or the yellow, but absorbent and porous are we, too.
David P. Barash is a professor of psychology at the University of Washington and a frequent Chronicle contributor. His most recent book, Natural Selections: Selfish Altruists, Honest Liars, and Other Realities of Evolution (Bellevue Literary Press), is based on his recent Chronicle articles.
3.11.08
STUDS
An Ear for the Lyrical Voice of Everyman
Much of what's important to know about Studs Terkel could be shorthanded in that nickname. Who calls anyone "Studs" anymore? Who even called guys that back when -- the 1930s, '40s, '50s, when Louis "Studs" Terkel honed his craft as a journalist, raconteur, and chronicler and champion of the working class.
As it happened, the nickname came from the character Studs Lonigan, hero of James T. Farrell's novels about a kid from Chicago's South Side, which was the Bronx-born Terkel's adopted home.
But "Studs" suggested much of what Terkel, who died yesterday at 96, was: a man who came up on the real people's side of town, who knew the gangsters and the showgirls, and who never lost his feel for the people at the back of the bus, slumped and slumbering in the dreary light. He was rumpled, and smoked cigars. Of course.
Reporters and priests and psychologists know it takes a certain kind of personality to get a certain kind of person to speak honestly. Terkel's gift -- displayed on his syndicated radio program for decades, as well as in print -- was just this. He perfected a kind of shoe-leather approach to writing the history of America in the last century that coaxed extraordinary tales out of nobodies.
His method was to travel the country, sometimes for years, interviewing hundreds of people about some enormous epoch or theme. Terkel essentially asked everyone a simple question: What was it like (or, what are your thoughts about . . . )? What was it like to live through the Great Depression ("Hard Times," 1970), to simply do your job ("Working," 1974), to live through World War II ("The Good War," 1984)? The result -- a series of oral histories -- was the poetry of ordinary people, shot through with desperation, hatred, love, dreams realized and lost.
"What first comes out of an interview are tons of ore; you have to get that gold dust in your hands," Terkel wrote in his 2007 memoir, "Touch and Go." "Now, how does it become a necklace or a ring or a gold watch? You have to get the form; you have to mold the gold dust."
In "Hard Times," Terkel's got an astounding cross section of people -- tycoons and autoworkers, farmers and stickup artists, even the fan dancer Sally Rand -- to open up not just about bread lines and poverty, but about heartache. The overriding theme of "Hard Times" isn't just deprivation but shame. Shame about losing a job and going "on relief." Shame about not being able to provide for one's family. Shame about the breakdown of families and, almost, the fabric of an entire society. You read it and think that the most terrible thing to happen to a nation is not to lose its economy, but to lose its faith in itself.
If "Hard Times" was about humiliation, "The Good War," Terkel's Pulitzer Prize-winning exploration of World War II, was about fear. The title was, of course, ironic. No wars are good, no matter how just the cause. In contrast to the Depression, America's economy revived with the spark of war production. People had jobs and stuff again (even with severe rationing), but the book illustrates the limitations of comfort and material things.
For what doth it profit a man to gain the whole world (or at least a job again) if he wakes up every morning fearing something far more terrifying than his own death: the loss of his son in distant battle?
An unapologetic and lifelong lefty, Terkel's special affinity was with the waitress and the tool-and-die man. "I never met a picket line or petition I didn't like," Terkel once said. He was an avid New Dealer in the 1930s and was blacklisted in the 1950s, suspected of communist leanings, a suspicion that cost him his national TV show.
Terkel's younger contemporaries were the fading class of bar-stool journalists (Terkel undoubtedly would hate that term), hard-boiled guys like Jimmy Breslin, Jimmy Cannon and a fellow Chicagoan, Mike Royko. Terkel's latter-day musical analogue is Bruce Springsteen, whose anthems and ballads have the same bottom-up view as Terkel's clerks and mechanics.
The perfect Terkel quote: "When the Chinese Wall was built, where did the masons go for lunch?" he said when he received an honorary National Book Award medal in 1997. "When Caesar conquered Gaul, was there not even a cook in the army? And here's the big one: When the Armada sank, you read that King Philip wept. Were there no other tears? And that's what I believe oral history is about. It's about those who shed those other tears, who on rare occasions of triumph laugh that other laugh."
Much of what's important to know about Studs Terkel could be shorthanded in that nickname. Who calls anyone "Studs" anymore? Who even called guys that back when -- the 1930s, '40s, '50s, when Louis "Studs" Terkel honed his craft as a journalist, raconteur, and chronicler and champion of the working class.
As it happened, the nickname came from the character Studs Lonigan, hero of James T. Farrell's novels about a kid from Chicago's South Side, which was the Bronx-born Terkel's adopted home.
But "Studs" suggested much of what Terkel, who died yesterday at 96, was: a man who came up on the real people's side of town, who knew the gangsters and the showgirls, and who never lost his feel for the people at the back of the bus, slumped and slumbering in the dreary light. He was rumpled, and smoked cigars. Of course.
Reporters and priests and psychologists know it takes a certain kind of personality to get a certain kind of person to speak honestly. Terkel's gift -- displayed on his syndicated radio program for decades, as well as in print -- was just this. He perfected a kind of shoe-leather approach to writing the history of America in the last century that coaxed extraordinary tales out of nobodies.
His method was to travel the country, sometimes for years, interviewing hundreds of people about some enormous epoch or theme. Terkel essentially asked everyone a simple question: What was it like (or, what are your thoughts about . . . )? What was it like to live through the Great Depression ("Hard Times," 1970), to simply do your job ("Working," 1974), to live through World War II ("The Good War," 1984)? The result -- a series of oral histories -- was the poetry of ordinary people, shot through with desperation, hatred, love, dreams realized and lost.
"What first comes out of an interview are tons of ore; you have to get that gold dust in your hands," Terkel wrote in his 2007 memoir, "Touch and Go." "Now, how does it become a necklace or a ring or a gold watch? You have to get the form; you have to mold the gold dust."
In "Hard Times," Terkel's got an astounding cross section of people -- tycoons and autoworkers, farmers and stickup artists, even the fan dancer Sally Rand -- to open up not just about bread lines and poverty, but about heartache. The overriding theme of "Hard Times" isn't just deprivation but shame. Shame about losing a job and going "on relief." Shame about not being able to provide for one's family. Shame about the breakdown of families and, almost, the fabric of an entire society. You read it and think that the most terrible thing to happen to a nation is not to lose its economy, but to lose its faith in itself.
If "Hard Times" was about humiliation, "The Good War," Terkel's Pulitzer Prize-winning exploration of World War II, was about fear. The title was, of course, ironic. No wars are good, no matter how just the cause. In contrast to the Depression, America's economy revived with the spark of war production. People had jobs and stuff again (even with severe rationing), but the book illustrates the limitations of comfort and material things.
For what doth it profit a man to gain the whole world (or at least a job again) if he wakes up every morning fearing something far more terrifying than his own death: the loss of his son in distant battle?
An unapologetic and lifelong lefty, Terkel's special affinity was with the waitress and the tool-and-die man. "I never met a picket line or petition I didn't like," Terkel once said. He was an avid New Dealer in the 1930s and was blacklisted in the 1950s, suspected of communist leanings, a suspicion that cost him his national TV show.
Terkel's younger contemporaries were the fading class of bar-stool journalists (Terkel undoubtedly would hate that term), hard-boiled guys like Jimmy Breslin, Jimmy Cannon and a fellow Chicagoan, Mike Royko. Terkel's latter-day musical analogue is Bruce Springsteen, whose anthems and ballads have the same bottom-up view as Terkel's clerks and mechanics.
The perfect Terkel quote: "When the Chinese Wall was built, where did the masons go for lunch?" he said when he received an honorary National Book Award medal in 1997. "When Caesar conquered Gaul, was there not even a cook in the army? And here's the big one: When the Armada sank, you read that King Philip wept. Were there no other tears? And that's what I believe oral history is about. It's about those who shed those other tears, who on rare occasions of triumph laugh that other laugh."
Sarah Vowell & Words
In an era of election buzzwords, “mavericks” and Katie Couric interviews, it's strange to remember there was once a time when highly literate speeches were all the rage.
Tragically, that was almost four centuries ago.
As the bestselling New York writer Sarah Vowell notes in her new book The Wordy Shipmates, there is a lot of carry-over rhetoric from the earliest English-American settlers to the present day. The original sermons uttered by Puritans washing ashore in the New World not only defined what was to become the American belief of a chosen land for a chosen people, their words still echo in today's stump speeches.
But oh, how those original words have been maligned, says Vowell.
The author of the 2005 bestseller Assassination Vacation and contributor to Public Radio International's This American Life specializes in the quirks of Americana and what she sees as a duplicity between the ideals of the United States and its actual deeds. For a long time, she has wanted to write about early American Puritans, specifically about the words of the first governor of the Massachusetts Bay Colony, John Winthrop, which she feels have been co-opted by conservatives.
One of his most famous sermons, A Model of Christian Charity, holds particularly strong appeal for her with its message of community and caring within a unified body of people. The sermons, with their biblical promised land imagery, have also been popular with Republicans, most notably former U.S. president Ronald Reagan, and now, vice-presidential candidate Sarah Palin.
But the phrases aren't simply Reagan's legacy, Vowell notes. They are in fact messages that have wafted through American culture for centuries. Most famously, Reagan used Winthrop's notion of “a city upon a hill,” which in turn was borrowed from the biblical Sermon on the Mount. (So Winthrop admirers today can't complain that the colonial governor's words were cribbed by Reagan and Palin. As it turns out, Jesus holds the copyright.) Still, it's Winthrop who first applied the imagery to a nascent America.
On-board the sailing ship Arbella in 1630, Winthrop uttered “a city upon a hill” in his Model of Christian Charity sermon, aimed at preparing these God-fearing Puritans to build a new society in an unknown environment. (Or maybe he delivered the sermon before the ship actually departed England. Historians are not 100 per cent sure.) Imagine the flock listening to the speech – stern-faced and stalwart. They were Puritans after all, although less rigidly anti-England than the Mayflower bunch who arrived a decade earlier. Nevertheless, they must also have been anxious, to say the least. In leaving England, they faced such joys as possible shipwreck in the Atlantic or death in the cold of the New World. So, strong words for them gave reason for their fateful, if possibly fatal decision.
And words, along with education and general learnedness, continued to be held in high esteem once they settled in Massachusetts. “They just loved words and writing and reading … not just the volume of it, but the quality as well. I'm just continually amazed by how much writing they did considering they were in a lot of ways regular old pioneers,” Vowell says over the phone during her current book tour.
There is one line from Winthrop that especially captivates her. She calls it one of the most beautiful sentences in the English language: “We must delight in each other, make other's conditions our own, rejoice together, mourn together, labour and suffer together, always having before our eyes our commission and community in the work, our community as members of the same body.”
After the destruction of the World Trade Center, Vowell says that, as a New Yorker, she felt the strong relevance of those words as the city around her pulled together. And Winthrop's message easily slides into a speech given by any presidential hopeful in recent memory: One electoral body. One nation under God, indivisible. It's all codified in Winthrop's utterances.
But while the rhetoric continues, it's clearly not the reality. Not then, and not today, Vowell suggests.
“I really got cracking on the book after Ronald Reagan's funeral [in 2004] for two reasons. The day Sandra Day O'Connor [then a Supreme Court justice] read Winthrop's sermon as part of the funeral service at the National Cathedral is the day when, it seems to me, Reagan's saintification was more or less complete. And it always bothered me that he used the ‘city on a hill' as his catchphrase, when it's a sermon about charity and generosity,” Vowell says.
“To me, [Reagan's] agenda and legacy and administration were about the opposite of charity and generosity,” she says, citing cuts in housing and lunch programs, and lack of attention to people with AIDS during his term in office.
In addition, O'Connor included Winthrop's phrase that “the eyes of all people are upon us.” What Winthrop meant was that the rest of the world was watching and expected the colony to fail. So, in other words, we need to put on even sterner faces and stiffer upper lips, and work even harder together to make this colony work. But when O'Connor read Winthrop's words centuries later, it was at the same time that the horrific images of the torture and humiliation of prisoners at Iraq's Abu Ghraib jail by American forces were everywhere in the news.
“That part of Winthrop's sermon seemed like a prophesy fulfilled. The eyes of all people were upon us, and what did they see?” Vowell says.
The sermon now has increasingly become a sound bite, she argues. But if read more fully and analyzed more carefully, Winthrop's A Model of Christian Charity describes, as Vowell writes in her book, “an America that might have been, an America fervently devoted to the quaint goals of working together and getting along. Of course, this America does exist. It's called Canada.”
So what has “city upon a hill” become? Another phrase to heap onto all the other phrases spoken at campaign stops, without much real meaning actually communicated? That's what New York writer and veteran magazine editor Glenn O'Brien thinks, calling it in his blog an element in “a new kind of language in which conventional structure is replaced by blocks and stacks of code and buzzwords, pre-digested button-pushing ideograms that simulate speech but are in fact its opposite.” He says this in a commentary highly critical of Palin.
But no matter how distant the language of Winthrop may now seem from its original context, some of the underlying sentiments still linger. For example, it's the view held by some – George W. Bush's administration, for one – that America is “always right and good, inherently best and better than everyone else in the rest of the world. I don't think [Bush] needs a catchphrase. To him, probably the word ‘America' is enough,” Vowell says. (When Vowell tells a joke or is disparaging, she speaks utterly matter-of-factly. It's safe to infer here that she's not complimenting Bush.) This belief in America “the right and good” has then served at times as self-justification for unilateral, bloody action. And yet it's also a sentiment that those who grew up in America, Vowell included, can't help feeling innately. As she writes in her book, “Even though my head tells me that the idea that America was chosen by God as His righteous city on a hill is ridiculous, my heart still buys into it.”
And as she adds on the phone, “The United States has these grand ideals and these beautiful founding documents that are all about liberty and equality. So there's that: There's what we say.
“And then there's what we do.”
Tragically, that was almost four centuries ago.
As the bestselling New York writer Sarah Vowell notes in her new book The Wordy Shipmates, there is a lot of carry-over rhetoric from the earliest English-American settlers to the present day. The original sermons uttered by Puritans washing ashore in the New World not only defined what was to become the American belief of a chosen land for a chosen people, their words still echo in today's stump speeches.
But oh, how those original words have been maligned, says Vowell.
The author of the 2005 bestseller Assassination Vacation and contributor to Public Radio International's This American Life specializes in the quirks of Americana and what she sees as a duplicity between the ideals of the United States and its actual deeds. For a long time, she has wanted to write about early American Puritans, specifically about the words of the first governor of the Massachusetts Bay Colony, John Winthrop, which she feels have been co-opted by conservatives.
One of his most famous sermons, A Model of Christian Charity, holds particularly strong appeal for her with its message of community and caring within a unified body of people. The sermons, with their biblical promised land imagery, have also been popular with Republicans, most notably former U.S. president Ronald Reagan, and now, vice-presidential candidate Sarah Palin.
But the phrases aren't simply Reagan's legacy, Vowell notes. They are in fact messages that have wafted through American culture for centuries. Most famously, Reagan used Winthrop's notion of “a city upon a hill,” which in turn was borrowed from the biblical Sermon on the Mount. (So Winthrop admirers today can't complain that the colonial governor's words were cribbed by Reagan and Palin. As it turns out, Jesus holds the copyright.) Still, it's Winthrop who first applied the imagery to a nascent America.
On-board the sailing ship Arbella in 1630, Winthrop uttered “a city upon a hill” in his Model of Christian Charity sermon, aimed at preparing these God-fearing Puritans to build a new society in an unknown environment. (Or maybe he delivered the sermon before the ship actually departed England. Historians are not 100 per cent sure.) Imagine the flock listening to the speech – stern-faced and stalwart. They were Puritans after all, although less rigidly anti-England than the Mayflower bunch who arrived a decade earlier. Nevertheless, they must also have been anxious, to say the least. In leaving England, they faced such joys as possible shipwreck in the Atlantic or death in the cold of the New World. So, strong words for them gave reason for their fateful, if possibly fatal decision.
And words, along with education and general learnedness, continued to be held in high esteem once they settled in Massachusetts. “They just loved words and writing and reading … not just the volume of it, but the quality as well. I'm just continually amazed by how much writing they did considering they were in a lot of ways regular old pioneers,” Vowell says over the phone during her current book tour.
There is one line from Winthrop that especially captivates her. She calls it one of the most beautiful sentences in the English language: “We must delight in each other, make other's conditions our own, rejoice together, mourn together, labour and suffer together, always having before our eyes our commission and community in the work, our community as members of the same body.”
After the destruction of the World Trade Center, Vowell says that, as a New Yorker, she felt the strong relevance of those words as the city around her pulled together. And Winthrop's message easily slides into a speech given by any presidential hopeful in recent memory: One electoral body. One nation under God, indivisible. It's all codified in Winthrop's utterances.
But while the rhetoric continues, it's clearly not the reality. Not then, and not today, Vowell suggests.
“I really got cracking on the book after Ronald Reagan's funeral [in 2004] for two reasons. The day Sandra Day O'Connor [then a Supreme Court justice] read Winthrop's sermon as part of the funeral service at the National Cathedral is the day when, it seems to me, Reagan's saintification was more or less complete. And it always bothered me that he used the ‘city on a hill' as his catchphrase, when it's a sermon about charity and generosity,” Vowell says.
“To me, [Reagan's] agenda and legacy and administration were about the opposite of charity and generosity,” she says, citing cuts in housing and lunch programs, and lack of attention to people with AIDS during his term in office.
In addition, O'Connor included Winthrop's phrase that “the eyes of all people are upon us.” What Winthrop meant was that the rest of the world was watching and expected the colony to fail. So, in other words, we need to put on even sterner faces and stiffer upper lips, and work even harder together to make this colony work. But when O'Connor read Winthrop's words centuries later, it was at the same time that the horrific images of the torture and humiliation of prisoners at Iraq's Abu Ghraib jail by American forces were everywhere in the news.
“That part of Winthrop's sermon seemed like a prophesy fulfilled. The eyes of all people were upon us, and what did they see?” Vowell says.
The sermon now has increasingly become a sound bite, she argues. But if read more fully and analyzed more carefully, Winthrop's A Model of Christian Charity describes, as Vowell writes in her book, “an America that might have been, an America fervently devoted to the quaint goals of working together and getting along. Of course, this America does exist. It's called Canada.”
So what has “city upon a hill” become? Another phrase to heap onto all the other phrases spoken at campaign stops, without much real meaning actually communicated? That's what New York writer and veteran magazine editor Glenn O'Brien thinks, calling it in his blog an element in “a new kind of language in which conventional structure is replaced by blocks and stacks of code and buzzwords, pre-digested button-pushing ideograms that simulate speech but are in fact its opposite.” He says this in a commentary highly critical of Palin.
But no matter how distant the language of Winthrop may now seem from its original context, some of the underlying sentiments still linger. For example, it's the view held by some – George W. Bush's administration, for one – that America is “always right and good, inherently best and better than everyone else in the rest of the world. I don't think [Bush] needs a catchphrase. To him, probably the word ‘America' is enough,” Vowell says. (When Vowell tells a joke or is disparaging, she speaks utterly matter-of-factly. It's safe to infer here that she's not complimenting Bush.) This belief in America “the right and good” has then served at times as self-justification for unilateral, bloody action. And yet it's also a sentiment that those who grew up in America, Vowell included, can't help feeling innately. As she writes in her book, “Even though my head tells me that the idea that America was chosen by God as His righteous city on a hill is ridiculous, my heart still buys into it.”
And as she adds on the phone, “The United States has these grand ideals and these beautiful founding documents that are all about liberty and equality. So there's that: There's what we say.
“And then there's what we do.”
Prayers
Thousands turn to online prayer and advice as financial situation worsens
Web users looking for support during the current financial situation have boosted traffic by more than 70 percent to a Church of England website section focusing on debt advice, and visitor numbers to the church's online prayer page have increased by more than a quarter.
The Matter of Life and Debt website section -- containing a new 'debt spiral' feature enabling visitors to work out if they are among the many families that will be seriously affected by the credit crunch, and useful advice for those worried about debt -- has seen a 71 percent increase in traffic in recent weeks.
The website section is available here.
A new Prayer for the Current Financial Situation has been viewed nearly 8,000 times since it was published online in September - increasing traffic to the popular Prayers for Today section by 28 percent.
It can be found online at
Archbishop of Canterbury Rowan Williams recently said: "At this time of international financial turbulence, it is important that the church should be offering the opportunity for prayer and reflection."
Prayers for Today also contains many other useful contemporary prayers, covering issues such as exam stress, and world peace.
Prayer for the current financial situation
Lord God, we live in disturbing days:
across the world,
prices rise,
debts increase,
banks collapse,
jobs are taken away,
and fragile security is under threat.
Loving God, meet us in our fear and hear our prayer:
be a tower of strength amidst the shifting sands,
and a light in the darkness;
help us receive your gift of peace,
and fix our hearts where true joys are to be found,
in Jesus Christ our Lord. Amen.
Web users looking for support during the current financial situation have boosted traffic by more than 70 percent to a Church of England website section focusing on debt advice, and visitor numbers to the church's online prayer page have increased by more than a quarter.
The Matter of Life and Debt website section -- containing a new 'debt spiral' feature enabling visitors to work out if they are among the many families that will be seriously affected by the credit crunch, and useful advice for those worried about debt -- has seen a 71 percent increase in traffic in recent weeks.
The website section is available here.
A new Prayer for the Current Financial Situation has been viewed nearly 8,000 times since it was published online in September - increasing traffic to the popular Prayers for Today section by 28 percent.
It can be found online at
Archbishop of Canterbury Rowan Williams recently said: "At this time of international financial turbulence, it is important that the church should be offering the opportunity for prayer and reflection."
Prayers for Today also contains many other useful contemporary prayers, covering issues such as exam stress, and world peace.
Prayer for the current financial situation
Lord God, we live in disturbing days:
across the world,
prices rise,
debts increase,
banks collapse,
jobs are taken away,
and fragile security is under threat.
Loving God, meet us in our fear and hear our prayer:
be a tower of strength amidst the shifting sands,
and a light in the darkness;
help us receive your gift of peace,
and fix our hearts where true joys are to be found,
in Jesus Christ our Lord. Amen.
Lonlieness
Lonely together
Loneliness: Human Nature and the Need for Social Connection
“The human being is by nature a social animal,” Aristotle wrote in his treatise on politics. The proof, he believed, lay in the fact that “the human being alone among the animals has speech.” Bees buzz; sheep bleat; if humans speak, they must be even more gregarious. So integral are groups to human nature, Aristotle believed, that a person who tries to go it alone must be something other than human – “either a beast or a god.”
Alas, by Aristotle’s definition, more and more people are abandoning their humanity every year. Industrialization brings material comforts, and representative democracy brings political power, and as soon as people have these goods they seem to use them to procure privacy and independence – and, inadvertently, social isolation. Instead of exchanging gossip while walking a footpath, we honk at one another from inside the bubbles of our automobiles. Instead of amicably trading opinions over a leisurely cup of tea, we leave anonymous vitriol in the comments sections of blogs. In Loneliness, the psychologist John T Cacioppo and the science writer William Patrick report on the situation in the United States: Between 1985 and 2004, the number of Americans who said they had no close confidants tripled. Single-parent households are on the rise, and the US Census estimates that 30 percent more Americans will live alone in 2010 than did so in 1980. As the American way of life spreads around the world, no doubt loneliness is being exported with it.
People do like to be alone sometimes. But no one likes to feel lonely – to feel that they are alone against their will, or that the social contacts they do have are without deeper meaning. According to Cacioppo and Patrick the feeling of loneliness is the least of it. They present scientific evidence suggesting that loneliness seriously burdens human health. By middle age, the lonely are less likely to exercise and more likely to eat a high-fat diet, and they report experiencing a greater number of stressful events. Loneliness correlates with an increased risk of Alzheimer’s. During a four-year study, lonely senior citizens were more likely to end up in nursing homes; during a nine-year study, people with fewer social ties were two to three times more likely to die.
To explain why loneliness hurts so bad, Cacioppo and Patrick turn to evolutionary psychology. Like Aristotle, they believe that humans were designed to live together. They posit that a willingness to cooperate helped propel humans to the top of the food chain, and they agree with the emerging consensus among evolutionary psychologists that human brains grew powerful in order to interpret the signals humans exchanged as they cooperated. (Where did you put the eland shank? In the cave? Really? But I thought you said a man-eating ocelot lived there.) The earliest human societies were made up of hunter-gatherers who shared everything. On the prehistoric savannahs where humans evolved, to be alone was dangerous, and so loneliness, for humans, is a distress signal akin to pain, thirst, cold, or hunger. It burdens the human body and mind so as to force us into company, where we will be safer.
It is through the mind that loneliness damages the body, and Cacioppo and Patrick explain a number of psychological experiments, conducted by themselves and others, that offer clues about the mechanism. Loneliness erodes will power, for one thing. If subjects are told for the purposes of experiment that they will face a lonely future, they score lower on intelligence tests and abandon tasks sooner. If cookies are set before subjects who have been told that no one else in the experiment wants to work with them, they eat twice as many as those who have been told that everyone else in the experiment wants to work with them.
The mind’s perception of social information seems particularly distorted. In experiments, lonely people recall social information more accurately but are worse at interpreting the emotional meaning of briefly displayed faces. They are more likely than non-lonely people to attribute failure to themselves and success to the situation they find themselves in. In a game where two players split $10 (Dh37) if they agree on how much each one gets, lonely players accept unfair divisions more often. When Cacioppo’s team watched the brains of lonely and non-lonely people under a functional magnetic resonance imaging scanner, they noticed that the lonely respond more strongly to pleasant-looking objects than to pleasant-looking people. When shown unpleasant images, however, the lonely pay more attention to people. Cacioppo and Patrick suggest that these distortions of perception might trap a chronically lonely person in “a defensive crouch” that keeps others at bay. “Fear of attack fosters a greater tendency to preemptively blame others,” they write. “Sometimes this fear makes us lash out. Sometimes it makes us desperate to please, and sometimes it causes us to play the victim.”
With this plausible hypothesis, we approach the chief flaw of Cacioppo and Patrick’s book. In addition to a summary of the damage caused by loneliness and a reasonable-sounding (if somewhat rambling) explanation of it grounded in evolutionary psychology, Cacioppo and Patrick offer a way out. “With a little encouragement, most anyone can emerge from the prison of distorted social cognition and learn to modify self-defeating interactions,” they promise. You, too, can learn “the secret to gaining access to social connection and social contentment.” In their closing chapters, Cacioppo and Patrick begin to sound like a late-night television infomercial. Cacioppo even reveals that he’s a “scientific consultant” for an online dating service.
You already know the advice that Cacioppo and Patrick are recommending, so let me save you the price of the book: do unto others as you would have them do unto you. If you’re lonely and want people to pay attention to you, Cacioppo and Patrick recommend that you pay attention to them. It’s great advice, of course. Time-tested. Easy to remember. And I don’t doubt that it works, for those able to follow through with it. But it will be useless to those too trapped by circumstances, habits, or brain chemicals to change their ways by mere will power – which, as Cacioppo and Patrick show, is often undermined by loneliness. There’s also very little science behind it. Cacioppo and Patrick cite evidence that mortality drops 25 percent among those who regularly attend religious services. Strength of religious feeling has nothing to do with the health benefit – mere attendance is all that’s required – and Cacioppo and Patrick infer that “seeing others committed to compassionate helping ... reinforces various positives, including a healthier lifestyle.” That’s not quite proof that the Golden Rule cures loneliness, though. It suggests, rather, that joining a church, mosque or synagogue cures loneliness. (Indeed, joining any organization at all might help; Cacioppo and Patrick report that mortality is lower in American states whose citizens belong to more groups of whatever kind.) And it should perhaps be noted that none of the major religions promise that the Golden Rule will win you friends and lovers. They merely say it’s the right thing to do. For the sake of clarity, let me repeat that I think so, too. But you don’t need Cacioppo and Patrick to tell you so, and there’s no reason to believe that being told to practise the Golden Rule will make you less lonely – or even to believe that being told to practise it will cause you to practise it. Cacioppo and Patrick have not tested their proposed remedy with the same scientific rigour that they have tested their claims about loneliness’s physiological effects and psychological mechanisms.
Of course science doesn’t have a monopoly on the interpretation of a human phenomenon like loneliness. Literature and political philosophy have much to say as well, and in another recent book, Loneliness as a Way of Life, Thomas Dumm turns to such thinkers and artists as William Shakespeare, Hannah Arendt, Herman Melville, Arthur Miller, Sigmund Freud and Ralph Waldo Emerson. Dumm writes that his “core insight” is that loneliness explains much of modern political life, and he weaves into his analysis a partial account of what he felt as he lost his wife, who died in 2003 after struggling for four and a half years with cancer. The story of his widowerhood appeals to the reader’s sympathy. It is therefore somewhat painful to be disappointed by his analysis, which is vague and breaks no new ground.
Dumm defines loneliness as “the experience of the pathos of disappearance.” The definition fails to distinguish loneliness from grief, and it doesn’t help Dumm reach any new insights. It isn’t clear, however, that Dumm intends to reach any. For the most part, he presents the insights of others, retelling, with the occasional critical remark, Arendt’s theory that totalitarian governments prefer their citizens lonely, Freud’s distinction between successful mourning and stalled brooding, the plot of the movie Paris, Texas and Judith Butler’s idea that the Bush administration shunted into violence the emotional energy that ought to have gone into grieving over America’s losses of September 11.
Such dependence on the thinking of others is hardly a fault. Writers like Alain de Botton have had great success in making the ideas of the Western tradition accessible and intriguing to new readers. But when Dumm does stray from his primary texts, he fails to convince the reader to follow. Though Dumm is a professor of political philosophy, Loneliness as a Way of Life is in method a work of literary criticism, and as a reader, Dumm is sloppy. For example, consider his discussion of a famous passage in Emerson’s essay Experience. Emerson expected the death of his young son to scar him but it didn’t. “It was caducous,” Emerson wrote. According to the Oxford English Dictionary, the word caducous is “applied to organs or parts that fall off naturally when they have served their purpose”, such as autumn leaves after they have turned. Dumm misses the beautiful metaphor implicit in the word. He defines it simply as “the falling off of a limb”, which sounds reptilian, and then wanders irrelevantly into a discussion of words near it in the dictionary, including cad, cadet and cadre. Elsewhere Dumm writes of his late wife that in her absence “she becomes, as Emerson says, a part of my estate,” but in fact Emerson wrote that “in the death of my son ... I seem to have lost a beautiful estate,” so Dumm has reversed Emerson’s meaning. Such errors may seem picayune, but in literary criticism, interpretations are built out of observations. Dumm’s lapses in noticing eventually lead him to claim, improbably, that in Herman Melville’s novel Moby-Dick, “Ishmael is Pip.”
Dumm also is hindered by a high-academic style whose hallmarks are rhetorical questions (“Why is Cordelia so unhappy?”), wildly general assertions (“Capitalism may be thought of as a symptom of the lonely self”), the po-faced delivery of puns (“Our very reality is fundamentally shaped by realty”) and the heavy-handed use of the literary figure of chiasmus (“Love is all we need to overcome absence – and loneliness is the absence we cannot overcome”). The style is unfortunate. You might even liken it to a defensive crouch. The professor alone among the animals has tenure. He should use it to communicate – not to set himself apart.
If industrial capitalism is fostering loneliness, then neither science nor political philosophy is likely to save us from it. I happen to fare poorly at the cookie-eating experiment described above and suspect myself, for this and other reasons, of not having the most thoroughly socialized personality, so I don’t dare offer any advice of my own. But I will mention a use of literature overlooked by Dumm: solidarity in loneliness. It’s strangely pleasant to read about the runaway boy in Denton Welch’s novel Maiden Voyage, or about the disillusioned widow in Angus Wilson’s The Middle Age of Mrs Eliot, even if the reader has no expectation of learning from the characters’ predicaments. One isn’t any less alone for reading them, but then loneliness has nothing to do with the number of actual people one is in touch with, even in Cacioppo and Patrick’s experiments. Some books make solitude bearable
Loneliness: Human Nature and the Need for Social Connection
“The human being is by nature a social animal,” Aristotle wrote in his treatise on politics. The proof, he believed, lay in the fact that “the human being alone among the animals has speech.” Bees buzz; sheep bleat; if humans speak, they must be even more gregarious. So integral are groups to human nature, Aristotle believed, that a person who tries to go it alone must be something other than human – “either a beast or a god.”
Alas, by Aristotle’s definition, more and more people are abandoning their humanity every year. Industrialization brings material comforts, and representative democracy brings political power, and as soon as people have these goods they seem to use them to procure privacy and independence – and, inadvertently, social isolation. Instead of exchanging gossip while walking a footpath, we honk at one another from inside the bubbles of our automobiles. Instead of amicably trading opinions over a leisurely cup of tea, we leave anonymous vitriol in the comments sections of blogs. In Loneliness, the psychologist John T Cacioppo and the science writer William Patrick report on the situation in the United States: Between 1985 and 2004, the number of Americans who said they had no close confidants tripled. Single-parent households are on the rise, and the US Census estimates that 30 percent more Americans will live alone in 2010 than did so in 1980. As the American way of life spreads around the world, no doubt loneliness is being exported with it.
People do like to be alone sometimes. But no one likes to feel lonely – to feel that they are alone against their will, or that the social contacts they do have are without deeper meaning. According to Cacioppo and Patrick the feeling of loneliness is the least of it. They present scientific evidence suggesting that loneliness seriously burdens human health. By middle age, the lonely are less likely to exercise and more likely to eat a high-fat diet, and they report experiencing a greater number of stressful events. Loneliness correlates with an increased risk of Alzheimer’s. During a four-year study, lonely senior citizens were more likely to end up in nursing homes; during a nine-year study, people with fewer social ties were two to three times more likely to die.
To explain why loneliness hurts so bad, Cacioppo and Patrick turn to evolutionary psychology. Like Aristotle, they believe that humans were designed to live together. They posit that a willingness to cooperate helped propel humans to the top of the food chain, and they agree with the emerging consensus among evolutionary psychologists that human brains grew powerful in order to interpret the signals humans exchanged as they cooperated. (Where did you put the eland shank? In the cave? Really? But I thought you said a man-eating ocelot lived there.) The earliest human societies were made up of hunter-gatherers who shared everything. On the prehistoric savannahs where humans evolved, to be alone was dangerous, and so loneliness, for humans, is a distress signal akin to pain, thirst, cold, or hunger. It burdens the human body and mind so as to force us into company, where we will be safer.
It is through the mind that loneliness damages the body, and Cacioppo and Patrick explain a number of psychological experiments, conducted by themselves and others, that offer clues about the mechanism. Loneliness erodes will power, for one thing. If subjects are told for the purposes of experiment that they will face a lonely future, they score lower on intelligence tests and abandon tasks sooner. If cookies are set before subjects who have been told that no one else in the experiment wants to work with them, they eat twice as many as those who have been told that everyone else in the experiment wants to work with them.
The mind’s perception of social information seems particularly distorted. In experiments, lonely people recall social information more accurately but are worse at interpreting the emotional meaning of briefly displayed faces. They are more likely than non-lonely people to attribute failure to themselves and success to the situation they find themselves in. In a game where two players split $10 (Dh37) if they agree on how much each one gets, lonely players accept unfair divisions more often. When Cacioppo’s team watched the brains of lonely and non-lonely people under a functional magnetic resonance imaging scanner, they noticed that the lonely respond more strongly to pleasant-looking objects than to pleasant-looking people. When shown unpleasant images, however, the lonely pay more attention to people. Cacioppo and Patrick suggest that these distortions of perception might trap a chronically lonely person in “a defensive crouch” that keeps others at bay. “Fear of attack fosters a greater tendency to preemptively blame others,” they write. “Sometimes this fear makes us lash out. Sometimes it makes us desperate to please, and sometimes it causes us to play the victim.”
With this plausible hypothesis, we approach the chief flaw of Cacioppo and Patrick’s book. In addition to a summary of the damage caused by loneliness and a reasonable-sounding (if somewhat rambling) explanation of it grounded in evolutionary psychology, Cacioppo and Patrick offer a way out. “With a little encouragement, most anyone can emerge from the prison of distorted social cognition and learn to modify self-defeating interactions,” they promise. You, too, can learn “the secret to gaining access to social connection and social contentment.” In their closing chapters, Cacioppo and Patrick begin to sound like a late-night television infomercial. Cacioppo even reveals that he’s a “scientific consultant” for an online dating service.
You already know the advice that Cacioppo and Patrick are recommending, so let me save you the price of the book: do unto others as you would have them do unto you. If you’re lonely and want people to pay attention to you, Cacioppo and Patrick recommend that you pay attention to them. It’s great advice, of course. Time-tested. Easy to remember. And I don’t doubt that it works, for those able to follow through with it. But it will be useless to those too trapped by circumstances, habits, or brain chemicals to change their ways by mere will power – which, as Cacioppo and Patrick show, is often undermined by loneliness. There’s also very little science behind it. Cacioppo and Patrick cite evidence that mortality drops 25 percent among those who regularly attend religious services. Strength of religious feeling has nothing to do with the health benefit – mere attendance is all that’s required – and Cacioppo and Patrick infer that “seeing others committed to compassionate helping ... reinforces various positives, including a healthier lifestyle.” That’s not quite proof that the Golden Rule cures loneliness, though. It suggests, rather, that joining a church, mosque or synagogue cures loneliness. (Indeed, joining any organization at all might help; Cacioppo and Patrick report that mortality is lower in American states whose citizens belong to more groups of whatever kind.) And it should perhaps be noted that none of the major religions promise that the Golden Rule will win you friends and lovers. They merely say it’s the right thing to do. For the sake of clarity, let me repeat that I think so, too. But you don’t need Cacioppo and Patrick to tell you so, and there’s no reason to believe that being told to practise the Golden Rule will make you less lonely – or even to believe that being told to practise it will cause you to practise it. Cacioppo and Patrick have not tested their proposed remedy with the same scientific rigour that they have tested their claims about loneliness’s physiological effects and psychological mechanisms.
Of course science doesn’t have a monopoly on the interpretation of a human phenomenon like loneliness. Literature and political philosophy have much to say as well, and in another recent book, Loneliness as a Way of Life, Thomas Dumm turns to such thinkers and artists as William Shakespeare, Hannah Arendt, Herman Melville, Arthur Miller, Sigmund Freud and Ralph Waldo Emerson. Dumm writes that his “core insight” is that loneliness explains much of modern political life, and he weaves into his analysis a partial account of what he felt as he lost his wife, who died in 2003 after struggling for four and a half years with cancer. The story of his widowerhood appeals to the reader’s sympathy. It is therefore somewhat painful to be disappointed by his analysis, which is vague and breaks no new ground.
Dumm defines loneliness as “the experience of the pathos of disappearance.” The definition fails to distinguish loneliness from grief, and it doesn’t help Dumm reach any new insights. It isn’t clear, however, that Dumm intends to reach any. For the most part, he presents the insights of others, retelling, with the occasional critical remark, Arendt’s theory that totalitarian governments prefer their citizens lonely, Freud’s distinction between successful mourning and stalled brooding, the plot of the movie Paris, Texas and Judith Butler’s idea that the Bush administration shunted into violence the emotional energy that ought to have gone into grieving over America’s losses of September 11.
Such dependence on the thinking of others is hardly a fault. Writers like Alain de Botton have had great success in making the ideas of the Western tradition accessible and intriguing to new readers. But when Dumm does stray from his primary texts, he fails to convince the reader to follow. Though Dumm is a professor of political philosophy, Loneliness as a Way of Life is in method a work of literary criticism, and as a reader, Dumm is sloppy. For example, consider his discussion of a famous passage in Emerson’s essay Experience. Emerson expected the death of his young son to scar him but it didn’t. “It was caducous,” Emerson wrote. According to the Oxford English Dictionary, the word caducous is “applied to organs or parts that fall off naturally when they have served their purpose”, such as autumn leaves after they have turned. Dumm misses the beautiful metaphor implicit in the word. He defines it simply as “the falling off of a limb”, which sounds reptilian, and then wanders irrelevantly into a discussion of words near it in the dictionary, including cad, cadet and cadre. Elsewhere Dumm writes of his late wife that in her absence “she becomes, as Emerson says, a part of my estate,” but in fact Emerson wrote that “in the death of my son ... I seem to have lost a beautiful estate,” so Dumm has reversed Emerson’s meaning. Such errors may seem picayune, but in literary criticism, interpretations are built out of observations. Dumm’s lapses in noticing eventually lead him to claim, improbably, that in Herman Melville’s novel Moby-Dick, “Ishmael is Pip.”
Dumm also is hindered by a high-academic style whose hallmarks are rhetorical questions (“Why is Cordelia so unhappy?”), wildly general assertions (“Capitalism may be thought of as a symptom of the lonely self”), the po-faced delivery of puns (“Our very reality is fundamentally shaped by realty”) and the heavy-handed use of the literary figure of chiasmus (“Love is all we need to overcome absence – and loneliness is the absence we cannot overcome”). The style is unfortunate. You might even liken it to a defensive crouch. The professor alone among the animals has tenure. He should use it to communicate – not to set himself apart.
If industrial capitalism is fostering loneliness, then neither science nor political philosophy is likely to save us from it. I happen to fare poorly at the cookie-eating experiment described above and suspect myself, for this and other reasons, of not having the most thoroughly socialized personality, so I don’t dare offer any advice of my own. But I will mention a use of literature overlooked by Dumm: solidarity in loneliness. It’s strangely pleasant to read about the runaway boy in Denton Welch’s novel Maiden Voyage, or about the disillusioned widow in Angus Wilson’s The Middle Age of Mrs Eliot, even if the reader has no expectation of learning from the characters’ predicaments. One isn’t any less alone for reading them, but then loneliness has nothing to do with the number of actual people one is in touch with, even in Cacioppo and Patrick’s experiments. Some books make solitude bearable
1.11.08
ALL SAINTS DAY

A Date With the Departed
By THOMAS LYNCH
THE pumpkins, penny candy and neighborly hordes of goblins and ghosts shouting “Trick or treat!” remind us of the ancients and their belief that the souls of the dead must be appeased. But it’s the days that follow Halloween that most interest me.
All Saints’ Day and All Souls’ Day are time set aside to broker peace between the living and the dead. Whether you are pagan or religious, Celt or Christian, New Age believer or doubter-at-large, these are the days when you traditionally acknowledge that the gone are not forgotten. The seasonal metaphors of reaping and rotting, harvest and darkness, leaf-fall and killing frost supply us with plentiful memento mori. Whatever is or isn’t there when we die, death both frightens and excites us.
Thus, throughout most of the Western world, graves are decorated on these first days of November with candles and fresh flowers. Picnics are held among the old stones and markers, relatives gather round family plots to give the dead their due of prayers and remembrances.
We humans are bound to and identified with the earth, the dirt, the humus out of which our histories and architectures rise — our monuments and memorials, cairns and catacombs, our shelters and cityscapes. This “ground sense,” to borrow William Carlos Williams’s idiom, is at the core of our humanity. And each stone on which we carve our names and dates is an effort to make a human statement about death, memory and belief. Our kind was here. They lived; they died; they made their difference. For the ancient and the modern, the grave is an essential station.
But less so, lately, especially here in the United States, where we whistle past our graveyards and keep our dead at greater distance, consigned to oblivions we seldom visit, estranged and denatured, tidy and Disney-fied memorial parks with names like those of golf courses or megachurches.
In her honors seminar, “Death in American Culture,” at Gardner-Webb University in Boiling Springs, N.C., June Hobbs takes her students on a field trip to Sunset Cemetery in nearby Shelby. She believes that cemeteries have much to tell us about ourselves. For most of her students, it is their first visit to a cemetery.
“I find this astonishing,” says Professor Hobbs. “This county had more casualties during the Civil War than any other. The dead were everywhere, the churchyards filled up, Sunday afternoons were spent visiting graves. The dead were very much a part of the community, kept alive in everyday conversations.” Now they’ve been downsized or disappeared.
She speaks to a culture that quietly turned the family “parlor” into a “living room,” the “burial policy” into “life insurance” and the funeral into a “celebration of life,” often notable for the absence of a corpse, and the subtle enforcement of an emotional code that approves the good laugh but not the good cry. Convenience and economy have replaced ethnic and religious customs.
The dead get buried but we seldom see a grave. Or they are burned, but few folks ever see the fire. Photographs of coffins returned from wars are forbidden, and news coverage of soldiers’ burials is discouraged. Where sex was once private and funerals were public, now sex is everywhere and the dead go to their graves often as not without witness or ritual.
Still, there remains something deeply human in the way we process mortality by processing mortals in the journey between life as we know it and life as we imagine it, in whatever space the dead inhabit. Wherever the dead go or don’t, it is the duty of the living to get them to the edge of that oblivion.
Since the first cave-dwelling Neanderthal awakened next to a dead kinsman and knew something would have to be done about it, we humans have looked into the tomb or grave or fire and asked ourselves the signature questions of our species: Is that all there is? Can it happen to me? What comes next? Only the dead know the answers. And the living are well and truly haunted by them.
Perhaps Professor Hobbs is right. The dead have something to teach us still. A visit to your local cemetery, here in the month of all saints and souls, is a course in humanity. There are inklings to answers among the stones.
Subscribe to:
Posts (Atom)