About Me

My photo
New Orleans, Louisiana, United States
Admire John McPhee, Bill Bryson, David Remnick, Thomas Merton, Richard Rohr and James Martin (and most open and curious minds)

31.10.08

October is Over

The End is Near

LAST JUDGEMENT

43's Legacy (collected by the Guardian)

Paul Auster
Author
I'm hard-pressed to think of a single thing the Bush administration has done to promote the arts. Things have gone on as before: novelists are writing books, people read them or don't read them, movies are being made and people go or don't go, artists are painting pictures, people are making music. I don't see that the Bush people have affected the cultural landscape that much.

These past eight years have been about the worst that I can imagine. For the first time as a writer I've addressed, here and there, the situation that we're living through. I'd never done that before and I guess because I've been so alarmed, so distraught, the pressure of this unhappiness has spilled over into my work at times.

If McCain wins, I feel like going into a cellar for the next four years or going out in the streets every day and screaming. Obama, if he does win, is going to have so many problems to deal with that the most one could hope for would be to undo some of the damage. Most artists seem to be for Obama. In fact, I haven't met a single one who is for McCain, so our spirits would be lifted. The problems in the country will remain as serious as ever.

Art isn't journalism. Some of the greatest historical novels were written long after the events discussed in the book. You think of War and Peace, written in 1870 about things that happened in 1812. I think there's this confusion in the minds of the public that artists are supposed to respond immediately to things that are going on. We've been living through a new era. Everyone knows the world has changed, but exactly where the story is taking us is unclear right now and until it plays out further I don't know if anyone has a clear vision of what's happening.

Joyce Carol Oates
Writer
The "cultural legacy of George W Bush" would seem to be the punchline of a cruel joke, if there could be anything remotely funny about the Bush administration. (There isn't: nothing funny, and nothing of a legacy.) But the National book festival, hosted each September by former librarian Laura Bush on the Mall in Washington is a truly valuable cultural event, which we can hope that the wife of George W's successor will wish to continue.

The cultural life of America is a thing quite apart from the federal government. It can flourish, as in the Johnson-Nixon eras, as counter-culture; in times of political debasement, art can be idealistic and ahistorical. Most artists live through a sequence of administrations, and their art evolves in ways too individual to be related to larger, generic forces.

The cultural life of America would be relatively unchanged if McCain wins, since he is a variant of Bush. If Obama wins, very likely there would be an efflorescence of a kind, perhaps most evident in the more public artforms - dance, music, theatre.

We can hope!

Gore Vidal
Writer
Although all politicians tell lies, Bush has gone right round the bend as a liar and he'll be remembered for a great many of the lies, starting with weapons of mass destruction and going on and on. That's the only legacy. Oliver Stone, I gather, is doing father-and-son stories. I'm very fond of Oliver, but you don't need Freud when you're dealing with Caligula.

One of the problems is that journalists think this is just a familiar phenomenon, this administration. It isn't - there isn't anything like it. It doesn't bode well for anything at all - it's just there. Culture goes on. People go on writing novels even though the general public doesn't want to read them. I think the plucky few will continue and to predict what the next wrinkle will be is not very useful.

We have a president who cannot read. He's dyslexic, as was his father before him. It must have an effect. I watch a good deal of television because of the elections. The professional television people, all of them graduates of our finest universities, can't use proper English. We are losing the language, I suppose.

Art is always needed in a country that doesn't much like it. Performance is all anybody cares about.

Paul Miller
AKA DJ Spooky
Under Reagan and Thatcher you had the rise of an internationalist view of music, especially with punk, reggae, and dub. The Bush administration has left a legacy of numbness - what do you rebel against when, essentially, the establishment just doesn't care what you think?

Usually when you have a rightwing lunatic such as Nixon, or more cynical regimes such as Reagan or Bush I's administration, there's a counterpoint. What ended up happening with Bush II is that the counter-culture response became incoherent.

The "culture-entertainment" industry is different now. They realise that the idea of rebellion can be made into an echo-chamber and sold back to you. We have rebellion on the radio, songs that are anti-establishment, "mavericks" for president, but on the other hand you have the Dixie Chicks and the reaction when they said stuff against the war. You can see that there's still a tremendous reactionary culture in the US.

The meta-narrative is that anything goes: Britney Spears giving herself a haircut or the "hyper-realism" of the execution of Saddam Hussein spreading like video wildfire on people's cellphones. It's incoherence - montaged and edited a la Oliver Stone. Is the president a mirror that speaks to you? I don't think so.

Elizabeth LeCompte
Theatre director
He has fostered the rise of political satire as an art form again. It hasn't been very strong for the last 30 years or so and I think television programmes such as The Daily Show, The Colbert Report and South Park are all political works of art. Without the Bush administration I don't think satire would have been as strong. It revived irony.

Theatre in America is in decline, however. A lot of the people who would have been writing for the theatre 100 years ago are now writing in television. In America, all art is denigrated, basically, with the possible exception of music. Written and spoken arts aren't taken seriously here, and I don't think they've ever been.

People are starting to view politics as entertainment much more. That's why the number of people voting is up. YouTube has made politicians entertainers. With satire there's an incredibly powerful challenging of the powers that are, which I think is very healthy. There's also a trivialising effect at the same time. But it is a change, because young people are going to be involved in politics in a way that they haven't been before.

When Obama had trouble, before he beat Hillary, they began to make fun of him as a pompous teacher, so let's see. I think it'll be interesting. I just know that for me, under the Bush administration, things like The Daily Show and South Park will be remembered as real satire, not just parody and caricature.

Edward Albee
Playwright
What cultural legacy? There is no cultural legacy. We have an administration of criminality, complicity and incompetence but no cultural legacy whatever from those eight years. It doesn't seem to have produced the kind of rage that I would have expected it to. It shows me that we have a far more passive and ignorant society than I thought we had.

The only value the arts have is commercial. I have found over the past eight years that commerce has taken over the arts in the United States.

I don't think that the Republican administration could have gotten away with everything that it did had it not had a complacent and compliant society. That troubles me a lot. It tells me sad things about the United States now. The only art that is allowed any great exposure is commercial art that is not going to rock the boat.

I always have hope. Somebody asked Beckett once why he writes if he's such a pessimist. He said, "If I were a pessimist I wouldn't write." I'm something of an optimist. I hope that we're capable of getting back on the right track and continuing our peaceful social revolution.

Alex Gibney
Film director
I think the Bush administration did its best to create a vast wasteland. At the same time, because of the perfidy and corruption and utter lawlessness it created a very interesting backlash of politically oriented materials that were inspiring. Unintentionally, the administration provoked a lot of political art that I think was very valuable.

It contributed to an extraordinary flowering of political documentaries - and not necessarily pure anti-Bush ones. The administration provoked a thoughtfulness, both in aesthetic terms and in terms of political thinking, that expressed itself in documentaries in a very exciting way. Iraq in Fragments, for instance, was a beautiful film - not overtly political but political in a deeper sense.

I was associated with a global series called Why Democracy? in which film-makers from all over the world looked at democracy at a time when Bush was trying to "make the world safe for democracy" - or to pulverise the world to accept his version of it. They were very interesting, perceptive and valuable. So the rise of new documentary in the age of Bush has been a great contribution, if unintended.

I think under a McCain administration you'd see a renewed sense of opposition. The cultural world may get even wilder. Under an Obama administration it'll be interesting to see what happens. My fear about Obama is that he's motivated to sweep a lot of stuff under the rug, about his own administration and its ties to very high-powered financial interests. I'm getting ready to start looking at stuff and holding people to account.

But I was joking with a friend of mine, Eugene Jarecki, who made Why We Fight, about how we'd better be honing our skills as the directors of romantic comedies, because without the Bush administration, what are political film-makers gonna do? We'll all be out of work. So please, McCain, win!

Lionel Shriver
Author
As Oliver Stone's film illustrates, W has been a great inspiration, a beacon on the hill, if you will, for artists in every field. Although perhaps not the kind of inspiration that the president would have wished.

Among many other works, Michael Moore's mocking Fahrenheit 9/11, Ian McEwan's Saturday, Brian Haw's Iraq-protest-turned-Turner-prize-winner-turned-West-End-play (The State We're In), and David Hare's Stuff Happens all have a notoriously incompetent American president to thank for their success. Bush's inability to put a sentence together without repeating the same word five times and chronic mispronunciation "nucular" have provided a feast for comics all over the world.

So the Bush years have been great for the arts, restoring a collusive, adversarial climate last seen circa 1968. Hate figures are far more motivating than heroes, and W has graciously provided the collectively leftwing artistic community an embarrassment of riches. In fact, the biggest problem with the Bush era's artistic legacy is that this widely despised president has tended to inspire polemics and agitprop. Many novels, films, plays, and artworks from the last eight years have been spitting with indignation, painfully obvious in their political intent, sledgehammer subtle in their execution, and clubby - since most of these works are preaching to the converted. Thus W may have bequeathed a whack of subject matter, but whether any of this stuff will be of enduring value is open to question. You have to ask yourself whether the diatribes denouncing Bush in a novel, such as JM Coetzee's Diary of a Bad Year (a book trying enough when it was published in 2007), will hold the faintest interest after January 2009.

And here's the really bad news: Obama could be terrible for the arts. Why, when there's barely an artist in the States who doesn't support him? Art thrives on resistance. There's nothing more arid, more enervating, more stultifying, or more utterly uninspiring than getting your way.

Trisha Brown
Choreographer
I was given a list of people in Congress who might be open to talking about the arts. One senator asked me, "Is this like that woman who does dance?" turned to an assistant and said, "Who was that person?" He meant Martha Graham. The discussion was not bitter whatsoever, but it was frightening because I learned that these people are not thinking at all about arts.

One Congressman said to me, "Well, is Joe Six-Pack gonna really be interested in this?" I said, well, we certainly have a country brim-full of great artists and maybe Joe Six-Pack has a couple of kids in the house and they might be interested in music, or painting, or dance. That was the most I could move him without getting into combat.

I was lobbying, trying to bring them information. The other person who's memorable from that excursion said, "Do you know who I think is the greatest artist?" so I said, "No, I don't know" and he said, "God". I went home like a whipped puppy. I saw that there was no thought about it. My heart was broken.

I thought young artists would go to bat sooner. There are some people who are dedicated to responding to political issues and many who are not. If you're a painter or a sculptor there's money. If you're in dance, there's very little for independent artists. It's very discouraging. I was so in love with art-making - but I'm tired of the suppression of arts and I've shifted into other disciplines to find vitality and exchange.

David Simon
TV writer/producer
Enron, Afghanistan, Iraq, New Orleans, Wall Street. An untenable drug war. A non-existent energy policy. An obliviousness to climate change. An unwillingness to recognise our problems, much less begin the hard work of solving them. Incompetence - rank incompetence - has become the American standard. We are no longer a competent, responsible nation-state. America. The can't-do superpower. Quite a legacy. Mr Bush is a remarkable man.

Naomi Wolf
Author
Bush's cultural legacy? It's disturbing that my initial response is to draw a blank. But it's a sign of the fact that the past eight years have simply pushed the arts to an underground place. It's not just that [Bush] didn't fund the arts or invite artists to the White House; it's not just that he doesn't read poetry, doesn't read books: there's something about the brute force of this administration, and the fetishisation of brute force by this administration, which literally stands in opposition to civilisation and the arts.

I've done a lot of work on Germany from the Weimar period to the late 30s. There was a similar hostility then to the cosmopolitan, the urbanite, the avant garde, to any originality in art. Some of the most interesting visual artists we've seen in recent times, for example, were working behind the iron curtain, and of course, they had to work allegorically.

Much of the protest work I've seen [in America] has been very bad, pedantic, heavy-handed. I've seen so many bad monologues about the Iraq war, so many dreadful photo-montages. I think it's because Americans don't quite understand repression yet. They're not yet understanding the nature of the force that has come down on them, by drawing on their subconscious, by expressing themselves in an allegorical way. And I think artists in America are scared. Respected journalists are being arrested. Film documenting the Republican national congress has been destroyed. And artists are next on the list after journalists. So if, God forbid, there's a McCain/Palin presidency we'll see a crackdown of the police state, there's no doubt.

I'm really quite ashamed of the American people - and of course I include myself in this. We have seen what was happening, and we kept right on internet shopping. All these writers and artists, good people, have just looked around and quietly aligned themselves. Novelists have been really silent. Usually writers are at the forefront of denouncing a regime: look at Václav Havel. Here, people have complained a lot, but in terms of organising a vanguard of resistance, of people getting out there and saying this is not the American way ... Where is the Arthur Miller of this generation? Who is out front, somewhere visible and tricky and scary?

• The End of America, a documentary based on Naomi Wolf's book of the same name, has its UK premiere on November 7 at the Sheffield Doc/Fest

Daniel Libeskind
Architect
How can you even begin to speak of a cultural legacy? It's been wholly negative. Culture's a dirty word to these people, like "liberal" or "literate". We've experienced a complete bankruptcy of the culture of ideas over the past eight years. The intellect has been denigrated. Deep cuts have been made in education and in investment in cultural institutions.

At Ground Zero, we're not sure if the performing arts centre planned will ever happen. This was a key part of the masterplan, but all that's mattered in the World of Bush is the workings, and failures, of the market economy. So, Ground Zero could yet end up, unless we get a sympathetic new president, as a purely commercial venture, with a memorial, rather than as a springboard for American culture.

It's hard to believe Bush, a man who's proud not to read books and who makes fun of words longer than one syllable, has been the inheritor of the mantle of the Founding Fathers, or of Woodrow Wilson, FDR or even Bill Clinton. These people believed in the value of American culture being seen as an inspiring and civilising force around the world. Jefferson was a fine architect. All Bush has offered the world is military force. This is still a great country, but Bush and Cheney have ensured that only the negative side of US culture has spread around the world.

• Interviews by Andrew Purcell, Jon Henley and Jonathan Glancey

Mr. Obama

Baratunde Thurston: Obama has tapped into hope – and triggered a backlash of fear


We are almost there. The mood here in the USA is tense. I am excited for this election, but with only a few days left, I am even more excited for it to be over. We Obama supporters are exhilarated yet exhausted, confident but cautious. Mostly we cannot stand the suspense.


This campaign has been going on for nearly two years, and "change" has already come to America. In that time, I have changed jobs, moved homes and upgraded my marital status. For the country, all the complicated debt financing our consumer shopping spree seems to have come due at once, and Britney Spears, well, no one even cares what she's up to anymore.

Since January, I have been consumed: cable news, YouTube, blogs, door-to-door voter contact, conventions, debates, conference calls, rallies, interviews, meetings, music videos and more. Like our dependence on fossil fuels, this pace is unsustainable. I have lost Facebook friends over this election. Facebook friends! Enough!

At least I get to play a direct role in the process through donations and actual voting. I can only imagine what people overseas must feel like having no say whatsoever in a process that could mean the difference between receiving shipments of humanitarian aid and shipments of bombs.

We are almost there.

The polling is clear. Obama has a steady national lead. More importantly, he is leading in the handful of states whose electoral votes actually matter including four states that voted for George W. Bush in 2000 and 2004. John McCain had hoped to make California competitive. Instead he finds himself ahead by only two points in his home state of Arizona, forced to commit resources and to start running deceptive automated phone calls in a desperate attempt to prevent an embarrassing loss.

Assuming he wins, what can we expect from Barack Obama. Can he unite the country as he's promised? He will have his work cut out. The divisions in America run deep racially and politically, but there can and will be progress, if not total resolution.

A friend and fellow comedian who produces a video series called "This Week In Blackness" spoke with me about the oft-repeated concept of a "post-racial" America. The notion is that we will have won the War On Racism by electing Obama and once and for all healed America's racial divide. Claims of employment discrimination, systematic imprisonment and economic segregation could be met with, "But you have a black president." The country could finally move on to more pressing matters, like selecting the next flavour of Coca Cola.

This simple resolution will not happen. In fact, Obama's mere candidacy (and the reaction of his opponents to it) have exacerbated that racial divide in small but poignant ways. Obama has tapped into hope, but he has also triggered a backlash of fear from the more ignorant realms of our society.

News organisations have scrambled to display their understanding of blackness but often showcase massive ignorance instead. Fox News refers to Michelle as Obama's "baby mama.". CNN tries to summarise all of Black America in a two-part series almost exclusively highlighting pain, struggle and failures. Most outlets fail to understand that Obama is not here to represent "blackness" to begin with but rather "Americanness."

Many of us have seen the ugliness flow forth from Republican rallies. Looking like images ripped from history books, crowds that seem more like mobs scream "terrorist" or "traitor" or "kill him". All that's missing are the police, fire hoses, attack dogs and bell bottoms. Post-rally interviews turn up more foolishness: Obama is a "monkey"; Obama is a "Muslim"; Obama will "let the blacks take over".

Of course, we have heard of the plots by extremist white supremacist groups to assassinate Obama.

Yet the ignorance of the media, the vocal minority at some rallies and the plots of extremists do not define the entire country.

We will know from the exit polling data in a few days, but it appears that Americans think they are more racist than they actually are. Despite much hype, the so-called Bradley Effect, in which white voters lie to pollsters about their willingness to vote for black candidates, has not emerged, while something more interesting has.

When asked if they would vote for a black presidential candidate, approximately 95 per cent of white Americans said yes. When asked if they thought their neighbours would, only 75 per cent said yes. It's the equivalent of the nation collectively saying, "Oh, I'm not racist, but they are."

We lack faith in one another, and restoration of that faith is something that Barack Obama is ideally positioned to do. His temperament and his biography speak loudly to black children and rural white factory workers alike in a common language that few others can. There is hope that, although he cannot successfully resolve the race question with his mere election, Barack Obama may be just the president to move the conversation further in the right direction.

Should he be elected, another difficult and unenviable job for Obama will be to unite the increasingly polarised left and right political wings.

This election has seen a hardening of lines. Both sides are massively afraid of victory by the other. Sarah Palin has terrified the Left. Conservatives quake at the notion of a federal government controlled by Democrats. How do you come together with people who think you're a secret communist or people you fear will sabotage the courts with religious extremists?

More than any other candidate in many decades, Obama would be the president most likely to lead both factions. He has garnered support from the most left-leaning environmentalists to the most conservative elected officials. He speaks in a language of rights and responsibilities. When faced with challenges, he appeals to our shared hopes rather than our divided fears.

In addition to being exceptionally well-produced – I think he should campaign for an Oscar – his 30-minute television special on Wednesday night gave us a preview of the type of mature dialogue he would be willing have with the nation. By example, he refuted the most extreme of his critics, displaying a family man in touch with the concerns and aspirations of average citizens.

So we have come a long way both historically and in this election. We are so close. The question isn't so much whether Barack Obama can close the deal. It is, can America? I believe we can. Next Tuesday, we'll find out.

The writer is a New York-based editor at 'The Onion' and co-founder of the black political blog Jack and Jill Politics
V.S. Naipaul, that clever and often wise man, once laid down: ‘One always writes comedy at the moment of deepest hysteria.’ Well, where’s the comedy now? There is certainly plenty of hysteria. Old Theodore Roosevelt used to say: ‘Men are seldom more unreasonable than when they lose their money. They do not seek to apportion blame by any rational process but, like a wounded snake, strike out against what is most prominent in their line of vision.’ I notice that the OED, as a rule politically correct, thinks hysteria is chiefly female: ‘Women being much more liable than men to this disorder, it was originally thought to be due to a disturbance of the uterus... Former names for the disease were vapours and hysteric passion.’ Women certainly laugh more than men, more frequently too, a form of anti-hysteria therapy Nancy Mitford called ‘shrieks’.

We first hear of it in Chapter 18 of the Book of Genesis, one of my favourite biblical scenes, taking place outside and within Abraham’s tent. John Frederick Lewis, so good at tents, ought to have painted it. Angels, one of whom turns out to be God, visit the patriarch, and God tells him that Sarah, his elderly wife, will conceive and bear him an heir. Sarah, within and rustling up a meal for the visitors, overhears this. She ‘laughed within herself, saying, “After I am waxed old, shall I have pleasure, my lord being old also?”’ God overheard this laugh, not so much a shriek, more a snort, and resented it. He recognised it was not a joyous laugh, more sardonic, cynical and sceptical. It seemed to doubt his powers to order babies, and he angrily asked aloud: ‘Is anything too hard for the Lord?’ Sarah denied her snort: ‘“I laughed not,” for she was afraid. And he said: “Nay; but thou didst laugh.”’ This is the first recorded laugh in history, and the scene is so vivid it makes one believe in the literal truth of the Old Testament, or at least in the imaginative talents of those ancient Hebrews. Also, it is interesting to note that the first joke arose out of what can only be called the sex war: God is a tremendously male figure in the OT. It may be, indeed, that Sarah’s laugh reflected resentment that God had not given her a child before.

That would square with the view shared by most philosophers and analysts that laughter contains an element of aggression. Hobbes thought it was vainglorious. He wrote in Leviathan (1651): ‘The passion of laughter is nothing else but sudden glory arising from a sudden conception of some eminency in ourselves by comparison with the infirmity of others.’ Henri Bergson, the French oeuftête who wrote a famous essay on the subject, insisted: ‘In laughter we always find an unavowed intention to humiliate and so to correct our neighbour.’ That is why, for instance, the otherwise genial Charles Lamb, playing whist with his much-loved but scruffy young friend Martin Burney, said: ‘If dirt were trumps, what hands you would hold!’ Max Beerbohm added: ‘There are two elements in the public’s humour: delight in suffering, contempt for the unfamiliar.’ Arthur Koestler, in the best short essay on laughing I know, printed in the latest edition of the Encyclopedia Britannica (under ‘humour and wit’) speaks of the laugh as a ‘trigger-releaser’, letting out ‘vast amounts of stored emotions, derived from various often unconscious sources: repressed sadism, sexual tumescence, unavowed fear, even boredom’.

The German-speaking world has wormed its way into the problem of laughter with deadly seriousness, as one would expect. Kant pondered deeply on the topic and came up with this definition: ‘A laugh is the sudden transformation of a tense expectation into nothing.’ Freud took up the view of Herbert Spenser that laughter was a case of emotions being translated into bodily movements; Freud argued that, since emotions were repressed, the tension was relieved when the muscles of the lips follow the line of least resistance in a smile, so ‘laughter is a form of respiratory gymnastics’.

Odd that German philosophers should go into the matter so earnestly (Hegel, Schopenhauer and Wittgenstein all had things to say about it), when humour is not the first thing you think of when contemplating the Teutonic intellect, massive though it may be. (I am told that all jokes have to be explained to Angela Merkel, the German Chancellor.) Some important Germans are recorded as passing through life without laughing: Frederick Barbarossa, for example, and Schiller’s aunt. Von Moltke, the great 19th-century military strategist, only laughed twice in his life: once when told a certain French fortress was impregnable, and once when his mother-in-law died. Martin Heidegger, whom many believe to be the greatest philosopher of the 20th century (he is impenetrable, I find, though I once understood his main point for about half an hour, until my mind crumbled), laughed only once. That was when, on a picnic in the Hartz mountains, Ernst Jünger bent over to pick up a sauerkraut roll, and split his lederhosen with a tremendous crack. But after ‘a fierce shout of mirth’, Heidegger checked himself, and ‘his expression reverted to its habitual ferocity’.

Germans of course are funniest when un-aware they are provoking mirth. Paddy Leigh Fermor, that grand repository of recondite anecdotes, told me that Colonel-Count Von Rausching, the officer commanding the famous Prussian cavalry regiment, the Death’s Head Hussars, became worried, about the year 1900, by the way his subalterns were laughing. He summoned a meeting in the mess and told them: ‘There is a right and a wrong way to laugh. I do not want my young officers sniggering, or tittering, or yelping or guffawing, like tradesmen or Jews or Poles. There is only one admissible way for a military gentleman to laugh, and that is by a short, sharp ejaculation. Thus: “Ha!” Understand? Now, let us see if you can do it properly. Ready? One, two, three, Ha! One, two, three, Ha! Very gut! Now once more: Ha! And again, Ha! So now you know how to laugh. Never let me hear any of you snigger again.’

What about the English, then? William Cory, the famous (or notorious, he was sacked for pederasty) Eton master, used to say: ‘If two or three Englishmen are together any length of time and do not laugh, something has gone wrong.’ Of course he lived before New Labour. Indeed his maxim does not always apply to politicians generally. Take the case of Gladstone. As Margot Asquith put it: ‘Mr Gladstone was not exactly lacking in a sense of humour. But he was not often in the mood to be amused.’ This particularly applied if Disraeli were ever in the vicinity. However they were once observed, and heard, laughing together behind the Speaker’s chair. What happened was this. Disraeli knew that Gladstone and Browning greatly admired each other. He had been reading a volume of Browning’s collected poems, and had spotted a curious error towards the end of ‘Pippa Passes’. Browning was a very learned man, for a poet, so learned that he did not often consult others to correct any mistakes. He thought the word ‘twat’ was an item of a nun’s vestments, instead of a vulgar word for the female pudenda. So towards the end of the poem, Disraeli found, were the lines

Then, owls and bats,

Cowls and twats,

Monks and nuns …

Disraeli pointed this out to the G.O.M., hoping to stump him. Instead, Gladstone laughed uproariously. Afterwards, he was asked: ‘What was the joke?’ But Gladstone would not tell. The moment had passed, and he never laughed with Disraeli again.
At this juncture, my best credit-crunch advice is to keep beside your armchair at all times an atlas of the world, a modern American dictionary and a bottle of whisky. If your constitution is strong, you might also want a copy of the Financial Times but do keep the television zapper handy, so you can hit the ‘mute’ button when the news comes on.

You can tell from the order of the silent pictures whether markets have plunged or rallied, which is really as much as you want to know. If the first shots to appear are of Russell Brand or yachts at anchor off Corfu, it has been a relatively good day for your savings. If it’s stock footage of traders mouthing the word ‘carnage’ or forests of estate agents’ For Sale signs, you know your net worth has just taken another terrible kicking. As for the voices you’ve chosen not to listen to, most of them are just guessing and extrapolating from extreme and unprecedented short-term market events: as I’ve said here before, no one really knows where this is going.

But if you do turn the sound up, you need the dictionary to explain words like ‘deleveraging’, which means reducing debt levels for every kind of borrower, including governments, financial institutions and individuals, and is going to have to happen everywhere; and ‘decoupling’, which means disconnecting yourself from the fate of the American economy, and doesn’t seem to be happening anywhere. If you want to grasp how those two concepts interact with markets, politics and real-world economic activity, the FT and the atlas are by far your best daily guide, and the whisky your only solace.

Thus on Monday, one double-page spread offered as telling a set of headlines as the venerable pink paper can ever have published. ‘IMF outlines $16.5 billion Ukraine loan’; ‘German bail-out: Berlin believes danger persists’; ‘Kuwait guarantees deposits after setback at leading bank’; ‘Tokyo urged to boost “insufficient” bail-out’; ‘Seoul seeks action to halt fall’; ‘India’s banking chief looks to follow G7 lead’; and rather forlornly, ‘Iceland puts hope in its neighbours’.

What began as the American subprime mortgage crisis is now the biggest global financial crisis of our lifetimes and probably of all time. Six weeks ago, we were being invited to feel sorry for sacked Lehman Brothers staff carrying their belongings out of that shiny Canary Wharf tower in cardboard boxes. Now we see just how far afield the damage is spreading; how many hard-working people, most of them much poorer than ourselves, are going to get hurt. And we begin to sense how chaos theory actually works, how the flapping of all those flamboyant butterfly wings in the financial industry has finally caused a tornado to rage across the world.

Asian and East European economies that looked immune to trouble because they were driven by low-wage manufacturing and not yet as decadent as the West, turn out to be just as vulnerable to the downturn as the buy-to-let plagued cities of provincial England. Any ‘emerging market’ that was hot last year is suddenly stone-cold, because foreign investors are fleeing as fast as they can, causing mayhem for local stock markets and currencies. And it turns out that many of these eager new players of the globalised game had all too swiftly learned our bad habits: their banks are dangerously undercapitalised and overexposed to real estate, while their profligate governments lack sufficient firepower for the necessary bail-outs.

Hungary, for example, cannot raise money to rescue its banks (which are in any case largely foreign-owned) by issuing government bonds, because in present conditions almost no one is willing to buy them. In Budapest there is talk of cutting public-sector pensions to help make ends meet, while the government of socialist prime minister Ferenc Gyurcsány this week became the first in the EU to ask for emergency support from the International Monetary Fund.

But Iceland and Ukraine were already at the front of the IMF queue, asking for loans far larger than the Fund’s rules would normally allow and raising questions as to whether the IMF itself, even with 185 subscribing member countries, will have enough in its coffers to answer all the calls for help. What happens if all its members, right down to the tiny Pacific island republic of Palau which is its smallest contributor, are in trouble at the same time?

That thought may be enough to make IMF chief Dominique Strauss-Kahn wish he had run off to Palau with the Hungarian-born mistress who recently got him into the headlines, but it is one that has to be considered. Only ten years ago, in the case of the collapsed US hedge fund Long-Term Capital Management which almost crippled Wall Street, we saw an example of what happens when financiers bet against the once-in-an-aeon probability that all the world’s markets would move against them simultaneously. LTCM’s Nobel laureate mathematicians calculated the risk to be infinitesimal, but it happened anyway.

It certainly feels as if it’s happening again now. Banks that looked solid last month are suddenly doubted. HSBC stood smugly aside from the great Brown bail-out, because its ratio of deposits to loans was healthy and its property exposure was relatively small, but now its shares have taken a hit because investors are worried about its exposure to emerging markets. Santander was strong enough to be invited to take over Bradford & Bingley’s branches and deposits, but analysts now worry about its Latin American exposure. Owning shares in natural-resources companies looked a relatively safe bet until the stock market figured out what global recession would do to commodity demand. Not owning shares in Volkswagen or rather, going short on them looked good too, given the dismal prospects for the car industry, until the short-sellers got squeezed and lost billions.

So we just have to pour ourselves another whisky and wait to see what goes down next. World leaders seem to have reached a consensus that banks cannot be allowed to fail and take voters’ deposits with them: so your current account will probably be the last thing to go before Mad Max anarchy reaches your high street. But insurance companies and pension managers look vulnerable, not only as the biggest holders of bombed-out bank shares and vulnerable corporate bonds, but also because they are hugely exposed to commercial property, which is heading for a global crash any time now. Have you already planned a shop-till-you-drop trip to Westfield, Europe’s biggest retail complex which opened in Shepherd’s Bush this week? No, neither have I, and I wouldn’t want a piece of it in my pension fund.

But first it looks as if the tornado will hit the hedge-fund sector. These exclusive boutiques, so emblematic of the fast-money boom, so arrogant in their modus operandi, now face a pincer movement from investors who want their money out and banks that have withdrawn their lines of credit. Hundreds of London-based funds may be wiped out this winter. We may not feel very sorry for them, but we must watch for the unintended consequences. The scramble to unwind huge hedge-fund positions is certainly one factor in the sudden fall of the pound against the dollar, and the rise of the yen which is so damaging to Japan’s export industries.

I say again, we just don’t know where this is going. In Mayfair, with despair in Hedge Fund Alley will come deep recession for Michelin-starred restaurateurs. In emerging markets, governments may fall and democracy itself may be dispensed with as swiftly as Gordon Brown’s public borrowing rules. As for daily life in Britain, we hear that the Home Office has already briefed the Prime Minister about the threat of a rise in theft and violence as the recession bites. Who knows how a generation wholly devoted to grasping materialism will behave as abundance turns to scarcity?

So the other thing you might want to keep beside your chair this winter, just for reassurance, is a shotgun. In the winter of 1973, amid a market collapse and a miners’ strike, the diarist James Lees-Milne recorded a conversation with Sir Sacheverell Sitwell, the rarified art critic: ‘Influential City friend warned him that “we” had only three months to clear out of England. Another told him to hoard his cartridges, for there would be shooting within that time.’

Of course it never came to that, and perish the thought that it might do so this time. As for clearing out of England, well — looking at the atlas and the FT — where would we go? What’s the weather like in Palau?

30.10.08

Ezra Pound & John Adams

It's the birthday of the poet and critic Ezra Pound, (books by this author) born in 1885 in Hailey, Idaho. Pound is famous for championing the Modernist movement, and he did this by celebrating and encouraging other writers like W.B. Yeats, Robert Frost, William Carlos Williams, Marianne Moore, H. D., James Joyce, Ernest Hemingway, and T.S. Eliot. He is most famous for editing T.S. Eliot's huge poem The Waste Land and eventually cutting out half of it.
Ezra Pound said, "Man reading should be man intensely alive. The book should be a ball of light in one's hand."


It's the birthday of the second president of the United States, John Adams, born in Quincy, Massachusetts, in 1735. He was a lawyer, a writer, and a philosopher, famous for the articles he wrote in opposition to the British Stamp Act. But even though John Adams supported the American patriot cause, he agreed to defend the British soldiers who killed civilians during the Boston Massacre, and he managed to get most of them acquitted. He did it because the soldiers couldn't find another lawyer and he felt that it was his duty to humanity.
He represented Massachusetts at the Continental Congress. He served on the committee to draft the Declaration of Independence, and even though Thomas Jefferson wrote most of it, John Adams edited it, and he defended it to the rest of the Congress and helped get it passed.
Adams was vice president for George Washington, but he didn't like it much. In 1796, he was elected the second president of the United States. But his party, the Federalist Party, ended up divided, and the next time around he lost to Jefferson. Eventually, the two Founding Fathers made up, and they began a long correspondence, more than 150 letters.
On July 4, 1826, the 50th anniversary of the adoption of the Declaration of Independence, Adams and Jefferson died on the same day, in two different places.
John Adams said, "Democracy never lasts long. It soon wastes, exhausts, and murders itself. There was never a democracy that did not commit suicide."

29.10.08

After America

In 1946, George Kennan, then the deputy head of the US mission in Moscow, sent a 5300-word telegram to Washington, hoping to alert his superiors to the threat of Soviet expansionism. Kennan had complained repeatedly and fruitlessly about what he saw as America’s indulgent attitude towards the Soviet Union, but for a crucial moment in 1946 his idea that the US should strike an alliance with Western Europe in order to contain Soviet Communism found listeners in Washington. The so-called Long Telegram, subsequently turned into an article in Foreign Affairs, became the basis of the Truman Doctrine, which proclaimed America’s willingness to fight the spread of Communism, militarily as well as economically.

Kennan would later complain that he had never advocated making force such an important aspect of American policy. The logic of military containment entrapped the US in Vietnam, and would disgrace friends and colleagues who had eagerly taken over new international responsibilities from the exhausted European empires after the Second World War. Kennan lost his influence inside the Beltway in the mid-1950s, after he began exhorting Americans to pursue ‘self-perfection’ and ‘spiritual distinction’ instead of exporting freedom and democracy to the rest of the world. But for the innumerable think-tank experts and ambitious academics and columnists who long to leave a mark on history, Kennan’s telegram remains the model: a set of policy prescriptions perfectly and powerfully in tune with the zeitgeist.

Kennan died in 2005 at the age of 101: he had lived to see the emergence of a whole industry of geopolitical speculation – foundations, research institutes, area studies programmes – intended to service the military-industrial complex. He and other civil servants of his generation, nurtured at Yale, Princeton, Harvard, Wall Street and other playgrounds of the Wasp elite, took badly paid jobs in government out of a spirit of noblesse oblige. In the 1960s, however, they found themselves pushed aside by the ‘professional elite’: people, often from Jewish, Irish, Italian or mixed ethnic backgrounds, who weren’t born into power and money and had little experience in business or government. Like careerists everywhere, these professionals with degrees in international relations or history tended to logroll. Much of their work involved legitimising their own employment. For decades they routinely exaggerated the Soviet Union’s military and economic capabilities, and the threat from Communism. Even in the mid-1980s few of them noticed that the Soviet Union was near collapse; in 1991, many rushed to hail the new ‘unipolar’ world where America was the ‘indispensable nation’. Fervently promoting free markets in Russia, they didn’t anticipate its descent into gangster capitalism or its vulnerability to authoritarianism. Many of them are still awaiting the arrival in China of the liberal democracy which they believe inevitably accompanies capitalism.

The years since 9/11 have been particularly confusing for policy intellectuals. ‘America’s dominance,’ Fareed Zakaria, the former managing editor of Foreign Affairs, asserted in the New Yorker in 2003, ‘now seems self-evident.’ Reprinting large parts of this article in his new book, The Post-American World, Zakaria adds: ‘That was then. America remains the global superpower today, but it is an enfeebled one.’

Policy intellectuals looking for the next big paradigm that will transform policy-making – and their own careers – have suddenly realised that they can’t avoid the prospect of American decline, something that was unthinkable five years ago, when both America and globalisation seemed unstoppable, and the war in Iraq was a brisk investment for the future. The US has many times more tanks, fighter jets, missiles and warships than any other country, but rag-tag armies of insurgents in Iraq and Afghanistan still defy its military authority. The American economy, which for years has depended on Asian willingness to finance US deficits, now needs cash from China, Singapore and Abu Dhabi to prop up some of its most revered financial institutions.

‘Who Shrunk the Superpower?’ a cover story in the New York Times magazine asked earlier this year. According to its author, Parag Khanna, ‘America’s unipolar moment has inspired diplomatic and financial counter-movements to block American bullying and construct an alternate world order.’ ‘America has lost its momentum, and it cannot turn things around simply because it wants to,’ he writes in his new book, The Second World: Europe and China have not only emerged from America’s ‘regional security umbrellas’ to become superpowers in their own right, but they too ‘now use their military, economic and political power to build spheres of influence around the world, competing to mediate conflicts, shape markets and spread customs’. Europe and China are challenging American hegemony in what Khanna calls the ‘second world’, a broad grouping that includes Kazakhstan and Libya as well as India and Brazil. ‘America’s false assumptions of dominance,’ he writes, ‘are laid bare in every second-world region: the EU can stabilise its East, the Chinese-led SCO can organise Central Asia, South America can reject the United States, Arab states can refuse American hegemony, and China cannot be contained in East Asia by military means alone.’

And Russia, Khanna might have added, can do whatever it wants in the Caucasus. Georgia is among the dozens of countries Khanna visited while writing his book, and though he mocks American support for Saakashvili’s ‘sham democracy’, he puts too much faith in the EU’s ability to stabilise the Caucasus and underestimates Putin’s keenness to assert Russian influence. His assessment of American debility, however, seems broadly right, even if it is far from the mainstream view among the Anglo-American commentariat. Robert Kagan, for example, remains bullishly confident of America’s supremacy in his new (and ruefully titled) book, The Return of History and the End of Dreams.[*] He believes that the US should assume ‘leadership of a united democratic bloc’ against the authoritarian powers of China and Russia and has found an influential reader in John McCain, an old-style promoter of American toughness. But it’s now quite hard to imagine Hu Jintao and Vladimir Putin, not to mention Osama bin Laden and other ‘enemies of the Free World’, quivering at the thought of a ‘concert of democracies’. Kagan’s resurrection of this tired notion shows that the idea of American over-reach and decline remains incomprehensible to American elites, or too painful for them to accept.

Zakaria, Khanna’s Indian-American compatriot and probably the most admired foreign affairs pundit in America today, has written a shrewder tract: ‘not about the decline of America’, as he writes on the first page, ‘but rather about the rise of everyone else’. He knows that the military superiority of the US can’t make up for its poor economic, financial, industrial, political and cultural performance. America’s dominance, he explains, ‘was possible only in a world in which the truly large countries were mired in poverty, unable or unwilling to adopt policies that made them grow’. Now ‘the natives have gotten good at capitalism.’

Zakaria came to America from Bombay as a student in 1982, when, he writes, Ronald Reagan was the embodiment of ‘a strikingly open and expansive country’. He is wary of the growing American backlash against immigration and free trade: ‘Just as the world is opening up, America is closing down.’ Muslims are increasingly dazzled by the shopping malls of Dubai, while Americans remain unhealthily obsessed with Islamist terror. ‘The ideological watchdogs have spent so much time with the documents of jihad that they have lost sight of actual Muslim societies.’

Zakaria himself, meanwhile, believes that the US political system, ‘captured by money, special interests, a sensationalist media and ideological attack groups’, is ‘dysfunctional’. Like the well-travelled Khanna, he is exasperated by jingoistic politicians such as Mitt Romney, who wants to double the size of Guantánamo. But where Khanna points to high income inequality, corporate fraud, anti-immigrant hysteria and an obsession with guns and incarceration as symptomatic of America’s insuperable self-delusion and irrevocable decline, Zakaria is all smooth reassurance: the American economy remains dynamic, and the US is as competitive and technologically innovative as any European country. The politicians in Washington may be know-nothings, but the country’s major research universities are still the best, attracting talent from all over the world. Besides, ‘the rise of the rest is a consequence of American ideas and actions.’

It follows that countries such as India, Brazil and China want to become ‘responsible stakeholders’ in a US-dominated international system. Zakaria argues that the US can accommodate these countries by offering them membership of clubs like the G8. China may remain a prickly ‘challenger’ but with the right kinds of inducement the democratic giant next door can be turned into an ‘ally’. Zakaria believes that the nuclear agreement the Bush administration offered New Delhi, which aroused fierce opposition from both left and right-wing parties in India, ‘will alter the strategic landscape, bringing India firmly and irrevocably onto the global stage as a major player’.

Zakaria first came to prominence with a 7000-word article called ‘The Politics of Rage: Why Do They Hate Us?’, published in Newsweek a few weeks after 9/11. Something of the glamour of Kennan’s Long Telegram now attaches to this article: New York magazine described it as ‘a defining piece on the meaning of the terror attacks’ in a profile studded with praise from Henry Kissinger which also proposed Zakaria as America’s first Muslim secretary of state. Tina Brown called him ‘New York’s hot brainiac of choice’. Zakaria’s article appeared during the moment of primitive fury that overcame even ‘liberal’ commentators. Amid the clamour for retribution, Zakaria sounded calm and judicious. Read now, however, his article seems notable mostly for its evasions: he was careful not to say anything that might get him stigmatised as a radical. Blaming the Arabs for their failure to modernise, he didn’t mention the American obsession with energy security, which has shaped the politics of the Middle East for more than half a century. He found space in his paragraph on Iran to mock ‘fashionable’ supporters of the Islamist upsurge in London and Paris, but didn’t bring up the Anglo-American coup against Mossadegh in 1953 or the American mollycoddling of the shah. He wrote about the collaboration between the Pakistani dictator Zia-ul-Haq and the Saudi Islamists, but left out the middleman in the affair, the CIA.

The threat of terrorism, he asserted, had given America ‘a chance to reorder the international system’. Here he seemed at one with the neocon hawks circling over Washington, who saw a similar opportunity in 9/11. And yet he was unable to shed his awareness – the result, perhaps, of a childhood spent in non-aligned India during the Cold War – of the way the United States is perceived in the wider world.

The United States dominates the world in a way that inevitably arouses envy or anger or opposition. That comes with the power, but we still need to get things done. If we can mask our power in – sorry, work with – institutions like the United Nations Security Council, US might will be easier for much of the world to bear. Bush’s father understood this, which is why he ensured that the United Nations sanctioned the Gulf War. The point here is to succeed, and international legitimacy can help us do that.

Previous administrations had indeed succeeded in securing international legitimacy for their military interventions through the UN, especially after Clinton replaced Boutros Boutros-Ghali with the more pliant Kofi Annan. The Clinton administration also made effective use of the World Bank, the IMF and the WTO to impose the ‘Washington Consensus’: a regime of radical economic restructuring and financial deregulation that helped American companies further globalise their investment and trade, boosting large pro-business and apparently transnational elites, even in countries like India and China which were traditionally distrustful of the US. Condoleezza Rice seemed to acknowledge Clinton’s success in an article in Foreign Affairs in 2000. ‘America,’ she wrote, ‘can exercise power without arrogance and pursue its interests without hectoring and bluster. When it does so in concert with those who share its core values, the world becomes more prosperous, democratic and peaceful.’

But it was clear soon after 9/11 that the Bush administration, far from masking American power in – sorry, working with – high-minded UN resolutions, believed that the awesome demonstration of American military muscle would intimidate present and potential enemies everywhere. The administration had its own intellectual cheerleaders and experts on the Middle East: Bernard Lewis, for instance, whose pet conviction that ‘in that part of the world, nothing matters more than resolute will and force’ was validated by the swift capitulation of the Taliban. Iraq was logically the next target. As the columnist Thomas Friedman told Charlie Rose, what the Iraqis ‘needed to see was American boys and girls going house to house, from Basra to Baghdad, and basically saying: “Which part of this sentence don’t you understand? You don’t think, you know, we care about our open society, you think this bubble fantasy, we’re just gonna let it grow? Well, Suck. On. This.”’ However enamoured of American power Zakaria is – ‘properly harnessed’, he writes, it benefits both America and the world – he blanched at its crude application. Initially a supporter of the war in Iraq, he quickly became a critic, joining a group of officials from previous administrations – Richard Holbrooke, Anthony Lake, Zbigniew Brzezinski – in blaming Bush for undermining the post-World War Two system of international alliances and treaties which had institutionalised American dominance.

But the neocon ascendancy in Washington meant that calls for a more multilateralist policy went unheard. It is only now, in the last phase of the Bush presidency, with a Democratic victory in November looking likely, that the exiled liberal internationalists appear closer to regaining influence. And, although Zakaria isn’t set to become secretary of state, his book seems to be on the Democrats’ reading list: Barack Obama has been seen carrying a well-thumbed copy. The Post-American World belongs to a genre of high-class briefing material which cannot ever question certain basic assumptions about American power; only a brisk comparison between India and the US in the 19th century works up some intellectual energy. Convinced that globalisation is an irreversible success, Zakaria doesn’t stop to examine its costs: the thousands of Indian farmers driven to suicide by the vagaries of international markets or – now – the inability of even economically strong countries to protect themselves from the ‘financial innovations’ of American bankers. His book has nothing to say about the likely effect on the environment of more than two billion Indians and Chinese embracing the consumption habits of middle-class Europeans and Americans, or about the problems of uneven growth and gross inequality inherent in globalisation.

He offers instead an ambitious historical overview of the rise of the West and of the United States: the two great power shifts of the past 500 years, he contends, which led in turn to the third big shift – the rise of the rest. The most noticeable thing here is Zakaria’s attempt – a formidable task – to describe the rise of the West and the decline of the East without using the word ‘imperialism’. Not surprisingly, he is forced to draw on Montesquieu’s hoary notion of Oriental despotism: ‘From the 15th century through the 19th, Asian rulers largely fit the stereotype of the Oriental tyrant.’ Though keen to show that the East derived almost all of its cultural and political inspiration from the West (‘Marx, Engels, Rosa Luxemburg and Lenin were all Western intellectuals’), he is unwilling even to mention the role of Western powers in the swift decline of the Arab lands they liberated from Ottoman rule. Here is his mechanistic explanation: ‘In the 20th century, an effort to create “modern” and powerful nation-states resulted in dictatorships that brought economic and political stagnation.’

His Power-Pointish prose shows signs of excitement only when it describes India, which he claims to be a ‘powerful package’. When he left the country in the early 1980s it was sunk in socialistic darkness: now, thanks to its globalised economy, India is ‘boisterous, colourful, open, vibrant, and, above all, ready for change’, particularly in its relationship with the US. ‘A common language, a familiar worldview and a growing fascination with each other is bringing together businessmen, non-governmental activists and writers.’

The Wasp mandarins of Kennan’s generation distrusted naturalised outsiders like Kissinger and Brzezinski. Averell Harriman and John J. McCloy believed that the Polish-born Brzezinski, who secretly armed Muslim fanatics in Afghanistan in order to entrap the USSR in its own Vietnam, was ‘perfectly willing to get the US into a confrontation with Russia for the sake of Poland’. Some ancient grandee in Nantucket may well mutter that Zakaria is not to be trusted when he advocates closer political and business ties between America and his ancestral country, and the grandee may well be right. Zakaria describes India as having been a ‘peaceful, stable and prosperous’ country since 1997, even though over the last ten years India and Pakistan have come close to nuclear war, the pro-business Hindu nationalist BJP has been responsible for the deaths of two thousand Muslims in Gujarat, and thousands more have died in insurgencies led by Maoists or separatists in Kashmir and the North-East.

Zakaria describes India’s pro-American prime minister, Manmohan Singh, as ‘a man of immense intelligence, unimpeachable integrity and deep experience’, whose ‘breadth, depth and decency as a person are unmatched by any Indian prime minister since Nehru’. This was written before Singh’s colleagues were accused of bribing members of parliament in order to push through India’s controversial nuclear deal with the United States, despite the opposition of the Communist parties and the Hindu nationalists, who feared the loss of national sovereignty. A few weeks ago, while thanking Bush for the nuclear deal, Singh blurted out, ‘The people of India deeply love you,’ thereby bringing his judgment even further into doubt.

Boosted by wealthy Indian-Americans, who constitute a special-interest group often thought to be as strong as the Israel lobby, India tends to be described in the US as ‘rising’. (Kissinger, who reveres all rising suns, has publicly apologised for calling Indians ‘bastards’ during a dispute in the 1970s.) Zakaria’s simplifications and misreadings are central to his argument: that the US can maintain its hegemony by making friends and influencing people in newly powerful countries. India, a large capitalist democracy with pro-American elites, is apparently particularly keen to help the next president ‘renew’, in Obama’s words, ‘America’s moral leadership’.

In a speech he gave in March, Obama proposed a return to ‘the traditional bipartisan realistic policy of George Bush’s father, of John F. Kennedy, of, in some ways, Ronald Reagan’. ‘Realistic’ is an odd description of Kennedy’s escapades in Cuba and Indochina and Reagan’s equally macho forays into Lebanon, Afghanistan and Central America. The Post-American World fuels a suspicion that the next president will try to carry on business as usual. Zakaria tries to deflect the disabling charge frequently directed by neocon ideologues at Democratic liberal internationalists, that their desire to consult foreigners amounts to a surrender of national sovereignty and appeasement of the bad guys. There is quite a bit of flag-waving in the book – ‘America has transformed the world with its power but also with its ideals’ – and lots of keep-your-pecker-up stuff: ‘The world is going America’s way. Countries are becoming more open, market-friendly and democratic.’ The non-American reader may wonder if Zakaria really believes in the possibility of an international system made consensual and peaceable by American-style capitalist modernity, when American-style capitalism has severely disrupted that system, plunging entire societies into chaos, first in Asia and Latin America, and now in Europe and the US.

Trusting in ‘global growth’ above all, Zakaria often sounds like an updated practitioner of the ‘modernisation theory’ that was popular on Ivy League campuses during the Cold War. A cruder version can be found in Thomas Friedman’s ‘Golden Arches’ postulate: countries privileged enough to be able to eat McDonald’s burgers don’t go to war with each other. These Panglosses of globalisation ignore, among other things, the fact that ‘global growth’ still means nothing to much of the world’s population, some of whom – like India’s poor majority, who are certain to vote out Manmohan Singh in the next general election – have their own ideas about how to organise their lives. Given the slightest opportunity, China’s masses express a fierce nationalism rooted in long, carefully sustained memories of humiliation by Western powers. Even the elites of China, East Asia, Russia and Brazil, whose political edge might have been blunted by extended sojourns in Davos and Aspen, are likely to credit their success to state-directed capitalism or resource-extraction rather than to ‘American ideas and actions’, which they would blame for the recent disasters of Western capitalism.

Khanna seems more aware of political sentiment in Asia, Africa and Latin America when he claims that ‘the West can expect no allegiance to a Western order masquerading as representative of global values.’ His unvarnished assessment of America – ‘a first-world country in need of a Marshall Plan to stay where it is’ – also sounds more accurate now that the scale of the crisis brought on by recklessly deregulated capitalism and the War on Terror is becoming clear. Even the Economist, which is usually eager to play the wise Greek to America’s triumphant Rome, recently admitted that ‘the world seems very multipolar. Europeans no longer worry about American ascendancy. The French, some say, understood the Arab world rather better than the neoconservatives did. Russia, the Gulf Arabs and the rising powers of Asia scoff openly at the Washington consensus. China in particular spooks America.’

There is more: American concessions in talks on global warming have not prompted any generous offers from India and China, whose determination to protect their farmers from international competition has already scuppered the Doha trade negotiations. Iraq, though only nominally sovereign, insists on its own timetable for the withdrawal of American forces. Dismissing American protests, Pakistan strikes ceasefire deals with the Taliban inside its borders, while its intelligence agency, the ISI, props up the Taliban in Afghanistan. Even Israel defies its American patrons by using Turkey and Egypt as mediators in negotiations with Syria and Hamas. Hailed not long ago by George Bush as a shining example of democracy in the Middle East, the Lebanese government has been forced to make concessions to Hizbullah. American tough talk seems to have made little impression on Iran, not to mention Burma and Zimbabwe. North Korea, once part of the Axis of Evil, has had to be appeased with Chinese assistance; and China appears to have more influence over Sudan than any Western power. Ties between China and Taiwan, and China and Japan, have markedly improved, making American military buffers in much of East Asia appear increasingly superfluous. With its battering of Georgia, another of Bush’s ‘beacons of liberty’, Russia has shown that it will deal viciously with any challenge to its hegemony by a pro-American country in its neighbourhood; and there is not much the US will be able to do about it. In recent weeks the Bush administration has meekly imitated European steps to contain the credit crisis; it now watches helplessly as European politicians scramble to replace the financial system built at Bretton Woods in 1944, taking on responsibilities and privileges that Zakaria and others thought belonged to America alone.

Taken together, these developments point to the emergence of a new international system, where America is far from being the ‘indispensable nation’ that can place what it needs to do ahead of all other national interests either through brute power or by striking postures of consultation and co-operation. A world with a multiplicity of national power centres with conflicting and often irreconcilable interests and values will eventually find its own equilibrium, which may prove less precarious than the order maintained by the imperial powers of Europe and America in the 20th century. The ‘rise of the rest’ will also correct what Kennan in his last years defined as ‘this whole tendency to see ourselves as the centre of political enlightenment and as teachers to a great part of the rest of the world’. ‘Unthought-through, vainglorious and undesirable’ is how Kennan described it, and the vast majority of the world’s population agrees.

Libertarianism

The Libertarian

Strident And Wrong

The endless debates on whether the government or the market is "the cause" of our current financial malaise has yet to reach its final resting point. One strident voice in this debate belongs to Jacob Weisberg, writing in Slate. His "big idea" is that the current financial meltdown proves that the End of Libertarianism is upon us.

Let us tremble at this premature death sentence. No fair and balanced account of the current meltdown can dwell exclusively on the failure of government to regulate credit-market derivatives. It must ask deeper questions about the antecedent events that brought credit markets to their knees.

Weisberg offers no such account. Alas, the financial rot started in the underlying home-mortgage market, with the government decision to subsidize home mortgages generally through low interest rates, and compounds the problem by offering special Fannie and Freddie guarantees at the low end of the market.

These foolish decisions prompted market actors to react just as libertarians fear: to profit privately from public foolishness. Savvy lenders looked less to the creditworthiness of their borrowers and more to unwise government guarantees that insulated them from risk. A high-risk loan of $1,000, without that guarantee, could be worth half that sum before the ink was dry. But who cares, if a government agency will pick up the slack?

Similarly, self-interested borrowers eagerly grasped at cheap-money loans, thereby driving up the price of underlying assets. But once these subsidies became too expensive to sustain, the capital markets raised the price of interest, which killed off refinancing for persons living beyond their means.

At this point, securitization, which diversified some risks, accentuated others by spreading the bad paper throughout the entire system. Now the high default rates on mortgages introduced massive uncertainty for valuing these financial instruments, which triggered government mark-to-market re-evaluations--which in turn forced a premature liquidation of assets. And presto, the failure of the Wall Street investment banks mushroomed into a larger financial crisis.

Weisberg is so intent on attacking libertarians as "intellectually immature" that he overlooks the point of this cautionary tale. Private markets magnify government errors. But in light of this history, it is plain foolish to treat the current failures solely as the result of an unregulated market. The hard question is what kind of regulation is appropriate, and why.

Unfortunately, Weisberg is adamant about the need for regulation but clueless about its form. It was just these messy problems of implementation that slowed up non-libertarians like Robert Rubin. How do we regulate derivatives in the U.S., for example, if those contracts could migrate offshore? Indeed, private clearing-houses, as responsible intermediaries, might just outperform government regulation, precisely because they don't face the territorial constraints of domestic oversight or the Byzantine complexities of multilateral treaties.

Weisberg's crudest charge, however, is that all libertarians suffer from the incurable dogmatism of high school students captivated by Ayn Rand novels (which--confession--I have never read). His stick-figure image of libertarians does not square with the current intellectual landscape. Limited-government libertarians like me are not anarchists. We have a presumption against government regulation, which can be rebutted by showing long-term social improvements. We know not only about the virtues of competitive markets, but of the challenges posed by asymmetrical information, public goods, prisoner's dilemmas and market cascades.

Accordingly, we recognize that the specter of bank runs, illiquidity and credit freezes might justify some regulation. In dealing with the current crisis, we have to accept some role for the Federal Reserve as a lender of last resort under our current institutional arrangements. But we are equally adamant that bad regulation can wreck credit markets. And we insist that governments must mend their lending habits to reduce the odds of credit trains going off the rails yet again. We also strenuously oppose using the credit crisis as a lever for introducing all sorts of senseless gimmicks to disrupt labor and product markets.

Weisberg is oblivious to these strands of the libertarian tradition. Pathologically, he overrates market failures and underestimates government ones. So says the indignant, but ever thoughtful, libertarian.

Richard Epstein writes a weekly column for Forbes.com. He is a senior fellow at Stanford's Hoover Institution and a professor of law at the University of Chicago, visiting this fall at the New York University School of Law.

28.10.08

Good for the Godwit

Birds Fly More Than 7,000 Miles Nonstop

In Its Annual Fall Migration, One Godwit Traveled From Alaska to New Zealand in Eight Days



The bar-tailed godwit, a plump shorebird with a recurved bill, has blown the record for nonstop, muscle-powered flight right out of the sky.

A study being published today reports that godwits can fly as many as 7,242 miles without stopping in their annual fall migration from Alaska to New Zealand. The previous record, set by eastern curlews, was a 4,000-mile trip from eastern Australia to China.

The birds flew for five to nine days without rest, a few landing on South Pacific islands before resuming their trips, which were monitored by satellite in 2006 and 2007.

As a feat of sustained exercise unrelieved by sleeping, eating or drinking, the godwit's migration appears to be without precedent in the annals of vertebrate physiology.

"The human species doesn't work at these levels. So you just have to sit back in awe of it all," said Robert E. Gill Jr., a biologist with the U.S. Geological Survey, who headed the study.

The birds were expending energy at eight-to-10 times the rate they do at rest. The previous record for a boost in energy output is seven times the "basal metabolic rate." Peak output in human beings, achieved by Tour de France bicyclists, is a sixfold increase.

"What this suggests to me is that we haven't yet mined the depths, we really don't know what the extremes are," said Kimberly A. Hammond, a physiological ecologist at the University of California at Riverside not involved in the research.

As astounding as the feat is the fact that it represents a highly evolved solution to a problem, not a fluke or one-time occurrence.

The nonstop, over-water route is free of predators and substantially shorter than a hopscotching route down the eastern coast of Asia, which is the alternative. Landing and eating -- literally, refueling -- would expose the birds to disease and parasites when they are probably somewhat immune-suppressed. Refueling also would add weeks to the trip and itself take energy.

All in all, flying nonstop across most of the north-south span of the Pacific Ocean is the safest thing to do.

The death rate during the migration is unknown but presumably low, as the population of bar-tailed godwits, estimated at 100,000, has been stable and long-lasting.

"This system would not have perpetuated itself if mortality were a big problem," said Gill, whose study is being published today in Proceedings B, a journal of The Royal Society, in England.

Gill and his colleagues outfitted 23 bar-tailed godwits with satellite transmitters that periodically sent a signal detected by a satellite.

Female godwits are substantially larger than males. A one-ounce, battery-powered device was surgically implanted in them, with the antennas exiting their bodies just beneath the tail. The smaller males got a solar-powered device weighing less than half an ounce strapped to their backs.

Nine of the transmitters functioned well enough on the southward flight to provide evidence of sustained, nonstop flight.

One female flew directly from the Yukon-Kuskokwim Delta of Alaska to New Zealand in eight days. Other birds either landed short of their destination in the Solomon Islands and Papua New Guinea, or the signal was lost near those places. Four were later identified in New Zealand by leg bands.

The birds weigh no more than 1.5 pounds when they leave. Half of that is fat, which they burn off completely during the flight. Some of the males may have lost their transmitters in flight as their bodies shrank.

The starting and stopping places are not chosen by chance. The Kuskokwim Delta is rich in food supply, which the birds must consume in prodigious quantities before leaving. The wintering site in New Zealand is largely free of predators. When the birds arrive in early October, they molt almost immediately.

The birds leave from late August to late September, departing only with favorable tail winds. How much of their journey is wind-aided is something the researchers hope to determine by overlaying the birds' routes with day-by-day meteorological data.

A major mystery is how high the birds fly. Gill said that since word of his research has spread, researchers on boats in the Pacific have told him of seeing godwits 3,000 feet high and "smoking by at deck level."

Getting Old

In 1873, in a hotel room in Brussels, the dastardly boy-poet Arthur Rimbaud was shot in the wrist, half-accidentally, by the outrageously hideous alcoholic man-poet Paul Verlaine. (Verlaine went to prison, whereupon Rimbaud shut himself in his mom’s attic, moaned his scandalized lover’s name, and single-handedly invented Surrealism.) The year before, halfway across the world in patrician Baltimore, an unusually large baby was born and christened Emily Price. She went on, as Emily Post, to publish Etiquette, the twentieth century’s most comprehensive encyclopedia of high-society tyranny (or, if you prefer, social betterment). As she was revising its second edition, a newborn version of Hugh Hefner managed to emerge from the presumably unphotographed loins of a conservative Methodist housewife in Chicago. In 1940, Hefner was just beginning high school, one of the golden periods of his life, when, during a brief lull between German air raids, John Lennon’s birth incrementally increased the wartime population of Liverpool. In 1972, as President Nixon worked up a case to deport the ex-Beatle from New York, Marshall “Eminem” Mathers was born into deeply unpromising circumstances in Missouri.


I know all of this because the American publishing industry seems to be having an extended moment of biographilia. We have seen, or are about to see, full-length treatments of Chaplin, Cheever, Naipaul, Reagan, Ol’ Dirty Bastard—and, although this hardly seems possible, an acceleration of books about Abraham Lincoln (his remains turn 200 in February). Dipping into this biographical torrent more or less at random, I recently extracted what strikes me as a promising batch of lives: Rimbaud, Post, Hefner, Lennon, and (via autobiography) Eminem—roughly 2,300 pages covering 154 years.


Rimbaud once famously wrote, in a letter to a friend, “I is another,” and the paradox seems doubly profound when you’re working your way through a small mountain of biographies. Even radically different lives, encountered in quick succession, have a way of blending. I was unaware, for instance, that Rimbaud, Post, and Eminem were all talented mimics. Or that Hefner and Lennon shared a deep love of animals: The former won a prize, as a child, for his poem “Be Kind to Dumb Animals”; the latter mourned when his cat died jumping out a window after a pigeon. Rimbaud had to be introduced to Paris literary society, as a teen, by the established Verlaine; Eminem was introduced to the world of inner-city-Detroit rap battling by the established MC Proof (“he was my ghetto pass”). Four of the five were born in October. Rimbaud and Lennon both titillated the world by writing free-associative nonsense verse while neglecting to cut their hair. (Eminem bleached his hair while high on Ecstasy.) Rimbaud, like Hefner, preferred to work in his pajamas, although instead of black silk he wore a buttonless cotton number of his own design. (Post also designed her own clothes.) Rimbaud lamented, in his biographer Edmund White’s words, “young people whose natural lust is blasted by the curse of puritanical religion”; Hefner says that “Puritan repression is really the key that unlocks the mystery of my life.” Hefner and Post were both compulsive record-keepers. Lennon, Hefner says, once stubbed out a cigarette on a Matisse painting at the Playboy Mansion.


Seventeen was, for all five of these figures, an oddly crucial year. At 17, Emily Post outshone every other socialite at an elaborate debutante ball: She wore “stark white mousseline de soie,” danced the cotillion for three straight hours, and received so many flowers from admirers it took four men to help her carry them out. The formerly shy 17-year-old Hefner became one of the most popular students at school when, having been rejected by a girl the previous summer, he consciously invented what would become his iconic persona: He put on stylish clothes, started acting suave around ladies, used “hip” expressions like “Jeeps Creeps,” and referred to himself as “Hef.” Rimbaud, at 17, became “increasingly bizarre in his slovenliness, his jerky movements, and his outrageous insults.” Lennon was just 17 (if you, you know, know what I mean) when he lost his mom in a car accident and recorded his first songs with the Quarrymen, the band that would eventually morph into the Beatles. Eminem met the woman he would one day marry, and divorce, twice.


The pop-psychological common denominator in these five lives seems to be an absent father. (As Eminem puts it, with his signature grace: “It takes a real special kind of asshole to abandon a kid.”) Rimbaud’s father walked out on his family when Arthur was 6, leaving the boy all alone to systematically outrage his unsmiling, conservative mother. (Although one of the fun revelations of White’s book is how thoroughly Mrs. Rimbaud, along with Verlaine’s mom, bankrolled their sons’ legendary rebellion.) Emily Post’s father, a seminal architect during New York’s early skyscraper boom, worked such late hours he often slept over at gentlemen’s clubs—Emily idolized him from a distance while spending mundane hours with her mother. Hefner’s father was a workaholic bookkeeper who claimed never to have masturbated in his entire life. (This is just one of many blindingly obvious foreshadowings in the Hefner story.) Lennon’s dad was a luckless merchant seaman who disappeared for long stretches, leaving the boy to bounce between relatives; in probably the most tragic scene in this entire bio-pile, 5-year-old John’s parents actually forced him to choose between them. He picked his father, then ran crying after his mother when she turned to go.


All five of these figures warmed their hands around a common fire: the public performance of morality. Fatherlessness seems to have frozen them in a kind of permanent adolescence. They answered adult questions (How should one behave?) prematurely and exaggeratedly, then stubbornly clung to those answers for life. Their careers were built entirely on bad manners—whether excoriating them, glorifying them, or reveling in them. They sacrificed their lives to oversize visions of righteous living. And while they all have their own special failures and triumphs—that’s what makes them fit for biography—the saddest figures, to me, for precisely opposite reasons, are Rimbaud and Hefner. The French poet burned through his world-stomping revolutionary phase in less time than it takes most people to finish college. By 19, he was facing a whole second lifetime of pure sad, unheroic frustration: He wound up in Africa, trying unsuccessfully to get rich, and died of very painful cancer at 37. Hefner, on the other hand, still clings to his adolescence. At 82, he brags of being a “babe magnet” and collects young platinum-blonde “girlfriends.”


Etiquette offers, as a central maxim, “Try to do and say those things only which will be agreeable to others.” As a young husband, Hefner organized strip poker with other couples, slept with his sister-in-law, and made a porn film starring himself. Rimbaud wrote “Shit on God” all over the walls of his hometown, sunbathed naked, pretended to fling his lice on people, and broke his Parisian hosts’ china. Lennon stole from charity bake sales, laughed at the disabled, and publicly excoriated former bandmates. And yet it’s fascinating to watch these figures, as they age, slide down the continuum from rebellion to convention. Rimbaud’s African co-workers knew him as “a pleasant guy who seldom speaks”; Lennon and Eminem (a surprisingly lovable figure) both proudly gave up music to raise children; and Hefner tried marriage again. (It didn’t work.) Even Emily Post had a little of the rebel in her. As a 4-year-old, enraged by getting a miniature tea set for Christmas, she took it outside, smashed it on rocks, and threw the shards into her uncle’s pool. “I is another!” one can imagine her screaming.

$2.8tn

Autumn's market mayhem has left the world's financial institutions nursing losses of $2.8tn, the Bank of England said today, as it called for fundamental reform of the global banking system to prevent a repeat of turmoil "arguably" unprecedented since the outbreak of the first world war.

In its half-yearly health check of the City, the Bank said tougher regulation and constraints on lending would be needed as policymakers sought to learn lessons from the mistakes that have led to a systemic crisis unfolding over the past 15 months.

27.10.08

1000 Works of Art

These things are always fun. The Guardian has assembled some mighty talent. Of course, it is a superficial exercise, but it's nonetheless fun. The site is a bit difficult to navigate and the whole opus ought to be re-cast as a book, but tnonetheless, here it is: http://www.guardian.co.uk/artanddesign/1000-artworks-to-see-before-you-die

Jacking the Brain

Jacking into the Brain--Is the Brain the Ultimate Computer Interface?

How far can science advance brain-machine interface technology? Will we one day pipe the latest blog entry or NASCAR highlights directly into the human brain as if the organ were an outsize flash drive?

By Gary Stix

The cyberpunk science fiction that emerged in the 1980s routinely paraded “neural implants” for hooking a computing device directly to the brain: “I had hundreds of megabytes stashed in my head,” proclaimed the protagonist of “Johnny Mnemonic,” a William Gibson story that later became a wholly forgettable movie starring Keanu Reeves.

The genius of the then emergent genre (back in the days when a megabyte could still wow) was its juxtaposition of low-life retro culture with technology that seemed only barely beyond the capabilities of the deftest biomedical engineer. Although the implants could not have been replicated at the Massachusetts Institute of Technology or the California Institute of Technology, the best cyberpunk authors gave the impression that these inventions might yet materialize one day, perhaps even in the reader’s own lifetime.

In the past 10 years, however, more realistic approximations of technologies originally evoked in the cyberpunk literature have made their appearance. A person with electrodes implanted inside his brain has used neural signals alone to control a prosthetic arm, a prelude to allowing a human to bypass limbs immobilized by amyotrophic lateral sclerosis or stroke. Researchers are also investigating how to send electrical messages in the other direction as well, providing feedback that enables a primate to actually sense what a robotic arm is touching.

But how far can we go in fashioning replacement parts for the brain and the rest of the nervous system? Besides controlling a computer cursor or robot arm, will the technology somehow actually enable the brain’s roughly 100 billion neurons to function as a clandestine repository for pilfered industrial espionage data or another plot element borrowed from Gibson?

Will Human Become Machine?
Today’s Hollywood scriptwriters and futurists, less skilled heirs of the original cyberpunk tradition, have embraced these neurotechnologies. The Singularity Is Near, scheduled for release next year, is a film based on the ideas of computer scientist Ray Kurzweil, who has posited that humans will eventually achieve a form of immortality by transferring a digital blueprint of their brain into a computer or robot.

Yet the dream of eternity as a Max Headroom–like avatar trapped inside a television set (or as a copy-and-paste job into the latest humanoid bot) remains only slightly less distant than when René Descartes ruminated on mind-body dualism in the 17th century. The wholesale transfer of self—a machine-based facsimile of the perception of the ruddy hues of a sunrise, the constantly shifting internal emotional palette and the rest of the mix that combines to evoke the uniquely subjective sense of the world that constitutes the essence of conscious life—is still nothing more than a prop for fiction writers.

Hoopla over thought-controlled prostheses, moreover, obscures the lack of knowledge of the underlying mechanisms of neural functioning needed to feed information into the brain to re-create a real-life cyberpunk experience. “We know very little about brain circuits for higher cognition,” says Richard A. Andersen, a neuroscientist at Caltech.

What, then, might realistically be achieved by interactions between brains and machines? Do the advances from the first EEG experiment to brain-controlled arms and cursors suggest an inevitable, deterministic progression, if not toward a Kurzweilian singularity, then perhaps toward the possibility of inputting at least some high-level cognitive information into the brain? Could we perhaps download War and Peace or, with a nod to The Matrix, a manual of how to fly a
helicopter? How about inscribing the sentence “See Spot run” into the memory of someone who is unconscious of the transfer? How about just the word “see”?

These questions are not entirely academic, although some wags might muse that it would be easier just to buy a pair of reading glasses and do things the old-fashioned way. Even if a pipeline to the cortex remains forever a figment of science fiction, an understanding of how photons, sound waves, scent molecules and pressure on the skin get translated into lasting memories will be more than mere cyberpunk entertainment. A neural prosthesis built from knowledge of these underlying processes could help stroke victims or Alz­heimer’s patients form new memories.

Primitive means of jacking in already reside inside the skulls of thousands of people. Deaf or profoundly hearing-impaired individuals carry cochlear implants that stimulate the auditory nerve with sounds picked up by a microphone—a device that neuroscientist Michael S. Gaz­zaniga of the University of California, Santa Barbara, has characterized as the first successful neuroprosthesis in humans. Arrays of electrodes that serve as artificial retinas are in the laboratory. If they work, they might be tweaked to give humans night vision.

The more ambitious goal of linking Amazon.com directly to the hippocampus, a neural structure involved with forming memories, requires technology that has yet to be invented. The bill of particulars would include ways of establishing reliable connections between neurons and the extracranial world—and a means to translate a digital version of War and Peace into the language that neurons use to communicate with one another. An inkling of how this might be done can be sought by examining leading work on brain-machine interfaces.

Your Brain on Text
Jacking text into the brain requires consideration of whether to insert electrodes directly into tissue, an impediment that might make neural implants impractical for anyone but the disabled. As has been known for nearly a century, the brain’s electrical activity can be detected without cracking bone. What looks like a swimming cap studded with electrodes can transmit signals from a paralyzed patient, thereby enabling typing of letters on a screen or actual surfing of the Web. Niels Birbaumer of the University of Tübingen in Germany, a leading developer of the technology, asserts that trial-and-error stimulation of the cortex using a magnetic signal from outside the skull, along with the electrode cap to record which neurons are activated, might be able to locate the words “see” or “run.” Once mapped, these areas could be fired up again to evoke those memories—at least in theory.

Some neurotechnologists think that if particular words reside in specific spots in the brain (which is debatable), finding those spots would probably require greater precision than is afforded by a wired swim cap. One of the ongoing experiments with invasive implants could possibly lead to the needed fine-level targeting. Philip R. Kennedy of Neural Signals and his colleagues designed a device that records the output of neurons. The hookup lets a stroke victim send a signal, through thought alone, to a computer that interprets it as, say, a vowel, which can then be vocalized by a speech synthesizer, a step toward forming whole words. This type of brain-machine interface might also eventually be used for activating individual neurons.

Still more precise hookups might be furnished by nanoscale fibers, measuring 100 nanometers or less in diameter, which could easily tap into single neurons because of their dimensions and their electrical and mechanical properties. Jun Li of Kansas State University and his colleagues have crafted a brushlike structure in which nano­fiber bristles serve as electrodes for stimulating or receiving neural signals. Li foresees it as a way to stimulate neurons to allay Parkinson’s disease or depression, to control a prosthetic arm or even to flex astronauts’ muscles during long spaceflights to prevent the inevitable muscle wasting that occurs in zero gravity.

Learning the Language
Fulfilling the fantasy of inputting a calculus text—or even plugging in Traveler’s French before going on vacation—would require far deeper insight into the brain signals that encode language and other neural representations.

Unraveling the neural code is one of the most imposing challenges in neuroscience—and, to misappropriate Freud, would likely pave a royal road to an understanding of consciousness. Theorists have advanced many differing ideas to explain how the billions of neurons and trillions of synapses that connect them can ping meaningful messages to one another. The oldest is that the code corresponds to the rate of firing of the voltage spikes generated by a neuron.

Whereas the rate code may suffice for some stimuli, it might not be enough for booting a Marcel Proust or a Richard Feynman, supplying a mental screen capture of a madeleine cake or the conceptual abstraction of a textbook of differential equations. More recent work has focused on the precise timing of the intervals between each spike (temporal codes) and the constantly changing patterns of how neurons fire together (population codes).

Some help toward downloading to the brain might come from a decadelong endeavor to build an artificial hippocampus to help people with memory deficits, which may have the corollary benefit of helping researchers gain insights into the coding process. A collaboration between the University of Southern California and Wake Forest University has worked to fashion a replacement body part for this memory-forming brain structure. The hippocampus, seated deep within the brain’s temporal lobe, sustains damage in stroke or Alzheimer’s. An electronic bypass of a damaged hippocampus could restore the ability to create new memories. The project, funded by the National Science Foundation and the Defense Advanced Research Projects Agency, might eventually go further, enhancing normal memory or helping to deduce the particular codes needed for high-­level cognition.

The two groups—led by Theodore W. Berger at U.S.C. and Samuel Deadwyler at Wake Forest—are preparing a technical paper showing that an artificial hippocampus took over from the biological organ the task of consolidating a rat’s memory of pressing a lever to receive a drop of water. Normally the hippocampus emits signals that are relayed to cortical areas responsible for storing the long-term memory of an experience. For the experiment, a chemical temporarily incapacitated the hippocampus. When the rat pressed the correct bar, electrical input from sensory and other areas of the cortex were channeled through a microchip, which, the scientists say, dispatched the same signals the hippocampus would have sent. A demonstration that an artificial device mimicked hippocampal output would mark a step toward deducing the underlying code that could be used to create a memory in the motor cortex—and perhaps one day to unravel ciphers for even higher-level behaviors.

If the codes for the sentence “See Spot run”—or perhaps an entire technical manual—could be ascertained, it might, in theory, be possible to input them directly to an electrode array in the hippocampus (or cortical areas), evoking the scene in The Matrix in which instructions for flying a helicopter are downloaded by cell phone. Artificial hippocampus research postulates a scenario only slightly more prosaic. “The kinds of examples [the U.S. Department of Defense] likes to typically use are coded information for flying an F-15,” says Berger.

The seeming simplicity of the model of neural input envisaged by artificial hippocampus-related studies may raise more questions than it answers. Would such an implant overwrite existing memories? Would the code for the sentence “See Spot run” be the same for me as it is for you or, for that matter, a native Kurdish speaker? Would the hippocampal codes merge cleanly with other circuitry that provides the appropriate context, a semantic framework, for the sentence? Would “See Spot run” be misinterpreted as a laundry mishap instead of a trotting dog?

Some neuroscientists think the language of the brain may not be deciphered until understanding moves beyond the reading of mere voltage spikes. “Just getting a lot of signals and trying to understand what these signals mean and correlating them with particular behavior is not going to solve it,” notes Henry Markram, director of neuroscience and technology at the Swiss Federal Institute of Technology in Lausanne. A given input into a neuron or groups of neurons can produce a particular output—conversion of sensory inputs to long-term memory by the hippocampus, for instance—through many different pathways. “As long as there are lots of different ways to do it, you’re not even close,” he says.

The Blue Brain Project, which Markram heads, is an attempt that began in 2005 to use supercomputer-based simulations to reverse-engineer the brain at the molecular and cellular levels—modeling first the simpler rat organ and then the human version to unravel the underlying function of neural processes. The latter task awaits a computer that boasts a more than 1,000-fold improvement over the processing power of current supercomputers. The actual code, when it does emerge, may be structured very differently from what appears in today’s textbooks. “I think there will be a conceptual breakthrough that will have significant implications for how we think of reality,” Markram says. “It will be quite a profound thing. That’s probably why it’s such an intractable problem.”

The challenge involved in figuring out how to move information into the brain suggests a practical foreseeable limit for how far neurotechnology might be advanced. The task of forming the multitude of connections that make a memory is vastly different from magnetizing a set of bits on a hard disk. “Complex information like the contents of a book would require the interactions of a very large number of brain cells over a very large area of the nervous system,” observes neuroscientist John P. Donoghue of Brown University. “Therefore, you couldn’t address all of them, getting them to store in their connections the correct kind of information. So I would say based on current knowledge, it’s not possible.”

Writing to the brain may remain a dream lost in cyberspace. But the seeming impossibility does not make Donoghue less sanguine about ultimate expectations for feeding information the other way and developing brain-controlled prostheses for the severely disabled. He has been a leader in studies to implant an array of multiple electrodes into the brain that can furnish a direct line from the cortex to a prosthetic arm or even a wheelchair.

Donoghue predicts that in the next five years brain-machine interfaces will let a paralyzed person pick up a cup and take a drink of water and that, in some distant future, these systems might be further refined so that a person with an upper spinal cord injury might accomplish the unthinkable, perhaps even playing a game of basketball with prosthetics that would make a reality of The Six Million Dollar Man, the 1970s television series. Even without an information pipeline into the brain, disabled patients and basic researchers might still reap the benefits of lesser substitutes. Gert Pfurtscheller of the Graz University of Technology in Austria and his colleagues reported last year on a patient with a spinal cord injury who was able, merely by thinking, to traverse a virtual environment, moving from one end to the other of a simulated street. Duke University’s Miguel A. L. Nicolelis, another pioneer in brain-machine interfaces, has begun to explore how monkeys connected to brain-controlled prosthetic devices begin to develop a kinesthetic awareness, a sense of movement and touch, that is completely separate from sensory inputs into their biological bodies. “There’s some physiological evidence that during the experiment they feel more connected to the robots than to their own bodies,” he says.

The most important consequences of these investigations may be something other than neural implants and robotic arms. An understanding of central nervous system development acquired by the Blue Brain Project or another simulation may let educators understand the best ways to teach children and determine at what point a given pedagogical technique should be applied. “You can build an educational development program that is engineered to, in the shortest possible time, allow you to acquire certain capabilities,” Markram says. If he is right, research on neural implants and brain simulations will produce more meaningful practical benefits than dreams of the brain as a flash drive drawn from 20th-century science-fiction literature.