About Me

My photo
New Orleans, Louisiana, United States
Admire John McPhee, Bill Bryson, David Remnick, Thomas Merton, Richard Rohr and James Martin (and most open and curious minds)

28.6.06


Slow Down



As we attempt to cool ourselves this Summer, let us reflect with David Hockney on Fuji.

I hope that all of us may retreat from our daily grinds this holiday season, whether it be by water or mountain, with great literature or not, and take time to simply enjoy.

THANK YOU NYTIMES

June 28, 2006

Editorial

Patriotism and the Press

Over the last year, The New York Times has twice published reports about secret antiterrorism programs being run by the Bush administration. Both times, critics have claimed that the paper was being unpatriotic or even aiding the terrorists. Some have even suggested that it should be indicted under the Espionage Act. There have been a handful of times in American history when the government has indeed tried to prosecute journalists for publishing things it preferred to keep quiet. None of them turned out well — from the Sedition Act of 1798 to the time when the government tried to enjoin The Times and The Washington Post from publishing the Pentagon Papers.
As most of our readers know, there is a large wall between the news and opinion operations of this paper, and we were not part of the news side's debates about whether to publish the latest story under contention — a report about how the government tracks international financial transfers through a banking consortium known as Swift in an effort to pinpoint terrorists. Bill Keller, the executive editor, spoke for the newsroom very clearly. Our own judgments about the uproar that has ensued would be no different if the other papers that published the story, including The Los Angeles Times and The Wall Street Journal, had acted alone.
The Swift story bears no resemblance to security breaches, like disclosure of troop locations, that would clearly compromise the immediate safety of specific individuals. Terrorist groups would have had to be fairly credulous not to suspect that they would be subject to scrutiny if they moved money around through international wire transfers. In fact, a United Nations group set up to monitor Al Qaeda and the Taliban after Sept. 11 recommended in 2002 that other countries should follow the United States' lead in monitoring suspicious transactions handled by Swift. The report is public and available on the United Nations Web site.
But any argument by the government that a story is too dangerous to publish has to be taken seriously. There have been times in this paper's history when editors have decided not to print something they knew. In some cases, like the Kennedy administration's plans for the disastrous Bay of Pigs invasion, it seems in hindsight that the editors were over-cautious. (Certainly President Kennedy thought so.) Most recently, The Times held its reporting about the government's secret antiterror wiretapping program for more than a year while it weighed administration objections.
Our news colleagues work under the assumption that they should let the people know anything important that the reporters learn, unless there is some grave and overriding reason for withholding the information. They try hard not to base those decisions on political calculations, like whether a story would help or hurt the administration. It is certainly unlikely that anyone who wanted to hurt the Bush administration politically would try to do so by writing about the government's extensive efforts to make it difficult for terrorists to wire large sums of money.
From our side of the news-opinion wall, the Swift story looks like part of an alarming pattern. Ever since Sept. 11, the Bush administration has taken the necessity of heightened vigilance against terrorism and turned it into a rationale for an extraordinarily powerful executive branch, exempt from the normal checks and balances of our system of government. It has created powerful new tools of surveillance and refused, almost as a matter of principle, to use normal procedures that would acknowledge that either Congress or the courts have an oversight role.
The Swift program, like the wiretapping program, has been under way for years with no restrictions except those that the executive branch chooses to impose on itself — or, in the case of Swift, that the banks themselves are able to demand. This seems to us very much the sort of thing the other branches of government, and the public, should be nervously aware of. We would have been very happy if Congressman Peter King, the Long Island Republican who has been so vocal in citing the Espionage Act, had been as aggressive in encouraging his colleagues to do the oversight job they were elected to do.
The United States will soon be marking the fifth anniversary of the war on terror. The country is in this for the long haul, and the fight has to be coupled with a commitment to individual liberties that define America's side in the battle. A half-century ago, the country endured a long period of amorphous, global vigilance against an enemy who was suspected of boring from within, and history suggests that under those conditions, it is easy to err on the side of security and secrecy. The free press has a central place in the Constitution because it can provide information the public needs to make things right again. Even if it runs the risk of being labeled unpatriotic in the process.

26.6.06

HOLIDAY

Vacation Deprivation

The Lost Art of Taking Time Off

By Amy Joyce
Washington Post Staff Writer

Are you running off to the beach this year? Is your car packed with beach umbrellas, rusty beach chairs, a BlackBerry, laptop and several cellphones? Are you taking just one week instead of your allotted two?
Then I guess you're all set.
That's because one in four workers plans to work while on vacation this year, according to CareerBuilder.com's annual survey. (So is it still considered vacation? Or are you just working from a prettier office?)
Meanwhile, workers are expected to give back 574 million vacation days in 2006, depriving themselves of much-needed breaks, according to Expedia.com's annual vacation deprivation survey. The number of vacation days employees are skipping this year increased by one over last year. On average, Americans leave at least four days unclaimed annually.
"People in America don't take all the vacation time they should or could," said Helen Darling, president of the National Business Group on Health, which is based in the District. "People in this country, especially in this geographic area, actually work very hard, have very long hours and are under a lot of stress. . . . It's a very tough world out there, and unfortunately it has the effect of leading people to take less vacation or, when they take it, they take much of their work with them. And they are under a lot of stress and having problems balancing their lives."
Vacation, she said, could help ease stress and would, therefore, cut down on health problems.
Compared with other developed countries, Americans receive the fewest vacation days per year on average -- 14 days, as opposed to 17 in Australia, 19 in Canada, 24 in Great Britain, 27 in Germany and 39 in France, according to the Expedia survey. So not only do we earn less vacation time, but we also take less than we're provided. (Could this be the root of road rage?)
At least one company, however, has put the vacation dilemma into the laps of its own employees. UCG, a Rockville publisher of business newsletters, electronic magazines and directories, has had an open leave policy since 1994. That means none of its 1,000 employees has a set amount of sick leave or vacation time.
The enlightened plan stemmed from a realization that no matter the policy any employer puts in place, someone will ask to change or bend the rules. Because many employers have to go through all sorts of machinations when it comes to vacation time anyway, UCG decided to let its employees judge how much vacation they need and when. "We have a lot of respect for our employees, and they know what they need to get the job done," said Jerry Purcell, director of human resources at UCG. The employees need to work through their managers when they determine which days they need off. But there is no limit.
One gentleman took eight weeks to ride his bike across the country (you call that a vacation?), and another woman had the opportunity to tour China for six weeks after being with the company for just two months. UCG paid for two weeks of her vacation and provided half pay for four more weeks, Purcell recalled. "Every now and again, a once-in-a-lifetime opportunity comes up," he said. "I don't think you should have to sit there and say 'Well, I used my vacation last year.' Talk to us."
He said the amount of time people take doesn't really vary from a normal structured plan.
In yet another vacation survey (hey, this stuff is important, folks!) 20 percent of 400 workers surveyed had not taken a vacation, which is considered three or more consecutive days off, in the last three or more years. The same survey, conducted by TrueCareers, a division of Sallie Mae, found that 91 percent of respondents said the amount of vacation offered is very or somewhat important when applying for a job. But only 26 percent said they had considered changing jobs because of insufficient vacation time.
Chris McManus, however, did. He turned down a job offer several years ago because he wanted four to six weeks of vacation. He even tried to negotiate a lower salary to get more vacation, but the company said it was a no-go. "People say I'm crazy," he said. But this man needs his vacation.
McManus has since launched his own marketing consultancy, CenterStage Communications, in Brooklyn. Being a small-business owner, one would think he wouldn't have time for vacation now anyway. But in fact, he takes the entire month of August off. "I can't believe in the U.S. we only get two weeks of vacation a year. Then we're discouraged from taking two at the same time. Then we're to be accessible and do work on that vacation," he said.
He tells his clients that they can save their monthly fee in August while he takes a vacation or that he will reduce the fee and do a small amount of work while he is away. The fact is, however, things are slow in August, anyway. So he argues that he saves his clients money while also getting his own energy back.
This year, it looks as if he will be joining some friends who have rented a house in southern France.
"A week is not enough to whet my palate," he said. So when he takes a vacation, he really takes it. His trips usually involve touring foreign cities, and he tries to pick one starting point with a few stops along the way. "I go for a change of pace. I have a daily routine at home, and I love it. But for three or four weeks in August, I want to do something else."
And then, it's back to work at a frenetic pace -- which he can handle with aplomb, thanks to his month of vacation.

WOODY ALLEN'S DIET

THUS ATE ZARATHUSTRA

by WOODY ALLEN

NEW YORKER

There’s nothing like the discovery of an unknown work by a great thinker to set the intellectual community atwitter and cause academics to dart about like those things one sees when looking at a drop of water under a microscope. On a recent trip to Heidelberg to procure some rare nineteenth-century duelling scars, I happened upon just such a treasure. Who would have thought that “Friedrich Nietzsche’s Diet Book” existed? While its authenticity might appear to be a soupçon dicey to the niggling, most who have studied the work agree that no other Western thinker has come so close to reconciling Plato with Pritikin. Selections follow.
·
Fat itself is a substance or essence of a substance or mode of that essence. The big problem sets in when it accumulates on your hips. Among the pre-Socratics, it was Zeno who held that weight was an illusion and that no matter how much a man ate he would always be only half as fat as the man who never does push-ups. The quest for an ideal body obsessed the Athenians, and in a lost play by Aeschylus Clytemnestra breaks her vow never to snack between meals and tears out her eyes when she realizes she no longer fits into her bathing suit.


It took the mind of Aristotle to put the weight problem in scientific terms, and in an early fragment of the Ethics he states that the circumference of any man is equal to his girth multiplied by pi. This sufficed until the Middle Ages, when Aquinas translated a number of menus into Latin and the first really good oyster bars opened. Dining out was still frowned upon by the Church, and valet parking was a venal sin.


As we know, for centuries Rome regarded the Open Hot Turkey Sandwich as the height of licentiousness; many sandwiches were forced to stay closed and only reopened after the Reformation. Fourteenth-century religious paintings first depicted scenes of damnation in which the overweight wandered Hell, condemned to salads and yogurt. The Spaniards were particularly cruel, and during the Inquisition a man could be put to death for stuffing an avocado with crabmeat.
No philosopher came close to solving the problem of guilt and weight until Descartes divided mind and body in two, so that the body could gorge itself while the mind thought, Who cares, it’s not me. The great question of philosophy remains: If life is meaningless, what can be done about alphabet soup? It was Leibniz who first said that fat consisted of monads. Leibniz dieted and exercised but never did get rid of his monads—at least, not the ones that adhered to his thighs. Spinoza, on the other hand, dined sparingly because he believed that God existed in everything and it’s intimidating to wolf down a knish if you think you’re ladling mustard onto the First Cause of All Things.

Is there a relationship between a healthy regimen and creative genius? We need only look at the composer Richard Wagner and see what he puts away. French fries, grilled cheese, nachos—Christ, there’s no limit to the man’s appetite, and yet his music is sublime. Cosima, his wife, goes pretty good, too, but at least she runs every day. In a scene cut from the “Ring” cycle, Siegfried decides to dine out with the Rhine maidens and in heroic fashion consumes an ox, two dozen fowl, several wheels of cheese, and fifteen kegs of beer. Then the check comes and he’s short. The point here is that in life one is entitled to a side dish of either coleslaw or potato salad, and the choice must be made in terror, with the knowledge that not only is our time on earth limited but most kitchens close at ten.

The existential catastrophe for Schopenhauer was not so much eating as munching. Schopenhauer railed against the aimless nibbling of peanuts and potato chips while one engaged in other activities. Once munching has begun, Schopenhauer held, the human will cannot resist further munching, and the result is a universe with crumbs over everything. No less misguided was Kant, who proposed that we order lunch in such a manner that if everybody ordered the same thing the world would function in a moral way. The problem Kant didn’t foresee is that if everyone orders the same dish there will be squabbling in the kitchen over who gets the last branzino. “Order like you are ordering for every human being on earth,” Kant advises, but what if the man next to you doesn’t eat guacamole? In the end, of course, there are no moral foods—unless we count soft-boiled eggs.
To sum up: apart from my own Beyond Good and Evil Flapjacks and Will to Power Salad Dressing, of the truly great recipes that have changed Western ideas Hegel’s Chicken Pot Pie was the first to employ leftovers with meaningful political implications. Spinoza’s Stir-Fried Shrimp and Vegetables can be enjoyed by atheists and agnostics alike, while a little-known recipe of Hobbes’s for Barbecued Baby-Back Ribs remains an intellectual conundrum. The great thing about the Nietzsche Diet is that once the pounds are shed they stay off—which is not the case with Kant’s “Tractatus on Starches.”

Breakfast

Orange juice
2 strips of bacon
Profiteroles
Baked clams
Toast, herbal tea
The juice of the orange is the very being of the orange made manifest, and by this I mean its true nature, and that which gives it its “orangeness” and keeps it from tasting like, say, a poached salmon or grits. To the devout, the notion of anything but cereal for breakfast produces anxiety and dread, but with the death of God anything is permitted, and profiteroles and clams may be eaten at will, and even buffalo wings.

Lunch

1 bowl of spaghetti, with tomato and basil
White bread
Mashed potatoes
Sacher Torte
The powerful will always lunch on rich foods, well seasoned with heavy sauces, while the weak peck away at wheat germ and tofu, convinced that their suffering will earn them a reward in an afterlife where grilled lamb chops are all the rage. But if the afterlife is, as I assert, an eternal recurrence of this life, then the meek must dine in perpetuity on low carbs and broiled chicken with the skin removed.

Dinner

Steak or sausages
Hash-brown potatoes
Lobster thermidor
Ice cream with whipped cream or layer cake
This is a meal for the Superman. Let those who are riddled with angst over high triglycerides and trans fats eat to please their pastor or nutritionist, but the Superman knows that marbleized meat and creamy cheeses with rich desserts and, oh, yes, lots of fried stuff is what Dionysus would eat—if it weren’t for his reflux problem.

Aphorisms

Epistemology renders dieting moot. If nothing exists except in my mind, not only can I order anything; the service will be impeccable.

Man is the only creature who ever stiffs a waiter.

THE FORTNIGHT

Today marks the commencement of the fortnight of the All-England Championships at Wimbledon. This year David Beckham's boys will certainly divert some attention away from tennis, and the tournament got off to an inauspicious start in the rain, nonetheless, it is a fine time to meditate on what makes England so wonderfully enjoyable. So, order up a Pimm's.....and reflect.

BUFFET'S BILLIONS

A remarkable day. Warren Buffet has decided to donate the bulk of his $60 billion fortune to the Bill & Melinda Gates Foundation. How much money is $60 billion? "By comparison," the WSJ writes, "the United Nations and its agencies spend about $12 billion per year." The next-largest charitable foundation, the Ford Foundation, has an endowment less than one-fifth the size. The donation rivals any in history. Andrew Carnegie, one of America's greatest philanthropists, gave away around $7.6 billion in inflation-adjusted dollars, according to the WP.
placeAd(5,'slate.news/slate')

In addition, according to the Fortune article—and this a detail none of the papers pick up on, so far as TP can see—Buffett's agreement with Gates specifies that after two years, during which the foundation will "resize its operations," it will then be required to "to annually spend the dollar amount of his contributions as well as those it is already making from its existing assets." As TP understands it, that means if Buffett gives the foundation $1.5 billion (as he will this year), it will have to dole out $1.5 billion. That is roughly the amount UNICEF spent in 2004.


In other words: Grant writers, start your engines. While the donation was generally applauded in the world of public health, where the Gates Foundation is a huge player, there is some worry about concentrating so much spending power in one organization. No one mentions—as a recent Financial Times profile of Gates did (registration required, but it's posted various other places online)—that there are some concerns about whether the foundation is distributing its already enormous resources wisely. There is a lot of touching detail about the relationship between Buffett, 75, and Gates, 50. The NYT says the two "have become extremely close business associates and confidants since they met in 1991," traveling together, advising one another, and "regularly playing online bridge games."

24.6.06

WORLD CUP


KINDLY NOTE HERE TO SEE HOW WELL THE U.S.A. IS SUPPORTING THE WORLD CUP. AND THEY SAID IT WOULD NEVER BE POPULAR!
http://www.eltiempolatino.com/2004-02-27Edition/Pictures/foto_deportes_futbol.jpg

Ms. ALBRIGHT

Breakfast with the FT:
Blast from the past

By Daniel Dombey


The first I see of the politician who was once the most powerful woman in the world is a slight figure walking up the steps of the Dorchester Hotel.

Madeleine Albright is returning from an interview with BBC radio. A small woman with whitish hair and a crisp blue blazer, she makes her way through the lobby. I linger behind, fiddling with tape recorders and notebooks.
That first glimpse of her is strangely poignant. Albright is a reminder of what now seems a more hopeful age, when solutions to some of the world’s most intractable problems appeared within reach.
During the eight years of the Clinton administration, when she served first as US ambassador to the United Nations and then as secretary of state, hopes ran high of a final deal in the Israeli-Palestinian conflict, a peace treaty with North Korea and US rapprochement with Iran.
In the end, none was achieved. Perhaps none was achievable. The Clinton years are now often remembered for the Monica Lewinsky affair and Albright for the forthrightness that led her to clash with Colin Powell, deride the Cuban government’s lack of “balls” and push for military action in Kosovo.
But, with Hamas in power in the Palestinian territories, Pyongyang boasting of its atomic weapons and tensions running high over Iran’s nuclear programme, things seem a good deal worse today.
When I finally make my way into the Dorchester Grill Room, she is slightly nonplussed by my request that she order breakfast. She doesn’t seem to have been informed of the FT’s strange interest in watching her eat. In fact, she must be rather confused as to what she is doing in the Grill Room at all, since she has already grabbed a bite before her radio interview.
This is all the more so since the Grill Room is a slightly weird place, with pastel-coloured murals of Scottish scenes and a menu that attempts to celebrate Britishness with dishes such as raspberry porridge and freshly baked stilton bread.
Despite describing herself as a “classic extrovert”, Albright sits rather stiffly on a huge throne-like sofa, her small frame dwarfed by the imperial red backrest that climbs half way up the wall. “I’m going to have to have coffee and water,” she says by way of beginning the conversation. “We’re on a totally different schedule but you have to have something delicious.”
After a little pleading from me, she agrees to play along and gamely saws some slivers off a couple of sausages, which she pronounces excellent. I manage half of a serving of scrambled eggs and salmon. Her assistant does marginally better than both of us, making solid progress on the clutter of patisseries on the table.
We talk about London, a city Albright first came to as a two-year-old when her family fled her native Prague in 1939. (They returned to Czechoslovakia after the war, but moved to the US after the communists took power.)
She says the second world war gave her a sense of “the importance of America’s involvement in international affairs and the goodness of American power”.
Ten years ago her parents were criticised in the press for failing to have told Albright that three of her grandparents were Jewish and had perished in Nazi camps. (She only learnt of their fate as she was taking office as secretary of state, after an investigation by The Washington Post.)
“The hardest through all of this was the allegations made about my parents, that drove me crazy,” she says. “They were the most loving, protective parents... People have no right to make judgments about people who they don’t know, who are not able to defend themselves and did nothing wrong.”
Albright’s father, Josef, is also a link with Condoleezza Rice, the current secretary of state, who was his favourite student when he taught international relations at the University of Denver.
In a passage hidden away in a footnote in her autobiography, Albright writes of her amazed reaction 18 years ago on learning that Rice was a Republican. “How could that be?” she asked the younger woman. “We had the same father.”
“I can’t say we have a relationship,” Albright says of Rice. “We have a bond through my father and I feel she has been very generous to talk about the importance that my father played in her life.”
But what does she make of how Rice is doing in her old job? “I think she probably wonders how things are going,” Albright says, alluding to Iraq. “Kosovo was a relatively short war, 78 days, and I worried every night as to whether we’d done the right thing... so I’m sure she worries about that.”
Albright herself seems to be affected by two frustrations. The first is that George W. Bush’s administration has in her eyes made the world a much worse place in which to live. The second is that she is not in power any more.
“We have damaged our reputation very badly,” she says. “Iraq may turn out to be the greatest disaster in American foreign policy, which by its very nature means that it is worse than Vietnam... It is in the middle of the most volatile area in the world.”
She has just published another book, The Mighty and the Almighty, about governments’ failure to take due account of religious feelings. Albright argues the USSR and the US underestimated the role of religion in places such as Afghanistan and Iran, and the world’s great powers still need to do much more to factor religion into their equations.
She is promoting the book at a hectic pace. The week after our breakfast she is scheduled to travel to Sweden, the Netherlands and Russia, returning to her Washington DC home only for a change of clothes.
“It’s taken me a very long time to develop a career and I’m not ready to give it up, so I do a lot of different things,” she says.
Indeed, she seems to have multiple careers. She works as a consultant, a professor at Georgetown University and heads an anti-poverty commission and the National Democratic Institute for International Affairs.
Such hyperactivity is nothing new. When she was secretary of state she was so busy that her daughter had to manage her life, scolding her for travelling to flashpoints in the Balkans and for spending too much on shoes.
Would she like to serve in office again? Sure, says the 69-year-old. “I loved being secretary of state - there was nothing better I did in my life... But you don’t get to go around twice as secretary of state.”
She reaches over the table and investigates a muffin. “I definitely can’t eat this,” she says after a semi-bite. “This is chocolate... Can’t have chocolate. Given up chocolate.”
Might she want a croissant, instead?
“No,” she says, a hint of steel in her voice.
She brushes away my attempt to bring up the Lewinsky affair as firmly as she did the chocolate muffin.
“As far as his policies are concerned, every passing day Bill Clinton will look like what he really is - a great American president,” she says. “A great force of nature... Unbelievably smart.” She remembers she would be irritated when he worked on crossword puzzles during briefings but how afterwards he would show he had taken in every word.
But didn’t his legacy fall far short of his promise? In his last year in office, despite hectic negotiations, there was no historic deal on the Middle East or on Korea.
In fact, her new book notes that the deal Clinton proposed for sharing out Jerusalem was uncannily similar to one suggested by Richard the Lionheart in 1192. Maybe many more centuries will have to go by before the region finds a lasting peace.
“The mistake we made was that while [Palestinian leader Yassir] Arafat could certainly make decisions about the size of the Palestinian state because he was their elected leader, he was not in a position to make the decisions about the disposition of the holy places because he wasn’t the only one within the Arab world who had responsibility for that,” she says. “And when we started calling people about it, it was too late.”
In retrospect, Albright says, Clinton should have gone to North Korea rather than spend the last days of his presidency at Camp David trying to get a Middle East deal.
Another regret is Iran. Clinton had hoped he could establish an understanding with the relatively liberal government of Mohammad Khatami in the late 1990s and once sought to shake Khatami’s hand at the UN. But mutual distrust and limits on the Iranian president’s power prevented any breakthrough.
Now Iran is involved in a face-off with the world’s big powers over its nuclear programme. Albright is as nervous as anyone else, and asks me whether Jack Straw, lately Tony Blair’s foreign secretary, was sacked for opposing military action.
She is very glum about the state of the world. To cheer her up, I foolishly suggest that western and central Europe is in reasonable shape compared with the divisions of the cold war era. She snorts in disbelief, pointing to Poland’s rightwing government and the continent’s failure to come to terms with large-scale immigration from the south.
It is a slightly sour note on which to end the breakfast. Albright gets up to go. “Please don’t say that I chose this table,” she says, pointing to the preposterous red sofa that is too regal for her taste. I promise not to. And then the former secretary of state leaves, still energetic, still wistful for the days when she had a real seat of power.
The Dorchester Grill Room, London
1 x orange juice
1 x breakfast cereal
2 x pork sausages
1 x scrambled eggs with salmon
2 x white toast
2 x coffee
Total: ₤55.30

AMERICAN EMPIRE

The European Union is a new political contraption, and we Europeans have the same problem in identifying it as Maier has with the United States. Is it a federal state in the making? Is it a confederation plus? I suggest it is an experiment in a form of governing a group of countries for which we have not yet found a name. And the same, I suspect, is true of the position of the United States in the world. So, illuminating though it is, the attempt to fit the United States into historical patterns of empire is ultimately misguided. The United States is not in transition from hegemony to empire. The world is in transition to new forms of political organization, whose outlines can be dimly perceived, but whose frontiers cannot yet be fixed.


NYRB Review

Hot, Cold & Imperial

By Robert Skidelsky

1945: The War That Never Ended
by Gregor Dallas
Yale University Press, 739 pp., $40.00
Among Empires: American Ascendancy and Its Predecessors
by Charles S. Maier
Harvard University Press, 373 pp., $27.95

The question of how the world should be run, and America's part in its running, is the subject of much academic and political discussion in Washington these days. The factual questions are: Is the United States on the road to becoming an empire like the Roman and British Empires before it? What are the prospects for such an enterprise in today's world? More speculatively, does globalization require an imperial underpinning? There are also questions of value: Is imperialism a good or bad thing? Should the United States sacrifice its republican institutions in order to fulfil an imperial vocation?[1] Gregor Dallas's 1945: The War That Never Ended can be read as setting the scene for this discussion. The Second World War cleared away the European empires, actual and aspiring, leaving the United States and the Soviet Union as the two contending superpowers. The collapse of the Soviet Union concluded the "unfinished business" of the war, by leaving the United States the sole superpower and simultaneously creating a single world economy. The dynamics of postwar US supremacy and the question of whether they are pushing the United States toward formal empire are the subject matter of Charles Maier's Among Empires: American Ascendancy and Its Predecessors.
1.
World War II, according to Gregor Dallas, never ended: it just stopped where the armies of East and West met, and almost immediately morphed into the cold war. This was because although the Soviet Union had achieved its war aim—an empire stretching from the Baltic to the Balkans—America had not achieved its aim, which, it will come as no surprise, was to convert the whole of Europe to democracy and free enterprise. The cold war started when Truman realized that "democracy" did not mean quite the same to Stalin as it did to the Americans.
That, in a nutshell, is the main argument of Dallas's discursive but fascinating book, made up of myriad fragments like a collage. Dallas justifies his method by quoting the Polish poet Czesl/aw Mil/osz: "You can only express things properly by details. When you've observed a detail, you must discover the detail of the detail." Nevertheless, underlying the book is an eminently sound proposition: that the war against Germany (Japan is scarcely mentioned) was simultaneously a struggle to control the post-Nazi future. Behind every military decision lay a political calculation. Indeed, Dallas's book is so much taken up with the jostling for postwar position that it sometimes loses sight of the fact that till 1945 a war was still being fought against Nazi Germany. But even in defeat, Hitler, too, influenced the shape of post-Nazi Europe, by his choice of where to fight, how hard to fight, whom to surrender to—and whom to kill. By the end, he preferred to have Germany conquered by Slavic communism than by the decadent democracies.
Dallas dates the turning point of the war to July 1943, with the German failure to push back the Russian salient at Kursk and with the Allied landings in Sicily. But Hitler may have realized that the war was lost—in the sense that he would not be able to impose his will on events by military force alone —as early as December 1941, following the disastrous German defeat before Moscow, one of the forgotten but decisive battles of the war.[2] Thereafter the most he expected from his armies was to achieve "temporary" victories to put him into a better bargaining position. He thought that the alliance between the West and the USSR would soon be torn asunder by conflicting interests, leaving him with room for a "political" solution. He was right to believe the Grand Alliance would break up, wrong to think it would happen before his own empire had been swept away. His aims, methods, and crimes had put him beyond the pale for the Western Allies.
Dallas suggests that Hitler would have had a better chance with Stalin. The main evidence for this is the Ribbentrop–Molotov, or Nazi–Soviet, Pact of August 23, 1939. By the terms of this pact, Hitler recognized half of Poland, the Baltic States, Finland, and Bessarabia as being in the Russian "sphere." Most historians have regarded the pact as a marriage of convenience: Hitler avoided the danger of a two-front war; Stalin bought time. Dallas accepts the argument for Hitler —Hitler's sights were always set on the conquest and settlement of European Russia—but not for Stalin. Stalin looked on the pact as a long-term arrangement because, to put it brutally, Hitler could give him what the Western democracies could not: reconstitution of the tsarist empire and further gains for the future. In November 1940, he and Hitler toyed with the idea of carving up the British Empire between them. But this was not Hitler's dream, and he was probably leading Stalin on in order to keep supplies flowing from Russia till he was ready to strike.
[3]
Dallas claims that even after the Germans invaded Russia, Stalin never abandoned the "fantastic perspectives" opened up by the Nazi–Soviet Pact. "In the spring and early summer of 1943, Stalin's representatives in Stockholm attempted to negotiate a revived Pact; it failed because Hitler insisted on holding on to the Ukraine." It was the Soviet victory at Kursk in July 1943, not at Stalingrad in December 1942 (which was followed by a successful German counterattack), that finally convinced him that Hitler had no more to offer. "As the Soviet armies rolled forward, Stalin could nourish the dream of imposing on Europe a novel kind of Nazi–Soviet Pact—one minus the Nazis."
The strength of Dallas's hypothesis is that it helps explain the movement of Stalin's armies, and thus links the Nazi– Soviet Pact to the origins of the cold war. Stalin's postwar annexations, including Poland up to the "Curzon Line," closely followed the contours of the Nazi–Soviet Pact. His designs on the Balkans, and even on the Middle East (where Israel was originally conceived as a Soviet satellite), were foreshadowed in the November 1940 conversations between Molotov and Ribbentrop in Berlin. Stalin's ambitions were bound to break up the Grand Alliance. What Britain and France were not prepared to concede to Russia in 1939 as the price for an anti-Nazi pact, America and Britain were not prepared to concede as the price for continuing the Grand Alliance. The seeds of the cold war, in Dallas's view, were laid when Stalin insisted at Tehran in November 1943 that the terms of the Nazi–Soviet Pact still applied to Poland.
Dallas's thesis is not without its problems, though. Does one need the pact to explain the "movement" of Stalin's armies? Was Stalin not just helping himself to the spoils of war that fell into his lap? For in fact Stalin got much more than Hitler offered. The cold war may have started with the Soviet takeover of Poland, but it got going seriously only in 1948 with the Communist coup in Czechoslovakia, which had nothing to do with the pact. (It was Czechoslovakia rather than Poland that was regarded as the litmus test of Soviet intentions in 1948, as it had been of German intentions ten years earlier.)
One could argue that the "fantastic perspectives" that opened up to the Soviet Union between 1945 and 1948 had less to do with the pact than with the power vacuum across Western and Central Europe. The claim that Stalin had a preference for Hitler also seems overdone, in view of his earlier efforts to negotiate an anti-German alliance with Britain and France. Then again, if Stalin was so keen to have Hitler as a companion in world conquest, why did he send such a spectacularly sour envoy as Molotov to Berlin in November 1940? Furthermore, Dallas offers no evidence for his contention that Stalin's representatives tried to renegotiate the Ribbentrop–Molotov Pact in Stockholm in 1943. The truth is we will never know for sure what went on in Stalin's mind. Dallas offers a powerful provocation to thought rather than conclusions from evidence.
Roosevelt was never concerned about who should liberate whom, because he dreamed of a post-territorial condominium with "Uncle Joe," exercised through multilateral institutions like the IMF, the World Bank, and the UN. This can be counted as the most spectacular misjudgment in American history, aided and abetted by a network of Soviet spies in the Treasury and State Departments. Churchill, who was defending a territorial empire, evinced a much greater interest in frontiers: hence his attempt to limit Soviet expansionism by means of the "percentages" agreement with Stalin in October 1944. For the same reason, Roosevelt showed no interest in getting the British and American armies into Eastern Europe through Germany ahead of the Russians. This would have been quite feasible in the autumn of 1944, when Germany lay defenseless against Western assault. Nor did he back Churchill's plan of seizing Hungary and Austria by forcing the attack on the German lines in northern Italy. "A break through the Ljubljana Gap and a march into Austria," argued Harold Macmillan, Churchill's envoy in the Mediterranean, in his memoirs, "might have altered the whole political destinies of the Balkans and Eastern Europe."
[4] But implementing such strategies would have required the Churchillian concept of the "balance of power," which had no place in Roosevelt's brave post-territorial world. And Churchill, the weakest of the Big Three, did not control Western policy.
Of the three victors, Britain's victory was the most equivocal. The Soviet Union gained an empire in Eastern Europe, the United States became the world's leading power, but Britain emerged too weak to hold on to the empire which it had been Churchill's object to preserve. Dallas emphasizes a point with which I am bound to agree since I have made the argument myself
[5] —that Roosevelt's persistent war aim was "the ejection of the British Empire as a Great Power." Churchill in his geopolitics and Keynes in his economic policy fought as hard as they could to maintain independence from the Americans, but the shrunken assets Britain controlled by the end of the war were inadequate for the job.
Was there an alternative? Dallas reminds us that De Gaulle proposed an Anglo-French alliance on November 12, 1944. "Should England and France agree to act together..." he told Churchill, "they will wield enough power to prevent anything being done which they themselves have not accepted or decided. Our two countries will follow us. America and Russia, hampered by their rivalry, will be unable to counter it." Others would join the Anglo-French camp because of their "instinctive fear of giants." Church-
ill refused. "It is better to persuade the stronger than to go against them." Dallas tends to ascribe Churchill's choice of America over Europe to his rootlessness (abetted by his American mother, and his free trade perspective), but who was the fantasist: Churchill with his hopes of encouraging the Americans into the paths of realism or De Gaulle with his dream of a bankrupt Europe as a third force?
2.
Dallas's parallel theme is the struggle that went on within Hitler's fortress— between collaborators and resisters, and between different groups of resisters. The collaborators hoped to find an acceptable place within Hitler's empire; the partisans and resisters fought over the post-Nazi future—as spearheads for the armies and vanguards of postwar governments. Which political tendency won out depended on the "movement of armies." Dallas reminds us of just how open the outcome was. In 1943 the "non-Communist resistance in Poland awaited the Western Allies to liberate them; the Communists in France stood ready for a Soviet liberation." Neither expectation was as absurd as it now seems. The Normandy invasion had not yet happened; and the German armies were still deep in Russia.
In all those parts of German-dominated Europe whose populations were not destined for slavery or liquidation, the Germans had allies and collaborators who, whether from conviction or perceived necessity, sought a privileged place in Hitler's New Order. For a time, at least, the Vichy regime in France enjoyed more legitimacy than the postwar puppet regimes set up by the Russians in Eastern Europe. It seemed to many in 1940 and 1941 that Germany had won the war. On that assumption, what was the alternative to making the best of a bad situation? Pierre Laval and Admiral Jean-François Darlan hoped to join with Germany in carving up Africa at Britain's expense. This was not entirely foolish: as we have seen, it was one of Hitler's options, though not his preferred one. Even on the murderous eastern front there were pro-Nazi forces inspired by ethnic anti-Russianism or anti-communism or anti-Semitism. Had Hitler been less brutal toward local populations he might have been received as a liberator over much of the Soviet Union, and not just in the Ukraine. But his racial theories left no room for Slav allies.
The different Resistance factions contended among themselves for control in order to establish their claim to postwar rule. Often they seemed keener to eliminate their rivals than fight the occupier. In France, the Communists tried to ensure that they, not De Gaulle, inherited liberated France. This might involve betraying Gaullist Resistance leaders, like Jean Moulin, to the Germans. The problem faced by De Gaulle—known as Ramrod, Dallas writes, "because he had all the rigidity of a poker without its occasional warmth"—was that he was the self-appointed leader in exile, whereas most of the internal Resistance, reacting to mass deportations of French workers to Germany, was run by the Communists. De Gaulle's other problem was that Roosevelt detested him. FDR was "at heart an ally of Vichy, thinking always that at any moment Vichy would switch sides and become a convenient client state of the Americans." For two years it was Churchill and Macmillan alone who upheld the claims of the prickly French general against American hostility. De Gaulle won the battle of legitimacy when his supporters gained control of the insurrection in Paris in August 1944 shortly before the Americans arrived, aided by the German military commander Dietrich von Choltitz, who ignored Hitler's order to "raze" the French capital. The price De Gaulle had to play was to share his legitimacy with the Communists. Dallas rightly notes that De Gaulle's ambiguous relationship with the Communists at home and with the Soviets abroad continued for the rest of his life.
Paris was in another world from Warsaw. The cities were "two symbols of the closing months of the Second World War: Paris was liberated, Warsaw was annihilated." In the German-occupied part of Poland there was very little scope for collaboration with the Germans, since the Nazi plan was to make it a slave state. The murder of tens of thousands of Polish officers at Katyn and elsewhere in 1940 on Stalin's orders showed that Stalin had an almost equally grim fate in store for the Polish elite in his part of Poland. With Wl/adysl/aw Sikorski's death in an aircraft accident in 1943, Poland lost its De Gaulle, but even he would have been powerless to save Poland from the Soviets in face of the "movement of armies"—unless the London Polish government had received much stronger support from Britain and America. This would probably have required a showdown with Stalin—perhaps even a threat to cut off crucial supplies to Russia—which Churchill may have been ready for, but which Roosevelt was not.
Without determined Western backing, the Warsaw Uprising started by the non-Communist "Home Army" on August 1, 1944, was destined for catastrophe. Its motive was to seize Warsaw before the arrival of the Red Army, which it would greet as Warsaw's rightful owners. It partly succeeded in its initial aim, but the Red Army remained encamped on the right bank of the Vistula and never came, leaving the Germans time to destroy the Home Army and most of Warsaw in two months of savage fighting. Stalin even denied landing rights to the RAF and USAF to fly in supplies.
Historical debate has centered on the reason for Stalin's decision to halt his armies. Was it because they were exhausted? Did he want to give the Germans time to eliminate the anti-Communist resistance? Or was he concerned about a German counter-attack? Dallas has another explanation: Stalin stalled the advance into Germany, and diverted part of his forces to the Balkans, in order to "reestablish the gains he had made and expected out of the Nazi–Soviet Pact of 1939...." Seized by a single idea, Dallas explains too much by it.
[6]
3.
Of the 18 million Nazi victims in Europe, 11 million, including millions of Jews, died in Poland. The final section of Dallas's book reminds us that Hitler and Stalin were not "normal" statesmen pursuing "realpolitik" goals, but mass murderers who aspired to reshape the societies they controlled by transporting or liquidating entire populations. Dallas's special provocation is to argue (as Solzhenitsyn has done) that in the matter of forced labor and extermination Hitler was a pupil of Stalin, who only "caught up" with his master in the war, and then in the special circumstances of "total" war. "Russian and German camps breathed death into one another, like the winds of the plain, in an alternating cycle." In Poland and Russia, the "Nazis moved into former Soviet camps; the Soviets then took over, and used the old Nazi camps. Worse still, some of these camps would have the same inmates and the same administrators. This is the story that has so far not been told."
Slave labor was started by Stalin: there were already millions in huge industrial camps like Kolyma in 1939 when there were only 21,000 in German concentration camps. During the war Hitler copied Stalin by rounding up millions of foreign workers to produce munitions and food in Germany, a disastrous policy that drove young men in the occupied territories into the Resistance. The policy was the logical consequence of refusing to allow the mobilization of German women, Hitler believing that "our long-legged, slender German women" were unsuitable for factory work.
[7] As in the Soviet Gulag, many of these workers died from brutality, malnutrition, and disease.
Dallas follows Solzhenitsyn in denying the uniqueness of the Holocaust. The classic distinction between stigmatizing a race (which could not change its characteristics) and a class (which could be "reeducated") breaks down with Stalin. He too "dumped whole nations down the sewer pipes," wrote Solzhenitsyn. Stalin deported the nations whom he thought had collaborated, or might collaborate, with the Germans—Georgians, Chechens, Ingushi, Kalmuks—straining the Russian transport system just as the deportation of Jews to the East strained the Nazi transport system. The reasoning was the same in both cases: their ethnic characteristics made the victims actual or potential enemies of the regime. In 1941, Hitler wavered between deporting and exterminating the Jews. He had been considering evacuating all Jews first to Madagascar and then east of the Urals. It was "the loss of any chance for control of these lands...[which] pushed the Nazis towards...the 'Final Solution.'"
In another twist to the story, Dallas argues that the "event decisive for the fate of the Jews" was initiated not by Hitler but by Stalin when he deported the Volga Germans to Siberia in September 1941. Alfred Rosenberg, the Nazi minister for the eastern territories, told Hitler that virtually none would survive. "It seems that it was between late September and October 1941 that Hitler, not a forgiving man, decided to exterminate the Jews of Europe in return." Thus the two regimes' policies were linked in a murderous tit for tat. The acceleration of Hitler's extermination program in 1942 was a reaction to a war that was being lost. After the defeat in front of Moscow, Dallas argues, Hitler "was obliged to imagine ways in which his Nazi ideology could survive.... The Jews, all the Jews, would have to be murdered while he still had control, before the war was ended."
According to Dallas, the main difference between the Nazis and the Soviets was that "the Nazis had specific aims whereas the Communists fired in every direction." The Nazis were the more determined killers, but their targets were much more limited. "The Russian Gulag penetrated every aspect of Soviet society," whereas most Germans "had nothing to fear from the Nazis." It was the Soviet Union, not Hitler's Germany, that was "in strictest terms the totalitarian state."
Different interpretations are possible of the origins of the Holocaust, and Dallas's is entirely plausible. Its great strength is its insistence that this appalling tragedy was not predetermined. His account raises large questions about what other nations might have done to prevent the genocide of the Jews. The most uncomfortable question of all is: Would it have happened at all had Britain and France conceded Danzig to Hitler?
The end of the fighting left a huge population of displaced persons: the survivors of the camps, German populations fleeing from the Red Army, non-Germans who had collaborated with the Germans, voluntarily or involuntarily. At Yalta the Big Three decided that the "scum of Europe," as Koestler called them, would be repatriated to their "home countries," even against their will. This was a parody of "national self-determination." Jews suffered most: for months survivors of the Nazi concentration camps stayed in them, many of them dying of typhus and dysentery. The Cossacks, Caucasians, Muslims, Christians, Ukrainians, even Poles who had fought or worked for the Germans were deported to the Gulag. The most infamous episode was the consignment of 25,000 Cossacks and thousands of Yugoslav "Chetniks" from British-occupied Austria to the sanguinary attentions of the Red Army and Tito's partisans. The uprooting and murder of peoples went on after the movement of armies had stopped.
When, then, did World War II eventually end? It was only in the years after 1989 that the Nazi–Soviet Pact was finally liquidated. Post-Communist Russia has lost all the imperial gains it made under the pact: eastern Poland (now Belarus), the Baltic States, Bessarabia (now Moldova), plus their surrounding circle of Eastern European satellites have all become independent states. Even some of Stalin's original dominions, inherited from Imperial Russia, have gone—Ukraine, Georgia, Armenia, Azerbaijan, Kazakhstan, Kyrgyzstan, Tajikistan, Uzbekistan, and Turkmenistan. Russia's influence in the Middle East is virtually extinguished. NATO, said Solzhenitsyn in a recent interview with Moskovskiye Novosti, "is methodically developing its military deployment in Eastern Europe and on Russia's southern flank."
[8] The United States is embarked on a revised version of FDR's mission to spread democracy and free markets around the world. It would take a rash person to say that frontiers in all these places are finally fixed, though it is far from clear where they will be fixed, or whether fixing them will make that much difference.
4.
The US is the only belligerent discussed by Dallas without territorial ambitions. Germany, Italy, and Japan were trying to acquire empires, Russia to restore the Russian Empire, Britain to retain its empire. (China is barely mentioned.) America aspired to be a post-territorial "empire of liberty," not a territorial dominion imposed by force. The US was only compelled to establish "frontiers" in Europe and Asia in 1947 and 1949, something FDR had disdained, because of the collapse of his delusion of a US–Soviet condominium. In the postwar world, as Professor Charles Maier of Harvard University describes it in Among Empires: American Ascendancy and Its Predecessors, the US had "far-flung, but real, frontiers." Woodrow Wilson and FDR had dreamed of American leadership not based on territory; rivalry with the Soviet Union forced the United States to construct a territorial and post-territorial domain simultaneously.
Maier's book has much to say about how this construction took place after 1945. In this sense it follows from where Dallas left off. The US compromise between traditional empire and a Kantian comity of democratic republics was to establish American "hegemony" over the "free world," backed by military commitments and military bases, and underpinned by nuclear weapons and Ford assembly-line technology. Maier distinguishes between the "empire of production" and the "empire of consumption." In the first phase, the American productive system was transferred to its allies through Marshall Aid and other aid packages; Phase II's "empire of consumption" was based on the dominance of the dollar, and culminated in the "twin deficits" of today—the budget deficit and the balance of payments deficit.
Maier contrasts the ways in which Britain and the United States financed their world domination. He shows how the adventurism of the Kennedy- Khrushchev period, which culminated in the Cuban missile crisis and in the Vietnam debacle, gave way to the Nixon-Kissinger-Brezhnev efforts to stabilize frontiers between the rival systems, and how this failed attempt to "adjourn...the cold war" was followed by a new "forward movement" by the Soviet Union in Afghanistan and by the Carter doctrine of human rights. Of the Nixon-Kissinger design for imperial multipolarity—in which the American superpower would share the world with China and the USSR—he writes: "Not since Hitler had offered Molotov the domination of South and Central Asia in November 1940 was such a fundamental world political order presented as a grand bargain to international rivals."
Although there are some interesting ideas here, this is the less satisfactory part of Maier's book, since the history of the cold war is well-trodden ground. It shows signs of being hastily written, and is full of irritating small mistakes. It has not escaped the blight of the academic field called political economy—vague treatment of fundamental concepts, and of the linkages between economic and politics.
More thought-provoking is the first part of Maier's book, in which he inquires about the meaning of the word "empire" and to what extent the US position in the world resembles past empires like the Roman and British Empires. His method is to identify "recurring themes" in imperial history and ask how far the US experience fits them. The inquiry has become especially relevant, because an academic consensus is developing that by its military interventions in Afghanistan and Iraq, its establishment for the first time of military bases in Eastern Europe and Central Asia, and its threat of an attack on Iran, the United States has moved beyond the forms of "primacy," "hegemony," "leadership," or "ascendancy," by which its role has until recently been described, and aspires to reach a new stage in which the term "empire" might apply. The historian Niall Ferguson has called America an "empire in denial"
[9] ; Maier suggests it might be an empire in the making.
Maier wants to reserve the term "empire" for a "territorially extensive structure of rule," which subordinates "diverse ethnolinguistic groups" and reserves preponderant power to an executive authority and its elites. On this definition, the United States is not and never has been an empire, because it has not sought formal sovereignty over foreign territory. (Domestic expansion in North America doesn't qualify, in Maier's view, because the Native Americans were semi-nomads; the Philippines was an exception.) However, Maier finds it difficult to use the word consistently, talking about the US as an "empire of consumption." In the end, the best he can do is to say America has some characteristics of an empire and not others.
Take the way the United States exerts power. Subordinate rulers abroad defer to the United States. Washington, too, is an "imperial" capital, attracting academic and other elites who want to be near the center of power. However, unlike in the case of traditional empires, these arrangements are voluntary, resting on shared values, and were created by a common perception of an external threat by the Soviet Union: if this is empire, it is "empire by invitation."
Empires had emperors. The title emperor was connected to military rule. The emperor personifies rule, has an intimate relationship with military resources, possesses (or claims) a moral grandeur, and, unlike a monarch, is not necessarily a heredity dynast. In Maier's view, Rome remains the most convincing model for discussing the United States because foreign conquest changed it from a republic to an empire. It retained eviscerated republican institutions like the Senate, but power shifted to the emperor, and voting became plebiscitary. According to this view, the US is not yet an empire because its domestic politics haven't yet become Bonapartist. But perhaps it is on the way. There has been a slippage of power from the legislature to the executive, from open discussion to expert control, and from the politics of political parties to the politics of religious and other groups. According to the Bush doctrine of the "unitary executive," the president as commander in chief has supreme power and does not have to be accountable to Congress for its exercise.
America does not yet have emperors, though it is possible to think of its presidents as elected emperors, with dynastic elements. Since World War II, considerations of "national security" have increasingly subverted civilian institutions, even though the one genuine US proconsul, General Douglas MacArthur, was put in his place by President Truman. However, the official and popular ideology of the United States is anti-imperial, and for that reason alone America is unlikely to complete the classical transition from republic to empire.
Another recurring theme of empire is the psychological satisfactions it provides: heroism, glory, valor, honor, opportunity of service for elite groups, vicarious identification for the masses. It has been seen as an antidote to decadence.
[10] Maier has little to say by way of moral evaluation of empire. He writes, that is, as a political scientist or sociologist, not as a political philosopher. He does not consider the role of ideas as influences on forms of rule. This results in a defective discussion of reasons for empire and of imperial collapse. Empires, as Thucydides realized long ago, arise from a belief in the right to rule, and collapse when that belief wanes. To be sure, there is a strong ideological element in the current US drive for empire, especially among neoconservatives in the academy and Washington think tanks. It is based on the belief that the West is best, and will only be secure if the Western way becomes the universal norm. Those who resist the embrace of the West are thought to be savages and must be persuaded, or forced, to recognize the error of their ways. This is classic European imperial-speak, and it is heard in Washington today. However, the doctrine of Western superiority has not yet crystalized into an overt imperial ideology. It lacks the nineteenth-century, as well as the Nazi, ingredient of racism, without which it is difficult to justify rule without consent, though the Soviets managed it for a time.
Maier discusses whether empires by definition exploit their subjects, concluding sensibly that all theories of exploitation make "unresolvable normative claims." Leaving these aside, Maier raises some factual questions which can in principle be resolved: Do the costs of empire outweigh the bene-fits to the imperial power? Can these costs be reclaimed from imperial subjects through taxation? Which groups gain and lose through the imperial connection? Maier tends to support the liberal view that empires, with all their military and other costs, are a
net drain on the imperial power, but that political and business elites, and special interests, both in the imperial center and at the peripheries, may gain at the expense of those with lower incomes.
This seems to fit recent US experience (for example, in relation to Latin America), but is unlikely to sustain an imperial project in the absence of popular support. Maier perceptively notes the reluctance of liberals to admit the connection between markets and empire. In economics as well as in psychology, they tend to view the satisfactions and rewards of empire as residues of past conditions rather than as part of the workings of markets. Thus, Maier writes, today's market model of globalization hides the role of US multinationals in spreading "imperial employment patterns" through offshore production. To the extent that empires were always a contest for control of resources, the current American adventure in the oil-rich Middle East fits the traditional imperial logic.
The most interesting discussion in Maier's book concerns the role of borders and of violence in the imperial experience. Empires, like states (at least in the Western tradition), are defined by having fixed borders; but the fixing of borders by conquest, and the maintenance of borders thus fixed, is a source of recurring violence "on the frontier," which affects domestic politics. Imperial borders, Maier argues, are inherently contestable, since they do not rest on consent. So empires, unlike nations, are unstable structures. His main point, though, is that the attempt to "fix" borders marks a retreat from claims to universal sovereignty, a recognition that the imperial writ can be made to run only so far. The classic example is Augustus' decision to limit the Roman Empire at the Rhine following the loss of Varus' legions beyond it in 9 AD. "The utopia of the United States," writes Maier, "has been of a system of free worldwide transactions.... When the utopia is punctured, the logic of territory reasserts itself." Just as the consolidation of the Soviet Empire in Eastern Europe forced the US to limit its claims after World War II, so the emergence of terrorism has forced the US to shift from its post-territorial utopia to territorial defense in the post-Communist world.
Thus the fundamental contradiction at the heart of empires is that they promise peace but beget war. Their claim to be benevolent institutions is subverted by continuous conflict on frontiers and the revolt of subject peoples. Maier recalls England's butchery of the native Irish population in the sixteenth century. With their mania for classification, empires also hardened ethnicities, identifications, and divisions: what are called "ancient hatreds" usually turn out to be products of colonial policies.
Maier is agnostic on the most popular current defense of empire as an agent of globalization, whatever its historical validity. The argument is that globalization requires conditions of peace and security that only empires can bring about. But advocates of empire forget that the last age of globalization, which was also the age of imperialism, ended in World War I. The truth is that no imperial substitute for multilateral institutions and rules will be accepted in a globalizing world that is also becoming more pluralist. There is no alternative but to advance at the pace which the slowest Great Power finds acceptable.
6.
So where is America headed? The idea that the US is an "empire of invitation" rather than conquest will be harder to sustain in the future in the absence of the Soviet threat. Clearly the United States is not in Iraq by invitation. Some line seems to have been crossed and, as in Vietnam, the US either will have to make its new frontier effective—which in Iraq it is manifestly not, with sectarian civil war raging almost unchecked—or get out. In any case, the notion of an "empire of invitation" was always partly a fiction: the United States was not "invited" into Germany and Japan in 1945; it conquered them, and they have been "garrisoned" by American troops ever since. One could say that since World War II both countries have been somewhere between being independent and being client states: and the same is true for the European Union as a whole, whose leaders and populations lack the will and cohesion to break out of their US-protected cage.
The main conclusion which emerges from Maier's study, though it does not seem to me that he spells it out explicitly, is that between the two poles of "empire" and "independence" there are a large number of intermediate positions which exhibit different mixtures of independence and subordination. It is the fiction that there are only two alternatives—a fiction which is the joint product of Wilsonian idealism and anti-colonialism—which causes most of the current confusion. Any exertion of power by the strong is called "imperialist" by its opponents, while the imperialist has to pretend that his actions are fully consistent with national independence.
Yet while this disguise may offend simple souls who crave sharp contrasts, it may also be a sign of progress. There is some evidence that forms of rule have been growing softer, more subtle, and more humane; being less transparent, they are harder to define. Despite the mass killings and other atrocities that still disfigure parts of the world, the systematic "imperial" brutality of Hitler or Stalin which Dallas documents is past history. They tortured and killed millions; now a relatively small number of violent deaths, of "human rights" abuses attributable to imperial efforts, arouses universal condemnation—partly, but not wholly, because of the difficulty of keeping violence off the airwaves.
The European Union is a new political contraption, and we Europeans have the same problem in identifying it as Maier has with the United States. Is it a federal state in the making? Is it a confederation plus? I suggest it is an experiment in a form of governing a group of countries for which we have not yet found a name. And the same, I suspect, is true of the position of the United States in the world. So, illuminating though it is, the attempt to fit the United States into historical patterns of empire is ultimately misguided. The United States is not in transition from hegemony to empire. The world is in transition to new forms of political organization, whose outlines can be dimly perceived, but whose frontiers cannot yet be fixed.
Notes
[1] The latest offering is Harold James's brilliant essay, The Roman Predicament: How the Rules of International Order Create the Politics of Empire(Princeton University Press, 2006). James, whose translation from Cambridge, England, to Princeton, USA, may be viewed as an example of the imperial job market in action, quotes the bon mot once applied to the British Empire: "Britannia waives the rules in order to rule the waves."
[2] On this see Rodric Braithwaite's excellent Moscow 1941: A City and Its People at War (Knopf, 2006).
[3] This interpretation has been challenged by John Lukacs, in June 1941: Hitler and Stalin (Yale University Press, 2006), pp. 23–24. Lukacs depicts Hitler as wavering in the autumn of 1940 between the attractions of a "peripheral" (German, Italian, French, Spanish) anti-British coalition which would include the Soviet Union and attacking the Soviet Union. It was Molotov's demand for military bases in Finland and Bulgaria when he came to Berlin in November 1940 which convinced Hitler to give the go-ahead to "Barbarossa."
[4] H. Macmillan, The Blast of War 1939– 1945 (Macmillan, 1967), pp. 510–511, quoted in Dallas, Among Empires, p. 442. For a different view, see Theodore Draper, "Eisenhower's War—II," The New York Review, October 9, 1986.
[5] In John Maynard Keynes: Fighting for Freedom 1937–1946 (Viking, 2001), pp. xiii–xv.
[6] Norman Davies, in Rising '44: The Battle for Warsaw (Viking, 2003), p. 272, agrees that "Soviet policy was ruthless, inhumane, and coldly calculating," but suggests that "their hesitations may have been inspired as much by disorientation as by deliberate policy."
[7] Quoted in Joachim Fest, Speer: The Final Verdict (Harcourt, 2002), p. 155.
[8] Quoted in William Pfaff, "Solzhenitsyn's Righteous Outrage," International Herald Tribune, May 4, 2006.
[9] Niall Ferguson, Colossus (Penguin, 2004), p. 6.
[10] This was one of the main themes of Leo Strauss, guru of the "neocons." See Edward Skidelsky, "No More Heroes," Prospect, March 2006, pp. 34–38.

NIGHY (&KNIGHTLEY TOO)

Have you seen The Girl in the Cafe? It is a wonderful film, if only to hear Kelly McDonald ask Bill Nighy to "risk it" and to have Nighy muse about giving the Rolling Stones a bit of a musical lift.

Anyway, for you Nighy fans:

Bill Nighy is behaving like a child, rushing around the Royal Suite of the Lanesborough Hotel opening doors, looking in cupboards and wondering whether he's allowed to jump on the bed.
Bill Nighy: 'I think it is vulgar to ask people to sit in the dark for hours and not tell them a joke'
When he finds the bathroom, he is in heaven examining all the toiletries. "You take this," he says, handing me a bar of tissue-wrapped soap. "My wife has banned me from bringing any more home."
The north London house he shares with actress Diana Quick must be littered with bars of soap because Nighy spends half his life in hotels, whether staying in them during filming or occupying a room - as we are now - for an interview. It's one of the consequences of being in constant demand, which he has been ever since he stole the show as an ageing rocker in Love Actually in 2003.
This summer, his admirers will be able to enjoy him in two very different new roles. Next month, Stormbreaker will open with Nighy as a grey man in a grey suit from M16. "My name is Blunt, Alan Blunt," he says, assuming a stern expression and dodging into character for a few seconds to discuss the junior version of James Bond, based on Anthony Horowitz's novels about a 14-year-old spy.
Meanwhile, in two weeks, he can be seen with Johnny Depp in Pirates of the Caribbean: Dead Man's Chest. "It's an odd engagement," he says in his deadpan way, "because I appear as a computer generated sea-creature, half crab and half squid. Whether this will further my career or not is a moot point."
Frankly, his career doesn't need much furthering. One "good gig", as he calls them, seems to follow another. He has made the shift from being an actor whose name used to spring into casting directors' mind whenever the words "shabby, lanky, shallow or run to seed" appeared in a script. Now he is known as someone who can inject a little humour into even a solemn situation.
He is even recognised by doormen in the US. " 'I like your choices,' one of them said the other day," he recounts with glee. "Now that is Big Information." (More than once Nighy seems to speak in capitals.)
What it means is that Nighy has arrived. He is box office. But he doesn't brag about it like that. Instead, he adopts what he calls a "craven light entertainment pose" and makes fun of the fact that at the age of 56, with his skinny body and tight-lipped face, he has become a name who can carry a film and even get the girl.

"The degree of attention I've been getting of late is slightly unnerving, but I'm not complaining," he says, smoothing the sleeves of his immaculate Alfred Dunhill suit and giving a quizzical look over his glasses. "There was a period when Nelly - my dog - and I couldn't go for a walk without ending up in a newspaper.
"Last Christmas, three different people wanted me to do a single [he hit number one with Christmas Is All Around from Love Actually]. And for a time there was a part of the house I couldn't go to any more because so many scripts were coming through the door and I thought I had to read them. Then I realised that most of them get sent to whoever next pops their head above the parapet."
His way of coping - and being charming - is not to take it all too seriously. It could be a pose, but it probably comes naturally to a career neurotic who, as we sit at the desk in the hotel suite, is constantly straightening pen, paper and glasses. He is, he admits, obsessive about neatness, an interesting trait for someone who describes himself as a "mess", and dislikes talking about the wild rock-and-roll days of his youthful excesses.
"There was nothing wild about it at all," he says, for once without a trace of humour. "The neatness absolutely goes with the territory. I don't know why. I'm not a shrink. As a kid I always had this neatness thing, but I've learnt, if I go into someone else's messy room, to take that bit out of my brain and think: 'These people are beyond help.' "
There seems to be a constant dialogue inside his head between Nighy the nutcase and a more sensible voice telling him to calm down. Being aware of this puts him in the position of wry observer who can see the comic potential in every situation, whether the self-righteousness of the messy or the bizarre nature of late-onset fame.
"A Big Change has been that people now start to accommodate me in terms of dates. It used to be that you'd spend months doing nothing then get three gigs at once and have to choose. Now they plan the filming around me. Rushing from one set to another, I find I'm looking for a prop when rehearsing and thinking; 'Not that. Not glasses today' or, 'No, don't talk posh. That's Thursday.'
"I've even had producers asking me 'What play would you like to do?' That is Big News. I think, 'Blimey,' and my mind goes blank."
I find myself in helpless giggles. What he says is often funny, but his delivery is even better. And yet he only became a comic actor by accident. In fact, after fluffing his O-levels at school in Caterham, Nighy - son of a garage owner and a nurse - set out to become a Great Writer in Paris (the habit of thinking in capital letters is catching).
He got only as far as the first line. A girlfriend suggested to this confused youth that he should go to drama school instead, and there he found the talent that he has spent the past 35 years honing.
"I've never quite recovered from standing on stage and everyone laughing," he says. "I am endlessly interested in why certain pauses or pronunciations are funnier than others, how to get louder and longer laughs. Did you know that, if you stress the last consonant of the final word, everyone in an audience laughs at the same time but if you don't, they laugh in pockets because nobody receives the word at the same time? Things like that are very satisfying. It's not world peace, but it makes us useful to have around."
Nighy's co-stars in Pirates of the Caribbean 2
Being funny is so important to him that he has not only "retired" from Shakespeare - he blames the tights; he likes to act in lounge suits - he has also retired from doing plays that don't contain jokes. "I think it's vulgar to ask people to sit in the dark for hours and not tell them a joke," he declares. "I don't mind other people doing plays like King Lear or Hedda Gabler, but I do think information travels well in joke form."
The success, the laughter, as well as Diana Quick, his stalwart not-quite-wife of many years and their daughter Mary Nighy, a budding actress - all these have helped to transform him from a highly strung young man who used to go on benders into a highly-strung middle-aged man who remains sober, but still doesn't seem to have any fundamental confidence.
"I have problems in that area," he says, shifting as he often does from the first person singular to the more universal "you". "Some days you can get up and you're capable of certain things and other days you aren't. I have average difficulty persuading myself I can do my job, as it is one of those jobs which is quite hard to pull off on occasions.
"People, quite understandably, imagine that actors have a different response to standing up in front of hundreds of strangers and being the only one allowed to speak. In fact, you have the same response as anybody else: how did this happen? Why am I here? Please God let me go now. Can I go home?
"I used to think I was in trouble because I couldn't rely on having confidence on any given day. But the Big News is that I've discovered it doesn't matter. You can just go to work without any confidence. No one knows what's going on in your mind. You can stand there paranoid and anxious but they don't know, so you just say the lines anyway."
And get the laughs and the good reviews. Agonising though he may find day-to-day life, his nervous disposition does mean that he is grateful for every shred of success, every cheque that allows him to buy smart clothes for himself and his family, or indulge his habit of sticking his hand out to hail a taxi whenever he sees an orange light. (Predictably, he doesn't drive.)
He is not even blasé about the hotels in which he spends so much time. "I'm obsessively keen on them. I like the fact that the environment is controlled. I like the stationery. They change the sheets every day and bring you food. So all I have to do is work. I read, I put my iPod onto Dylan shuffle or Stones shuffle and I'm pretty happy."
Now, whenever I am in the bath, I look at the piece of soap he encouraged me to nick and feel glad that some people need to make others laugh.

INTELLECTUAL PROPERTY

The problem of intellecual property in the digital age is enormous. Along with the multifarous technical problems, however, is an mindset. The typical internet user is just not mentally attuned to 'content' that is not freely available -- or freely usable without proper attribution or compensation.

Many of the new generation of students raised on the internet see nothing wrong with copying other people's work, says Professor Sally Brown.

Prof Brown, of Leeds Metropolitan University, will tell an international conference that the net has made copying and pasting too easy.

She suggests personalising assignments would make plagiarism difficult.

Prof Brown, pro vice-chancellor for assessment, learning and teaching at Leeds Metropolitan, will be speaking in Gateshead this week.

Net wise

Wikipedia is a giant, free online encyclopaedia to which anyone can contribute.
Google is the most popular net search engine. It has an index of billions of web pages.
In her presentation for the conference, she says students do not necessarily see anything wrong with copying other people's work.

She says they say things like "if they are stupid enough to give us three assignments with the same deadline, what can they expect?" and "I just couldn't say it better myself".
Some do not understand about the rules while others know there are rules but get them wrong, she says.

Widespread problem

They might have poor academic practices - not keeping good records of where the material they were using came from, for example, she adds.
"They are post-modern, eclectic, Google-generationists, Wikipediasts, who don't necessarily recognise the concepts of authorships/ownerships."
Research indicates that plagiarism - whether done as deliberate cheating or not - is widespread in UK universities.
And all of them had a problem with it, Prof Brown says.
"The ones that say they haven't got a problem have got their heads in the sand."
She outlines four basic strategies for tackling plagiarism:
try to deter and punish
make the penalties known and try to educate the students on the issue
try to "design it out" - her preferred option - for example by setting assignments that required personal knowledge or keeping a diary or showing work in progress
change the culture in which students are working - the hardest option
'Bullying tactics'
A big issue - but one hard to tackle - was students who deliberately paid someone else to produce work for them, Prof Brown says.
She says she has even heard of students bullying other people into doing their work.
But group working could also present problems.
"There is a very wobbly line between collusion, co-operation and cheating," Prof Brown says.
"And students don't know where the wobbly, fuzzy lines are."
Increasingly, software is being used to detect work that is similar to other people's.
But Prof Brown has this warning: "The good plagiarists aren't caught."

20.6.06


Answering my own question, YES! Ready as always, on 21 June, to admire the mysterious celestial geometry of our ancestors. What will we leave to posterity that will be so evocative? Perhaps nothing more than Carl Sagan's SETI message.

THE (NOT SO) FORBIDDEN CITY

The type of technology that allows gamers to interact online will soon merge real and virtual space for people visiting China's Palace Museum and those who cannot travel to the actual site.
IBM announced "The Forbidden City: Beyond Space and Time," a philanthropic cultural heritage initiative Friday that will enhance real tours of the 800-building complex, while also bringing the city to a global audience. In partnership with The Palace Museum, Big Blue is creating a fully immersive, interactive, three-dimensional online environment that corresponds to the grounds and cultural artifacts of such places as the majestic Hall of Supreme Harmony.
Online visitors will be able to talk with each other and tour guides or tour the grounds anonymously and go back in time. Actual visitors will also benefit from the technology through projectors that will put their experience in cultural and historical context and help. They will be able to use the technology for guidance through the massive site. They also may be able to record their experience, which can later be enhanced and relayed to friends and relatives.
"Unlike the real Forbidden City, this will allow you to view different time periods," John Tolva, Program Manager for Cultural Strategy and Programs for IBM, said during an interview Friday. "One of the questions we're dealing with now is: What happens if you and I are in that space and all of a sudden I jump back 200 years? Will you come with me? There are some real interesting challenges, but they're the fun kind."
The project begins immediately and is scheduled for unveiling in 2008. It will be presented in Mandarin and English, and developers are considering a translation service so visitors can overcome language barriers to connect with each other through an architecture and history at the Palace Museum creates a unique sense of culture that has no equal. It is this unique spatial experience of Chinese culture that "Beyond Space and Time" seeks to bring to the world," Henry Chow, general manager, IBM Greater China Group, said in a prepared statement. "To create this experience requires an innovative approach and IBM is set to use the latest technology to tell the stories of Chinese culture through artifacts, people and places."
Chow and Xin Miao Zheng, China's vice minister of Culture and head of The Palace Museum, participated in a signing ceremony to begin work officially on the project, which is expected to benefit China in many ways. Not only will the project provide a showcase for cultural pride and a potential tourism boost during the year the country will host the Olympics, it also presents training opportunities with cutting-edge technology.
"The Forbidden City: Beyond Space and Time" is the next significant step in a series of major IBM initiatives to demonstrate the powerful role technology plays in bringing the arts and culture to people all over the world.
In 1998, IBM unveiled the results of its partnership with the Hermitage State Museum in St. Petersburg, Russia, which, for the first time, captured some of the world's most beautiful masterpieces online.
Two years ago, IBM announced the results of its Eternal Egypt project " breakthrough collaboration with the Egyptian government to provide global access to more than 5,000 years of Egyptian history.
Tolva, who studied English before going for a degree information design and technology at Georgia Tech, has worked on all three projects. He said he and the members of his team build dimensions by getting direct measurements from models or by analyzing photographs, or both. Then they use simple shapes to create shapes that are more complex and apply textures.
IBM's new cell chip, which will power the new PlayStation 3, provides a realistic three-dimensional rendering for people to download. Users will see a representation of themselves unless they chose to visit anonymously.
Eventually, the application will be open to the input of other developers but subject to the approval of the team from China's Palace Museum, Tolva said.

19.6.06

AIM HIGH

Are we all ready for the Solstice?

SEDARIS AT COMMENCEMENT

Although I went to Harvard, a generation ahead of David, his thoughts ring very true to me.


WHAT I LEARNED
by DAVID SEDARIS


And what I said at Princeton.

The New Yorker

It’s been interesting to walk around campus this afternoon, as when I went to Princeton things were completely different. This chapel, for instance—I remember when it was just a clearing, cordoned off with sharp sticks. Prayer was compulsory back then, and you couldn’t just fake it by moving your lips; you had to know the words, and really mean them. I’m dating myself, but this was before Jesus Christ. We worshipped a God named Sashatiba, who had five eyes, including one right here, on the Adam’s apple. None of us ever met him, but word had it that he might appear at any moment, so we were always at the ready. Whatever you do, don’t look at his neck, I used to tell myself.
It’s funny now, but I thought about it a lot. Some people thought about it a little too much, and it really affected their academic performance. Again, I date myself, but back then we were on a pass-fail system. If you passed, you got to live, and if you failed you were burned alive on a pyre that’s now the Transgender Studies Building. Following the first grading period, the air was so thick with smoke you could barely find your way across campus. There were those who said that it smelled like meat, no different from a barbecue, but I could tell the difference. I mean, really. Since when do you grill hair? Or those ugly, chunky shoes we all used to wear?
It kept you on your toes, though, I’ll say that much. If I’d been burned alive because of bad grades, my parents would have killed me, especially my father, who meant well but was just a little too gung ho for my taste. He had the whole outfit: Princeton breastplate, Princeton nightcap; he even got the velvet cape with the tiger head hanging like a rucksack from between the shoulder blades. In those days, the mascot was a sabretooth, so you can imagine how silly it looked, and how painful it was to sit down. Then, there was his wagon, completely covered with decals and bumper stickers: “I hold my horses for Ivy League schools,” “My son was accepted at the best university in the United States and all I got was a bill for a hundred and sixty-eight thousand dollars.” On and on, which was just so . . . wrong.
One of the things they did back then was start you off with a modesty seminar, an eight-hour session that all the freshmen had to sit through. It might be different today, but in my time it took the form of a role-playing exercise, my classmates and I pretending to be graduates, and the teacher assuming the part of an average citizen: the soldier, the bloodletter, the whore with a heart of gold.
“Tell me, young man. Did you attend a university of higher learning?”
To anyone holding a tool or a weapon, we were trained to respond, “What? Me go to college?” If, on the other hand, the character held a degree, you were allowed to say, “Sort of,” or, sometimes, “I think so.”
“So where do you sort of think you went?”
And it was the next bit that you had to get just right. Inflection was everything, and it took the foreign students forever to master it.
“Where do you sort of think you went?”
And we’d say, “Umm, Princeton?”—as if it were an oral exam, and we weren’t quite sure that this was the correct answer.
“Princeton, my goodness,” the teacher would say. “That must have been quite something!”
You had to let him get it out, but once he started in on how brilliant and committed you must be it was time to hold up your hands, saying, “Oh, it isn’t that hard to get into.”
Then he’d say, “Really? But I heard—”
“Wrong,” you’d tell him. “You heard wrong. It’s not that great of a school.”
This was the way it had to be done—you had to play it down, which wasn’t easy when your dad was out there, reading your acceptance letter into a bullhorn.
I needed to temper my dad’s enthusiasm a bit, and so I announced that I would be majoring in patricide. The Princeton program was very strong back then, the best in the country, but it wasn’t the sort of thing your father could get too worked up about. Or, at least, most fathers wouldn’t. Mine was over the moon. “Killed by a Princeton graduate!” he said. “And my own son, no less.”
My mom was actually jealous. “So what’s wrong with matricide?” she asked. “What, I’m not good enough to murder?”
They started bickering, so in order to make peace I promised to consider a double major.
“And how much more is that going to cost us?” they said.
Those last few months at home were pretty tough, but then I started my freshman year, and got caught up in the life of the mind. My idol-worship class was the best, but my dad didn’t get it. “What the hell does that have to do with patricide?” he asked.
And I said, “Umm. Everything?”
He didn’t understand that it’s all connected, that one subject leads to another and forms a kind of chain that raises its head and nods like a cobra when you’re sucking on a bong after three days of no sleep. On acid it’s even wilder, and appears to eat things. But, not having gone to college, my dad had no concept of a well-rounded liberal-arts education. He thought that all my classes should be murder-related, with no lunch breaks or anything. Fortunately, it doesn’t work that way.
In truth, I had no idea what I wanted to study, so for the first few years I took everything that came my way. I enjoyed pillaging and astrology, but the thing that ultimately stuck was comparative literature. There wasn’t much of it to compare back then, no more than a handful of epic poems and one novel about a lady detective, but that’s part of what I liked about it. The field was new, and full of possibilities, but try telling that to my parents.
“You mean you won’t be killing us?” my mother said. “But I told everyone you were going for that double major.”
Dad followed his “I’m so disappointed” speech with a lecture on career opportunities. “You’re going to study literature and get a job doing what?” he said. “Literaturizing?”
We spent my entire vacation arguing; then, just before I went back to school, my father approached me in my bedroom. “Promise me you’ll keep an open mind,” he said. And, as he left, he slipped an engraved dagger into my book bag.
I had many fine teachers during my years at Princeton, but the one I think of most often was my fortune-telling professor—a complete hag with wild gray hair, warts the size of new potatoes, the whole nine yards. She taught us to forecast the weather up to two weeks in advance, but ask her for anything weightier and you were likely to be disappointed.
The alchemy majors wanted to know how much money they’d be making after graduation. “Just give us an approximate figure,” they’d say, and the professor would shake her head and cover her crystal ball with a little cozy given to her by one of her previous classes. When it came to our futures, she drew the line, no matter how hard we begged—and, I mean, we really tried. I was as let down as the next guy, but, in retrospect, I can see that she acted in our best interests. Look at yourself on the day that you graduated from college, then look at yourself today. I did that recently, and it was, like, “What the hell happened?”
The answer, of course, is life. What the hag chose not to foretell—and what we, in our certainty, could not have fathomed—is that stuff comes up. Weird doors open. People fall into things. Maybe the engineering whiz will wind up brewing cider, not because he has to but because he finds it challenging. Who knows? Maybe the athlete will bring peace to all nations, or the class moron will go on to become the President of the United States—though that’s more likely to happen at Harvard or Yale, schools that will pretty much let in anybody.
There were those who left Princeton and soared like arrows into the bosoms of power and finance, but I was not one of them. My path was a winding one, with plenty of obstacles along the way. When school was finished, I went back home, an Ivy League graduate with four years’ worth of dirty laundry and his whole life ahead of him. “What are you going to do now?” my parents asked.
And I said, “Well, I was thinking of washing some of these underpants.”
That took six months. Then I moved on to the shirts.
“Now what?” my parents asked.
And, when I told them I didn’t know, they lost what little patience they had left. “What kind of a community-college answer is that?” my mother said. “You went to the best school there is—how can you not know something?”
And I said, “I don’t know.”
In time, my father stopped wearing his Princeton gear. My mother stopped talking about my “potential,” and she and my dad got themselves a brown-and-white puppy. In terms of intelligence, it was just average, but they couldn’t see that at all. “Aren’t you just the smartest dog in the world?” they’d ask, and the puppy would shake their hands just like I used to do.
My first alumni weekend cheered me up a bit. It was nice to know that I wasn’t the only unemployed graduate in the world, but the warm feeling evaporated when I got back home and saw that my parents had given the dog my bedroom. In place of the Princeton pennant they’d bought for my first birthday was a banner reading, “Westminster or bust.”
I could see which way the wind was blowing, and so I left, and moved to the city, where a former classmate, a philosophy major, got me a job on his rag-picking crew. When the industry moved overseas—this the doing of another former classmate—I stayed put, and eventually found work skinning hides for a ratcatcher, a thin, serious man with the longest beard I had ever seen.
At night, I read and reread the handful of books I’d taken with me when I left home, and eventually, out of boredom as much as anything else, I started to write myself. It wasn’t much, at first: character sketches, accounts of my day, parodies of articles in the alumni newsletter. Then, in time, I became more ambitious, and began crafting little stories about my family. I read one of them out loud to the ratcatcher, who’d never laughed at anything but roared at the description of my mother and her puppy. “My mom was just the same,” he said. “I graduated from Brown, and two weeks later she was raising falcons on my top bunk!” The story about my dad defecating in his neighbor’s well pleased my boss so much that he asked for a copy, and sent it to his own father.
This gave me the confidence to continue, and in time I completed an entire book, which was subsequently published. I presented a first edition to my parents, who started with the story about our neighbor’s well, and then got up to close the drapes. Fifty pages later, they were boarding up the door and looking for ways to disguise themselves. Other people had loved my writing, but these two didn’t get it at all. “What’s wrong?” I asked.
My father adjusted his makeshift turban, and sketched a mustache on my mother’s upper lip. “What’s wrong?” he said. “I’ll tell you what’s wrong: you’re killing us.”
“But I thought that’s what you wanted?”
“We did,” my mother wept, “but not this way.”
It hadn’t occurred to me until that moment, but I seemed to have come full circle. What started as a dodge had inadvertently become my life’s work, an irony I never could have appreciated had my extraordinary parents not put me through Princeton.