About Me

My photo
New Orleans, Louisiana, United States
Admire John McPhee, Bill Bryson, David Remnick, Thomas Merton, Richard Rohr and James Martin (and most open and curious minds)

30.9.12

The world has changed


The World We’re Actually Living In

FOR the first time in a long, long time, a Democrat is running for president and has the clear advantage on national security policy. That is not “how things are supposed to be,” and Republicans sound apoplectic about it. But there is a reason President Obama is leading on national security, and it was apparent in his U.N. speech last week, which showed a president who understands that we really do live in a more complex world today — and that saying so is not a cop-out. It’s a road map. Mitt Romney, given his international business background, should understand this, but he acts instead as if he learned his foreign policy at the International House of Pancakes, where the menu and architecture rarely changes.
Rather than really thinking afresh about the world, Romney has chosen instead to go with the same old G.O.P. bacon and eggs — that the Democrats are toothless wimps who won’t stand up to our foes or for our values, that the Republicans are tough and that it is 1989 all over again. That is, America stands astride the globe with unrivaled power to bend the world our way, and the only thing missing is a president with “will.” The only thing missing is a president who is ready to simultaneously confront Russia, bash China, tell Iraqis we’re not leaving their country, snub the Muslim world by outsourcing our Arab-Israel policy to the prime minister of Israel, green light Israel to bomb Iran — and raise the defense budget while cutting taxes and eliminating the deficit.
It’s all “attitude” — without a hint at how we could possibly do all these contradictory things at once, or the simplest acknowledgment that two wars and a giant tax cut under George W. Bush has limited our ability to do even half of them.
Let’s look at the world we’re actually living in. It is a world that has become much more interdependent so that our friends failing (like Greece) can now harm us as much as our enemies threatening, and our rivals (like China) collapsing can hurt us as much as their rising. It’s a world where a cheap YouTube video made by a superempowered individual can cause us more trouble than the million-dollar propaganda campaign of a superpower competitor. It is a globalized economy in which the U.S. Chamber of Commerce, America’s largest business lobby, has opposed Romney’s pledge to designate China as a currency manipulator and is pressing Congress to lift cold war trade restrictions on Russia, a country Romney has labeled America’s “No. 1 geopolitical foe.” It is a world where, at times, pulling back — and focusing on rebuilding our strength at home — is the most meaningful foreign policy initiative we can undertake because when America is at its best — its institutions, schools and values — it can inspire emulation, whereas Russia and China still have to rely on transactions or bullying to get others to follow. It is still a world where the use of force, or the threat of force, against implacable foes (Iran) is required, but a world where a nudge at the right time and place can also be effective.
Add it all up and it’s a world in which America will have greater responsibility (because our European and Japanese allies are now economically enfeebled) and fewer resources (because we have to cut the defense budget) to manage a more complex set of actors (because so many of the states we have to deal with now are new democracies with power emanating from their people not just one man — like Egypt — or failing states like Pakistan) where our leverage on other major powers is limited (because Russia’s massive oil and gas income gives it great independence and any war we’d want to fight in Asia we’d have to borrow the money from China).
This complexity doesn’t argue for isolationism. It argues for using our power judiciously and in a nuanced fashion. For instance, if you had listened to Romney criticizing Obama for weakness after the attack on the U.S. consulate in Benghazi, Libya, you’d have thought that, had Romney been president, he would have immediately ordered some counterstrike. But, had we done so, it would have aborted what was a much more meaningful response: Libyans themselves taking to the streets under the banner “Our Revolution Will Not Be Stolen” and storming the headquarters of the Islamist militias who killed the U.S. ambassador. It shows you how much this complexity can surprise you.
The one area where Romney could have really challenged Obama on foreign policy was on the president’s bad decision to double-down on Afghanistan. But Romney can’t, because the Republican Party wanted to triple down. So we’re having no debate about how to extricate ourselves from our biggest foreign policy mess and a cartoon debate — “I’m tough; he’s not” — about everything else. In that sense, foreign policy is a lot like domestic policy. The morning after the election, we will face a huge “cliff”: how to deal with Afghanistan, Iran and Syria, without guidance from the candidates or a mandate from voters. Voters will have to go with their gut about which guy has the best gut feel for navigating this world. Obama has demonstrated that he has something there. Romney has not.

What might have been


Obamanomics: A Counterhistory

Washington
WORKING out of cramped, bare offices in a downtown building here in Washington, President-elect Obama’s economic team spent the final weeks of 2008 trying to assess how bad the economy was. It was during those weeks, according to several members of the team, when they first discussed academic research by the economists Carmen M. Reinhart and Kenneth S. Rogoff that would soon become well known.
Ms. Reinhart and Mr. Rogoff were about to publish a book based on earlier academic papers, arguing that financial crises led to slumps that were longer and deeper than other recessions. Almost inevitably, the economists wrote, policy makers battling a crisis made the mistake of thinking that their crisis would not be as bad as previous ones. The wry title of the book is “This Time Is Different.”
In my interviews with Obama advisers during that time, they emphasized that they knew the history and were determined to avoid repeating it. Yet of course they did repeat it. After successfully preventing another depression, in 2009, they have spent much of the last three years underestimating the economy’s weakness. That weakness, in turn, has become Mr. Obama’s biggest vulnerability, helping cost Democrats control of the House in 2010 and endangering his accomplishments elsewhere.
Entire books and countless articles have taken Mr. Obama to task on the economy, and administration officials have a rebuttal that makes a couple of important points. The Federal Reserve and many private-sector economists were also too optimistic, Obama aides note. And they argue that the Senate would not have passed a much larger stimulus in 2009, given Republican opposition, regardless of the White House’s wishes.
But from these reasonable points, the Obama team then jumps to a larger and more dubious conclusion: that their failure to grasp the severity of the slump has had no real consequences. Even if they had seen the slow recovery coming, they say, they couldn’t have done much about it. When Mr. Obama has been asked about his biggest mistake, he talks about messaging, not policy.
“The mistake of my first term — couple of years — was thinking that this job was just about getting the policy right,” he has said. “The nature of this office is also to tell a story to the American people that gives them a sense of unity and purpose and optimism, especially during tough times.”
We can never know for sure what the past four years would have been like if the administration and the Fed had been more worried about the economy. But my reading of the evidence — and some former Obama aides agree — points strongly to the idea that the misjudging of the downturn did affect policy and ultimately the economy.
Mr. Obama’s biggest mistake as president has not been the story he told the country about the economy. It’s the story he and his advisers told themselves.
The notion of insurance is useful here. Suggesting that Mr. Obama and his aides should have bucked the consensus forecast and decided that a long slump was the most likely outcome smacks of 20/20 hindsight. Yet that wasn’t their only option. They also could have decided that there was a substantial risk of a weak recovery and looked for ways to take out insurance.
By late 2008, the full depth of the crisis was not clear, but enough of it was. A few prominent liberal economists were publicly predicting a long slump, as was Mr. Rogoff, a Republican. The Obama team openly compared its transition to Franklin D. Roosevelt’s and, in private, discussed the Reinhart-Rogoff work.
So why didn’t that work do more to affect the team’s decisions?
There are two main answers. First, the situation was unlike anything any living policy maker had previously experienced, and it was deteriorating quickly. Although officials talked about the Depression, they struggled to treat the downturn as fundamentally different from a big, relatively brief recession.
“The numbers got ramped up,” one former White House official told me, referring to the planned size of the stimulus in late 2008. “But the basic frame did not get altered.” In particular, the administration did not imagine that the economy would still need major help well beyond 2009 and that Congress would not comply.
The second problem was that Mr. Obama and his advisers believed — correctly — that they and the Fed were already responding more aggressively than governments had in past crises. Even before the election, President George W. Bush signed the financial bailout, a decidedly un-Hooveresque policy. The Fed began flooding the economy with money. The Obama administration pushed for the stimulus and, with the Fed, conducted successful stress tests on banks.
Whatever the political debate over these measures, the economic evidence suggests they made a large difference. Analyses by the Congressional Budget Office and other nonpartisan economists have come to this conclusion. Europe, which was less aggressive, has fared worse. And the chronology of the crisis tells the same story.
In 2008 and early 2009, the global economy was deteriorating even more rapidly than in 1929, according to research by Barry Eichengreen and Kevin H. O’Rourke. Global stock prices and trade dropped more sharply. But the policy response this time was vastly different, and by the spring of 2009 — just as the measures were taking effect — the economy stabilized.
In this success came the seeds of future failures. Knowing in late 2008 how much policy help was on the way, Mr. Obama and his economic advisers decided that the disturbing pattern of financial crises was not directly relevant. “In a way, they fell into a ‘This Time Is Different’ trap,” another former White House official said.
A banner headline in The Financial Times in June 2009 pronounced the White House “Upbeat on Economy.” Nine months later, after the recovery had run into new problems, the administration said the economy was on the verge of “escape velocity.”
Even now, the Obama team sometimes suggests that the weak recovery isn’t related to the financial crisis. Some problems, like the rise in oil prices, are not in fact related. Many others, like Europe’s troubles and this country’s still-depressed consumer spending, are.
Imagine if the transition team had instead placed, say, 25 percent odds on a protracted slump. Political advisers like David Axelrod would have immediately understood the consequences. Mr. Obama’s policies would look like a failure during the midterm campaign, and the prospects of winning additional stimulus would dwindle. Which is exactly what happened.
Contemplating this outcome, the new administration would have had urgent reasons to take out insurance policies. For starters, Mr. Obama would indeed have told a different story about the economy. Rather than promising a “recovery summer” in 2010, he and his aides would have cautioned patience. Bill Clinton’s recent Democratic convention speech was a model.
More concretely, the administration would have looked for every possible lever to lift the economy. Despite Republican opposition, such levers existed.
Upon taking office, Mr. Obama could have immediately nominated people to fill the Fed’s seven-member Board of Governors, rather than leaving two openings. Ben S. Bernanke, the chairman, works hard to achieve consensus on the Fed’s policy committee, and in 2010 and 2011 the committee was skewed toward officials predicting — wrongly, we now know — that inflation was a bigger threat than unemployment.
TWO more appointees may well have shifted the debate and caused the Fed to have been less cautious. After the vacancies were finally filled this year, the Fed took further action.
The administration also could have added provisions to the stimulus bill that depended on the economy’s condition. So long as job growth remained below a certain benchmark, federal aid to states and unemployment benefits could have continued flowing. Crucially, these provisions would not have added much to the bill’s price tag. Because the Congressional Budget Office’s forecast was also too optimistic, the official budget scoring would have assumed that the provisions would have been unlikely to take effect. They would have been insurance.
Perhaps most important, the administration might have taken a different path on housing. With the auto industry and Wall Street, Mr. Obama accepted the political costs that come with bailouts. He rescued arguably undeserving people in exchange for helping the larger economy. With housing, he went the other way, even leaving some available rescue money unspent — at least until last year, when the policy became more aggressive and began to have a bigger effect.
No one of these steps, or several other plausible ones, would have fixed the economy. But just as the rescue programs of early 2009 made a big difference, a more aggressive program stretching beyond 2009 almost certainly would have made a bigger difference. It would have had the potential to smooth out the stop-and-start nature of the recovery, which has sapped consumer and business confidence and become a problem in its own right.
By any measure, Mr. Obama and his team faced a tremendously difficult task. They inherited the worst economy in 70 years, as well as an opposition party that was dedicated to limiting the administration to one term and that fought attempts at additional action in 2010 and 2011. And the administration can rightly claim to have performed better than many other governments around the world.
But their claim on having done as well as could reasonably have been expected — to have avoided major mistakes — is hard to accept. They considered the possibility of a long, slow recovery and rejected it.
In the early months of the crisis, Mr. Obama and his aides made clear that they would try to learn from the errors of the Great Depression and do better. They achieved that goal. They also left a whole lot of lessons for the people who will have to battle the next financial crisis.

Another Autumn?

29.9.12

When the Globe was pink


Cutting the British Empire Down to Size


The ‘British Empire’ was the name given by imperialists in the late 19th century to Britain’s territorial possessions. It was meant to create an image of unity and strength. But such a view is illusory, argues Bernard Porter.
'From the Cape to Cairo', Puck, 1902. Britannia leads civilising soldiers and colonists against Africans as Civilisation conquers Barbarism. Library of Congress'From the Cape to Cairo', Puck, 1902. Britannia leads civilising soldiers and colonists against Africans as Civilisation conquers Barbarism. Library of CongressWith the British Empire finally dead and buried – give or take a Falkland or two – now may be a good time to pause and try to take stock of what it was while it still had breath in it. This won’t be easy. For a start, it may be too early. Historical judgements take a while to bed down. Even then they are subject to revision by successive generations, influenced by new discoveries and their own historical environments. Subjects like imperialism are complex and can be approached from many different angles. In the case of the British Empire the problem is exacerbated by the fact that its death is still too recent to be looked on dispassionately and its legacy too present to be ignored. Hence the controversy that rages today between the broadly pro-imperial Niall Ferguson (Empire: How Britain Made the Modern World, 2003) and the uncompromisingly anti-imperialist Richard Gott (Britain’s Empire: Resistance, Repression and Revolt, 2011). That debate would not be so heated if the ghost of the old empire was not felt to be haunting us still.
The ghost of yet another empire lurks behind it. British imperialists often made comparisons between their empire and the much longer-dead Roman one. It was the image of this great empire, present in popular culture and in contemporary school syllabuses, much more than the British Empire ever was, that largely determined perceptions of imperialism. It was where the word ‘imperial’ came from: a Latin term (imperium) associated with notions of power, authority and control. It is a big, strong, singular word, implying a big, strong, singular thing. That is why British imperialists liked it. One of the issues that needs to be determined before making any assessment of the effects and legacies of the British Empire, therefore, is how big, strong and singular it really was.
We shouldn’t be fooled by appearances. All those red-bedaubed world maps that became fashionable in Britain around 1900, for example, give an impression of uniform British power, which is certainly false. A truer picture would have been conveyed by colouring most parts a much lighter pink and some with only the faintest blush (to be fair, cartographers often did this with Egypt and the Indian princely states). If we are measuring British imperialism in terms of informal influence, certain countries outside the empire can be pinked in, too. You might also put a few drops of red into the oceans, to reflect Britain’s naval dominance.
In fact those red patches on the map covered an extraordinary variety of relationships between the colonies and the ‘mother’ country, which would require a whole new palette to colour-code them accurately. These ranged from absolute despotisms and racist tyrannies; through colonies ruled paternalistically, in intention at any rate, and territories simply ‘protected’ by the British; to colonies whose (white) people were far more ‘free’ than stay-at-home Britons and those whose (non-white) subjects were so little touched by the system that they could barely have been aware that they were colonies at all. Beyond these there were disguised colonies like Egypt; territories mandated after the First World War, in one of which, Palestine, Britain’s role was mainly a thankless peace-keeping one; her ‘informal’ colonies – nominally independent, but dominated, for example, by British commercial companies; and Ireland, which could be said to straddle both sides of the imperial-colonial divide. That’s without taking any account of what is more  problematically termed British ‘cultural imperialism’: problematic because, if ‘imperialism’ means anything at all, it must surely involve some sort of duress or domination, which is difficult to show in the case of, say, Brazilians choosing to play football.
To bundle all these together under the rubric of ‘empire’ seems perverse. Equating the experience of a colonial Nigerian with that of a New South Waleian, for example, which is what you get when you categorise them both as imperial victims, makes no sense at all. Insofar as they were victims (and there are, of course, other ways of looking at them) the latter certainly were no more so than most Britons. My old granddad in Essex was probably more exploited there than he would have been if his granddad had made the journey to Australia that apparently was at one time planned for him. (Family legend has it that he narrowly escaped transportation for keeping a muck-heap outside his house in Writtle.) In most cases there were other forces at work in addition to the discretely imperial one. In my granddad’s it was industrial capitalism. The same applied in many colonial cases, too.
Another image of imperial strength to be wary of is the great public show the empire made around 1900: the fantastic gubernatorial uniforms, the ceremonial puffery of the 1897 Diamond Jubilee and the ‘mafficking’ of the early Boer War. But in 1900 the empire was coming to seem under threat. Many of these displays of loyalty to it were nervous reactions to this. It was only around then, too, that British imperialists began thinking of their colonial possessions as a unity; of the empire as an empire, in order to strengthen it against these threats. Before this it had been a much more messy affair.
The association of capitalism with imperialism is well-known. It used to be denied by imperialists in more social-democratic times, when ‘capitalism’ was a term of implied abuse. It is acknowledged by all (and positively celebrated by Niall Ferguson) now that capitalism has become respectable again. But the precise relationship between the two is not always understood. The expansion of British trade and finance into the wider world generally came before the more formal kind of imperialism; in other words the flag followed trade rather than vice-versa. That is assuming it followed at all. You could have foreign trade without imperialism, or at least, that is what contemporaries believed, before modern historians decided that this should be called imperialism, too. Whether this is a valid extension of the meaning of the word is a matter of opinion: imperialism doesn’t have one, set definition and is used in a variety of ways. My own preference is for a usage that preserves the notion of compulsion or pressure.
Mau Mau suspects led away for questioning by the police in Nairobi, Kenya, 1952.Mau Mau suspects led away for questioning by the police in Nairobi, Kenya, 1952.For the Victorians there was a clear distinction between imperialism and mere commercial expansion, which explains why, until the last quarter of the 19th century, they often denied they were engaged in the former. Denial was not easy in the light of the formal colonies Britain already possessed, which were often regarded as embarrassing obligations incurred in less enlightened times to be shed as soon as respectably possible. (Canada would be the first.) Even in these places the flag was not supposed to give Britain any particular commercial advantages, with their markets usually open to all other nations.
This was not colonialism in the 18th-century, mercantilist sense. Free trade, in fact, was widely supposed to be both the antithesis and the antidote to imperialism, bringing an end to, as the great anti-Corn Laws agitator Richard Cobden put it in 1846, ‘the desire and the motive for large and mighty empires; [and] for gigantic armies and great navies – for those materials which are used for the destruction of life and the desolation of the rewards of labour’; all this ‘as man becomes one family, and freely exchanges the fruits of his labour with his brother man’. Today it is possible to read all kinds of imperialist inferences between these lines: who are we to tell the world what is best for it? But it is easy to see how contemporaries could be persuaded that they were embarking on an entirely different and more ethical course. This was what differentiated mid-19th-century Britain from all previous imperial times and nations.
It was also a highly convenient position from a practical point of view. This kind of (theoretically) peaceful expansion was well suited to a nation whose military capacities (as opposed to naval) were not of the highest order, by comparison with three or four continental European armies. Luckily European armies before the 1880s were less interested in challenging Britain in these commercial theatres, which left it with only technologically backward ‘native’ opponents when it came to defending its economic interests outside Europe.
Cost was the other convenience. Britain’s commercial expansion, and the formal imperialism that rode on the back of it, were on the whole cheap. Free trade was more than merely a commercial policy. It was tied in to a whole economic ideology, called ‘political economy’ then – ‘free marketism’ or ‘neo-liberalism’ today – one tenet of which was that enterprise worked best if it was not taxed. Anything that required tax revenues was therefore discouraged. That is partly why the British army was relatively skimpy and why colonies could not be allowed to become a direct burden on the British Treasury. They had to be ‘self-sufficient’: their revenues, even for their defence, raised locally.
This had profound implications for the way the empire was subsequently ruled. The main one was that it could only be done on the cheap. Sometimes this was achieved by sending out men (always men) from the public schools (nearly always the public schools) in pith helmets and khaki shorts, or more exotic clothing if they went as ‘governors’. These imperial proconsuls were more old-fashioned paternalistic than new capitalist, thanks to the values they had imbibed at their public schools. (‘He’s in trade’ was a common put-down among this class.) True free market capitalists didn’t reckon much to governing in any circumstances; it was against the grain to work at an occupation which was neither productive nor profitable. Colonial rulers did not generally share the free market ideology of the people whose activities had given rise to the necessity of their presence in the first place. Consequently, and to avoid unsettling their charges and the risk of provoking rebellion, many of them actually obstructed the capitalist exploitation of the colonies they were in charge of. Certainly it was not they – the highest-profile and most conventionally imperialist of the imperialists – who were responsible for spreading capitalism to Africa, Asia and elsewhere; or ‘democracy’, which they scarcely understood; or any of the other features of modernisation that present-day apologists for imperialism attribute to British rule in the 19th and 20th centuries.
The main point about this cadre of men, however, is how few they were. This was because more of them could not be afforded without inflicting unacceptable taxation either on the British or their colonial subjects. Taxes for colonial purposes could provoke rebellions in both places. There was a ‘Hut Tax war’ in Sierra Leone in 1898, for example, and a Commons revolt against a tax to police and govern Mesopotamia (Iraq) in 1921. In total there were just 2,000 British imperial servants ‘in the field’ over the whole of British India in the 1900s, plus about another 2,000 in the rest of the empire: just 4,000 to control hundreds of millions of native subjects. Of course they had soldiers to back them, many in India (albeit a minority British) but far fewer elsewhere, as well as native collaborators to help them with the more mundane tasks.
Collaborators, however, need to be collaborated with. This was crucial to Britain’s governance of its ‘Crown’ colonies, which was generally indirect, both in terms of using local traditional rulers and of generally preserving the natives’ own customs and cultures. Again this was found to be less unsettling than trying to change them (the Indian ‘Mutiny’ in 1857 was a lesson in the dangers of that) and some British colonial officials did grow genuinely to value Indian and African cultures. The common idea, therefore, that British imperialists invariably set out to westernise their subjects is not altogether sound. Even if they had wanted to they couldn’t have.
Economy with regard to colonial government was achieved elsewhere by delegating its duties to others. Most of the British Empire in the 19th- and 20th-centuries was what we would now call ‘privatised’, for two of the reasons usually adduced for privatisation today: to save money and to shed responsibility. The other reason, that ‘private’ always works better, was not so much in evidence. The three main sorts of beneficiary were local rulers (Indian princes, for example); European settlers (in Australasia, British North America, the ex-slave islands of the West Indies for a brief period and southern and eastern Africa); and ‘chartered’ capitalist companies (in India, going back centuries, Africa and the Pacific). Liberal governments from the 1880s were particularly keen on this last device, which chimed with their free market principles and also – they believed – with their anti-imperialism. In 1906 Britain even granted representative government, with a racist constitution of which no one approved, to the Transvaal Boers, a people it had just beaten in a bruising war. The reason given was that the government was powerless to do anything else. This could be regarded as a sort of anti-imperialism. These were colonies where metropolitan control – ‘imperialism’ by one definition – was minimal, allowing the forces of capitalism and settlerism to operate freely and ‘naturally’. These were the major forces here, with the empire’s main role being a negative one, not to hold them back.
The back cover of the catalogue for the Empire Exhibition at Johannesburg in 1936/7 shows the extent of Cecil Rhodes' ambitions. Click to enlarge.The back cover of the catalogue for the Empire Exhibition at Johannesburg in 1936/7 shows the extent of Cecil Rhodes' ambitions. Click to enlarge.All this undermined Britain’s effective control over its empire and its ability either to do good there or to prevent harm. Southern Rhodesia, outsourced in the 1880s to both a private company – Cecil Rhodes’ British South Africa Company (BSAC) – and the white settlers who came in its train, did all the ‘ruling’ over the black African majority, with the Colonial Office hardly getting a look in, until the country’s genuine independence, as Zimbabwe, in 1980. This is not to absolve Britain from moral or legal accountability for what went on in Rhodesia during that period, any more than a government that privatises railways or schools or health provision can avoid either the credit or the blame for the way they operate subsequently.
The licence that settlers and capitalists were given in so many of the British colonies had mixed and controversial effects. They, rather than the conservative and paternalistic proconsuls, were the main agents of modernisation, economic, cultural and social. If you consider this was a good thing, or perhaps a regrettable necessity (which was the way a lot of ‘progressive’ Victorians regarded it) you will look upon these enterprising colonists indulgently. Against this are two main arguments. The first questions the value of capitalism in this form for the good of these countries and even of the world: encouraging unbalanced colonial monocultures, for example; exhausting soils with intensive and large-scale farming methods; and uprooting native societies. (It was this last consequence that set the paternalists against it.) The second points to the sheer human suffering, way beyond ‘uprooting’, which came in the train of these developments.
The history of the British and of other European empires is punctuated by ‘atrocities’: slavery, mass killings, avoidable famines. Many of these were inflicted by settlers: for example against the aborigines in Australia, the natives of southern Africa, Native Americans, in the West Indies, in Kenya and in French Algeria; or by capitalist entrepreneurs: Atlantic slave traders; the BSAC in southern Africa; Malayan planters; King Leopold’s rubber-tapping ‘concessionaires’ in the Congo Free State. If you count the British East India company as capitalist, which is stretching things a bit for the mid-19th century, by which time it had lost its trading function but was still largely independent of government, you can also debit the several Indian wars that culminated in the 1857 ‘Mutiny’ to this. Britain took over the Company directly in 1858 as a result. One might even include the great famines in Ireland (1840s) and Bengal (1940s) as exacerbated by contemporary economic-liberal theory, which in its most severe form taught that these matters were best left to the markets. The key point, however, is that they were not directly caused by the kind of imperialism represented by pith-helmeted, khaki-shorted young prefects.
This suggests a paradox: that the less genuine and formal colonial government was, the worse things were likely to be. Indeed it was for this reason that the most thoughtful anti-imperialists of the early 20th century were against Britain’s simple withdrawal from its colonies. Thinkers such as the economist J.A. Hobson (1858-1940) argued that this wouldn’t genuinely liberate them, but would leave them more vulnerable to capitalist imperialism. Hobson and others called instead for international ‘trusteeship’ over European empires until their subjects could properly be prepared for self-rule. This was one of the supposed purposes of the post-First World War mandates system.
Britain’s imperial weakness was finally exposed between the First World War and the 20 years after 1945. The empire became impossible to defend against rising nationalist movements, helped from outside, without resorting to a degree of firmness or brutality that people in Britain, whose attachment to the empire was predicated on the belief that it was essentially liberal, were reluctant to tolerate. Repression was tried at first: in Kenya, Malaya, Cyprus and elsewhere. (These were the occasions of many of Britain’s worst colonial ‘atrocities’.) But its failure and the remarkably rapid dismantling of the empire that followed shows just how weak the empire had been for years and how little its disappearance affected the majority of Britons. The collapse was presented by government as ‘granting’ self-rule to the natives, in India a ‘transfer of power’: almost the culmination of the old, Cobdenite liberal project; but that fooled no one.
These realities challenge the notion of a British imperium in the Roman sense. Britain was only able to rule most of its empire on the sufferance of local collaborators: native chiefs and princes, empowered settlers and companies, even many of its prefects, all of whom had agendas of their own. This was a cause of weakness in other ways, too. Though the soldiers of the Dominions and India contributed military strength in the two world wars, this must be weighed against the advantage taken by colonial nationalists as a result of those conflicts to push their own anti-imperialist claims. Furthermore the First World War (and consequently, indirectly, the second) was partly provoked by the existence of the British Empire. Diplomatically this form of rule was frequently a liability: drawing Britain unproductively into Afghanistan and the Nile Valley, for example, in order to defend India and the routes to it; tying up military forces that might be needed for its own defence; creating for rival European nations (and later the Soviet Union) unnecessary points of irritation; and overall restricting Britain’s freedom of movement on the European stage.
The empire’s great size was misleading if it was thought to reflect world power. If only it could be organised and consolidated to realise its full potential thought the imperialists – tariff, defence and political unions were mooted – something might be made of it, so long as ordinary Britons could become as committed as the imperial zealots were. (The jury is still out on how committed the former were; but it certainly wasn’t enough.) Most efforts in this direction, however, came to nothing. The empire continued to disappoint, on these grounds at least. Later, its successor, the multiracial, self-governing ‘Commonwealth’, acquired a certain moral authority, but that was different.
The empire was just too much of a mess to make this kind of impact. It was never organised or even loosely administered centrally. At least three separate government departments were involved. There was little training offered for the young men sent out to run it, beyond the occasional university summer school; nothing on the pattern of France’s Ecole Coloniale. Governing was learned on the job, underpinned by a public school education: lessons from Classical history that were supposed to be good for all time; hints about ‘character’, ‘stiff upper lip’ and the like; a tradition of noblesse oblige; and maybe some prejudices against the working classes that could be carried over to ‘natives’. Beyond that, attitudes and policies were largely formed by experience in the field and how a man regarded his function there. It could be a steep learning curve.
This applied especially to racial attitudes. It is commonly assumed that imperialism and ‘racism’ went together, with the racism deriving from whites’ home-taught feelings of racial superiority; but this is an oversimplification. For a start the most racist Europeans were often anti-imperialist: like the French writer Arthur de Gobineau, who was afraid of the racial mixing that might ensue; and Charles Dickens, who saw no point in trying to do good for inferior races that were doomed in any case (and at the expense of the British working classes, who deserved it more).
At the other end of the spectrum, some people became working imperialists because they believed in human equality. This was true of most Christian missionaries, whose whole purpose rested on the assumption that the peoples they were there to convert were at least convertible. They were probably the major agents of cultural westernisation in the colonial field, hence the suspicion and even hostility shown to them by many conservative colonial officers. The missionaries’ prejudices tended to be cultural rather than racial. For government officers it was different: they needed to respect native religions and hierarchies in order to work through them. Some believed these were racially determined: that animism, for example, was in the West African’s genes; but it was possible, theoretically, to hold that they were ‘separate but equal’, to use a term later beloved of apartheid politicians, which indicates how treacherous the idea could be.
Public school paternalism also carried the implication that the wards were ‘children’; but children grow up. (The question was, when?) Small traders, who had to negotiate with their customers, generally got along with the indigenous peoples pretty well. So did most explorers. The most conventionally racist of the colonialists were the European settlers, whose attitudes again conformed to their function: they needed to believe in the natives’ racial inferiority in order to justify taking their lands and forcing them to labour. That is quite apart from the military, whose prime function was to kill them. We know from recent wars what can happen to soldiers’ perceptions of their enemies. Hence the settlers’ and military’s disproportionate involvement in colonial atrocities. Colonists’ racial attitudes were mainly acquired on the spot, not brought with them from home; where, incidentally, the teaching of racism in schools was far less common than in European countries that didn’t have colonies. This might suggest that the effect of having an empire, or at least one that needed to be collaborated with, was to undermine racial prejudice, rather than the opposite. (But this needs more comparative research.)
Of course Britain’s general influence in the world, during the 19th century especially, was immense. You can still see it today. The omnipresence of the English language points to this. Her peoples’ DNA is widely spread. Sites all over the world bear British names. Many developing countries have frontiers, often awkward ones, negotiated between Britain and other powers. The Industrial Revolution started in Britain and expanded from there. Many of the world’s railways bear witness to this. Cricket and association football may be considered, if not its greatest contributions to world civilisation, at least the ones that are least alloyed. The constitutions and legal systems of several countries bear the mark of Britain’s. There are some gaps. Its cuisine doesn’t appear to have caught on so widely. Other countries have been far more influential artistically, with the exception of literature.
The point is, however, that very little of this required an empire, or anything that can usefully be called imperialism (implying compulsion or pressure), to achieve it. Frontiers – certainly. Migrant settlers and certain markets needed protection by the Royal Navy and just a few land troops. Cricket may have required a longer period of imperial tutelage to establish it than football because it is a more complicated game (which is why the two sports have such different geographical spreads). Most of the rest, however, could have done – and done better – without the empire, or ‘imperialism’ in the limited sense that I prefer. At most the formal empire was just a part of this and not nearly as potent a part as these other softer kinds of influence, including the great natural force of unfettered capitalism, which of course continues.
As well as being historically misleading, attaching to the British Empire all this (Roman) baggage can actually be dangerous. We are familiar with African leaders blaming European imperialism for most bad things and even suspecting western countries of imperial ambitions against their countries still. In Uganda British colonists are charged with introducing homosexuality there; West African Anglican bishops resist Canterbury’s (relative) tolerance in this area as ‘liberal imperialism’; while over in Jamaica, confusingly, liberals blame the empire for local homophobia. Muammar Gaddafi used to rally his followers against the ‘crusader-imperialism’ of the West: the crusader prefix, of course, adding a particular frisson for Muslims. Apart from anything else, this kind of charge attributes far too much potency to imperialism in the sense they mean.
More serious, however, is when pro-imperialists attribute all the good in the world to western imperialism, in its role of disseminating what they like to call western values. The title and subtitle of Niall Ferguson’s latest book (and TV series) Civilization: The West and the Rest (2011) perfectly encapsulate this. The flaw is that most values described as western are not exclusively that. Nearly all of them are also found rooted in certain Chinese and Indian societies, as Jack Goody has shown in his book The Theft of History(2007), about the way the West has falsely appropriated these ideas. Unfortunately many ‘Resterners’, as Ferguson dismissively calls them, seem to have fallen for the imperial view of this. Reacting against their former imperial masters, they often reject what they take to be their ideas, also. So, as a Sri Lankan commentator put it in 1999 in response to a typically ‘liberal imperialist’ speech by Tony Blair, what the West presents as universal human rights are not that really, but only ‘relative to the present West, which has been successful in establishing its hegemony over the whole world’.
No one likes to be told even the right thing to do by a bully; even if the bully – in this case the British Empire – isn’t as tough as he used to dress himself up to be.
Bernard Porter is the author of The Lion's Share. A History of British Imperialism 1850 to the Present (5th edn., Pearson, 2012).

NYTimes


Publisher Who Transformed The Times for New Era

Arthur Ochs Sulzberger, who guided The New York Times and its parent company through a long, sometimes turbulent period of expansion and change on a scale not seen since the newspaper’s founding in 1851, died early Saturday at his home in Southampton, N.Y. He was 86.
His death, after a long illness, was announced by his family.
Mr. Sulzberger’s tenure, as publisher of the newspaper and as chairman and chief executive of The New York Times Company, reached across 34 years, from the heyday of postwar America to the twilight of the 20th century, from the era of hot lead and Linotype machines to the birth of the digital world.
The paper he took over as publisher in 1963 was the paper it had been for decades: respected and influential, often setting the national agenda. But it was also in precarious financial condition and somewhat insular, having been a tightly held family operation since 1896, when it was bought by his grandfather Adolph S. Ochs.
By the 1990s, when Mr. Sulzberger passed the reins to his son, Arthur Sulzberger Jr., first as publisher in 1992 and then as chairman in 1997, the enterprise had been transformed. The Times was now national in scope, distributed from coast to coast, and it had become the heart of a diversified, multibillion-dollar media operation that came to encompass newspapers, magazines, television and radio stations and online ventures.
The expansion reflected Mr. Sulzberger’s belief that a news organization, above all, had to be profitable if it hoped to maintain a vibrant, independent voice. As John F. Akers, a retired chairman of I.B.M. and for many years a Times Company board member, put it, “Making money so that you could continue to do good journalism was always a fundamental part of the thinking.”
On Saturday, President Obama praised Mr. Sulzberger as “a firm believer in the importance of a free and independent press, one that isn’t afraid to seek the truth, hold those in power accountable and tell the stories that need to be told.”
Mr. Sulzberger’s insistence on independence was shown in his decision in 1971 to publish a secret government history of the Vietnam War known as the Pentagon Papers. It was a defining moment for him and, in the view of many journalists and historians, his finest.
In thousands of pages, this highly classified archive detailed Washington’s legacy of deceit and evasion as it stumbled through an unpopular war. When the Pentagon Papers were divulged in a series of articles in June 1971, an embarrassed Nixon administration demanded that the series be stopped immediately, citing national security considerations. The Times refused, on First Amendment grounds, and won its case in the United States Supreme Court in a landmark ruling on press freedom.
Mr. Sulzberger reshaped The Times. In the mid-1970s, another financially difficult period in which he might have chosen to retrench, he expanded the paper to four sections from two, creating separate sections for metropolitan and business news and introducing new ones oriented toward consumers.
They were a gamble, begun in the hope of attracting new readers, especially women, and advertisers.
Some critics dismissed the feature sections as unworthy of a serious newspaper. But the sections — SportsMonday, Science Times, Living, Home and Weekend — were an instant success, without compromising the paper’s hard-news core. They were widely imitated.
Over the next two decades, a billion-dollar investment in new printing facilities made still more innovations possible, among them a national edition, special regional editions and the daily use of color photos and graphics.
“Adolph Ochs is remembered as the one who founded this great enterprise,” Richard L. Gelb, a longtime member of the Times board, said in 1997, when Mr. Sulzberger stepped down as chairman. “Arthur Ochs Sulzberger will be remembered as the one who secured it, renewed it and lifted it to ever-higher levels of achievement.”
Even while the enterprise was put on a secure financial footing, ultimate control never passed from the Sulzberger family. It managed to avoid the internal strife and jealousies that tore apart other newspaper dynasties and traumatized their companies.
At Mr. Sulzberger’s death, The Times was being run by a fourth generation of his family, a rarity in an age when the management of most American newspapers is determined by distant corporate boards. A family trust, unaffected by his death, guarantees continued control by Adolph Ochs’s descendants.
It was no coincidence, Mr. Sulzberger believed, that some of the country’s finest newspapers were family-owned. “My conclusion is simple,” he once said with characteristic humor. “Nepotism works.”
Pentagon Papers Champion
A newspaper publisher may be a business executive, but the head of an institution like The Times is also inevitably cast as a leader in legal defenses of the First Amendment. It was a role Mr. Sulzberger embraced, never with more enduring results than in his decision to publish the Pentagon Papers.
“This was not a breach of the national security,” Mr. Sulzberger said at the time. “We gave away no national secrets. We didn’t jeopardize any American soldiers or Marines overseas.” Of the government, he added, “It’s a wonderful way if you’ve got egg on your face to prevent anybody from knowing it, stamp it secret and put it away.”
The government obtained a temporary restraining order from a federal judge in Manhattan. It was the first time in United States history that a court, on national security grounds, had stopped a newspaper in advance from publishing a specific article. The Washington Post soon began running its own articles based on the same documents, and both papers took their case to the Supreme Court. In late June, the court issued its decision rejecting the administration’s national-security arguments and upholding a newspaper’s right to publish in the face of efforts to impose “prior restraint.”
The significance of that ruling for the future of government-press relations has been debated, but this much was certain: It established the primacy of a free press in the face of a government’s insistence on secrecy. In the 40 years since the court handed down its ruling, there has not been another instance of officially sanctioned prior restraint to keep an American newspaper from printing secret information on national security grounds.
In a 1996 speech to a group of journalists, Mr. Sulzberger said of the documents that he “had no doubt but that the American people had a right to read them and that we at The Times had an obligation to publish them.” But typically — he had an unpretentious manner and could not resist a good joke or, for that matter, a bad pun — he tried to keep even a matter this weighty from becoming too ponderous.
The fact is, Mr. Sulzberger said, the documents were tough sledding. “Until I read the Pentagon Papers,” he said, “I did not know that it was possible to read and sleep at the same time.”
Nor did he understand why President Richard M. Nixon had fought so hard “to squelch these papers,” he added.
“I would have thought that he would bemoan their publication, joyfully blame the mess on Lyndon Johnson and move on to Watergate,” Mr. Sulzberger said. “But then I never understood Washington.”
He did, however, understand the stakes in publishing the Pentagon Papers, he said in 1998. He was risking heavy fines and a sullied reputation for the newspaper, and perhaps even jail for him and his top editors. No editor could solely make a decision of that magnitude, Mr. Sulzberger said. The buck stopped with the publisher.
An Adjudicator, Not an Editor
But more commonly, indeed nearly all the time, Mr. Sulzberger left the task of putting out The Times to the people hired for the job: the managers on the business side and the editors in the newsroom.
“His confidence in the people he chose to trust was almost total,” said Max Frankel, one of five executive editors during Mr. Sulzberger’s time as chairman. “He did not want to edit the paper, plain and simple. He was there to adjudicate disputes and to set standards and values.”
Most afternoons he attended the meeting at which the editors determined what would be on the next day’s front page. But he was there mainly to keep himself informed, not to join the discussion. He generally kept his hands off the editorial page, too. Except perhaps for the most hotly disputed issues and political endorsements, he often did not learn of the page’s opinions until the paper was delivered to his Manhattan home, an apartment on Fifth Avenue.
There were, to be sure, departures from this self-imposed restraint.
A notable one came in 1976, when Mr. Sulzberger insisted that The Times endorse Daniel Patrick Moynihan over Bella S. Abzug in the New York Democratic primary election for United States senator. This brought strong objections from John B. Oakes, who was editor of the editorial page and the publisher’s first cousin once removed. Mr. Oakes received permission to register his disapproval in a most unusual fashion. He wrote a 40-word letter to the editor — in effect, a letter to himself.
Long after his retirement, the episode continued to rankle Mr. Oakes. But he acknowledged that such intervention from the 14th floor — where the publisher had his office in the Times building on West 43rd Street — was rare.
Mr. Sulzberger did not always appreciate the content or the tenor of the editorial page. He thought it was anti-business, especially in the 1960s and ’70s. “If you were a corporation, you were wrong, whatever you were doing,” he said. But he almost never imposed his views. Though he did nudge Mr. Oakes into retirement a year or two early, Mr. Oakes said, “I literally can count on the fingers of one hand the number of times that I had to go to bat for an editorial.” Mr. Oakes died in 2001.
When Mr. Sulzberger wanted to exercise his prerogative and make his likes and dislikes known, he usually did so through interoffice memorandums, signing them with the nickname he had carried since childhood, Punch.
The topics might be of national importance, like the structure of the Joint Chiefs of Staff, the military being a subject he cared about as a former Marine who had served in World War II and the Korean War. When he felt the boundaries of good taste had been crossed, he let the editors know it. But the notes could also be about mundane matters. In one, in 1976, he complained to A. M. Rosenthal, the executive editor, about a published recipe for cooked eel.
“Why can’t our food pages have something on them that most people like?” Mr. Sulzberger asked.
Sincerely, A. Sock
Quality-of-life issues in New York were dear to him. He never pretended to have a solution for America’s balance-of-payments problems, but he had ideas about crime and grime in his hometown. In 1989, they led to a series of editorials that ran under the rubric “New Calcutta.”
Once in a while he wrote brief letters to the editor using the name A. Sock, a wordplay on Punch. Mr. Sock was always as pleased as punch with puns. “The nationalist Chinese seem extremely apprehensive when Mr. Nixon drinks with Premier Chou,” he wrote in 1972. “Are they scared he’ll Taiwan on?”
One A. Sock letter, an unflattering take in 1979 on the National Organization for Women, brought a sharp rebuttal letter. “Mr. Sock deserves a punch,” it concluded. It was signed Gail Gregg, the publisher’s daughter-in-law at the time. Convinced his cover was blown, Mr. Sulzberger wrote almost no A. Sock letters again.
His suggestions to editors and his criticisms tended to be delivered in a velvet glove, not with a mailed fist. “He didn’t have to be obeyed; he just wanted to be heard,” said Jack Rosenthal, a former editor of the editorial page. Arthur Gelb, a former managing editor, put it this way:
“He never said forcefully, ‘This is what I believe and this is what I want.’ It was always mild and often with a little humor. He listened to what others had to say. Some people saw that as weakness. But it was the nature of a true gentleman.”
Mr. Sulzberger involved himself far less with the content of the news columns than with the large business decisions that had to be made. It had been that way from the moment he was named publisher, on June 20, 1963, succeeding his brother-in-law Orvil E. Dryfoos, who had died of heart trouble a month earlier at age 50.
The line of Times publishers, from Adolph Ochs in 1896 to Arthur Sulzberger Jr. a century later, has run through the men in the family.
Mr. Ochs had a daughter, Iphigene, but no sons. When he died in 1935, authority passed to Iphigene Ochs’s husband, Arthur Hays Sulzberger. Iphigene and Arthur had three daughters: Marian, Ruth and Judith, called Judy.
Then came a son, Arthur Ochs Sulzberger, born on Feb. 5, 1926. The father enjoyed composing light verse, and he celebrated this birth with an illustrated book describing the boy as having “come to play the Punch to Judy’s endless show.” The nickname stuck.
In 1961, his health deteriorating, the elder Sulzberger considered it time to step aside. His son was not even considered as a possible successor — not then, anyway. Many Times executives and close relatives felt Arthur was too young and not up to the challenge. The family turned to Mr. Dryfoos, who was married to the oldest Sulzberger daughter, Marian, and who had been a senior Times executive for years.
Unexpected Promotion
But a mere two years into the job, Mr. Dryfoos was dead, and the family looked to the young Mr. Sulzberger. At 37, he became the youngest publisher in Times history.
His mother, the paper’s guiding spirit until her death at 97 in 1990, wrote in her memoir, “Iphigene,” that tapping her son amounted to “something of a gamble.” Until then he had held only lower-level company positions. In one, assistant treasurer, he had been given little to do except sign payroll checks.
“It’s impossible to be an assistant to your father,” he said years later.
Mr. Sulzberger himself — square-shouldered, pipe-smoking, affable, unaffected — knew how lightly he was regarded. “I’ve made my first executive decision,” he told his sister Ruth after his first day on the job. “I’ve decided not to throw up.”
It did not take him long to prove the doubters wrong. “It was remarkable how quickly Punch began to demonstrate initiative and decisiveness,” his mother wrote. Years later, A. M. Rosenthal would call Mr. Sulzberger “the most insufficiently estimated publisher in modern American journalism.”
It would have been hard to dispute that The Times was America’s premier paper in 1963. Its influence on national politics was even greater than it is in today’s information-saturated age of round-the-clock cable news, social media and the Internet.
But the paper in 1963 was also a struggling operation with an uncertain future. It was reeling from low revenues and high labor costs. It had just emerged from a 114-day printers’ strike, which had put a tremendous strain on Mr. Dryfoos.
The long walkout, over wages, automation and other issues, was also devastating to the once-boisterous world of New York newspapers. When the strike began in 1962, there were seven major dailies in the city, all of them already threatened by competition from television and burdened with high costs and low income. Within five years the field had shrunk to three: The Times, The Daily News and The New York Post.
The Times, a morning paper, toyed with the idea of creating an afternoon daily to go head-to-head with the afternoon Post. In 1967 it went so far as to create prototypes for two possible papers. But in the end it dropped the project, deciding it would be too much of a drain on the company’s resources.
At The Times, 1963 was the first year it had lost money since 1898. But even in good years, profits were precariously small. It hardly helped that the paper was based in a city that was in economic decline and unable to stop the flight of many affluent New Yorkers — newspaper readers and advertisers included — to the suburbs.
“I personally never had the concern that the paper was going to go out of business,” Mr. Sulzberger later recalled. “But it was obvious to me that we had a problem on the business side, the problem being structural: the way we worked together or, more accurately, the way we didn’t work together.”
Business as Usual Won’t Do
Mr. Sulzberger moved swiftly to bring financial order, and to show skeptics that he was in charge. Some of his early actions were the sort that any struggling businessman might take. But for The Times of that era, they bordered on revolutionary. Mr. Sulzberger insisted that the news department operate on a budget — unheard-of at the time. He strived to reduce the company’s payroll, then about 5,400 strong, a work force appreciably larger than today’s. He tried to get the advertising and circulation departments, each jealous of the other’s powers and privileges, to coordinate their efforts.
By nature, he was fastidiously neat. The habitual clutter on reporters’ desks drove him to distraction. And he felt no better about untidy organizational charts.
It made no sense to Mr. Sulzberger that the daily Times and the Sunday paper were separate and sometimes competing fiefs. In 1964 he ordered that they both report to a single executive editor, Turner Catledge, who had become something of a father figure to the new publisher.
The plan was not executed so easily. The Sunday department had been run for 40 years by the brilliant but imperious Lester Markel, an editor who brooked no interference from anyone, including a young publisher, who still called him “Mr. Markel.” Mr. Sulzberger chose to spring the daily-Sunday merger without telling Mr. Markel. “Young men can be cruel, sometimes without knowing it,” Mr. Catledge concluded in a 1971 memoir.
The two branches of the paper did not truly merge for 12 more years, but the episode was considered an early sign that one underestimated Punch Sulzberger at one’s peril.
“Punch soon proved to be a more aggressive publisher than either Sulzberger or Dryfoos had been,” Mr. Catledge wrote. “Those two, having married into the newspaper, saw themselves as trustees, but Punch had a sense of proprietorship that they had lacked.”
He continued: “Adolph Ochs used to say, ‘I always question the obvious,’ and Punch was like his grandfather in that regard. He liked to ask questions and challenge assumptions.”
One assumption he challenged was that the paper would return to business as usual after the strike. “Our obvious trouble” at the end of the walkout “was that we had no muscle whatsoever,” Mr. Sulzberger recalled, “and that all our financial eggs were in one basket.”
In January 1964 he summarily closed The Times’s Western edition, a slender version of the newspaper that had been started only 15 months earlier under Mr. Dryfoos. Printed in Los Angeles and distributed in 13 Western states, the edition suffered from anemic advertising and plummeting circulation.
Mr. Sulzberger’s decision left scars in the newsroom, for it involved rare layoffs of staff members. Mr. Sulzberger later acknowledged the trauma. “It was the first time that we kind of threw in the sponge and said, ‘We can’t do it,’ ” he said. “That wasn’t the usual way we did things.”
But The Times did so again in 1967, when it folded its money-losing international edition, which was founded in 1949 and based in Paris. In a merger with the Paris edition of the recently closed New York Herald Tribune, a new entity was formed, The International Herald Tribune. It is now owned solely by The Times after a long partnership with The Washington Post Company.
The Times Company in 1963 consisted of the newspaper, the New York radio stations WQXR-AM and FM, and a share of the Spruce Falls Power and Paper Company in Ontario. Mr. Sulzberger concluded that The Times needed new ventures to generate profits that could insulate it from future strikes, crises in the New York economy or declines in retail advertising as department stores failed.
But expansion required cash, lots of it, far more than could be raised by a small, fusty company with marginal profits at best. “We were privately held,” Mr. Sulzberger said. “Once a year, at the annual meeting of the board, the salaries were passed around on a single sheet of paper. But it wasn’t proper to look.”
All that changed in 1969, when millions of Times shares began to be traded on the American Stock Exchange. (Since 1997 they have been traded on the New York Stock Exchange.) That opened the newspaper to outside investors, as well as to outside scrutiny.
The Family Trust
Significantly, the shares put into play in 1969, Class A shares, were kept separate from a Class B category of stock that controlled the company. Holders of those strategic B shares elected 70 percent of the company’s directors. And nearly all the B stock — more than 87 percent when Mr. Sulzberger stepped down in 1997, and 91 percent today — was held by a family trust established by Adolph Ochs and passed on to his heirs. (Family members also own Class A shares, bringing their total holdings to 15 percent.)
The terms of the trust were revised from time to time, most recently in 1997. But its purpose was unvarying: to ensure that control of The Times would remain with Ochs descendants deep into the 21st century unless they agreed that the only way to save the paper was to sell it.
The importance of keeping the family in charge cannot be overstated, said Walter E. Mattson, a former Times Company president. “The commitment to the quality of the paper and the correctness of the paper — the willingness to suffer the criticism of Wall Street, for example — is best dealt with by a member of the family,” he said.
With money in hand from the public sale of Class A stock, The Times began looking for properties to buy.
It made its first major purchases in 1971, from Cowles Communications. For $52 million in Times stock and the assumption of $15 million in Cowles debt, The Times acquired Family Circle magazine, a group of medical magazines, three Florida newspapers, a textbook publishing house and a Memphis television station. Those businesses are no longer in The Times’s hands, but they were the foundation for a future diversified company.
In October 1997, when Mr. Sulzberger passed on the chairmanship to his son and assumed the title of chairman emeritus, the company included 21 regional newspapers, 9 magazines focused on golf and other outdoor pastimes, 8 television stations, 2 radio stations, a news service, a features syndicate and The Boston Globe, which was bought in 1993 for $1.1 billion, a record price for a newspaper, paid largely in Times Company stock.
The Times Company’s holdings have varied since then and today they are principally confined to The Globe, The International Herald Tribune and its flagship paper.
Not every acquisition under Mr. Sulzberger was a success. Nor did investors applaud every move. Times stock was often an underachiever on Wall Street, as it was in the first years after The Globe was bought. Many thought the company had paid too much for the paper. Mr. Sulzberger disagreed. “It was a business that we understood,” he said. “We know how to run newspapers.”
That philosophy — focus on what you do well — guided his business decisions. Ignoring it would only land the company in trouble, he believed. The Times got into the book business in the 1970s, then got out when it realized it could not apply to books the same rigorous standards of accuracy that it insisted on for its own newspaper. (The Times Books imprint was licensed to Random House for many years. It returned to The Times in 2000 in partnership with Henry Holt & Company.)
A big mistake, Mr. Sulzberger said, was Us magazine, a weekly The Times started in the 1970s, patterned on People magazine but without its verve or circulation.
“That’s when I came up with a cardinal rule,” he said. “Never acquire a company that you’re embarrassed to tell your mother about. Or start a publication, in this case. It really isn’t our line of work, and we weren’t very good at it.”
Us Weekly was sold within three years and is now published by Wenner Media.
Unions and Women’s Issues
The Times Company Mr. Sulzberger passed along in 1997 was light years from the one he had inherited. Revenue in 1963 was $101 million, with The Times newspaper accounting for almost all of it. By 1997, the total was $2.6 billion, with the newspaper accounting for about half. In 2011, The New York Times Media Group, made up of The Times, The International Herald Tribune and their Web sites, accounted for 66 percent of the Times Company’s total revenues of $2.4 billion.
The Times eventually had less reason to worry about the sort of labor strife that plagued New York papers through the 1960s and ’70s. Its newfound strength was shown when it bounced back quickly from an 88-day press worker’s strike in 1978, the last time it endured a significant walkout.
In Mr. Sulzberger’s early years as publisher, his attitude toward the dozen or so newspaper unions in New York ranged from annoyance to hostility. More than a few workers considered him less than generous when it came to salaries and health benefits.
He was also challenged on other grounds. In the 1970s, the company was hit with a class-action lawsuit accusing the paper of practicing systematic discrimination against women in hiring, promotions and pay. The Times reluctantly settled the suit in 1978 by agreeing to pay $350,000 in back wages and lawyers’ fees and committing itself to an affirmative-action program for women in both the news and business departments.
In later years, labor relations at the paper became far more conciliatory. Long-term no-strike contracts were signed, offering productivity incentives for some employees, like press workers. But the 1960s and ’70s were a confrontational era.
The Times, like newspapers across the country, had embraced automated typesetting and high-speed printing operations as a way to shrink the work force and cut costs. Mr. Sulzberger believed that New York’s powerful craft unions, seeking to protect jobs and maintain their memberships, stood in the way of the new money-saving technologies.
The problem went beyond the strikes called every few years. His paper, he felt, was bleeding from a thousand cuts in the form of repeated slowdowns, whether by typesetters, press workers or truck drivers.
Still, Mr. Sulzberger did not confront the unions head-on until the 1970s. A turning point came in 1974, when The Times and The Daily News reached an agreement with Local 6 of the International Typographical Union, the same printers’ union that had struck in 1963.
In return for substantial bonuses and guarantees of lifetime employment, “Big Six” agreed, finally, to let the newspapers bring computers and electronic typesetting into composing rooms, which were still being run much as they had been for 90 years, with clanking machines that cast type in hot lead, one line at a time. The old composing room — big, burly, noisy and, yes, romantic — would disappear in a few years, followed over the next 20 years, through attrition, by some 800 jobs.
While it was a breakthrough, the agreement did not improve the paper’s finances right away. Circulation and advertising were stagnant, and so were profits. A national recession in 1975 hit New York especially hard, and the city teetered toward bankruptcy. For The Times, the solution was to make striking changes in both its content and appearance.
Profitable Innovations
One move, in 1976, was to adopt a six-column format, replacing the denser eight-column configuration that the paper had used since 1913. This put The Times in line with other newspapers that were standardizing their formats as a convenience to many advertisers.
The same year, the daily paper expanded to four sections from two, adding stand-alone business and metropolitan news reports and a daily mini-magazine devoted to consumer-oriented subjects. The paper also introduced four Sunday regional sections in 1976, aiming them at New York’s affluent suburbs.
These innovations were followed in 1980 by a new national edition, which Mr. Sulzberger began with some trepidation, remembering the Western edition’s failure two decades earlier. But it had become less costly to print a version of the newspaper at satellite plants around the country than to distribute copies of the New York edition nationally by airplane. The national edition gained circulation steadily and eventually drew its own advertising. It now accounts for more than half of the print newspaper’s 779,731 daily readership.
Photos and graphics began appearing in color in Sunday sections in 1993. A Times Web site was introduced in 1996. The next year, special editions for New England and the Washington area were started. And despite some head-shaking on Wall Street, the paper invested nearly $1 billion in new printing plants in Edison, N.J., and College Point, Queens. (The Edison plant was later sold.) In September 1997, the daily Times began to be printed in color, its news report now spread across as many as six sections.
When these changes started in the mid-1970s, there was no mystery to them. Mr. Mattson, who was general manager at the time, saw the consumer-oriented sections as a way to lure new advertisers and readers. It was up to the newsroom, under Mr. Rosenthal, to figure out how to create the sections with taste and intelligence so that they offered more than tips on how to get chocolate stains out of the carpet.
The Weekend section on Friday came first, in April 1976. It was followed by Living on Wednesday, Home on Thursday and SportsMonday.
What to do with Tuesday became an issue. It turned into a tug of war. The business staff wanted a fashion section, seeing it as a sure way to bring in advertising. The newsroom preferred a science section, seeing it as more in tune with The Times’s traditional mandate.
As arbiter, Mr. Sulzberger sided with the news department.
His decision paid an unforeseen dividend when the new Tuesday section, Science Times, began generating advertising, thanks to the advent of the personal computer.
Science Times was the publisher’s way of reinforcing a conviction that the editors, not the business staff, ultimately determined what went into The New York Times.
There would be skirmishes over containing costs, as when advertising declined after Wall Street’s Black Monday in October 1987 and belts were tightened.
But The Times continued to pour money into news gathering when many American papers were pulling back.
“Punch was a very staunch advocate for the editor; it wasn’t even a close call,” Arthur Gelb said in 1998. Mr. Mattson said, “There was never an issue as to the business people involving themselves in the news side of the newspaper.”
Living, Home and the other feature sections did not draw unanimous raves, however. The main complaint was that the sections were unbefitting the serious-minded newspaper that had published the Pentagon Papers and pioneered formats like the Op-Ed page, created in 1970 as a forum for outside voices.
“Did you hear about the latest Times section?” one joke went. “They’re calling it News.”
But the skeptics were soon silenced. Not only were the sections imitated; they also brought The Times tens of thousands of new readers.
Mr. Sulzberger was not inclined to apologize for turning a profit.
“A financially sound Times is good for our readers, our advertisers, our shareholders and our employees,” he wrote to The Wall Street Journal, responding to criticisms there, in 1978. “A financially sound Times also is good newspapering. A newspaper that is broke is not going to be able to spend the money necessary to support thorough and aggressive reporting.”
Passing the Torch
As the years passed, Mr. Sulzberger turned his attention toward handing over leadership to the next generation. In 1992, his son, Arthur Sulzberger Jr., took over as Times publisher. The question then became whether the younger Sulzberger would also inherit his father’s corporate titles. Some board members wanted to see someone from outside the family assume those posts.
In 1997, Mr. Sulzberger rallied family members around his son as chairman. But his sisters, he later said, wanted roles for their own branches of the family. “So we came up with compromises and things that worked wonderfully well for the organization,” he said. Arthur Sulzberger Jr. was made chairman. A cousin, Michael Golden, was named senior vice president and vice chairman. The chief executive was from outside the family, Russell T. Lewis.
Once again, the Times family had avoided the kind of internal strife that had battered newspaper clans like the Binghams of Kentucky and the Chandlers of Los Angeles.
The family sense of duty was strong. It was captured in a eulogy that Susan W. Dryfoos, Orvil Dryfoos’s daughter and the director of the Times History Project, delivered in February 1990 for her grandmother, Iphigene Ochs Sulzberger.
“We, your family, understand our birthright,” Ms. Dryfoos said.
Siblings With a Mountain
Arthur Ochs Sulzberger’s share of that birthright took shape in a town house in Manhattan on 80th Street, off Fifth Avenue, where Arthur Hays and Iphigene Ochs Sulzberger had settled a few years before Arthur’s birth.
Family members remember a boy with a sunny disposition, doted on by relatives, including his three sisters, although, he said, “Judy and I fought like cats and dogs when we were little.”
He liked games requiring manual dexterity. He was fascinated by gadgets, a passion that became lifelong, though it did not extend to computers. Forced to install a desktop computer, he was not known to have used it. He preferred his old Underwood manual typewriter.
Growing up in The Times’s guiding family meant a life spent in the company of the famous and powerful. Young Punch played Chinese checkers with Wendell L. Willkie, the 1940 Republican presidential nominee. Adm. Richard E. Byrd, whose Antarctic expeditions were financed by The Times, named a peak in Antarctica for him and the other Sulzberger children. The name, Marujupu Peak, is a combination of Marian, Ruth, Judith and Punch. Mount Iphigene is nearby.
But it was also a family that shunned ostentation and put a premium on social obligation. When Arthur was 8, his mother sent this note to his father:
“Punch the other day said he would like to be king of the world. ‘What would you do then?’ I asked. ‘Well,’ said he, ‘I would first make all the countries stop fighting and be friends and then I would fix up Germany.’ ‘How would you do that?’ ‘I would make Hitler into a plain man and make them get a good man as President (he thinks Hitler is President of Germany) and then I would go to Africa and kill all the bugs that make people sick.’ ”
What struck Mrs. Sulzberger was that her son had talked about what he would do for others, not for himself.
Punch Sulzberger’s qualities did not include a talent for studies. His boyhood was a succession of expensive private schools.
“Nearly every school in the vicinity of New York was graced with Punch’s presence at one time or another,” his sister Ruth wrote in 1963 for Times Talk, the newspaper’s staff publication at the time. “They were all delighted to have him, but wanted him as something other than a spectator.”
At one school, St. Bernard’s, on the Upper East Side, there was a yearly Latin play, with roles assigned according to one’s proficiency in the language. Punch played a mute slave.
It amused the self-effacing Mr. Sulzberger later in life to tell stories like that about himself. But back then, his academic failures were a source of pain. His mother thought he had dyslexia.
Whatever the problem, it drove him at age 17 to drop out of the Loomis School in Windsor, Conn., and join the Marines. It was 1944, and World War II was raging. His parents did not like the idea but finally gave written permission. “My family didn’t worry about me for a minute,” he later said. “They knew that if I got shot in the head, it wouldn’t do me any harm.”
Buckling Down
The Marines turned him around.
“Before I entered the Marines, I was a lazy good-for-nothing,” he once told his mother. “The Marines woke me up.”
He would never forget the corps. Afterward, its commandants were invited to The Times for lunch, and tours by Marine information officers were a regular newsroom feature.
Trained as a radioman, Mr. Sulzberger went through the Leyte and Luzon campaigns in the Philippines, then landed in Japan as a jeep driver at Gen. Douglas MacArthur’s headquarters. He was discharged in 1946 as a corporal.
Five years later, with the Korean War on, he was called back to active duty. This time he received an officer’s commission and served as a public information officer in Korea before being transferred to Washington. He was a captain when he returned to civilian life in December 1952.
Between the military tours, Mr. Sulzberger got serious about academic work. Armed with a high school equivalency degree and, he said, “armed with the fact that my old man was on the board,” he entered Columbia University and received a bachelor of arts degree in English and history in 1951. In 1967, he became a life trustee.
While in college, in 1948, Mr. Sulzberger married Barbara Winslow Grant, who lived near the Sulzbergers’ estate in Stamford, Conn. They had two children, Arthur Jr., born in 1951, and Karen, born in 1952. Karen Sulzberger is a former arts administrator and board member of Facing History and Ourselves, an educational organization.
The Sulzberger marriage ended in divorce. Months later, in December 1956, Mr. Sulzberger married Carol Fox Fuhrman; they had met at a New York dinner party given by Orvil Dryfoos’s brother, Hugh. The couple had a daughter, Cynthia, in 1964. She is an elementary school reading specialist.
By a previous marriage, Mrs. Sulzberger had a daughter, Cathy, who was born in 1949 and legally adopted by Mr. Sulzberger. Cathy Sulzberger is a real estate developer who serves on many philanthropic boards.
Carol Sulzberger died in 1995. In March 1996, Mr. Sulzberger married Allison S. Cowles, widow of William H. Cowles III, who had been president and publisher of newspapers in Spokane, Wash. The two families had a newspaper connection going back to the early 1900s, though Ms. Cowles was not from the branch of the family that directed Cowles Communications. Ms. Cowles died in 2010 in Spokane.
Besides his children, Mr. Sulzberger is survived by two of his three sisters, Marian S. Heiskell of New York and Ruth S. Holmberg of Chattanooga, Tenn., and nine grandchildren. The third sister, Dr. Judith P. Sulzberger — the Judy to his Punch — died in 2011.
A Publisher With Patience
Having decided to go into the family business, Mr. Sulzberger went off in 1954 for a year’s apprenticeship at The Milwaukee Journal. Then it was back to The Times, first on the foreign copy desk, then stints in the London, Paris and Rome bureaus. He was not to make his mark as a foreign correspondent.
One day in 1955, he went to a car race in Le Mans, France — as a spectator, not as a reporter. A driver lost control and jumped the track, his car spinning through the air and churning through the stands. The driver and 82 spectators were killed. It was a horrifying scene. But it never occurred to Mr. Sulzberger to notify The Times.
Some in journalism had the impression that Mr. Sulzberger was not cut out to be publisher, and his reputation was hardly bolstered by the relatively minor assignments given to him when he returned to The Times’s main office. But after being named publisher, he quickly took charge.
One of Mr. Sulzberger’s qualities as head of a high-tension operation was that “he had a soothing personality,” said James C. Goodale, a former Times senior executive and the newspaper’s general counsel in the battle over the Pentagon Papers. John D. Pomfret, a former Times executive vice president who worked alongside Mr. Sulzberger at The Milwaukee Journal, said the publisher had remarkable patience.
“He could outwait a problem,” Mr. Pomfret said. “He was like an oyster almost. If something was irritating him, it’s almost like he would cover it up.”
Not always, though. In the mid-1980s, his irritation turned to anger when Sydney H. Schanberg, who had won a Pulitzer Prize for his reporting in Cambodia, attacked some of Mr. Sulzberger’s business friends and even The Times itself in a series of columns about proposals for a highway along the Hudson River called Westway. The publisher stripped Mr. Schanberg of his column.
In 1973, Mr. Sulzberger provoked howls of outrage, both inside and outside the Times building, when he hired William Safire, a speechwriter in the Nixon White House with no newspaper background, to be a columnist on the Op-Ed page. Mr. Sulzberger wanted a conservative voice to balance the paper’s mostly liberal columnists. Mr. Safire, who died in 2009, won the Pulitzer Prize for commentary in 1978.
On Mr. Sulzberger’s watch, The Times won the Pulitzer Prize, American journalism’s highest award, 31 times.
His commitment to defending the First Amendment was unflagging. He was publisher in 1964 when the United States Supreme Court issued a landmark ruling, New York Times v. Sullivan, strengthening newspapers’ protections against libel suits brought by public figures. (The case began before he took charge of the paper.)
Less successfully, Mr. Sulzberger supported Times reporters in federal and state cases in the 1970s in which they asserted that the First Amendment gave them the right to keep the names of confidential sources secret from grand juries and to refuse to surrender their notes in a murder trial.
In a 1972 speech, Mr. Sulzberger warned that there was a “troubling lack of understanding of the importance of a free press to a free society and an absence of sensitivity to its fragility.”
No case during his career loomed larger than that of the Pentagon Papers.
Mr. Goodale said he had felt that Mr. Sulzberger, given his military background and centrist politics, was at first “semi-committed not to publish” a secret government history. Among the factors weighing on the publisher was pressure from The Times’s outside law firm, Lord, Day & Lord. Its senior partner, Louis M. Loeb, warned that he would not defend the paper in court if it printed the documents.
“Lord, Day & Lord was a well-established firm numbering among its clients the Cunard Line,” Mr. Sulzberger said at a 1996 dinner of the Committee to Protect Journalists. “Whether they were traumatized by the loss of the Titanic, I really can’t say. But they certainly were cautious.”
Lighthearted as that comment was — and inaccurate; the Titanic was owned by the White Star Line, not Cunard — Mr. Sulzberger acknowledged in interviews that he was “scared to death” that the Pentagon Papers could blow up in his face, that “all sorts of terrible things would happen,” from fines to ruined reputations. But he stood firm, published the documents, found another law firm and won.
Calls From the White House
Mr. Sulzberger was not unfamiliar with outside pressures on what he should or should not print. In 1963, President John F. Kennedy tried to get him to replace The Times’s correspondent in South Vietnam, David Halberstam. Mr. Halberstam stayed put.
In 1967, with the war in Vietnam going strong, Secretary of State Dean Rusk called Mr. Sulzberger to express concern about the work of another Times reporter, Harrison E. Salisbury, who was writing from Hanoi.
But there were fewer calls like those than many might think, Mr. Sulzberger said after he retired. In general, he said, government leaders “seemed loath to come directly to me.”
During the Reagan administration, he was invited to lunch in the Oval Office. “When I got through,” he recalled, “I made a telephone call to my mother. I said, ‘Mom, guess who I had lunch with?’ She said, ‘Who, darling?’ I said, ‘The president, the vice president and the secretary of state.’ She said: ‘Oh, that’s wonderful. What did they want?’ ”
Mr. Sulzberger added with a laugh, “I never did find out what they wanted.”
More typically, complaints about news coverage came from social acquaintances, not that it did them much good. The publishing magnate Walter H. Annenberg was so enraged by an article on a seamy side of his family’s past that he kept advertising for his TV Guide out of The Times for months.
As Mr. Sulzberger neared retirement, outside activities, like his involvement with the Metropolitan Museum of Art, took up an increasing share of his time. Mr. Sulzberger was the museum’s chairman from 1987 to 1998 and since 1968 had been a trustee, as his father had been years earlier.
His time as chairman was one of expansion: museum attendance and membership rose significantly, and the budget nearly doubled, to $116.6 million from $64 million. But fund-raising was a chore he preferred to leave to others. “I’m expected to be prepared to go and rattle the cup, but luckily they don’t call upon me too often,” he said.
When he left as Times chairman in 1997, he remained convinced that newspapers — at least good newspapers — had a bright future.
“I think that paper and ink are here to stay for the kind of newspapers we print,” he said in a post-retirement interview. “There’s no shortage of news in this world. If you want news, you can go to cyberspace and grab out all this junk. But I don’t think most people are competent to become editors, or have the time or the interest.”
“You’re not buying news when you buy The New York Times,” Mr. Sulzberger said. “You’re buying judgment.”