The Metropolitan Museum of Art, founded in 1870, is enjoying a golden age. And if a single person can be said to have shaped this extraordinary era, it's Philippe de Montebello, the director of the museum since 1977. The announcement last winter that he would step down by the end of the year has provoked stock taking and soul searching inside the museum and beyond. This month an exhibition opens, "The Philippe de Montebello Years: Curators Celebrate Three Decades of Acquisitions," that draws on work from the museumâ??s 17 curatorial departments. Among the treasures assembled: a Kongo power figure, its massive torso bristling with nails; a guitar that once belonged to Segovia; a sweetly sensuous yet austere Buddha from fifth-century India; Rubens's sumptuously romantic self-portrait with his young wife; and, most extraordinary of all, Duccio's intimate, gentle, tiny Madonna and Child, a work that almost immediately found its way into the hearts of many New Yorkers when it was purchased, in 2004. These wonders join the 2 million objects in what may well be the greatest encyclopedic art museum in the world. But acquisitions are only the beginning of the glories of the de Montebello years.
This has in many respects been an unlikely golden age, unfolding as corporate and government support shrinks and the conviction grows among cultural arbiters that the public will invariably gravitate toward the latest pop sensation rather than the art of the past. De Montebello, however, isnâ??t among the pessimists. Rejecting the idea that a museum lives or dies on the basis of a few heavily marketed blockbuster events, de Montebello operates, as he told me in May, under the assumption that "the public is a lot smarter than anybody gives it credit for. The public as a whole has intellectual curiosity. These are people who know the difference between a serious show and pure sham." This theory and its corollary -- "If you take the high road, the public comes to expect it" -- animate nearly everything the museum does. What some may regard as the Metropolitan's overly rarefied adventures include exhibitions devoted to Byzantine art and Renaissance and Baroque tapestries, the immaculate reinstallation of the Greek and Roman collection in new galleries, and the recent "Poussin and Nature: Arcadian Visions," a show that explored the poetic reveries of the most rigorously intellectual of all painters. The pundits expect the majority of museumgoers to reject Byzantine icons, Renaissance tapestries, Greek vases, and Poussin's landscapes. But at the Metropolitan, the pundits are proven wrong -- time and again.
In a period when museums seem increasingly schizophrenic -- bouncing between a curator's desire to preÂsent the best work in the most illuminating way and a marketing executiveâ??s obsession with the next blockbuster show or high-profile building scheme -- the Metropolitan offers nuance and coherence. Rather than herding visitors into the big show or the galleries with the greatest hits, the Metropolitan encourages them to take in a variety of unexpected treasures, often less well-known but equally worthy of attention. Every day, visitors linger over works on paper -- a Rembrandt drawing, for instance, or a 19th-century children's book, or a 20th-century photograph -- placed front and center, just off the great staircase on the second floor. In most museums, prints and drawings and vintage photographs are relegated to a remote location, filed away as specialized items for specialized tastes. Here they occupy prime real estate, where they can't be missed. The assumption at the Metropolitan -- the most visited attraction in the most museum-conscious city in the country -- is that the public can respond to anything beautiful or curious or rare, if only given a chance. De Montebello -- 72, tall and elegant, with a deep, French-accented voice that visitors have come to know from the audio toursâ??was born in Paris, and after World War II emigrated with his family first to Canada and then to the United States, where he attended the Lycee Français in New York and then Harvard. Most of his professional life has been spent at the Metropolitan. He began as a curatorial assistant in the department of European painting in 1963 and, after being named director of the Houston Museum of Fine Arts in 1969, returned to the Met five years later, where he served as an assistant to then-Director Thomas Hoving. Hoving was a controversial figure, a showman's showman who brought the Metropolitan into the blockbuster era, some would say kicking and screaming (he was the mastermind behind the exhibition of Egyptian antiquities that came to be known as "King Tut"). By the time the museum turned to de Montebello, many on the board of trustees were weary of the man the art critic Hilton Kramer had described as a "master of the revels." Whatever the initial perception of de Montebello -- and many saw him as relatively weak, perhaps as somebody the board could control -- this will surely go down as one of the greatest hires in the history of American museums. Although he may have been brought in to be the anti-showman, de Montebello has reinvented showmanship, demonstrating that it can be intellectually grounded, civic-minded, and fiscally sound.
In the United States, the Museum of Modern Art, founded in 1929, was the model for a new, multilayered vision of the museumâ??s place in society. The Modern published books and catalogs that largely established the study of modern art as a scholarly discipline; the schedule of temporary exhibitions, both in New York and traveling across the country, shaped our understanding of the major and minor achievements of the 20th century; and the museum became a dynamic social force in the city, through the popularity of its sculpture garden and restaurants as meeting places, and of the programs of classes, lectures, and films that acted as a magnet for a growing modern-art-minded public. Quickly, other institutions embraced the approach pioneered at the Modern under its legendary founding director, Alfred H. Barr Jr. For American museums, the half century from the 1930s through the 1970s turned out to be the confident years, a period of seemingly endless growth and prosperity, when the ambitions of the museums were apparently in sync with the ambitions of the country as a whole.
Since then, much has changed. A decline in arts-and-humanities education at all levels and the move of middle-class audiences from the urban centers where museums are generally located have threatened to marginalize museum-going. And museums have found themselves increasingly hemmed in by sharp cuts in corporate giving on one side and a white-hot art market on the other. They have continued to grow in the face of these problems, in part by spotlighting their potential as tourist destinations. Their growth has tended to be strategic, with marquee-style events -- that new building plan, that blockbuster show -- often masking a shrinking focus on the long-term development of the permanent collection and on the scholarly activities that add weight to an institution. Marketing executives have been given previously unheard-of power, and some organizations are afraid to make a move without consulting focus groups; museum audiences are being treated like nothing so much as guinea pigs.
The Boston Museum of Fine Arts, one of the greatest in the country, has drastically cut curatorial departments and positions. The Albright-Knox Art Gallery in Buffalo, New York, in an effort to shore up an endowment to buy contemporary art, sacrificed to the auction block (and almost inevitably to the private sector) great rarities in its holdings of Graeco-Roman and Asian art. (Ironically, one of the Albright-Knox deaccessions, a magnificent classical bronze of Artemis, is currently on display at the Metropolitan, on loan from the European collector who bought it for $25 million.) But certainly the most extreme case of a museum done in by its own supposed marketing savvy is the Guggenheim, which under its former director, Thomas Krens, inaugurated an aggressive expansion plan. This began with a great success -- the satellite museum, designed by Frank Gehry, which opened in Bilbao in 1997 -- but since then Krens's operation has deteriorated, with the Frank Lloyd Wright building in New York now basically a rent-a-rotunda operation for trendy art events, while the museumâ??s permanent collections and scholarly mission are almost entirely sidelined.
Many who have written about the contemporary museum agree that there's too much focus on the business model. It's no wonder that several books critical of the recent drift of events have an ironic Inc. in their titles. Paul Wernerâ??s Museum, Inc: Inside the Global Art World is a rollicking little screed that takes aim at Krens and the Guggenheim. Of Krens's boasts about increased attendance, Werner writes:
It wasn't clear where attendance had risen, and for what shows, and what the investment had been for each. Besides, the actual space available for exhibitions ballooned in those years, so it's possible the ratio of visitors to square footage actually shrank.
Werner has a good eye for the smoke- and-mirrors of the marketing people and what it often serves to hide, which is a synergy between art museums and corporate ambition that has little to do with art itself. Writing after the Enron scandal, Werner points out that several of the lead players, among them Kenneth Lay and Andrew Fastow, were "avid sponsors of contemporary art," experts in what he terms "Cutting-edge Cronyism."
More than any other museum in this country, the Metropolitan Museum of Art has insisted on a coherent sense of purpose -- which boils down to the coherence of the museum-going experience. De Montebello's epiphany occurred in 1988, a decade into his directorship, when a great Degas retrospective had opened at the museum. At the time, the Metropolitan was charging a separate admission fee for certain special exhibitions, a practice that remains widespread in the museum world today and that encourages people to visit a temporary show while bypassing the permanent collection, which often contains works far more important than those theyâ??ve come specially to see. After the Degas show, the Metropolitan abolished this Balkanizing approach, henceforth offering a single admission that covers both the permanent collection and all temporary shows.
"My view," de Montebello told me, "was that you have to be able to encourage people to come many times to a show as great as the Degas retrospective....If you have only one ticket, which you've paid a lot of money for, you're only going to see the show once. And if you can't come back three or four times, you're not really seeing the show. From the point of view of reaching people -- and of civility -- tickets and charging was not a good idea. I also felt, from the purely fiscal point of view, that in fact by encouraging return visits, the money you would lose from tickets, you would gain from more visits." True, the Metropolitan under de Montebello hasn't exactly spurned the sort of bread-and-circus events that PR people love. Last spring's "SuperÂheroes: Fashion and Fantasy," a swankily engineered romp through comic books and designer clothing, is nothing to be proud of, and the museum has produced plenty of this stuff. But at the Metropolitan, such shows are never allowed to become the main event, at least not for more than a few weeks. An air of proportion and authority permeates the Metropolitan, which ultimately derives from de Montebello's sense of the authority inherent in great works of art. Although sometimes criticized as elitist, he believes in the populist appeal of elite forms of art, which can speak to many different people in many different ways.
He wants to move museums beyond the boom-or-bust mentality that blockbuster events tend to create. The Metropolitan declines to keep separate profit-and-loss statements for each show; that way, the financial people can't cherry-pick potential moneymakers or dismiss unprofitable exhibitions. "The budget," he says, "is there to support the programâ?â??not the other way around. By offering a truly heterogeneous experience, de Montebello has taken the Metropolitan out of the game of "guess which shows will fill the till," an approach that in museums across the country results in a glut of predictable Impressionist and Picasso splashes. What de Montebello understands is that the public is actually hungering for something else.
The curators who work with de Montebello are used to seeing their dreams realized. Although curators have a largely behind-the-scenes role in museums, they're supremely important figures in the cultural world, and the team at the Metropolitan is the best there is. They oversee the permanent collection, which involves caring for and presenting objects already in the museum, while keeping an eye out for objects on the market or in private collections that might enhance the holdings. They organize temporary shows, often building on the strengths of the permanent collection. And through their writings for catalogs and other publications, they bridge the gap between scholars and the general public, offering a unique synthesis of evolving knowledge. Unlike most museums, the Metropolitan has no exhibition committee. Curators go straight to de Montebello, whose adventurous spirit (he has been known to add last-minute shows to already overcrowded schedules) makes for an environment in which creative people flourish. People like Carlos Picón, who installed the new Greek and Roman galleries, and Helen Evans, who organized "The Glory of Byzantium," not only are formidable scholars, they also have a sixth sense for the most effective way to preÂsent challenging material.
Nobody more thoroughly embodies the headlong creative spirit of the Metropolitan than Keith Christiansen, the Jayne Wrightsman Curator of European Painting -- immensely tall, whippet thin, a nonstop talker with an endearing mixture of scholarly authority, aesthetic avidity, and unlimited curiosity. A few years ago, I ran into him at intermission during a performance of Lohengrin; under his arm was a paperback of one of Trollope's novels, which he was reading, and he immediately launched into a lyric salute to the wonders of 19th-century fiction. I doubt there's a subject in the arts on which Christiansen wouldn't be able to speak knowledgeably. It was Christiansen who brought Duccio's Madonna and Child to de Montebello's attention, thereby initiating what turned out to be a $45 million purchase. Even though his field is Italian Renaissance painting, he delights in leaping centuries and borders. In an important 1996 show, he argued for the dramatic depth of the altarpieces of the 18th-century Venetian painter Giambattista Tiepolo, which many had hitherto thought of as merely magnificent decorations. And a couple of years ago, when he learned that Pierre Rosenberg, the former director of the Louvre, was organizing a show of Poussin landscapes for Spain, he arranged for the exhibition to come to New York. In ChristianÂsen's essay for the catalog of "Poussin and Nature: Arcadian Visions," you can see a curatorial imagination working at the very highest level. In some 25 beautifully reasoned pages, he moves beyond the traditional idea of Poussin as a master of the strictly plotted narrative painting, presenting us with an infinitely more instinctive, intuitive, and maybe even romantic artist. Rosenberg has written the entries for the individual works, and he is a master of the form, creating what amount to miniature essays that unite scholarly exactitude with an easygoing feeling for the underlying humanity of Poussinâ??s work. This catalog, which also contains five essays by as many art historians, becomes a conversation about the many ways to approach a great artist and his relationship with the natural world -- a relationship that for Poussin was by turns terrifying and inspiriting, humbling and ennobling. "Poussin and Nature" was a quietly revolutionary show.
The Metropolitan is a rarity in that younger curators are given a chance to work on a truly ambitious scale. Thomas Campbell was still a relatively new kid on the block when in 2002 he organized and mounted "Tapestry in the Renaissance," the immense and audacious exhibition that irreversibly established the centrality of tapestry in 16th-century art. I asked de Montebello about "Tapestry in the Renaissance" and about Campbell. He recalled that his first thought on hearing Campbell pitch the show was that it was going to cost $2 million. "When has it ever been done?" de Montebello remembers asking. "Never," Campbell replied. Which meant, so far as de Montebello was concerned -- even though most people thought 16th-century tapestry was a bore -- that it had to be done. And then 9/11 occurred, six months before the opening dayâ??just when all the fund-raising needed to be put in place. As Campbell recalls, "What was already a challenging project became an almost impossible one.â? Again, de Montebello stood firm. "We'll find the money for it. We have to do it." So the museum rallied. When it turned out that some of the tapestries were so big that they would have to come in through the museum's front doors, Harold Holzer, then the head of the press office (he's now the senior vice president for external affairs), got The Times to send a photographer -- and New Yorkers were put on notice that a spectacle was about to be unfurled. The exhibition turned out to be a scholarly sensation and a hit with the public: "The Greatest Show in Town," as The New York Review of Books announced on its cover.
Last year, Campbell mounted an almost equally glorious sequel, "Tapestry in the Baroque," which took the story into the 18th century. And although both shows have now closed, their extraordinary catalogs remain. Recalling the initial conversations about "Tapestry in the Renaissance," de Montebello remembers thinking, "If it bombs, it bombs, but we'll have a great catalog and the only book of its kind on the subject." De Montebello's office is a trove of such catalogs, and he delights in displaying them, jumping up to pull them off the shelves and expressing pride in their heft, their dense endnotes and bibliographies, and their generous indexes. These books -- the permanent record of exquisitely plotted, dazzling but perforce fleeting exhibits -- are perhaps de Montebello's most lasting achievement. Seeing "Tapestry in the Renaissance" convinced me that the vast hunt scenes designed by Bernaert van Orley around 1530 and realized by the weavers of Brussels are among the masterworks of European art, panoramas of a richness and intricacy and poetic loveliness rivaling those of Brueghel. In his catalogs, Campbell, assisted by contributions from many of the best scholars in the field, helps us understand why such astonishing works could be so undervalued. The tradition of art writing that began with Vasari, he explains, tended to downplay the collaborative activity of the Belgian tapestry artists. Also, those who later celebrated tapestry, like William Morris, the prophet of the Arts and Crafts movement in 19th-century England, wanted to see tapestry as more purely decorative than it is in the work of a great storyteller like van Orley.
An encyclopedic museum in the heart of a world-class city has advantages that other institutions do not. When I asked de Montebello what he would do if he were put in charge of a smaller institution -- a museum in a midsize city, say, with one or two masterworks and a fine but second-string collection -- he was unfazed. He said he would focus on smaller exhibitions that showcased the best things in the collection; such shows aren't expensive, and they give a museum a chance to bring in scholars from other institutions as it's drawing local attention to its permanent collection. This is precisely what some of the savviest museums in the country are doing right now.
There are of course no simple anÂÂswers, certainly not in the high-cost world of museums. It's not even always clear to whom, exactly, our museums belong. When we speak of a great museum as a public institution, we aren't necessarily speaking with much precision. Technically, the board of trustees of the Metropolitan owns the museum's collections -- and this is not an unusual situation. While everybody on museum boards wants to serve the public good, each may have different ideas about the nature of that good. A great many board members come from the business world, where it's believed that if you aren't growing, youâ??re dying; this helps explain the endless building plans and blockbuster schemes, and even the deaccessioning of works, as if any kind of activity were better than none at all. Repeatedly in recent years, some big ego has thwarted a museum board, the most recent example being Eli Broad at the Los Angeles County Museum of Art -- who, after paying for a museum-within-the-museum that bears his own name, announced that he would not in fact donate his collection. Many around the Metropolitan cite its board, relatively large and famously stable, as a source of strength. But in the years to come, the Metropolitan may conceivably be unable to withstand the pressures from a new generation of billionaires who tend to take an interest in contemporary art and who are used to getting their way. De Montebello departs at a time when Steven Cohen, a hedge-fund wizard, has already lent a Damien Hirst and two works by Jeff Koons to this museum, which is only beginning to develop a clear direction in its contemporary-art collection, and if we have learned anything in recent years, it is that few arts institutions are immune to the pressures that can be exerted by a certain kind of gonzo personality.
The trustees at the Metropolitan who are leading the search for a new director, a group headed by Annette de la Renta, have to be acutely aware that the museum's future health depends on the boardâ??s dynamic interaction with that new director. This is a complex relationship, for although the director serves at the board's pleasure, a great director should in fact be shaping the board's expectations. And if the trustees at the Metropolitan have any doubt as to what kind of a director they do not want, they need only look 30 blocks south, to the Museum of Modern Art, where Glenn Lowry, who has been director since 1995, has been reshaping the Modern as a business-model museum, a place where huge amounts of money are raised and large crowds are attracted while more and more of the essential matter of the museum experience is neglected. One of the ironies of the past 20 or so years has been that these two premier New York museums have in some sense switched places, with the Modern, once the more adventuresome, increasingly timid and mainstream in its thinking, while the Metropolitan becomes bolder and more innovative. When de Montebello says, "If you take the high road, you pull up your public," he's speaking the language of the great figures who once dominated the Modern, people like Alfred Barr and William Rubin, a language that is more or less dead on West 53rd Street. New York's artists long owed their deepest allegiance to the Modern. But now they and the city's most avid museumgoers really feel at home at the Metropolitan. Even in areas where the Modern once reigned supreme, such as the history of photoÂgraphy, the excitement has more and more been at the Metropolitan, which has mounted a definitive Walker Evans retrospective and unforgettable shows of French daguerreotypes and of English photographs from paper negatives.
Now well over a century old, the Metropolitan, like any institution with a long history, must simultaneously embrace what is best from its past while reimagining itself for the present and the future. No event in recent years has demonstrated the museumâ??s capacity to realize this Janus-faced vision better than the reopening of the now-dazzling Greek and Roman galleries last year, for here everything old is, quite literally, new again. You have only to take a look at the catalog published on this occasion -- Art of the Classical World in the Metropolitan Museum of Art -- to see how conscious a process was at work. Carlos Picón opens this volume with "A History of the Department of Greek and Roman Art." Don't be put off by that dry title; in the historical photographs that Picon has gathered and in his account of 125 years of collecting is the story of the Metropolitan's heroic efforts to establish in the New World a remarkable record of the achievements of the Old -- a record now housed in one of the most delightful, coolly elegant spaces in Manhattan.
Although many of de Montebello's triumphs, like the new Greek and Roman galleries, have involved the art of the past, he recognizes the pressure of the presÂent. Under his watch, not only has the museumâ??s involvement with photography grown enormously, but the Metropolitan has acquired highly significant 20th-century art, including one of Balthusâ??s greatest paintings, The Mountain -- an evocation of a Biedermeier afternoon in the Alps, but with the pleasures of the hiking party somehow becoming atomized, troubled, surreal. Museumgoers will see connections between this painting, completed in 1937, and Courbet's Young Ladies of the Village, from 1851. And it is precisely that kind of connection that de Montebello wants to encourage visitors to recognize. Nothing makes him happier than when an important object in the museum inspires a curator to organize an exhibition -- and when that show in turn gives birth to another show. In this and other ways, the museum maintains an ongoing conversation with the hundreds of thousands of museumÂgoers for whom a trip to the Metropolitan isn't an occasional event but a regular and vital part of their lives. In the catalog for "ByzanÂtium: Faith and Power," de Montebello describes the show, mounted in 2004, as
the third wing of a "triptych" of exhibitions dedicated to a fuller understanding of the art of the Byzantine Empire, whose cultural and political influence spanned more than a millennium. That show -- together with "The Age of Spirituality" and "The Glory of ByzanÂtium" -- exemplified the director's vision of the museum as a kind of crossroads, where art of many times and places comes alive in the present. Helen Evans's immense catalog for "Byzantium: Faith and Power" is as gloriously overstuffed as was the exhibition it accompanied. The sections devoted to stone sculpture, metalwork, icon painting, mosaics, and textiles could stand as substantial books in themselves. And Maryan Ainsworth, a curator of painting at the Metropolitan, contributes a brilliant essay on the impact of Byzantine icons on Gerard David, Jan van Eyck, and other European painters during the Northern Renaissance. A catalog such as this is an intellectual and artistic binge -- but a binge without excess. And that, come to think of it, is a fair description of the Metropolitan Museum of Art today.
How long the Metropolitan's golden age will last, nobody can know. Continuing it, even for a time, is going to require a director who can harness the best traditions of American museumÂ-going, with its dazzling merger of populism and elitism. This is a juggling act, no question about it, infinitely more difficult than de Montebello has ever been willing to let us see.
A little more than a century ago, Henry James, on his last trip to America, visited the Metropolitan, which was by then established on Fifth Avenue, and he saw a museum involved in a major housecleaning, eliminating third- and fourth-rate stuff. In The American Scene, his resplendent account of the United States at the dawn of the 20th century, James wrote:
The thought of the acres of canvas and the tons of marble to be turned out into the cold world as the penalty of old error and the warrant for a clean slate ought to have drawn tears from the eyes. But these impending incidents affected me, in fact, on the spot, as quite radiant demonstrations. The Museum, in short, was going to be great.
And so it would be. The first half of the 20th century was the age of collecting, when the Metropolitan acquired the lion's share of its Old Master paintings, classical statues, Byzantine ivories, and Impressionist works. But in the past 20 years, more than ever before, the museum has demonstrated the power the individual object can still exert in a society where mass marketing rules. The climactic purchase of the de Montebello years is the Duccio Madonna and Child, less than 12 inches high -- a modest yet monumental gold-leafed panel to remind us of the golden age that has flourished on Fifth Avenue in the very midst of all the follies of America's new Gilded Age.
Jed Perl is the art critic for The New Republic. His new book, Antoine's Alphabet: Watteau and His World, has just been published by Knopf.
A PERSONAL JOURNAL, KEPT LARGELY TO RECORD REFERENCES TO WRITINGS, MUSIC, POLITICS, ECONOMICS, WORLD HAPPENINGS, PLAYS, FILMS, PAINTINGS, OBJECTS, BUILDINGS, SPORTING EVENTS, FOODS, WINES, PLACES AND/OR PEOPLE.
About Me
- Xerxes
- New Orleans, Louisiana, United States
- Admire John McPhee, Bill Bryson, David Remnick, Thomas Merton, Richard Rohr and James Martin (and most open and curious minds)
30.9.08
Rorty
By the last years of the 20th century, Richard Rorty was probably the best-known university-based philosopher in the United States. In recent years he has been surpassed in notoriety by the utilitarian ethicist Peter Singer, known for his advocacy of animal rights and the acceptability of euthanizing severely disabled newborns. Rorty, in his time, was accused of murdering truth. He argued the position that there was no standpoint outside of human descriptions of the world from which to decide that any one view was false and another true. There were only descriptions in more or less convincing language, with more or less convincing uses, by which people might persuade one another how to live in the world.
Rorty called his position pragmatism, following in the grand tradition of John Dewey and William James. Critics called it relativism, or a claim that no view or behavior is better or worse than another, except as it appears to its possessor or practitioner. The unshakeable consistency with which Rorty invited people to downgrade their pretensions about themselves—including philosophers’ giving up a special, privileged access to the right, good, and true—infuriated readers in many different fields, not least his own, for 30 years.
At lectures on the many topics Rorty took up once he had become a significant public figure -- human rights, labor unions, a revival of the political left -- hostile audience members would often revisit the claims he had laid out in Philosophy and the Mirror of Nature in 1979, as if they might get him to recant his basic stance. I witnessed this spectacle a few times, as could anyone who attended a university in the 1980s and 1990s, when Rorty had become a ubiquitous commentator, ceaselessly touring campuses worldwide. Rather than exhort or inspire by force of personality, Rorty in the flesh always acted slightly embarrassed to be the center of attention. His perpetual chagrin seemed the effect of a lightning-quick and catholic mind combined with a temperament so deeply hostile to pretension and so insistent upon the folly of intellectual grandiosity that he must constantly chasten himself.
The myth of Rorty grew up around the belief that he was a sudden apostate. A tenured professor of philosophy at Princeton, author of a series of significant papers in the philosophy of mind, he had made a turn to history and wider perspectives that impelled his profession to reject him. The notion that the turn had come out of nowhere—that an analytic philosopher had woken up one morning and denounced his colleagues as absurd, meanwhile rejecting a tradition of epistemology that went back to Descartes or, in later writings, to Plato—added to his authority for many enemies of the analytic style.
This misunderstanding of Rorty's path was corrected unexpectedly when, in later years (he died in 2007, active until the end), he began speaking publicly about his roots, in the lectures published as Achieving Our Country (1998) and in a notable autobiographical essay, ''Trotsky and the Wild Orchids,'' dating to 1993 but republished as the first selection in his more popularly oriented volume Philosophy and Social Hope in 2000. Rorty's formal education had been in a style of intellectual history at the University of Chicago that instilled a command of the past and its pluralism of traditions, as well as a blinkered confidence in ''eternal truths'' that would trouble Rorty for decades. His father was James Rorty, the socialist and anti-Communist poet, journalist, and polemicist who had belonged to the circles of the New York Intellectuals. His mother, Winifred Rauschenbusch, also an intellectual, was the daughter of the Social Gospel minister Walter Rauschenbusch. Not to turn to large matters of general interest, not to seek a more democratic, secular, and usable tradition of truth-seeking and debate, working through an extensive repertoire of authors from Plato to Dewey to Proust, would have been the real betrayal of his past.
The sociologist Neil Gross' new book on Rorty clarifies the exact details of the thinker's educational and professional activities up to the point, in 1982, when he ascended that larger stage and gained an audience beyond the philosophy department wall. Gross fills in information about Rorty's choices when applying to graduate programs, the contents of his master's and Ph.D. dissertations, the professors and departments regnant in academic philosophy during his early career, Rorty's tenure offers, and things like his correspondence with colleagues and deans. The book adds two sensitive chapters on his mother and father, embracing such topics as James Rorty's suffering while serving in World War I and both parents' personalities and states of mind. Richard is simply given a different kind of attention. His military service, the mood of his childhood, his personal style as an adult, his friendships and interests, are all left out despite Gross’ access to Rorty’s personal papers and correspondence.
The publishers may have done Gross a terrible disservice by marketing his book as a biography. It is explicitly a case study, treating only professional details for purposes of sociology. Specifically, this is the sociology of the dynamics of American university careers in the second half of the 20th century. Its investment in empirical research (Rorty makes a sample of one) is oriented to theory creation and improvement. The monograph's early and late portions include literature reviews and excellent summaries of competing theoretical stances intended for the use of colleagues in the field. A truly superb section of the last chapter argues there was a “shift in the nature of intellectual authority in American academic life” from the postwar years to the 1970s and 1980s, ''caused largely by structural transformations'' in universities' finances and labor markets, a shift which parallels Rorty's turn from expert practice in his discipline to a critique of the conceit of rigorous expert knowledge (this would help to explain the magnitude of the reception of his Philosophy and the Mirror of Nature).
But this is still not quite biography, and it's bound to be disappointing to those who expect one. True, readers drawn by the announcement of a first biography of Richard Rorty are unlikely to be looking just for personal interest: the facts of Rorty's divorce from the philosopher Amélie Rorty (briefly discussed) or a portrait of the thinker in his free time (not discussed). If they share any of the catholicity of attention of Rorty himself, they may be acquainted with the sociology of knowledge descended from Karl Mannheim, or the history of ideas inherited from Arthur Lovejoy and revised by Quentin Skinner, or the theorization of intellectuals as a distinct class that has preoccupied commentators both academic and popular for a century.
But then Gross' book will disappoint a second time. Gross proposes himself as theorist and practitioner of a ''New Sociology of Ideas,'' which has, as its other active figure, his dissertation adviser. His case study does not cover intellectuals in the ordinary language sense; in this work, Gross is forced to explain, ''I use the terms 'intellectual' and 'thinker' as shorthand for 'faculty members in modern American academic settings.''' ''Tenure'' is almost a holy word in the book, as the grail for which ''intellectuals'' quest. With such a straitened notion of intellectual practice, Rorty is of interest primarily because his career path started outside the mainstream of his discipline, took him near its center, and then moved him to its periphery again at a higher level of success.
The theoretical innovation that emboldens Gross to declare a new school in sociology is his addition of an idea of ''self-concept'' -- basically, how professors conceive of themselves as thinkers -- as a determinant of their career behavior. Their un- or pre-intellectual career behavior in turn can influence their choice of subject matter and thought (for example, if they choose topics that help them gain attention and praise). This is supposed to compare favorably with the theories of Pierre Bourdieu and Randall Collins, two previous sociologists whom Gross aims to correct. Bourdieu can indeed be accused of slighting intellectuals' views of themselves and their ''identity.'' A Rorty-like figure, he sought counterintuitive and uncommon forms of explanation for intellectuals' otherwise mystified achievements, looking to unacknowledged struggle, tangible and symbolic capital, and habitus (whole perceptual apparatuses conditioned by group socialization). But Bourdieu was deliberately neglecting the commonplace; it's only a trivially ''superior'' theory that stakes its claim by restoring the commonplace and universally acknowledged. In Gross' section on Thomas Kuhn's influence on Rorty, he reprises Kuhn's famous distinction between ''normal'' science -- which takes the existing paradigm of explanation and seeks to make tiny adjustments to come to grips with anomalies -- and ''revolutionary'' science, which introduces a new paradigm. Gross' contribution is a perfectly reasonable one when acknowledged as a piece of normal science.
Indeed, Gross' novelty of ''self-concept'' brings him more in line with normal practice in the discipline of intellectual history, a subfield in Gross' neighboring department that has increasingly fallen into desuetude. The classics of 20th-century intellectual biography as practiced by the last generation of intellectual historians -- like Robert Westbrook's life of John Dewey, or Richard Wightman Fox's life of Reinhold Niebuhr -- took into account their subjects' self-conceptions as they connected to the substance of ideas, the details of career moves, and something else, too: the dimension of personality and character. In intellectual biography, this can’t be left aside as mere gossip or psychology.
With Rorty, for example, it seems impossible to contemplate his career decisions or his later ideas without acknowledging the odd intellectual temperament by which he became an ironist, and the degree to which he would be conscious of his own desires as products of institutions and circumstance. Treating this position abstractly, in the 1989 masterpiece Contingency, Irony, and Solidarity, he defined the ''ironist'' as ''the sort of person who faces up to the contingency of his or her most central beliefs and desires -- someone sufficiently historicist and nominalist to have abandoned the idea that those central beliefs and desires refer back to something beyond the reach of time and chance.'' But one can feel the lucid yet tortuous personal side of his ironist's approach to life quite clearly in a tantalizing passage that Gross introduces from Rorty's correspondence from 1971, in a discussion of his thoughts on the left student movement:
I honestly think that we—the parasitic priestly class which confers sacraments like BAs and PhDs -- are the best agency for social change on the scene. … This ... requires the continuation of the same claptrap about contemplation we’ve always handed out, because without this mystique the society won’t let us get away with corrupting the youth anymore.
From the perspective of the objectivity of knowledge and the neutrality of teaching, this is damning: Rorty is a leftist who hides his colors in order to push his ideology. From Rorty's perspective, it is the consequence of knowing that your liberal beliefs are conditioned by outside forces (your parents, your educational circumstances, and how you gain access to social and economic power, even as a professor) yet still holding passionately to these beliefs and wanting to try to convince others.
In 1971, he says troublingly that to be allowed to speak, you must play along with what others believe (''claptrap about contemplation,'' that is, a pretense of the value-neutral teaching countenanced by those outside the university). By 1978, Rorty was trying to tell what he considered the pragmatic truth about that ''claptrap'' -- that he should teach things he believed in without claiming superior access to truth by contemplation, and that this was an appropriate task not just for political science or journalism but for the hallowed halls of philosophy. How he lived with his double perspective, day to day -- how he could believe, and yet accept the contingency of his belief, and bear up under objections to his position as illogical, insulting, or corrupting -- is the matter that biography still has to illuminate.
Rorty called his position pragmatism, following in the grand tradition of John Dewey and William James. Critics called it relativism, or a claim that no view or behavior is better or worse than another, except as it appears to its possessor or practitioner. The unshakeable consistency with which Rorty invited people to downgrade their pretensions about themselves—including philosophers’ giving up a special, privileged access to the right, good, and true—infuriated readers in many different fields, not least his own, for 30 years.
At lectures on the many topics Rorty took up once he had become a significant public figure -- human rights, labor unions, a revival of the political left -- hostile audience members would often revisit the claims he had laid out in Philosophy and the Mirror of Nature in 1979, as if they might get him to recant his basic stance. I witnessed this spectacle a few times, as could anyone who attended a university in the 1980s and 1990s, when Rorty had become a ubiquitous commentator, ceaselessly touring campuses worldwide. Rather than exhort or inspire by force of personality, Rorty in the flesh always acted slightly embarrassed to be the center of attention. His perpetual chagrin seemed the effect of a lightning-quick and catholic mind combined with a temperament so deeply hostile to pretension and so insistent upon the folly of intellectual grandiosity that he must constantly chasten himself.
The myth of Rorty grew up around the belief that he was a sudden apostate. A tenured professor of philosophy at Princeton, author of a series of significant papers in the philosophy of mind, he had made a turn to history and wider perspectives that impelled his profession to reject him. The notion that the turn had come out of nowhere—that an analytic philosopher had woken up one morning and denounced his colleagues as absurd, meanwhile rejecting a tradition of epistemology that went back to Descartes or, in later writings, to Plato—added to his authority for many enemies of the analytic style.
This misunderstanding of Rorty's path was corrected unexpectedly when, in later years (he died in 2007, active until the end), he began speaking publicly about his roots, in the lectures published as Achieving Our Country (1998) and in a notable autobiographical essay, ''Trotsky and the Wild Orchids,'' dating to 1993 but republished as the first selection in his more popularly oriented volume Philosophy and Social Hope in 2000. Rorty's formal education had been in a style of intellectual history at the University of Chicago that instilled a command of the past and its pluralism of traditions, as well as a blinkered confidence in ''eternal truths'' that would trouble Rorty for decades. His father was James Rorty, the socialist and anti-Communist poet, journalist, and polemicist who had belonged to the circles of the New York Intellectuals. His mother, Winifred Rauschenbusch, also an intellectual, was the daughter of the Social Gospel minister Walter Rauschenbusch. Not to turn to large matters of general interest, not to seek a more democratic, secular, and usable tradition of truth-seeking and debate, working through an extensive repertoire of authors from Plato to Dewey to Proust, would have been the real betrayal of his past.
The sociologist Neil Gross' new book on Rorty clarifies the exact details of the thinker's educational and professional activities up to the point, in 1982, when he ascended that larger stage and gained an audience beyond the philosophy department wall. Gross fills in information about Rorty's choices when applying to graduate programs, the contents of his master's and Ph.D. dissertations, the professors and departments regnant in academic philosophy during his early career, Rorty's tenure offers, and things like his correspondence with colleagues and deans. The book adds two sensitive chapters on his mother and father, embracing such topics as James Rorty's suffering while serving in World War I and both parents' personalities and states of mind. Richard is simply given a different kind of attention. His military service, the mood of his childhood, his personal style as an adult, his friendships and interests, are all left out despite Gross’ access to Rorty’s personal papers and correspondence.
The publishers may have done Gross a terrible disservice by marketing his book as a biography. It is explicitly a case study, treating only professional details for purposes of sociology. Specifically, this is the sociology of the dynamics of American university careers in the second half of the 20th century. Its investment in empirical research (Rorty makes a sample of one) is oriented to theory creation and improvement. The monograph's early and late portions include literature reviews and excellent summaries of competing theoretical stances intended for the use of colleagues in the field. A truly superb section of the last chapter argues there was a “shift in the nature of intellectual authority in American academic life” from the postwar years to the 1970s and 1980s, ''caused largely by structural transformations'' in universities' finances and labor markets, a shift which parallels Rorty's turn from expert practice in his discipline to a critique of the conceit of rigorous expert knowledge (this would help to explain the magnitude of the reception of his Philosophy and the Mirror of Nature).
But this is still not quite biography, and it's bound to be disappointing to those who expect one. True, readers drawn by the announcement of a first biography of Richard Rorty are unlikely to be looking just for personal interest: the facts of Rorty's divorce from the philosopher Amélie Rorty (briefly discussed) or a portrait of the thinker in his free time (not discussed). If they share any of the catholicity of attention of Rorty himself, they may be acquainted with the sociology of knowledge descended from Karl Mannheim, or the history of ideas inherited from Arthur Lovejoy and revised by Quentin Skinner, or the theorization of intellectuals as a distinct class that has preoccupied commentators both academic and popular for a century.
But then Gross' book will disappoint a second time. Gross proposes himself as theorist and practitioner of a ''New Sociology of Ideas,'' which has, as its other active figure, his dissertation adviser. His case study does not cover intellectuals in the ordinary language sense; in this work, Gross is forced to explain, ''I use the terms 'intellectual' and 'thinker' as shorthand for 'faculty members in modern American academic settings.''' ''Tenure'' is almost a holy word in the book, as the grail for which ''intellectuals'' quest. With such a straitened notion of intellectual practice, Rorty is of interest primarily because his career path started outside the mainstream of his discipline, took him near its center, and then moved him to its periphery again at a higher level of success.
The theoretical innovation that emboldens Gross to declare a new school in sociology is his addition of an idea of ''self-concept'' -- basically, how professors conceive of themselves as thinkers -- as a determinant of their career behavior. Their un- or pre-intellectual career behavior in turn can influence their choice of subject matter and thought (for example, if they choose topics that help them gain attention and praise). This is supposed to compare favorably with the theories of Pierre Bourdieu and Randall Collins, two previous sociologists whom Gross aims to correct. Bourdieu can indeed be accused of slighting intellectuals' views of themselves and their ''identity.'' A Rorty-like figure, he sought counterintuitive and uncommon forms of explanation for intellectuals' otherwise mystified achievements, looking to unacknowledged struggle, tangible and symbolic capital, and habitus (whole perceptual apparatuses conditioned by group socialization). But Bourdieu was deliberately neglecting the commonplace; it's only a trivially ''superior'' theory that stakes its claim by restoring the commonplace and universally acknowledged. In Gross' section on Thomas Kuhn's influence on Rorty, he reprises Kuhn's famous distinction between ''normal'' science -- which takes the existing paradigm of explanation and seeks to make tiny adjustments to come to grips with anomalies -- and ''revolutionary'' science, which introduces a new paradigm. Gross' contribution is a perfectly reasonable one when acknowledged as a piece of normal science.
Indeed, Gross' novelty of ''self-concept'' brings him more in line with normal practice in the discipline of intellectual history, a subfield in Gross' neighboring department that has increasingly fallen into desuetude. The classics of 20th-century intellectual biography as practiced by the last generation of intellectual historians -- like Robert Westbrook's life of John Dewey, or Richard Wightman Fox's life of Reinhold Niebuhr -- took into account their subjects' self-conceptions as they connected to the substance of ideas, the details of career moves, and something else, too: the dimension of personality and character. In intellectual biography, this can’t be left aside as mere gossip or psychology.
With Rorty, for example, it seems impossible to contemplate his career decisions or his later ideas without acknowledging the odd intellectual temperament by which he became an ironist, and the degree to which he would be conscious of his own desires as products of institutions and circumstance. Treating this position abstractly, in the 1989 masterpiece Contingency, Irony, and Solidarity, he defined the ''ironist'' as ''the sort of person who faces up to the contingency of his or her most central beliefs and desires -- someone sufficiently historicist and nominalist to have abandoned the idea that those central beliefs and desires refer back to something beyond the reach of time and chance.'' But one can feel the lucid yet tortuous personal side of his ironist's approach to life quite clearly in a tantalizing passage that Gross introduces from Rorty's correspondence from 1971, in a discussion of his thoughts on the left student movement:
I honestly think that we—the parasitic priestly class which confers sacraments like BAs and PhDs -- are the best agency for social change on the scene. … This ... requires the continuation of the same claptrap about contemplation we’ve always handed out, because without this mystique the society won’t let us get away with corrupting the youth anymore.
From the perspective of the objectivity of knowledge and the neutrality of teaching, this is damning: Rorty is a leftist who hides his colors in order to push his ideology. From Rorty's perspective, it is the consequence of knowing that your liberal beliefs are conditioned by outside forces (your parents, your educational circumstances, and how you gain access to social and economic power, even as a professor) yet still holding passionately to these beliefs and wanting to try to convince others.
In 1971, he says troublingly that to be allowed to speak, you must play along with what others believe (''claptrap about contemplation,'' that is, a pretense of the value-neutral teaching countenanced by those outside the university). By 1978, Rorty was trying to tell what he considered the pragmatic truth about that ''claptrap'' -- that he should teach things he believed in without claiming superior access to truth by contemplation, and that this was an appropriate task not just for political science or journalism but for the hallowed halls of philosophy. How he lived with his double perspective, day to day -- how he could believe, and yet accept the contingency of his belief, and bear up under objections to his position as illogical, insulting, or corrupting -- is the matter that biography still has to illuminate.
29.9.08
P.J O'Rourke & Cancer
I looked death in the face. All right, I didn't. I glimpsed him in a crowd. I've been diagnosed with cancer, of a very treatable kind. I'm told I have a 95% chance of survival. Come to think of it -- as a drinking, smoking, saturated-fat hound -- my chance of survival has been improved by cancer.
I still cursed God, as we all do when we get bad news and pain. Not even the most faith-impaired among us shouts: "Damn quantum mechanics!" "Damn organic chemistry!" "Damn chaos and coincidence!"
I believe in God. God created the world. Obviously pain had to be included in God's plan. Otherwise we'd never learn that our actions have consequences. Our cave-person ancestors, finding fire warm, would conclude that curling up to sleep in the middle of the flames would be even warmer. Cave bears would dine on roast ancestor, and we'd never get any bad news and pain because we wouldn't be here.
But God, Sir, in Your manner of teaching us about life's consequential nature, isn't death a bit ... um ... extreme, pedagogically speaking? I know the lesson that we're studying is difficult. But dying is more homework than I was counting on. Also, it kind of messes up my vacation planning. Can we talk after class? Maybe if I did something for extra credit?
Why can't death -- if we must have it -- be always glorious, as in "The Iliad"? Of course death continues to be so, sometimes, with heroes in Fallouja and Kandahar. But nowadays, death more often comes drooling on the toilet seat in the nursing home, or bleeding under the crushed roof of a teen-driven SUV, or breathless in a deluxe hotel suite filled with empty drug bottles and a minor public figure whose celebrity expiration date has passed.
I have, of all the inglorious things, a malignant hemorrhoid. What color bracelet does one wear for that? And where does one wear it? And what slogan is apropos? Perhaps that slogan can be sewn in needlepoint around the ruffle on a cover for my embarrassing little doughnut buttocks pillow.
Furthermore, I am a logical, sensible, pragmatic Republican, and my diagnosis came just weeks after Teddy Kennedy's. That he should have cancer of the brain, and I should have cancer of the ass ... well, I'll say a rosary for him and hope he has a laugh at me. After all, what would I do, ask God for a more dignified cancer? Pancreatic? Liver? Lung?
Which brings me to the nature of my prayers. They are, like most prayers from most people, abject self-pleadings. However, I can't be the only person who feels like a jerk saying, "Please cure me, God. I'm underinsured. I have three little children. And I have three dogs, two of which will miss me. And my wife will cry and mourn and be inconsolable and have to get a job. P.S. Our mortgage is subprime."
God knows this stuff. He's God. He's all-knowing. What am I telling him, really? "Gosh, you sure are a good God. Good -- you own it. Plus you're infinitely wise, infinitely merciful, but ... look, everybody makes mistakes. A little cancer of the behind, it's not a big mistake. Not something that's going on your personal record. There's no reason it can't be, well ... reversed, is there?"
No doubt death is one of those mysterious ways in which God famously works. Except, on consideration, death isn't mysterious. Do we really want everyone to be around forever? I'm thinking about my own family, specifically a certain stepfather I had as a kid. Sayonara, you s.o.b.
Napoleon was doubtless a great man in his time -- at least the French think so. But do we want even Napoleon extant in perpetuity? Do we want him always escaping from island exiles, raising fanatically loyal troops of soldiers, invading Russia and burning Moscow?
Well, at the moment, considering Putin et al, maybe we do want that. But, century after century, it would get old. And what with Genghis Khan coming from the other direction all the time and Alexander the Great clashing with a Persia that is developing nuclear weapons and Roman legions destabilizing already precarious Israeli-Palestinian relations -- things would be a mess.
Then there's the matter of our debt to death for life as we know it. I believe in God. I also believe in evolution. If death weren't around to "finalize" the Darwinian process, we'd all still be amoebas. We'd eat by surrounding pizzas with our belly flab and have sex by lying on railroad tracks waiting for a train to split us into significant others.
I consider evolution to be more than a scientific theory. I think it's a call to God. God created a free universe. He could have created any kind of universe he wanted. But a universe without freedom would have been static and meaningless -- the taxpayer-funded-art-in-public-places universe.
Rather, God created a universe full of cosmic whatchmajiggers and subatomic whosits free to interact. And interact they did, becoming matter and organic matter and organic matter that replicated itself and life. And that life was completely free, as amoral as my cancer cells.
Life forms could exercise freedom to an idiotic extent, growing uncontrolled, thoughtless and greedy to the point that they killed the source of their own fool existence. But, with the help of death, matter began to learn right from wrong -- how to save itself and its ilk, how to nurture, how to love (or, anyway, how to build a Facebook page) and how to know God and his rules.
Death is so important that God visited death upon his own son, thereby helping us learn right from wrong well enough that we may escape death forever and live eternally in God's grace. (Although this option is not usually open to reporters.)
I'm not promising that the pope will back me up about all of the above. But it's the best I can do by my poor lights about the subject of mortality and free will.
Thus, the next time I glimpse death ... well, I'm not going over and introducing myself. I'm not giving the grim reaper fist daps. But I'll remind myself to try, at least, to thank God for death. And then I'll thank God, with all my heart, for whiskey.
P.J. O'Rourke is a correspondent for the Weekly Standard and the Atlantic. A longer version of this article will appear in Search magazine. searchmagazine.org.
I still cursed God, as we all do when we get bad news and pain. Not even the most faith-impaired among us shouts: "Damn quantum mechanics!" "Damn organic chemistry!" "Damn chaos and coincidence!"
I believe in God. God created the world. Obviously pain had to be included in God's plan. Otherwise we'd never learn that our actions have consequences. Our cave-person ancestors, finding fire warm, would conclude that curling up to sleep in the middle of the flames would be even warmer. Cave bears would dine on roast ancestor, and we'd never get any bad news and pain because we wouldn't be here.
But God, Sir, in Your manner of teaching us about life's consequential nature, isn't death a bit ... um ... extreme, pedagogically speaking? I know the lesson that we're studying is difficult. But dying is more homework than I was counting on. Also, it kind of messes up my vacation planning. Can we talk after class? Maybe if I did something for extra credit?
Why can't death -- if we must have it -- be always glorious, as in "The Iliad"? Of course death continues to be so, sometimes, with heroes in Fallouja and Kandahar. But nowadays, death more often comes drooling on the toilet seat in the nursing home, or bleeding under the crushed roof of a teen-driven SUV, or breathless in a deluxe hotel suite filled with empty drug bottles and a minor public figure whose celebrity expiration date has passed.
I have, of all the inglorious things, a malignant hemorrhoid. What color bracelet does one wear for that? And where does one wear it? And what slogan is apropos? Perhaps that slogan can be sewn in needlepoint around the ruffle on a cover for my embarrassing little doughnut buttocks pillow.
Furthermore, I am a logical, sensible, pragmatic Republican, and my diagnosis came just weeks after Teddy Kennedy's. That he should have cancer of the brain, and I should have cancer of the ass ... well, I'll say a rosary for him and hope he has a laugh at me. After all, what would I do, ask God for a more dignified cancer? Pancreatic? Liver? Lung?
Which brings me to the nature of my prayers. They are, like most prayers from most people, abject self-pleadings. However, I can't be the only person who feels like a jerk saying, "Please cure me, God. I'm underinsured. I have three little children. And I have three dogs, two of which will miss me. And my wife will cry and mourn and be inconsolable and have to get a job. P.S. Our mortgage is subprime."
God knows this stuff. He's God. He's all-knowing. What am I telling him, really? "Gosh, you sure are a good God. Good -- you own it. Plus you're infinitely wise, infinitely merciful, but ... look, everybody makes mistakes. A little cancer of the behind, it's not a big mistake. Not something that's going on your personal record. There's no reason it can't be, well ... reversed, is there?"
No doubt death is one of those mysterious ways in which God famously works. Except, on consideration, death isn't mysterious. Do we really want everyone to be around forever? I'm thinking about my own family, specifically a certain stepfather I had as a kid. Sayonara, you s.o.b.
Napoleon was doubtless a great man in his time -- at least the French think so. But do we want even Napoleon extant in perpetuity? Do we want him always escaping from island exiles, raising fanatically loyal troops of soldiers, invading Russia and burning Moscow?
Well, at the moment, considering Putin et al, maybe we do want that. But, century after century, it would get old. And what with Genghis Khan coming from the other direction all the time and Alexander the Great clashing with a Persia that is developing nuclear weapons and Roman legions destabilizing already precarious Israeli-Palestinian relations -- things would be a mess.
Then there's the matter of our debt to death for life as we know it. I believe in God. I also believe in evolution. If death weren't around to "finalize" the Darwinian process, we'd all still be amoebas. We'd eat by surrounding pizzas with our belly flab and have sex by lying on railroad tracks waiting for a train to split us into significant others.
I consider evolution to be more than a scientific theory. I think it's a call to God. God created a free universe. He could have created any kind of universe he wanted. But a universe without freedom would have been static and meaningless -- the taxpayer-funded-art-in-public-places universe.
Rather, God created a universe full of cosmic whatchmajiggers and subatomic whosits free to interact. And interact they did, becoming matter and organic matter and organic matter that replicated itself and life. And that life was completely free, as amoral as my cancer cells.
Life forms could exercise freedom to an idiotic extent, growing uncontrolled, thoughtless and greedy to the point that they killed the source of their own fool existence. But, with the help of death, matter began to learn right from wrong -- how to save itself and its ilk, how to nurture, how to love (or, anyway, how to build a Facebook page) and how to know God and his rules.
Death is so important that God visited death upon his own son, thereby helping us learn right from wrong well enough that we may escape death forever and live eternally in God's grace. (Although this option is not usually open to reporters.)
I'm not promising that the pope will back me up about all of the above. But it's the best I can do by my poor lights about the subject of mortality and free will.
Thus, the next time I glimpse death ... well, I'm not going over and introducing myself. I'm not giving the grim reaper fist daps. But I'll remind myself to try, at least, to thank God for death. And then I'll thank God, with all my heart, for whiskey.
P.J. O'Rourke is a correspondent for the Weekly Standard and the Atlantic. A longer version of this article will appear in Search magazine. searchmagazine.org.
Greeks (bearing gifts)
Hidden histories
'The Odyssey' and 'The Iliad' are giving up new secrets about the ancient world
By Jonathan Gottschall
NEARLY 3,000 YEARS after the death of the Greek poet Homer, his epic tales of the war for Troy and its aftermath remain deeply woven into the fabric of our culture. These stories of pride and rage, massacre and homecoming have been translated and republished over millennia. Even people who have never read a word of "The Iliad" or "The Odyssey" know the phrases they have bequeathed to us - the Trojan horse, the Achilles heel, the face that launched a thousand ships.
Today we still turn to Homer's epics not only as sources of ancient wisdom and wrenchingly powerful poetry, but also as genuinely popular entertainments. Recent translations of "The Iliad" and "Odyssey" have shared the best-seller lists with Grisham and King. "The Odyssey" has inspired works from James Joyce's "Ulysses" to a George Clooney movie, and an adaptation of "The Iliad" recently earned more than $100 million in the form of Wolfgang Petersen's "Troy" - a summer blockbuster starring Brad Pitt as an improbable Achilles.
The ancient Greeks, however, believed that Homer's epics were something more than fiction: They thought the poems chronicled a real war, and reflected the authentic struggles of their ancestors. But modern scholars have generally been more skeptical. The poems describe a culture that thrived hundreds of years before Homer was born, and which would have seemed legendary even to him. Scholars have allowed that a kernel of historical truth might be tucked beneath the layers of heroic hyperbole and poetic embroidery, but only a small kernel. In the last 50 years, most scholars have sided with the great classicist Moses Finley, who argued that the epics were "a collection of fictions from beginning to end" and that - for all their majesty and drama - they were "no guide at all" to the civilization that
may have fought the Trojan War.
But thanks to evidence from a range of disciplines, we are in the middle of a massive reappraisal of these foundational works of Western literature. Recent advances in archeology and linguistics offer the strongest support yet that the Trojan War did take place, with evidence coming from the large excavation at the likely site of Troy, as well as new analysis of cuneiform tablets from the dominant empire of the region. Insights from comparative anthropology have transformed studies of the society that created the poems and allowed us to analyze the epics in a new way, suggesting that their particular patterns of violence contain a hidden key to ancient Greek history - though not necessarily the key that Homer's readers once thought they were being given.
"The Iliad" and "The Odyssey" are our most precious artifacts of early Greek culture. Aside from the dry and voiceless remains of archeological sites, the poems are the last surviving impressions of the society that created them - what the people hoped for, what they despaired of, and how they managed their social and political lives. The poems are time machines - imperfect, surely - that show us people who were so like us, and so different, too. And they are still revealing new truths about the prehistoric civilization that has exerted such a strong formative influence over the art, the history, and even the psychology of the West.
. . .
The desire to find truth in Homer has a long and checkered history, and no figure looms larger than the German businessman and self-taught archeologist Heinrich Schliemann. In 1870 he landed on the western coast of Asia Minor (modern day Turkey) with a copy of "The Iliad" in his hand. On the plain before him, an unimpressive mound of grass and stone and bushes swelled 100 feet into the air. Tradition had long identified this mound, called Hisarlik, as a possible site of the historical Troy.
Schliemann soon reported to the world, breathlessly, that he and his diggers had found the charred remains of a grand citadel destroyed in prehistory by hostile men - that he had found Troy just where Homer said it would be. The news was a worldwide sensation, and Schliemann's view that the Homeric epics were fairly accurate chronicles of Late Bronze Age history - that is, the Greek world of around 1200 BC - dominated scholarship for more than 50 years.
But, in fact, Schliemann hadn't found Homer's Troy. Hisarlik was occupied from 3000 BC until 500 AD, and subsequent archeological excavations showed that the civilization Schliemann chipped from the mound actually ended more than 1,000 years before the Trojan War could realistically have been fought. When the German archeologist Carl Blegen examined the proper layer of the Hisarlik mound, the settlement he found seemed like a wretched and insignificant place. Schliemann's amateurism, wishful thinking, and instinct for self-glorification had led him into serious error, and ended up discrediting his claim that Homer's poems were historically based.
But the newest digging at Troy is tipping the consensus again, perhaps this time for good. Schliemann and Blegen, it now appears, had only discovered the tip of the iceberg. The mound at Hisarlik thrusts up from the plain, but most of its ruins are concealed beneath the surface. In a project that has now been underway for 20 years, the German archeologist Manfred Korfmann and hundreds of collaborators have discovered a large lower city that surrounded the citadel. Using new tools, such as computer modeling and imaging technology that allows them to "see" into the earth before digging, Korfmann and his colleagues determined that this city's borders were 10 to 15 times larger than previously thought, and that it supported a population of 5,000 to 10,000 - a big city for its time and place, with impressive defenses and an underground water system for surviving sieges. And, critically, the city bore signs of being pillaged and burned around 1200 BC, precisely the time when the Trojan War would have been fought.
In his influential book, "Troy and Homer," German classicist Joachim Latacz argues that the identification of Hisarlik as the site of Homer's Troy is all but proven. Latacz's case is based not only on archeology, but also on fascinating reassessments of cuneiform tablets from the Hittite imperial archives. The tablets, which are dated to the period when the Late Bronze Age city at Hisarlik was destroyed, tell a story of a western people harassing a Hittite client state on the coast of Asia Minor. The Hittite name for the invading foreigners is very close to Homer's name for his Greeks - Achaians - and the Hittite names for their harassed ally are very close to "Troy" and "Ilios," Homer's names for the city.
"At the very core of the tale," Latacz argues, "Homer's 'Iliad' has shed the mantle of fiction commonly attributed to it."
But if the Trojan War is looking more and more like a historical reality, there is still the question of whether the poems tell us anything about the motives and thinking of the people who actually fought it. Do the epic time machines actually take us back to the Greek culture of the Late Bronze Age?
It is almost certain that they do not. Homer's epics are a culmination of a centuries-long tradition of oral storytelling, and extensive cross-cultural studies of oral literature have established that such tales are unreliable as history. Homeric scholars believe that the epics were finally written down sometime in the 8th century BC, which means that the stories of Achilles and Odysseus would have been passed by word of mouth for half a millennium before they were finally recorded in what was, by then, a vastly changed Greek culture. Facts about the war and the people who fought it would have been lost or grossly distorted, as in a centuries-long game of "telephone." Scholars agree that the relatively simple and poor culture Homer describes in his epics is quite sharply at odds with the complex and comparatively rich Greek kingdoms of the Late Bronze Age, when the war would have taken place.
But even if the epics make a bad history of Greece in 1200 BC - in the sense of transmitting names, dates, and accurate political details - scholars increasingly agree that they provide a precious window on Greek culture at about the time the poems were finally written down. Moses Finley, who believed that the epics were "no guide at all" to the history of the Trojan War, did believe they were guides to Homer's own culture. And by turning an anthropological eye to the conflicts Homer writes about, we are now learning far more about what that culture was really like.
. . .
Reconstructing a prehistoric world from literary sources is rife with complications. But there are aspects of life in the Homeric era upon which most scholars agree. Homer paints a coherent picture of Greek attitudes, ideology, customs, manners, and mores that is consistent with the 8th century archeological record, and holds together based on anthropological knowledge about societies at similar levels of cultural development. For instance, we can trust that the Greeks' political organization was loose but not chaotic - probably organized at the level of chiefdoms, not kingdoms or city-states. In the epics we can see the workings of an agrarian economy; we can see what animals they raised and what crops, how they mixed their wine, worshipped their gods, and treated their slaves and women. We can tell that theirs was a warlike world, with high rates of conflict within and between communities.
This violence, in fact, opens an important window onto that world. Patterns of violence in Homer are intriguingly consistent with societies on the anthropological record known to have suffered from acute shortages of women. While Homeric men did not take multiple wives, they hoarded and guarded slave women who they treated as their sexual property. These women were mainly captured in raids of neighboring towns, and they appear frequently in Homer. In the poems, Odysseus is mentioned as having 50 slave women, and it is slave women who bear most of King Priam's 62 children. For every slave woman working a rich man's loom and sharing his bed, some less fortunate or formidable man lacks a wife.
In pre-state societies around the world - from the Yanomamo of the Amazon basin to the tribes of highland New Guinea to the Inuit of the Arctic - a scarcity of women almost invariably triggers pitched competition among men, not only directly over women, but also over the wealth and social status needed to win them. This is exactly what we find in Homer. Homeric men fight over many different things, but virtually all of the major disputes center on rights to women - not only the famous conflict over Helen, but also over the slave girls Briseis and Chryseis, Odysseus's wife Penelope, and all the nameless women of common Trojan men. As the old counselor Nestor shouts to the Greek hosts, "Don't anyone hurry to return homeward until after he has lain down alongside a wife of some Trojan!"
The war between Greeks and Trojans ends in the Rape of Troy: the massacre of men, and the rape and abduction of women. These events are not the rare savageries of a particularly long and bitter war - they are one of the major points of the war. Homeric raiders always hoped to return home with new slave-concubines. Achilles conveys this in his soul-searching assessment of his life as warrior: "I have spent many sleepless nights and bloody days in battle, fighting men for their women."
Historical studies of literature are sometimes criticized for ignoring, or even diminishing, the artistic qualities that draw people to literature in the first place. But understanding how real history underlies the epics makes us appreciate Homer's art more, not less. We can see Homer pioneering the artistic technique of taking a backbone of historical fact and fleshing it over with contemporary values and concerns - the same technique used later by Virgil in "The Aeneid," by Shakespeare in his history plays, and by Renaissance painters depicting the Bible and classical antiquity.
And understanding Homer's own society gives us a new perspective on the oppressive miasma of fatalism and pessimism that pervades "The Iliad" and, to a lesser but still palpable extent, "The Odyssey." While even the fiercest fighters understand that peace is desirable, they feel doomed to endless conflict. As Odysseus says, "Zeus has given us [the Greeks] the fate of winding down our lives in hateful war, from youth until we perish, each of us." A shortage of women helps to explain more about Homeric society than its relentless violence. It may also shed light on the origins of a tragic and pessimistic worldview, a pantheon of gods deranged by petty vanities, and a people's resignation to the inevitability of "hateful war."
Jonathan Gottschall teaches English at Washington & Jefferson College. He is the author of "The Rape of Troy: Evolution, Violence, and the World of Homer," and he is currently at work on a novel of the Homeric age called "Odysseus, A True Story."
'The Odyssey' and 'The Iliad' are giving up new secrets about the ancient world
By Jonathan Gottschall
NEARLY 3,000 YEARS after the death of the Greek poet Homer, his epic tales of the war for Troy and its aftermath remain deeply woven into the fabric of our culture. These stories of pride and rage, massacre and homecoming have been translated and republished over millennia. Even people who have never read a word of "The Iliad" or "The Odyssey" know the phrases they have bequeathed to us - the Trojan horse, the Achilles heel, the face that launched a thousand ships.
Today we still turn to Homer's epics not only as sources of ancient wisdom and wrenchingly powerful poetry, but also as genuinely popular entertainments. Recent translations of "The Iliad" and "Odyssey" have shared the best-seller lists with Grisham and King. "The Odyssey" has inspired works from James Joyce's "Ulysses" to a George Clooney movie, and an adaptation of "The Iliad" recently earned more than $100 million in the form of Wolfgang Petersen's "Troy" - a summer blockbuster starring Brad Pitt as an improbable Achilles.
The ancient Greeks, however, believed that Homer's epics were something more than fiction: They thought the poems chronicled a real war, and reflected the authentic struggles of their ancestors. But modern scholars have generally been more skeptical. The poems describe a culture that thrived hundreds of years before Homer was born, and which would have seemed legendary even to him. Scholars have allowed that a kernel of historical truth might be tucked beneath the layers of heroic hyperbole and poetic embroidery, but only a small kernel. In the last 50 years, most scholars have sided with the great classicist Moses Finley, who argued that the epics were "a collection of fictions from beginning to end" and that - for all their majesty and drama - they were "no guide at all" to the civilization that
may have fought the Trojan War.
But thanks to evidence from a range of disciplines, we are in the middle of a massive reappraisal of these foundational works of Western literature. Recent advances in archeology and linguistics offer the strongest support yet that the Trojan War did take place, with evidence coming from the large excavation at the likely site of Troy, as well as new analysis of cuneiform tablets from the dominant empire of the region. Insights from comparative anthropology have transformed studies of the society that created the poems and allowed us to analyze the epics in a new way, suggesting that their particular patterns of violence contain a hidden key to ancient Greek history - though not necessarily the key that Homer's readers once thought they were being given.
"The Iliad" and "The Odyssey" are our most precious artifacts of early Greek culture. Aside from the dry and voiceless remains of archeological sites, the poems are the last surviving impressions of the society that created them - what the people hoped for, what they despaired of, and how they managed their social and political lives. The poems are time machines - imperfect, surely - that show us people who were so like us, and so different, too. And they are still revealing new truths about the prehistoric civilization that has exerted such a strong formative influence over the art, the history, and even the psychology of the West.
. . .
The desire to find truth in Homer has a long and checkered history, and no figure looms larger than the German businessman and self-taught archeologist Heinrich Schliemann. In 1870 he landed on the western coast of Asia Minor (modern day Turkey) with a copy of "The Iliad" in his hand. On the plain before him, an unimpressive mound of grass and stone and bushes swelled 100 feet into the air. Tradition had long identified this mound, called Hisarlik, as a possible site of the historical Troy.
Schliemann soon reported to the world, breathlessly, that he and his diggers had found the charred remains of a grand citadel destroyed in prehistory by hostile men - that he had found Troy just where Homer said it would be. The news was a worldwide sensation, and Schliemann's view that the Homeric epics were fairly accurate chronicles of Late Bronze Age history - that is, the Greek world of around 1200 BC - dominated scholarship for more than 50 years.
But, in fact, Schliemann hadn't found Homer's Troy. Hisarlik was occupied from 3000 BC until 500 AD, and subsequent archeological excavations showed that the civilization Schliemann chipped from the mound actually ended more than 1,000 years before the Trojan War could realistically have been fought. When the German archeologist Carl Blegen examined the proper layer of the Hisarlik mound, the settlement he found seemed like a wretched and insignificant place. Schliemann's amateurism, wishful thinking, and instinct for self-glorification had led him into serious error, and ended up discrediting his claim that Homer's poems were historically based.
But the newest digging at Troy is tipping the consensus again, perhaps this time for good. Schliemann and Blegen, it now appears, had only discovered the tip of the iceberg. The mound at Hisarlik thrusts up from the plain, but most of its ruins are concealed beneath the surface. In a project that has now been underway for 20 years, the German archeologist Manfred Korfmann and hundreds of collaborators have discovered a large lower city that surrounded the citadel. Using new tools, such as computer modeling and imaging technology that allows them to "see" into the earth before digging, Korfmann and his colleagues determined that this city's borders were 10 to 15 times larger than previously thought, and that it supported a population of 5,000 to 10,000 - a big city for its time and place, with impressive defenses and an underground water system for surviving sieges. And, critically, the city bore signs of being pillaged and burned around 1200 BC, precisely the time when the Trojan War would have been fought.
In his influential book, "Troy and Homer," German classicist Joachim Latacz argues that the identification of Hisarlik as the site of Homer's Troy is all but proven. Latacz's case is based not only on archeology, but also on fascinating reassessments of cuneiform tablets from the Hittite imperial archives. The tablets, which are dated to the period when the Late Bronze Age city at Hisarlik was destroyed, tell a story of a western people harassing a Hittite client state on the coast of Asia Minor. The Hittite name for the invading foreigners is very close to Homer's name for his Greeks - Achaians - and the Hittite names for their harassed ally are very close to "Troy" and "Ilios," Homer's names for the city.
"At the very core of the tale," Latacz argues, "Homer's 'Iliad' has shed the mantle of fiction commonly attributed to it."
But if the Trojan War is looking more and more like a historical reality, there is still the question of whether the poems tell us anything about the motives and thinking of the people who actually fought it. Do the epic time machines actually take us back to the Greek culture of the Late Bronze Age?
It is almost certain that they do not. Homer's epics are a culmination of a centuries-long tradition of oral storytelling, and extensive cross-cultural studies of oral literature have established that such tales are unreliable as history. Homeric scholars believe that the epics were finally written down sometime in the 8th century BC, which means that the stories of Achilles and Odysseus would have been passed by word of mouth for half a millennium before they were finally recorded in what was, by then, a vastly changed Greek culture. Facts about the war and the people who fought it would have been lost or grossly distorted, as in a centuries-long game of "telephone." Scholars agree that the relatively simple and poor culture Homer describes in his epics is quite sharply at odds with the complex and comparatively rich Greek kingdoms of the Late Bronze Age, when the war would have taken place.
But even if the epics make a bad history of Greece in 1200 BC - in the sense of transmitting names, dates, and accurate political details - scholars increasingly agree that they provide a precious window on Greek culture at about the time the poems were finally written down. Moses Finley, who believed that the epics were "no guide at all" to the history of the Trojan War, did believe they were guides to Homer's own culture. And by turning an anthropological eye to the conflicts Homer writes about, we are now learning far more about what that culture was really like.
. . .
Reconstructing a prehistoric world from literary sources is rife with complications. But there are aspects of life in the Homeric era upon which most scholars agree. Homer paints a coherent picture of Greek attitudes, ideology, customs, manners, and mores that is consistent with the 8th century archeological record, and holds together based on anthropological knowledge about societies at similar levels of cultural development. For instance, we can trust that the Greeks' political organization was loose but not chaotic - probably organized at the level of chiefdoms, not kingdoms or city-states. In the epics we can see the workings of an agrarian economy; we can see what animals they raised and what crops, how they mixed their wine, worshipped their gods, and treated their slaves and women. We can tell that theirs was a warlike world, with high rates of conflict within and between communities.
This violence, in fact, opens an important window onto that world. Patterns of violence in Homer are intriguingly consistent with societies on the anthropological record known to have suffered from acute shortages of women. While Homeric men did not take multiple wives, they hoarded and guarded slave women who they treated as their sexual property. These women were mainly captured in raids of neighboring towns, and they appear frequently in Homer. In the poems, Odysseus is mentioned as having 50 slave women, and it is slave women who bear most of King Priam's 62 children. For every slave woman working a rich man's loom and sharing his bed, some less fortunate or formidable man lacks a wife.
In pre-state societies around the world - from the Yanomamo of the Amazon basin to the tribes of highland New Guinea to the Inuit of the Arctic - a scarcity of women almost invariably triggers pitched competition among men, not only directly over women, but also over the wealth and social status needed to win them. This is exactly what we find in Homer. Homeric men fight over many different things, but virtually all of the major disputes center on rights to women - not only the famous conflict over Helen, but also over the slave girls Briseis and Chryseis, Odysseus's wife Penelope, and all the nameless women of common Trojan men. As the old counselor Nestor shouts to the Greek hosts, "Don't anyone hurry to return homeward until after he has lain down alongside a wife of some Trojan!"
The war between Greeks and Trojans ends in the Rape of Troy: the massacre of men, and the rape and abduction of women. These events are not the rare savageries of a particularly long and bitter war - they are one of the major points of the war. Homeric raiders always hoped to return home with new slave-concubines. Achilles conveys this in his soul-searching assessment of his life as warrior: "I have spent many sleepless nights and bloody days in battle, fighting men for their women."
Historical studies of literature are sometimes criticized for ignoring, or even diminishing, the artistic qualities that draw people to literature in the first place. But understanding how real history underlies the epics makes us appreciate Homer's art more, not less. We can see Homer pioneering the artistic technique of taking a backbone of historical fact and fleshing it over with contemporary values and concerns - the same technique used later by Virgil in "The Aeneid," by Shakespeare in his history plays, and by Renaissance painters depicting the Bible and classical antiquity.
And understanding Homer's own society gives us a new perspective on the oppressive miasma of fatalism and pessimism that pervades "The Iliad" and, to a lesser but still palpable extent, "The Odyssey." While even the fiercest fighters understand that peace is desirable, they feel doomed to endless conflict. As Odysseus says, "Zeus has given us [the Greeks] the fate of winding down our lives in hateful war, from youth until we perish, each of us." A shortage of women helps to explain more about Homeric society than its relentless violence. It may also shed light on the origins of a tragic and pessimistic worldview, a pantheon of gods deranged by petty vanities, and a people's resignation to the inevitability of "hateful war."
Jonathan Gottschall teaches English at Washington & Jefferson College. He is the author of "The Rape of Troy: Evolution, Violence, and the World of Homer," and he is currently at work on a novel of the Homeric age called "Odysseus, A True Story."
& the beauty of the Internet is that no one knows you are a dog.............
With New York's top investment banks either converting into commercial banks or bankrupt, the city is poised to lose revenue and talent, industry analysts said.
Mark Lennihan
The prediction is that many of the highest-paid traders, analysts, and portfolio managers, frustrated with lower earnings potential at commercial banks — the banks are permitted to use only about one-third the leverage of investment banks and so often post lower returns — will leave to start their own hedge funds and investment vehicles, where possibilities for profit are greater. And unlike the Wall Street banks, Manhattan's borders will not bind these new companies.
"You can start up a hedge fund anywhere you have an Internet connection — you don't even need a phone — as long as you have capital and a good reputation," the chief market strategist at research firm Fusion IQ, Barry Ritholtz, said. "Given the collapse of Wall Street, it is hard to know how many people will decide they don't really need to be here."
That is not to say that hedge funds are not facing their own difficulties. One index of hedge fund performance was down 5.8% for the year going into September, and the funds are now contending with "redemption week," the period when investors say whether they want to redeem their money at year's end. For the bulk of the industry, the deadline for requesting redemptions is tomorrow, and its outcome could have a severe impact on the industry's performance going forward.
Meanwhile, Wall Street is contending with a new landscape, with all five major investment banks bankrupt, acquired, or converted into commercial banks. Last week, Goldman Sachs Group Inc. and Morgan Stanley announced they would become commercial bank holding companies, following on the heels of Lehman Brothers' bankruptcy and Bank of America Corp.'s acquisition of Merrill Lynch & Co. Earlier this year, JPMorgan Chase acquired investment bank Bear Stearns.
As commercial banks, Goldman Sachs and Morgan Stanley will join JPMorgan Chase, Citigroup, and others in submitting to greater regulation by the Federal Reserve. In return, the banks will gain access to the Fed's borrowing facilities. As part of this additional Fed oversight, the banks will be limited to borrowing about $10 for every $1 in equity they hold, compared with $33 for every $1 Bear Stearns held, for example.
By using more leverage, the investment banks were able to increase the size of the bets they were making with their portfolios, allowing for far greater returns. Their compensation for employees also reflected this, with some top earners taking home tens of millions of dollars in bonuses. As commercial banks, their ability to borrow as much, and generate such high returns and salaries, will be more limited. In light of this diminished earning potential, some banking employees may choose to launch their own platforms, where they can continue investing using greater leverage.
"Certain top portfolio managers, traders, and analysts will feel motivation to leave and start their own enterprise," a lawyer at Sadis & Goldberg who specializes in hedge funds, Ron Geffner, said, adding he is advising several portfolio managers "who are using their own capital to incubate a track record, so after the market stabilizes they can go out on their own."
He added, "These people are driven by two things: economic reward, but also the knowledge that the major banks no longer offer job stability."
The drive to locate a new fund outside of New York is in part because of lower taxes available elsewhere, like Greenwich, Conn., and the Cayman Islands, hedge fund executives said. Also, "the regulatory landscape may change a fair amount, and if the regulation becomes onerous, perhaps you will see hedge funds begin setting up shop in other domiciles," the general partner at hedge fund Aurelian Partners, Brian Horey, said.
But there are several factors that could put a crimp in plans to launch new investment funds. Perhaps the largest stumbling block is the lack of capital. "This is a really tough environment to raise capital, and investors are focusing more and more on someone's track record," Mr. Ritholtz said. "Especially if you are looking to set up your first fund, even if you've had a strong record at an investment bank, it may not be enough now."
Many hedge funds and other firms may also decide to remain in New York because of the availability here of talent and infrastructure. "While some will probably look to places like Greenwich because the local tax treatment is so much more favorable than New York, others may decide the benefit of a broader talent pool and the convenience is a bitter fit," Mr. Horey said.
And as hedge funds face turmoil following redemption week, it may become even tougher. If these funds are forced to unwind their positions so they can return some investments to clients, it could put pressure on the market and cause further difficulties for the financial industry.
Despite these challenges, employees who do disperse and create their own funds elsewhere will accelerate the pain with which Manhattan was already bracing. Investment banks have typically been some of the biggest employers in the city, as well as the biggest commercial tenants and revenue generators. Earlier this month, Governor Paterson said as many as 30,000 New York jobs could be lost as a result of the Wall Street collapse, while Mayor Bloomberg has said it could have a drastic impact on revenue.
"It's generally believed that one Wall Street job helps create two or three other jobs," Mayor Bloomberg said in a press conference earlier this month. "This multiplier effect will have a serious impact not only on the financial-sector employees who lose jobs, but also on many New York families who are indirectly affected by those job losses."
Mark Lennihan
The prediction is that many of the highest-paid traders, analysts, and portfolio managers, frustrated with lower earnings potential at commercial banks — the banks are permitted to use only about one-third the leverage of investment banks and so often post lower returns — will leave to start their own hedge funds and investment vehicles, where possibilities for profit are greater. And unlike the Wall Street banks, Manhattan's borders will not bind these new companies.
"You can start up a hedge fund anywhere you have an Internet connection — you don't even need a phone — as long as you have capital and a good reputation," the chief market strategist at research firm Fusion IQ, Barry Ritholtz, said. "Given the collapse of Wall Street, it is hard to know how many people will decide they don't really need to be here."
That is not to say that hedge funds are not facing their own difficulties. One index of hedge fund performance was down 5.8% for the year going into September, and the funds are now contending with "redemption week," the period when investors say whether they want to redeem their money at year's end. For the bulk of the industry, the deadline for requesting redemptions is tomorrow, and its outcome could have a severe impact on the industry's performance going forward.
Meanwhile, Wall Street is contending with a new landscape, with all five major investment banks bankrupt, acquired, or converted into commercial banks. Last week, Goldman Sachs Group Inc. and Morgan Stanley announced they would become commercial bank holding companies, following on the heels of Lehman Brothers' bankruptcy and Bank of America Corp.'s acquisition of Merrill Lynch & Co. Earlier this year, JPMorgan Chase acquired investment bank Bear Stearns.
As commercial banks, Goldman Sachs and Morgan Stanley will join JPMorgan Chase, Citigroup, and others in submitting to greater regulation by the Federal Reserve. In return, the banks will gain access to the Fed's borrowing facilities. As part of this additional Fed oversight, the banks will be limited to borrowing about $10 for every $1 in equity they hold, compared with $33 for every $1 Bear Stearns held, for example.
By using more leverage, the investment banks were able to increase the size of the bets they were making with their portfolios, allowing for far greater returns. Their compensation for employees also reflected this, with some top earners taking home tens of millions of dollars in bonuses. As commercial banks, their ability to borrow as much, and generate such high returns and salaries, will be more limited. In light of this diminished earning potential, some banking employees may choose to launch their own platforms, where they can continue investing using greater leverage.
"Certain top portfolio managers, traders, and analysts will feel motivation to leave and start their own enterprise," a lawyer at Sadis & Goldberg who specializes in hedge funds, Ron Geffner, said, adding he is advising several portfolio managers "who are using their own capital to incubate a track record, so after the market stabilizes they can go out on their own."
He added, "These people are driven by two things: economic reward, but also the knowledge that the major banks no longer offer job stability."
The drive to locate a new fund outside of New York is in part because of lower taxes available elsewhere, like Greenwich, Conn., and the Cayman Islands, hedge fund executives said. Also, "the regulatory landscape may change a fair amount, and if the regulation becomes onerous, perhaps you will see hedge funds begin setting up shop in other domiciles," the general partner at hedge fund Aurelian Partners, Brian Horey, said.
But there are several factors that could put a crimp in plans to launch new investment funds. Perhaps the largest stumbling block is the lack of capital. "This is a really tough environment to raise capital, and investors are focusing more and more on someone's track record," Mr. Ritholtz said. "Especially if you are looking to set up your first fund, even if you've had a strong record at an investment bank, it may not be enough now."
Many hedge funds and other firms may also decide to remain in New York because of the availability here of talent and infrastructure. "While some will probably look to places like Greenwich because the local tax treatment is so much more favorable than New York, others may decide the benefit of a broader talent pool and the convenience is a bitter fit," Mr. Horey said.
And as hedge funds face turmoil following redemption week, it may become even tougher. If these funds are forced to unwind their positions so they can return some investments to clients, it could put pressure on the market and cause further difficulties for the financial industry.
Despite these challenges, employees who do disperse and create their own funds elsewhere will accelerate the pain with which Manhattan was already bracing. Investment banks have typically been some of the biggest employers in the city, as well as the biggest commercial tenants and revenue generators. Earlier this month, Governor Paterson said as many as 30,000 New York jobs could be lost as a result of the Wall Street collapse, while Mayor Bloomberg has said it could have a drastic impact on revenue.
"It's generally believed that one Wall Street job helps create two or three other jobs," Mayor Bloomberg said in a press conference earlier this month. "This multiplier effect will have a serious impact not only on the financial-sector employees who lose jobs, but also on many New York families who are indirectly affected by those job losses."
27.9.08
BOOKSLUT on GOD
God's WordsThe (unnecessary) rise of the spiritual memoir.
By Jessa Crispin
We live in an age of autobiography, one in which young writers cannot even bother to change people’s names to create a novel, in which a story being true is a greater virtue than being well written, or insightful, or interesting.
I have a few unyielding standards for a memoir: Either your book must be exceptionally written (a trait hard to find in memoirs these days) or you must have done something exceptional. You must have traveled to the underground or the heavens and come back with fire or golden apples or at least a little wisdom. It can’t just be, “Daddy hit me, mommy got cancer” — everyone has a sad story, and it is possible to go through a trauma or experience something significant without gaining any insight.
You would think that the spiritual memoir would be a stand out division — after all, if the writer has seen the face of God, he or she should probably get a good story out of that. For centuries, people have been telling stories about spiritual experiences, listing out their sins, telling tales of redemption and light at the end of a very dark tunnel. These past few years, however, have seen a crazy rush on the subject matter, with everyone who has ever thought about religion feeling the need to write about it. Approximately half the United States population will convert or adapt their religious beliefs at some point in their lifetime, which equals a lot of potential memoirists.
I suppose the thought process behind publishing these books is that since it’s in the air of our culture, those who are seeking will want to hear other people’s stories. But the same rules from other memoirs apply — just because you lived through something, that doesn’t mean you have anything interesting to say about it. Perhaps the bar is set too high by the original spiritual memoirist, St. Augustine. In his book, he had a great hook — "Give me chastity and continence, but not yet" — and managed to invent the concept of original sin. It’s not like a recent convert to, say, Judaism is going to top that.
Danya Ruttenberg never really fell from grace, she just sort of shrugged it off. She declared herself an atheist at 13, but not because a trauma shook the foundations of her life. She just found temple boring. “The bearded man at the front of the enormous room put his hands out, palms up, and raised them. We all stood on cue, like well-trained dogs… It was very irritating, having to keep standing up and sitting down like that. By the end of service, I could barely contain my contempt.” Ah, the teenage years. Rather than being embarrassed, Ruttenberg lays it all out for us — the social isolation, the Ayn Rand books, the revelations she experienced dancing at Goth clubs.
She builds herself up intellectually, feasting on philosophy, becoming involved in feminist matters, taking college level courses in high school. But when her mother dies, she finds solace in tradition and ritual. She begins to pray, and the concerns she had with religion begin to fall away. Suddenly the patriarchal tone isn’t stultifying; it instead evokes “the feeling of a small child looking to a parent for comfort and safety.” The question of who wrote the Torah — and the meanings behind the rules laid down — suddenly matter less and less. “God was in this book.” That is enough for her.
Perhaps when she named her memoir Surprised by God she was thinking, God really is the narrow man-made construct I thought our society was in the process of rejecting. Surprise! When people find a window to the divine, they have a tendency to think theirs is the only one. Ruttenberg is no different. She falls into the trap of sneering at “idolatry” and thinks practicing yoga is “cheating” her Jewish tradition. Yahweh is a jealous God, indeed.
And as far as other seekers go: If you’re not practicing a religion to the letter of its law, your experience is not as valid as hers. “In order for this to work, one must enter a single practice fully and let go of the attempt to control the terms on which this happens… A novice practitioner picking and choosing from various disciplines will inevitably miss vital parts of the curriculum — most likely the parts that she most sorely needs, that will require the most work for her.” The problem is that Ruttenberg is not a nuanced enough thinker or writer to make proclamations about religion as a whole. She can quote contemporary writers like Annie Dillard or theologians like Merton, and she does liberally, but whenever she stretches beyond describing her own personal experience, her ideas seem half-formed and unsupported.
Her book fits right into the template of Conversion Memoir. You could rewrite a few sections and make this a memoir about a woman finding her place in Catholicism. Confession can be healing, communion brings about a sense of community, sacrificing certain behaviors brings discipline. It could also be changed around for Islam, or Hinduism, or Buddhism. These religions have survived because they are good for the soul. Their practices and rituals feed the needs of their followers. Ruttenberg is telling us absolutely nothing new. Had she been able to integrate her younger self — the feminist activist and the student who questioned the origins and implications of every aspect of religious ritual — all while keeping her faith, now that would have been a memoir worth reading. But Ruttenberg tells us over and over that these questions just stopped mattering as she continued her religious studies. They’re never answered — they just fall away.
Ruttenberg’s unquestioning nature can occasionally become infuriating. After a trip to Jerusalem, she writes, “I may personally find the fact that women aren’t allowed to study Talmud in some communities to be morally problematic, but as long as the women in the community freely consent to this worldview, I don’t have the right to impose my perspective.” I had the same reaction as when I heard Wesley Clark say on Real Time with Bill Maher not so long ago that women in Afghanistan like the burqas. As if it never occurred to Clark or Ruttenberg that if women were “freely consenting” to restricted behavior, there were possibly a thousand other restrictions they had grown up with, told was their duty, were made to believe were their burdens because they were born women and therefore inferior. Her response to the matter is to walk around Jerusalem with a yarmulke and some red lipstick.
Maybe Ruttenberg should be introduced to Robert N. Levine. While his latest book What God Can Do for You Now seems to have been written in an effort to find the middle ground between fundamentalist works like The Purpose-Driven Life and the atheist bestsellers, he directly addresses the questions that Ruttenberg deems unimportant. I knew Levine was a kindred spirit when I read that his faith stumbled when he thought about the plague of the death of the first born in Exodus. He doesn’t shrug it off by saying that it ultimately doesn’t matter; he states, “Perhaps our sources are guilty of faulty intelligence.” The Torah, the Bible, and the Quran are not the direct word of God. They were written by men and they are full of holes and contradictions. Acknowledging that does not destroy the power of faith — it instead gives you some flexibility.
To Levine, what matters is action. He doesn’t mean keeping kosher or resting on the Sabbath. “When you do what God would do if standing in our midst, when you act godly, you will find God. Really.” The title is misleading, then. Forgive me, but it’s not what God can do for you, but what you can do for God. That is, protect his creation, take care of your fellow man, and respect yourself. The spiritual healing comes out of these actions, not from just not turning on the lights one day a week.
While I don’t agree with most of the arguments laid out in What God Can Do — particularly his idea that God is not omnipotent, and his chapter on why he thinks evolution was guided by an intelligent force — I still enjoyed reading his thoughts. It was more of a discussion or a debate than a lecture. I did not feel like he would judge me if I suddenly wanted to take up yoga (unlikely — I hate all of that breathing), or if the way I make sense out of the world is to believe in reincarnation. Those other parts of religion, the dietary restrictions, the discipline, the rules of how to live are there if you need them, but you can have a relationship with God even if you reject the rest. And Levine won’t call you shallow if you are thoughtful about your beliefs and actions.
I’ve read more than a half dozen spiritual memoirs in the past year — and attempted to read even more than that but could not get past 50 pages — and it’s disappointing how unfocused the memoirists become when they look beyond their own lives and needs. If I’m looking for a new concept of what God is or a new way to think about the divine, I am obviously looking in the wrong place. I should put the memoirs down and head back to the theologians and philosophers. • 24 September 2008
--------------------------------------------------------------------------------
Jessa Crispin is editor and founder of Bookslut.com. She currently resides in Chicago.
--------------------------------------------------------------------------------
By Jessa Crispin
We live in an age of autobiography, one in which young writers cannot even bother to change people’s names to create a novel, in which a story being true is a greater virtue than being well written, or insightful, or interesting.
I have a few unyielding standards for a memoir: Either your book must be exceptionally written (a trait hard to find in memoirs these days) or you must have done something exceptional. You must have traveled to the underground or the heavens and come back with fire or golden apples or at least a little wisdom. It can’t just be, “Daddy hit me, mommy got cancer” — everyone has a sad story, and it is possible to go through a trauma or experience something significant without gaining any insight.
You would think that the spiritual memoir would be a stand out division — after all, if the writer has seen the face of God, he or she should probably get a good story out of that. For centuries, people have been telling stories about spiritual experiences, listing out their sins, telling tales of redemption and light at the end of a very dark tunnel. These past few years, however, have seen a crazy rush on the subject matter, with everyone who has ever thought about religion feeling the need to write about it. Approximately half the United States population will convert or adapt their religious beliefs at some point in their lifetime, which equals a lot of potential memoirists.
I suppose the thought process behind publishing these books is that since it’s in the air of our culture, those who are seeking will want to hear other people’s stories. But the same rules from other memoirs apply — just because you lived through something, that doesn’t mean you have anything interesting to say about it. Perhaps the bar is set too high by the original spiritual memoirist, St. Augustine. In his book, he had a great hook — "Give me chastity and continence, but not yet" — and managed to invent the concept of original sin. It’s not like a recent convert to, say, Judaism is going to top that.
Danya Ruttenberg never really fell from grace, she just sort of shrugged it off. She declared herself an atheist at 13, but not because a trauma shook the foundations of her life. She just found temple boring. “The bearded man at the front of the enormous room put his hands out, palms up, and raised them. We all stood on cue, like well-trained dogs… It was very irritating, having to keep standing up and sitting down like that. By the end of service, I could barely contain my contempt.” Ah, the teenage years. Rather than being embarrassed, Ruttenberg lays it all out for us — the social isolation, the Ayn Rand books, the revelations she experienced dancing at Goth clubs.
She builds herself up intellectually, feasting on philosophy, becoming involved in feminist matters, taking college level courses in high school. But when her mother dies, she finds solace in tradition and ritual. She begins to pray, and the concerns she had with religion begin to fall away. Suddenly the patriarchal tone isn’t stultifying; it instead evokes “the feeling of a small child looking to a parent for comfort and safety.” The question of who wrote the Torah — and the meanings behind the rules laid down — suddenly matter less and less. “God was in this book.” That is enough for her.
Perhaps when she named her memoir Surprised by God she was thinking, God really is the narrow man-made construct I thought our society was in the process of rejecting. Surprise! When people find a window to the divine, they have a tendency to think theirs is the only one. Ruttenberg is no different. She falls into the trap of sneering at “idolatry” and thinks practicing yoga is “cheating” her Jewish tradition. Yahweh is a jealous God, indeed.
And as far as other seekers go: If you’re not practicing a religion to the letter of its law, your experience is not as valid as hers. “In order for this to work, one must enter a single practice fully and let go of the attempt to control the terms on which this happens… A novice practitioner picking and choosing from various disciplines will inevitably miss vital parts of the curriculum — most likely the parts that she most sorely needs, that will require the most work for her.” The problem is that Ruttenberg is not a nuanced enough thinker or writer to make proclamations about religion as a whole. She can quote contemporary writers like Annie Dillard or theologians like Merton, and she does liberally, but whenever she stretches beyond describing her own personal experience, her ideas seem half-formed and unsupported.
Her book fits right into the template of Conversion Memoir. You could rewrite a few sections and make this a memoir about a woman finding her place in Catholicism. Confession can be healing, communion brings about a sense of community, sacrificing certain behaviors brings discipline. It could also be changed around for Islam, or Hinduism, or Buddhism. These religions have survived because they are good for the soul. Their practices and rituals feed the needs of their followers. Ruttenberg is telling us absolutely nothing new. Had she been able to integrate her younger self — the feminist activist and the student who questioned the origins and implications of every aspect of religious ritual — all while keeping her faith, now that would have been a memoir worth reading. But Ruttenberg tells us over and over that these questions just stopped mattering as she continued her religious studies. They’re never answered — they just fall away.
Ruttenberg’s unquestioning nature can occasionally become infuriating. After a trip to Jerusalem, she writes, “I may personally find the fact that women aren’t allowed to study Talmud in some communities to be morally problematic, but as long as the women in the community freely consent to this worldview, I don’t have the right to impose my perspective.” I had the same reaction as when I heard Wesley Clark say on Real Time with Bill Maher not so long ago that women in Afghanistan like the burqas. As if it never occurred to Clark or Ruttenberg that if women were “freely consenting” to restricted behavior, there were possibly a thousand other restrictions they had grown up with, told was their duty, were made to believe were their burdens because they were born women and therefore inferior. Her response to the matter is to walk around Jerusalem with a yarmulke and some red lipstick.
Maybe Ruttenberg should be introduced to Robert N. Levine. While his latest book What God Can Do for You Now seems to have been written in an effort to find the middle ground between fundamentalist works like The Purpose-Driven Life and the atheist bestsellers, he directly addresses the questions that Ruttenberg deems unimportant. I knew Levine was a kindred spirit when I read that his faith stumbled when he thought about the plague of the death of the first born in Exodus. He doesn’t shrug it off by saying that it ultimately doesn’t matter; he states, “Perhaps our sources are guilty of faulty intelligence.” The Torah, the Bible, and the Quran are not the direct word of God. They were written by men and they are full of holes and contradictions. Acknowledging that does not destroy the power of faith — it instead gives you some flexibility.
To Levine, what matters is action. He doesn’t mean keeping kosher or resting on the Sabbath. “When you do what God would do if standing in our midst, when you act godly, you will find God. Really.” The title is misleading, then. Forgive me, but it’s not what God can do for you, but what you can do for God. That is, protect his creation, take care of your fellow man, and respect yourself. The spiritual healing comes out of these actions, not from just not turning on the lights one day a week.
While I don’t agree with most of the arguments laid out in What God Can Do — particularly his idea that God is not omnipotent, and his chapter on why he thinks evolution was guided by an intelligent force — I still enjoyed reading his thoughts. It was more of a discussion or a debate than a lecture. I did not feel like he would judge me if I suddenly wanted to take up yoga (unlikely — I hate all of that breathing), or if the way I make sense out of the world is to believe in reincarnation. Those other parts of religion, the dietary restrictions, the discipline, the rules of how to live are there if you need them, but you can have a relationship with God even if you reject the rest. And Levine won’t call you shallow if you are thoughtful about your beliefs and actions.
I’ve read more than a half dozen spiritual memoirs in the past year — and attempted to read even more than that but could not get past 50 pages — and it’s disappointing how unfocused the memoirists become when they look beyond their own lives and needs. If I’m looking for a new concept of what God is or a new way to think about the divine, I am obviously looking in the wrong place. I should put the memoirs down and head back to the theologians and philosophers. • 24 September 2008
--------------------------------------------------------------------------------
Jessa Crispin is editor and founder of Bookslut.com. She currently resides in Chicago.
--------------------------------------------------------------------------------
The Aga Khan
Coffee with the FT: His Highness the Aga Khan
By Rachel Morarjee
In a deeply undignified start to my interview with one of the world’s most famous spiritual leaders, I am pressing my face against the glass of the Ismaili Centre in South Kensington, gesticulating wildly as I try to catch the eye of the dark-suited security man. It seems to me he is, perhaps reasonably, deliberately ignoring the madwoman outside.
As I’ve already tried pushing the locked door, I eventually stand on the street corner and rummage inelegantly in my bag to find my phone. After a long wait, I manage to get hold of a friend who works for His Highness the Aga Khan, who lets me in.
My requests for a lunch or breakfast meeting had been deflected by the Aga Khan’s aides, who gave me the unusual excuse that the leader of 20m Ismaili Muslims guards his privacy so zealously that he would be reluctant to reveal what he eats at mealtimes. So we settle on a coffee.
Tall, in a grey suit and a burgundy tie, the Aga Khan, 71, would blend seamlessly into a crowd of London commuters. He welcomes me with a smile and says, acknowledging our tricky discussions about this interview: “Not breakfast, not lunch, not dinner, but coffee. What would you like to drink?”
The room is impersonal but, as I sit down on a plush chair, I look out and see a lush flower-filled internal roof garden, a courtyard where water flows into a fountain.
I met the Aga Khan twice during my three-year stint as a reporter for the FT in Afghanistan so I am used to the atmosphere of stiff formality that surrounds him. After 51 years, he is presumably used to it too. In July 1957, at the age of 20, he took over from his grandfather as leader of the Ismailis, who are followers of the Shia Muslim tradition.
A woman brings the Aga Khan a white coffee while I opt for a cup without milk or sugar, which I try to balance on the arm of the chair and drink. I am dismayed to see no sign of anything edible.
The Aga Khan’s thoroughbred passion
As the most successful racehorse owner-breeder in France, the Aga Khan has won just about everything, several times over, writes Rachel Pagones. And while racing is a fast and furious sport – the verdict delivered in around two and a half minutes for races such as the Epsom Derby – breeding the horses for these contests can be an agonisingly slow process. Patience is the Aga Khan’s hallmark.
He inherited the business from his father, Prince Aly, and grandfather, the Aga Khan III, who bought his first thoroughbreds in Deauville in 1921 and went on to win the Epsom Derby five times. For his part, Prince Aly became the first owner in Britain to win £100,000 in a season in 1959, the year before he was killed in a car accident outside Paris.
The present Aga Khan’s “families” of broodmares often produce a top-class winner after three or four generations on the backburner. He is the least commercial of the large, independently wealthy owner-breeders, including Sheikh Mohammed, ruler of Dubai, Prince Khalid Abdullah of the Saudi royal family, and John Magnier of Coolmore Stud in Ireland, all of whom promote many of their own stallions for use by other breeders. He has only four stallions on his six properties in France and Ireland.
Money also helps. The Aga Khan’s operation breeds from its own stock, but makes a big purchase when a rare opportunity arises. The most recent was in 2005 when the Aga Khan bought the late Jean-Luc Lagardère’s bloodstock holdings, including two studs and close to 200 horses, for an industry estimate of between €40m and €50m.
The Aga Khan has had four winners of both the Epsom and Irish Derby, including Shergar, the most famous horse in Britain during his lifetime – he won the Epsom Derby by a record ten lengths in 1981 – but this achievement has been largely replaced in the public mind by the memory of the horse’s bizarre kidnapping in 1983, a year after he was retired to stud in Ireland with a valuation of £10m. The most thorough reports conclude it was an IRA plot, and the horse was killed not long afterwards, probably because the kidnappers had trouble handling him.
The Aga Khan’s current star is an unbeaten filly named Zarkava. The favourite for next weekend’s Prix de L’Arc de Triomphe, she descends from Petite Etoile, a grey filly who starred on British tracks just before and after Prince Aly’s death. Petite Etoile’s great-great-grandmother was Mumtaz Mahal, one of the first and most important horses purchased by the Aga Khan III for 9,100 guineas in 1922.
I feel slightly on show now, as there are a lot of people crowded into the room with us. There is a Paris-based PR man, an older Ismaili man and, most disconcertingly of all, a young woman with a notepad, poised to write down everything I say.
The Aga Khan wears a suit even when he’s travelling and working in Islamic countries. It’s not a look that we are used to seeing on Muslim spiritual leaders, so I decide to start by asking whether his clothing attracts criticism in the Muslim world. The woman with the notepad starts scribbling furiously. Uh-oh, I think, and I get the question thrown back to me: “You have lived in a Muslim country. Are you aware of any requirement for an Imam to wear a particular type of clothing? There are traditions but are you aware of any theological requirement?”
I ask again, and this time the Aga Khan replies, “I have never sensed that as a problem. Imams in sub-saharan Africa dress differently than Imams in the Middle East, who dress differently from Imams in central Asia.” He adds that for ceremonial occasions, he wears a traditional robe and Astrakhan hat – a look favoured by Afghan President Hamid Karzai.
This question of clothing goes to the heart of the paradox of the Aga Khan. While he’s a spiritual leader to millions of Muslims, he is best-known in the west as the highest-profile racehorse owner in France, where he lives.
The other unusual thing about this spiritual leader is how staggeringly rich he is. The Aga Khan’s personal wealth is estimated at $1bn but the Ismaili community is tight-lipped about how much of the Aga Khan’s money is his own and how much is ring-fenced for religious and development work.
I ask him how he reconciles such great wealth with having so many impoverished followers in many parts of the developing world. “Well, I think first of all you have to reposition the statement about having great wealth. I would say, frankly, that’s nonsense,” he says, smiling emphatically.
What is in no doubt is that the Aga Khan comes from a privileged background. He was born Karim al-Hussayni in Geneva in 1936 and was known as Prince Karim. After school in Switzerland, he went to the US and graduated from Harvard in 1959 with a BA honours degree in Islamic history.
His parents divorced in 1949 and his father later married Hollywood actress Rita Hayworth. The couple were a favourite of the gossip columns, although the marriage did not survive long. The unwelcome spotlight at that time might be part of why the Aga Khan now guards his privacy so carefully.
The Aga Khan title was granted to the family by the Shah of Persia in the 1830s after he had married his daughter to the Aga Khan’s great-great-grandfather. The man sitting opposite me is only the fourth to hold the title. As I sip my rapidly cooling coffee, I settle back and hear how the myth of fabulous family wealth was created when the third Aga Khan, grandfather of Aga Khan IV, was given his weight in gold during his golden jubilee celebrations in 1936.
Although Aga Khan III was only 5ft 5in, he tipped the scales at 220lb and the donations added up to $125,000 – a vast fortune in 1936. The ceremony of sitting on the scales with the gold made a great impression on the British public at the time. “In the west, this was seen as some sort of fantastic ceremonial, and this was because India at the time was ceremonial.” The current Aga Khan did not have to endure anything like this during his own golden jubilee celebrations during 2007 and 2008, not least because the 1930s gold made a solid bedrock for investments.
Ismailis have also traditionally paid a tithe to their Imams. The Aga Khan tells me that money raised by Ismaili followers does not end up in his pocket. “There is a great difference between wealth which comes from the faith and is used for the faith and personal wealth used for the individual. The Imam has responsibility for significant resources but they in no way cover the needs we have, and never will,” he says.
The Aga Khan inherited shares in corporations, banks, trusts and oil from his grandfather in 1957 and, over the past five decades, he has built a vast business development network by investing in poor and conflict-torn parts of the globe. He is the key shareholder in many of the projects but his profits are reinvested in the businesses, which are often run by members of the Ismaili community.
He began with newspaper investments in east Africa in the 1960s and now runs investment ventures tightly linked to development work that funds schools, hospitals and architectural projects.
In Afghanistan, I saw how the success of the Aga Khan’s projects stood in contrast to the bumbling efforts of many western governments. He owns stakes in the country’s largest telephone network, and a five-star hotel but has also renovated ancient mosques, gardens and citadels as well as running educational and agricultural projects.
The Aga Khan says he sees his role as a venture capitalist who specialises in difficult environments, laying the foundations of projects to entice other investors. The Geneva-based Aga Khan Fund for Economic Development (Akfed) runs more than 90 for-profit businesses and employs 36,000 people.
“There is no point going into economies that are wealthy and have their own resources, so we go into the really poor ones. If you try to put social development ahead of economic support, it doesn’t work. You have to do both together.
“A community whose economics don’t change is not one that can support community structures, education, healthcare, it doesn’t have the wherewithal,” he says.
The Aga Khan uses a lot of the same jargon used by development workers, mentioning “human resources” and “capacity building”. I am familiar with this way of talking from my time in Kabul but have always felt it a shame that it means that speakers often convey nothing of the real excitement involved in seeing a project take off and become an independent success.
His profits are reinvested in the Akfed businesses and the rest is paid in dividends to the other joint venture partners. These include private equity firm Blackstone, which has co-invested in a hydroelectric damn in Uganda, and Swedish telecoms group TeliaSonera, which holds a stake in Afghanistan’s largest telephone network,
Roshan has gone from strength to strength, its mobile business bolstered by the fact that it is impossible to lay landlines in a country so laced with landmines. But his five-star Serena Hotel in Kabul has attracted criticism for its opulence in a city where most people don’t have electricity and running water.
“The nature of what we do is high-risk,” the Aga Khan says, with characteristic understatement. I ask whether he thinks this long-term view is key to his success and he says that many projects can take 25 years to come to fruition. He cites a hospital in Pakistan that now produces world class doctors a quarter of a century after it opened. It would be hard to find western donors who would remain with a project for that long.
During his 51 years as Imam, he has watched the collapse of the Soviet Union, which brought Ismaili communities in central Asia back into contact with the outside world, as well as the rise of militant Islam. “Communities like the Ismailis don’t live in a vacuum,” he notes, saying that his job as Imam is to think carefully about how to address the problems in the societies his followers call home. The Ismaili diaspora is almost as widespread as the Jewish one.
I wonder whether he sees the clash between Islam and the west as the most serious global problem. “I’m unwilling to say that in these major issues today faith has been the prime driver. In my view it’s political issues that have been the prime driver,” he says. I ask whether that means they need political solutions. “Bang on,” he replies.
He believes ignorance about Islam in the west is a huge problem. “The Islamic world as an important part of our globe has really been absent from Judeo-Christian education in a strange way,” he says, asking how anyone can be considered properly educated in the west when they know nothing about Islam.
We have to finish, so I ask what he thinks his legacy will be, which provokes laughter and the response that he doesn’t have the faintest idea.
As I switch off the tape recorder and prepare to leave, he visibly relaxes and begins talking about Afghanistan in a far more open way, reminiscing about the Mujahideen leaders he knew during the country’s civil war. We step out into the roof garden, where running water blocks out the roar of traffic. The peace lasts only a moment – the Aga Khan always has more meetings – and I have to go in search of lunch.
.............................................................
The Ismaili Centre
South Kensington, London
By Rachel Morarjee
In a deeply undignified start to my interview with one of the world’s most famous spiritual leaders, I am pressing my face against the glass of the Ismaili Centre in South Kensington, gesticulating wildly as I try to catch the eye of the dark-suited security man. It seems to me he is, perhaps reasonably, deliberately ignoring the madwoman outside.
As I’ve already tried pushing the locked door, I eventually stand on the street corner and rummage inelegantly in my bag to find my phone. After a long wait, I manage to get hold of a friend who works for His Highness the Aga Khan, who lets me in.
My requests for a lunch or breakfast meeting had been deflected by the Aga Khan’s aides, who gave me the unusual excuse that the leader of 20m Ismaili Muslims guards his privacy so zealously that he would be reluctant to reveal what he eats at mealtimes. So we settle on a coffee.
Tall, in a grey suit and a burgundy tie, the Aga Khan, 71, would blend seamlessly into a crowd of London commuters. He welcomes me with a smile and says, acknowledging our tricky discussions about this interview: “Not breakfast, not lunch, not dinner, but coffee. What would you like to drink?”
The room is impersonal but, as I sit down on a plush chair, I look out and see a lush flower-filled internal roof garden, a courtyard where water flows into a fountain.
I met the Aga Khan twice during my three-year stint as a reporter for the FT in Afghanistan so I am used to the atmosphere of stiff formality that surrounds him. After 51 years, he is presumably used to it too. In July 1957, at the age of 20, he took over from his grandfather as leader of the Ismailis, who are followers of the Shia Muslim tradition.
A woman brings the Aga Khan a white coffee while I opt for a cup without milk or sugar, which I try to balance on the arm of the chair and drink. I am dismayed to see no sign of anything edible.
The Aga Khan’s thoroughbred passion
As the most successful racehorse owner-breeder in France, the Aga Khan has won just about everything, several times over, writes Rachel Pagones. And while racing is a fast and furious sport – the verdict delivered in around two and a half minutes for races such as the Epsom Derby – breeding the horses for these contests can be an agonisingly slow process. Patience is the Aga Khan’s hallmark.
He inherited the business from his father, Prince Aly, and grandfather, the Aga Khan III, who bought his first thoroughbreds in Deauville in 1921 and went on to win the Epsom Derby five times. For his part, Prince Aly became the first owner in Britain to win £100,000 in a season in 1959, the year before he was killed in a car accident outside Paris.
The present Aga Khan’s “families” of broodmares often produce a top-class winner after three or four generations on the backburner. He is the least commercial of the large, independently wealthy owner-breeders, including Sheikh Mohammed, ruler of Dubai, Prince Khalid Abdullah of the Saudi royal family, and John Magnier of Coolmore Stud in Ireland, all of whom promote many of their own stallions for use by other breeders. He has only four stallions on his six properties in France and Ireland.
Money also helps. The Aga Khan’s operation breeds from its own stock, but makes a big purchase when a rare opportunity arises. The most recent was in 2005 when the Aga Khan bought the late Jean-Luc Lagardère’s bloodstock holdings, including two studs and close to 200 horses, for an industry estimate of between €40m and €50m.
The Aga Khan has had four winners of both the Epsom and Irish Derby, including Shergar, the most famous horse in Britain during his lifetime – he won the Epsom Derby by a record ten lengths in 1981 – but this achievement has been largely replaced in the public mind by the memory of the horse’s bizarre kidnapping in 1983, a year after he was retired to stud in Ireland with a valuation of £10m. The most thorough reports conclude it was an IRA plot, and the horse was killed not long afterwards, probably because the kidnappers had trouble handling him.
The Aga Khan’s current star is an unbeaten filly named Zarkava. The favourite for next weekend’s Prix de L’Arc de Triomphe, she descends from Petite Etoile, a grey filly who starred on British tracks just before and after Prince Aly’s death. Petite Etoile’s great-great-grandmother was Mumtaz Mahal, one of the first and most important horses purchased by the Aga Khan III for 9,100 guineas in 1922.
I feel slightly on show now, as there are a lot of people crowded into the room with us. There is a Paris-based PR man, an older Ismaili man and, most disconcertingly of all, a young woman with a notepad, poised to write down everything I say.
The Aga Khan wears a suit even when he’s travelling and working in Islamic countries. It’s not a look that we are used to seeing on Muslim spiritual leaders, so I decide to start by asking whether his clothing attracts criticism in the Muslim world. The woman with the notepad starts scribbling furiously. Uh-oh, I think, and I get the question thrown back to me: “You have lived in a Muslim country. Are you aware of any requirement for an Imam to wear a particular type of clothing? There are traditions but are you aware of any theological requirement?”
I ask again, and this time the Aga Khan replies, “I have never sensed that as a problem. Imams in sub-saharan Africa dress differently than Imams in the Middle East, who dress differently from Imams in central Asia.” He adds that for ceremonial occasions, he wears a traditional robe and Astrakhan hat – a look favoured by Afghan President Hamid Karzai.
This question of clothing goes to the heart of the paradox of the Aga Khan. While he’s a spiritual leader to millions of Muslims, he is best-known in the west as the highest-profile racehorse owner in France, where he lives.
The other unusual thing about this spiritual leader is how staggeringly rich he is. The Aga Khan’s personal wealth is estimated at $1bn but the Ismaili community is tight-lipped about how much of the Aga Khan’s money is his own and how much is ring-fenced for religious and development work.
I ask him how he reconciles such great wealth with having so many impoverished followers in many parts of the developing world. “Well, I think first of all you have to reposition the statement about having great wealth. I would say, frankly, that’s nonsense,” he says, smiling emphatically.
What is in no doubt is that the Aga Khan comes from a privileged background. He was born Karim al-Hussayni in Geneva in 1936 and was known as Prince Karim. After school in Switzerland, he went to the US and graduated from Harvard in 1959 with a BA honours degree in Islamic history.
His parents divorced in 1949 and his father later married Hollywood actress Rita Hayworth. The couple were a favourite of the gossip columns, although the marriage did not survive long. The unwelcome spotlight at that time might be part of why the Aga Khan now guards his privacy so carefully.
The Aga Khan title was granted to the family by the Shah of Persia in the 1830s after he had married his daughter to the Aga Khan’s great-great-grandfather. The man sitting opposite me is only the fourth to hold the title. As I sip my rapidly cooling coffee, I settle back and hear how the myth of fabulous family wealth was created when the third Aga Khan, grandfather of Aga Khan IV, was given his weight in gold during his golden jubilee celebrations in 1936.
Although Aga Khan III was only 5ft 5in, he tipped the scales at 220lb and the donations added up to $125,000 – a vast fortune in 1936. The ceremony of sitting on the scales with the gold made a great impression on the British public at the time. “In the west, this was seen as some sort of fantastic ceremonial, and this was because India at the time was ceremonial.” The current Aga Khan did not have to endure anything like this during his own golden jubilee celebrations during 2007 and 2008, not least because the 1930s gold made a solid bedrock for investments.
Ismailis have also traditionally paid a tithe to their Imams. The Aga Khan tells me that money raised by Ismaili followers does not end up in his pocket. “There is a great difference between wealth which comes from the faith and is used for the faith and personal wealth used for the individual. The Imam has responsibility for significant resources but they in no way cover the needs we have, and never will,” he says.
The Aga Khan inherited shares in corporations, banks, trusts and oil from his grandfather in 1957 and, over the past five decades, he has built a vast business development network by investing in poor and conflict-torn parts of the globe. He is the key shareholder in many of the projects but his profits are reinvested in the businesses, which are often run by members of the Ismaili community.
He began with newspaper investments in east Africa in the 1960s and now runs investment ventures tightly linked to development work that funds schools, hospitals and architectural projects.
In Afghanistan, I saw how the success of the Aga Khan’s projects stood in contrast to the bumbling efforts of many western governments. He owns stakes in the country’s largest telephone network, and a five-star hotel but has also renovated ancient mosques, gardens and citadels as well as running educational and agricultural projects.
The Aga Khan says he sees his role as a venture capitalist who specialises in difficult environments, laying the foundations of projects to entice other investors. The Geneva-based Aga Khan Fund for Economic Development (Akfed) runs more than 90 for-profit businesses and employs 36,000 people.
“There is no point going into economies that are wealthy and have their own resources, so we go into the really poor ones. If you try to put social development ahead of economic support, it doesn’t work. You have to do both together.
“A community whose economics don’t change is not one that can support community structures, education, healthcare, it doesn’t have the wherewithal,” he says.
The Aga Khan uses a lot of the same jargon used by development workers, mentioning “human resources” and “capacity building”. I am familiar with this way of talking from my time in Kabul but have always felt it a shame that it means that speakers often convey nothing of the real excitement involved in seeing a project take off and become an independent success.
His profits are reinvested in the Akfed businesses and the rest is paid in dividends to the other joint venture partners. These include private equity firm Blackstone, which has co-invested in a hydroelectric damn in Uganda, and Swedish telecoms group TeliaSonera, which holds a stake in Afghanistan’s largest telephone network,
Roshan has gone from strength to strength, its mobile business bolstered by the fact that it is impossible to lay landlines in a country so laced with landmines. But his five-star Serena Hotel in Kabul has attracted criticism for its opulence in a city where most people don’t have electricity and running water.
“The nature of what we do is high-risk,” the Aga Khan says, with characteristic understatement. I ask whether he thinks this long-term view is key to his success and he says that many projects can take 25 years to come to fruition. He cites a hospital in Pakistan that now produces world class doctors a quarter of a century after it opened. It would be hard to find western donors who would remain with a project for that long.
During his 51 years as Imam, he has watched the collapse of the Soviet Union, which brought Ismaili communities in central Asia back into contact with the outside world, as well as the rise of militant Islam. “Communities like the Ismailis don’t live in a vacuum,” he notes, saying that his job as Imam is to think carefully about how to address the problems in the societies his followers call home. The Ismaili diaspora is almost as widespread as the Jewish one.
I wonder whether he sees the clash between Islam and the west as the most serious global problem. “I’m unwilling to say that in these major issues today faith has been the prime driver. In my view it’s political issues that have been the prime driver,” he says. I ask whether that means they need political solutions. “Bang on,” he replies.
He believes ignorance about Islam in the west is a huge problem. “The Islamic world as an important part of our globe has really been absent from Judeo-Christian education in a strange way,” he says, asking how anyone can be considered properly educated in the west when they know nothing about Islam.
We have to finish, so I ask what he thinks his legacy will be, which provokes laughter and the response that he doesn’t have the faintest idea.
As I switch off the tape recorder and prepare to leave, he visibly relaxes and begins talking about Afghanistan in a far more open way, reminiscing about the Mujahideen leaders he knew during the country’s civil war. We step out into the roof garden, where running water blocks out the roar of traffic. The peace lasts only a moment – the Aga Khan always has more meetings – and I have to go in search of lunch.
.............................................................
The Ismaili Centre
South Kensington, London
Rothko, Bacon -- TATE
In a stroke of daring, high-fever gloom, Tate has dedicated this autumn’s leading exhibitions to the two most pessimistic painters of the 20th century. Francis Bacon, born in Ireland in 1909, and Mark Rothko, born in Latvia, then part of the Russian empire, in 1903, have much in common.
Both were outsiders – one a homosexual, the other a Jew – who grew up on western Europe’s fringes; both came to occupy essential but controversial positions on the international stage, one in European figurative art, the other in American abstraction.
Both are painters of existential anguish, creating in windowless, hemmed-in spaces a Huis Clos world of calculated claustrophobia. Bolshie atheists, both understood instinctively that postwar painting risked bankruptcy if it did not aspire to spiritual authority – thus the triptych, echoes of the Crucifixion and apocalyptic visions that recur in their work. And, as Tate dramatises effectively at Millbank’s Bacon show and at the new exploration of Rothko’s late paintings at Tate Modern, both stood apart from their times in reaching for tragic greatness, gambling absurdity, repetition, embarrassment and poverty for the chance of Old Master gravitas.
At Bankside, Tate has stacked the odds by focusing exclusively on the difficult late period, from 1958 to 1970, which ended one February evening when Rothko slit his veins, overdosed on antidepressants, and was found next morning in a pool of blood in his New York studio. Does the legend of his suicide – so graphic and so visually evocative of the deep brooding reds shading to purple and black that dominated his palette in these years – shape the way we respond to the intransigent, barely penetrable canvases here as hallowed and sublime?
Or was suicide the only way out for a painter who had played painting to its endgame, finishing up, in the black and pale grey acrylic paintings of 1969 in Tate’s final room, with a sparse, crepuscular intensity which reduced art to fundamental, biblical chords of darkness versus light?
You have to work hard at this show. Tate’s permanent Rothko Room, of course, is among its most popular; the coup here is that for the first time since they left Rothko’s studio, the Seagram murals which the artist donated to Tate are reunited with others in the series from Washington and Japan’s Kawamura Museum, and put in context with subsequent work. X-ray studies reveal the complex veils of pigment and subtle colour relationships in the late pieces; one, the glowing 1958 “Black on Maroon”, is installed so that viewers can walk around the back of the canvas as if to divine its secrets. Supported by sketches and models, and an excellent catalogue, this is a serious, scholarly endeavour which sets out to demystify Rothko and reclaim him as the pure painterly painter he passionately was. Yet I couldn’t help feeling that we would all be having a better – and more comprehensible – time, had this first Rothko exhibition in 20 years been a broader chronological retrospective.
Rothko trained with American colour field painter Milton Avery, but did not hit his stride until 1949 – significantly at a time of emotional devastation following his mother’s death – when he developed the large-scale, cinema-screen format of floating oblong blurs in contrasting yet complementary soft-edged, furrowing colours. A perfect example is the voluptuous “White Centre (Yellow, Pink and Lavender on Rose)”, which sold for $72.8m in 2007, becoming – until a Bacon recently beat it – the most expensive postwar painting to be bought at auction.
With astonishing elasticity, and lyrical reworkings of a bright-red-yellow-orange tonality, Rothko made variations on this template for most of the 1950s. Then suddenly the warmth changed to midnight blues, heavy greens, greys, black.
“I can only say that the dark pictures began in 1957 and have persisted almost compulsively to this day,” was the artist’s comment. Rothko was adept at self-sabotage. Maybe the floating-form pictures had become too easy, or too popular – Rothko had a horror of success, believing that “as an artist you have to be a thief and steal a place for yourself on a rich man’s wall”.
The 1958 commission to paint murals for the Four Seasons restaurant in Mies van der Rohe’s Seagram Building offered a new dimension. Rothko liked his paintings to envelop the viewer; now he could make a modern immersive installation. Rarely seen preparatory studies here, lovely gouaches executed at domestic scale, indicate the careful deliberation as he set to work with malicious intent: “I hope to paint something that will ruin the appetite of every son of a bitch who ever eats in that room.”
An important inspiration had long been Fra Angelico’s bright tempera frescoes in a serene Florence monastery; now – the addition of the visiting works at Tate enhances the sense of architectonic structure – Rothko made his own fugue-like arrangements of dark reds on mauves on blacks, the colours radiating against each other in dense overlapping layers, at once closed in and luminous, repressed and ecstatic. Are the strong verticals classical columns, sealed tombs, bricked-up walls, portals to the unconscious? Are there, as is sometimes claimed, allusions to Jewish graves dug after the pogroms of Rothko’s childhood, or to the infinite American landscape of his adopted country?
The former is too precise, the latter way off. What the long rhythmic Seagram installation here, suggestive of cathedral architecture, confirms – and why Rothko returned the cash and withdrew the murals from the profane commission – is the religious impulse at the core of an art that looks back not just to an orthodox Jewish upbringing but to the exalted calling of pre-revolutionary Russian painting.
The 1964 series of black paintings here, exquisite surfaces of shiny and matt paint, at once austere and sensual, emanating a solemn light, recall Malevich’s “Black Square”, hung like an icon, and beyond that the phosphorescent 19th-century nightscapes of Arkhip Kuindzhi, as well as the contemporary minimalism of Americans such as Ad Reinhardt.
Unlike Pollock or Mondrian – whose hermeticism he recalls – Rothko did not define himself as an abstract painter. “I belong to a generation that was preoccupied with the human figure,” he said. “It was with the utmost reluctance that I found it did not meet my needs. Whoever used it mutilated it. No one could paint the figure as it was and feel that he could produce something that could express the world.” In his lush 1950s paintings, Rothko fulfilled Van Gogh’s prophecy about the expressive possibility of colour. But he is so emotionally charged a painter that when colour drained finally away, in the last black and grey works here, framed in harsh white to declare themselves as minimal objects, the loss is felt just as extravagantly as the earlier rapture.
Rothko, Tate Modern, London SE1, to February 1; tel: +44 (0)20 7887 8000. Then Kawamura Memorial Museum, Sakura, Japan, February 21-June 14 2009
Stephen Fry in America
I was so nearly an American. It was that close. In the mid-1950s my father was offered a job at Princeton University – something to do with the emerging science of semiconductors. One of the reasons he turned it down was that he didn't think he liked the idea of his children growing up as Americans. I was born, therefore, not in NJ but in NW3. I was 10 when my mother made me a present of this momentous information. The very second she did so, Steve was born.
Steve looked exactly like me, same height, weight and hair colour. In fact, until we opened our mouths, it was almost impossible to distinguish one from the other. Steve's voice had the clear, penetrating, high-up-in-the-head twang of American. He called Mummy "Mom", he used words like "swell", "cute" and "darn".
There were detectable differences in behaviour too. He spread jam (which he called jelly) on his (smooth, not crunchy) peanut butter sandwiches, he wore jeans, T-shirts and basketball sneakers rather than grey shorts, Airtex shirts and black plimsolls. He had far more money for sweets, which he called candy, than Stephen ever did. Steve was confident almost to the point of rudeness, unlike Stephen who veered unconvincingly between shyness and showing off. If I am honest I have to confess that Stephen was slightly afraid of Steve.
As they grew up, the pair continued to live their separate, unconnected lives. Stephen developed a mania for listening to records of old music hall and radio comedy stars, watching cricket, reading poetry and novels, becoming hooked on Keats and Dickens, Sherlock Holmes and P G Wodehouse and riding around the countryside on a moped. Steve listened to blues and rock and roll, had all of Bob Dylan's albums, collected baseball cards, went to movie theatres three times a week and drove his own car.
Stephen still thinks about Steve and wonders how he is getting along these days. After all, the two of them are genetically identical. It is only natural to speculate on the fate of a long-lost identical twin. Has he grown even plumper than Stephen or does he work out in the gym? Is he in the TV and movie business too? Does he write? Is he "quintessentially American" the way Stephen is often charged with being "quintessentially English"?
advertisementAll these questions are intriguing but impossible to settle. If you are British, dear reader, then I dare say you too might have been born American had your ancestral circumstances veered a little in their course. What is your long-lost non-existent identical twin up to?
Most people who are obsessed by America are fascinated by the physical – the cars, the music, the movies, the clothes, the gadgets, the sport, the cities, the landscape and the landmarks. I am interested in all of those, of course I am, but I (perhaps because of my father's decision) am interested in something more. I have always wanted to get right under the skin of American life. To know what it really is to be American, to have grown up and been schooled as an American; to work and play as an American; to romance, labour, succeed, fail, feud, fight, vote, shop, drift, dream and drop out as an American; to grow ill and grow old as an American.
For years then, I have harboured deep within me the desire to make a series of documentary films about "the real" America. Not the usual road movies in a Mustang and certainly not the kind of films where minority maniacs are trapped into making exhibitions of themselves. It is easy enough to find Americans to sneer at if you look hard enough, just as it is easy to find ludicrous and lunatic Britons to sneer at. Without the intention of fawning and flattering, I did want to make an honest film about America, an unashamed love letter to its physical beauty and a film that allowed Americans to reveal themselves in all their variety.
I have often felt a hot flare of shame inside me when I listen to my fellow Britons casually jeering at the perceived depth of American ignorance, American crassness, American isolationism, American materialism, American lack of irony and American vulgarity. Aside from the sheer rudeness of such open and unapologetic mockery, it seems to me to reveal very little about America and a great deal about the rather feeble need of some Britons to feel superior. All right, they seem to be saying, we no longer have an empire, power, prestige or respect in the world, but we do have "taste" and "subtlety" and "broad general knowledge", unlike those poor Yanks.
What silly, self-deluding rubbish! What dreadfully small-minded stupidity! Such Britons hug themselves with the thought that they are more cosmopolitan and sophisticated than Americans because they think they know more about geography and world culture, as if firstly being cosmopolitan and sophisticated can be scored in a quiz and as if secondly (and much more importantly) being cosmopolitan and sophisticated is in any way desirable or admirable to begin with!
Subscribe to:
Posts (Atom)