About Me

My Photo

A good day begins with the NYTimes, NPR, Arts & Letters Daily, Sacred Space & good coffee; it ends with a Grand Marnier. A brilliant day would be spent in London, New York or San Francisco -- although Sydney would be right up there. Unwinding in Carmel or Antibbes. Daytime spent in library (the Morgan, LOC or Widener) or museum (the Frick, the Louvre, British) with a healthy walk (around Lake Annecey); evening -- theatre (West End), or music (Carnegie Hall). A nice last meal: Perhaps the French Laundry or Fredy Giardet or Quennelles de Brochet from Taillevent, Cassoulet from Cafe des Artistes, Peking Duck from le Tsé-Fung, Lobster Savannah from Locke-Ober, Sacher Torte from Demel and Café Brulot from Antoine. Sazerac as an apéritif, Le Môntrachet in the beginning, Stag's Leap Cabernet in the middle, Veuve Cliqûot to conclude. Desert Island: Imac, Ipod, (I know, generator and dish necessary) Johnnie Walker Blue Label, wife & Adler's Great Books.

4.3.15

Words

A kingdom in splinters
Traditional philology today is a shadow of what it once was. Can it survive?
Papyrus of Callimachus's Aetiavia
What language did Adam and Eve speak in the Garden of Eden? Today the question might seem not only quaint, but daft. Thus, the philologist Andreas Kempe could speculate, in his “Die Sprache des Paradises” (“The Language of Paradise”) of 1688, that in the Garden God spoke Swedish to Adam and Adam replied in Danish while the serpent—wouldn’t you know it?—seduced Eve in French. Others suggested Flemish, Old Norse, Tuscan dialect, and, of course, Hebrew. But as James Turner makes clear in his magisterial and witty history, which ranges from the ludicrous to the sublime, philologists regarded the question not just as one addressing the origins of language, but rather as seeking out the origins of what makes us human; it was a question at once urgent and essential.1 After all, animals do express themselves: they chitter and squeak, they bay and roar and whinny. But none of them, so far as we know, wields grammar and syntax; none of them is capable of articulate and reasoned discourse. We have long prided ourselves, perhaps excessively, on this distinction. But on the evidence Turner so amply provides, we might also wonder whether the true distinction lies not simply in our ability to utter rational speech, but in the sheer obsessive love of language itself; that is, in philology, the “love of words.”
This abiding passion for words, cultivated fervently from antiquity into modern times—or at least until around 1800, in Turner’s view—encompassed a huge range of subjects as it developed: not only grammar and syntax, but rhetoric, textual editing and commentary, etymology and lexicography, as well as, eventually, anthropology, archeology, biblical exegesis, linguistics, literary criticism, and even law. It comprised three large areas: textual philology, theories about the origins of language, and, much later, comparative studies of different related languages. Two texts predominated: Homer, considered sacred by the ancient Greeks, and the Bible, a contested area of interpretation for both Jews and Christians. As for theories of language origins, these go back to the pre-Socratics and Plato; the controversy was over whether language was divinely given, with words corresponding to the things they named, or arrived at by convention (thenomos versus physis debate). As for comparative studies, these arose in the eighteenth-century, largely as a result of Sir William Jones’s discovery of the common Indo-European matrix of most European languages. Encounters with “exotic,” that is, non-European, peoples in the course of the Renaissance voyages of discovery were another important source; here American Indian languages in their variety and complexity offered an especially rich, if perplexing, new field of inquiry.
To follow Turner’s account of all this is akin to witnessing the gradual construction of a vast and intricate palace-complex of the mind, carried out over centuries, with all its towers and battlements, crenellations and cupolas, as well as its shadier and sometimes disreputable alleyways and culs-de-sac, only to witness it disintegrate, by almost imperceptible stages, into fragmented ruins, a kingdom in splinters. The remnants of that grand complex, its shards and tottering columns, as it were, are our discrete academic disciplines today with their strict perimeters and narrow confines. To illustrate the difference, take Charles Eliot Norton (1827–1908), one of Turner’s heroes (and the subject of his Liberal Education of Charles Eliot Norton of 2002): Norton was the first professor of art history at Harvard and, indeed, one of the founders of the discipline, but he was also, among many other things, an expert on Dante who “taught and wrote art history as a philologist, an interpreter of texts.” Nowadays a polymath like Norton would not be hired, let alone get tenure, at any American university; he would be viewed as a dubious interloper on others’ turf.
In fact, traditional philology nowadays is less a ruin than the shadow of a ruin; no, even less than that, the vestige of a shadow. Turner acknowledges, and laments, this from the outset; he notes that “many college-educated Americans no longer recognize the word.” He adds that, “for most of the twentieth century, philology was put down, kicked around, abused, and snickered at, as the archetype of crabbed, dry-as-dust, barren, and by and large pointless academic knowledge. Did I mention mind-numbingly boring?” Worse, “it totters along with arthritic creakiness.” With friends like these, we might ask, can philology sink any further into oblivion than it already has? But the unspoken question here—“shall these bones live?”—is one that Turner poses and resolves triumphantly. He breathes life back into philology. There is not a dull page in this long book (and I include here its sixty-five pages of meticulous and sometimes mischievous endnotes). He accomplishes this by setting his account firmly in a detailed if inevitably brisk historical narrative interspersed with vivid cameos of individual scholars, the renowned as well as the notorious, the plainly deranged alongside the truly radiant.
Here I should disclose a distant interest. I once flirted with the idea of devoting myself to philology. I was soon dissuaded by my encounters with philologists in the flesh. The problem was not that they were dry; in fact, their cool, faintly cadaverous aplomb was a distinct relief amid the relentlessly “relevant” atmosphere of the 1960s. Dry but often outrageously eccentric, they were far from being George Eliot’s Casaubon toiling, and making others toil, to leave some small but significant trace in the annals of desiccation. No, it was rather their sheer single-mindedness coupled with a hidden ferocity that gave me pause. When I first met the late Albert Jamme, the renowned epigrapher of Old South Arabian, this Belgian Jesuit startled me by exclaiming at the top of his voice, “I hate my bed!” When I politely suggested that he get a new mattress, he shot back with “No, no! I hate my bed because it keeps me from my texts!” And the undiluted vitriol of Jamme’s opinions of his colleagues (all three of them!), both in conversation and in print, was scary; from him and others I learned that nothing distills venom more quickly than disagreement over a textual reading. At times there was something decidedly otherworldly about other philologists I met. In the 1970s, when I studied at the University of Tübingen and had the good fortune to work with Manfred Ullmann, the great lexicographer of Classical Arabic, he startled me one day by excitedly brandishing a file card on which was written the Arabic word for “clitoris” (bazr) and exclaiming, “Kli-tO-ris! What do ordinary folk know about Kli-tO-ris?” (More than you imagine, I thought.) Needless to say, it was the word—its etymology, its cognates, its morphology—that captivated him.
As for philological single-mindedness, when a celebrated German Assyriologist of my acquaintance (who shall remain nameless) got married, he rose abruptly from the wedding banquet to announce “Jetzt zur Arbeit!” (“Now to work!”) and headed for the door, a volume of cuneiform texts tucked under one arm; only the outraged intervention of his new mother-in-law kept him from leaving. Such anecdotes about philologists—their pugnacity, their obsessiveness, their downright daffiness—could fill a thick volume. Such anecdotes taught me not only that I wasn’t learned enough to become a philologist, I wasn’t unhinged enough either.
Happily, there is venom aplenty in Turner’s account. As he remarks, “the Republic of Letters could make bare-knuckle boxing look civilized.” His adversaries lampooned Erasmus as “Errans-mus” or “roving rat.” The seventeenth-century cleric the Reverend William Wotton was described as “a most excellent preacher, but a drunken whoring soul.” And A. E. Housman could write of Benjamin Jowett’s monumental translation of Plato—a translation that helped to dislodge Aristotle from his pre-eminence in the Oxford curriculum—that it was “the best translation of a Greek philosopher which has ever been executed by a person who understood neither philosophy nor Greek.” Turner too bares his knuckles when he describes the celebrated eighteenth-century English classicist Richard Porson as “an alcoholic and a slob on a Herculean scale.” While such sallies enliven Turner’s account throughout, they don’t detract from the genuine magnificence of the philological tradition as he describes it. The venom was the unavoidable by-product of that all-consuming passion for words.
Turner is predictably excellent on such prodigies as the truly amazing Richard Bentley (1662–1742), a kind of Mozart of philology, or Sir William Jones, whose precocious research led to the discovery of what is now known as “proto-Indo-European,” as well as on a host of other luminaries, who range from Petrarch, Scaliger, Grotius, and Gibbon to both Alexander and Wilhelm von Humboldt, and well beyond. But he is also superb on the many obscure or forgotten figures who teem throughout his account. He gives a vivid portrait, for example, of Elizabeth Elstob (1683–1756), the so-called “Saxon Nymph,” who had eight languages at her command and whose pioneering studies of Anglo-Saxon established it firmly in the grand philological tradition. And then there is Alexander Bryan Johnson, a banker from Utica, whose Treatise on Language went through three editions, from 1828 to 1854. Even in the nineteenth century, philology was still capacious enough to encourage “amateurs” like Johnson to make substantial contributions.
In several of his most intriguing asides, Turner discusses Thomas Jefferson’s abiding fascination with American Indian languages. On one occasion, in June of 1791, Jefferson and James Madison “squatted in a tiny Unquachog village on Long Island” to compile a wordlist of the now-extinct Quiripi language from the three old women who still spoke it. Jefferson sought to discover the origins of American Indians; he wondered whether they were ultimately of Asian origin or, less plausibly, whether they had originated in Wales, his own ancestral homeland. In any case, the image of two future American presidents hunkering down in a freezing wigwam on Long Island, driven purely by intellectual curiosity, seems to come from some alternative universe now forever lost to us.
While Turner excels at pithy profiles, he is also quite superb at illuminating certain recurrent debates in the history of philology, such as the “Transcendentalist Controversy” in 1830s Massachusetts in which Ralph Waldo Emerson disputed Andrews Norton, his former teacher of divinity. The disagreement was over the age-old question of whether language developed by convention, Norton’s view, or was inherent in the things it denoted, as Emerson argued. It isn’t really so surprising that such a dispute would crop up in nineteenth-century New England: the very nature of discourse, let alone consensus on the interpretation of Scripture, depended upon its resolution. The most compelling such clash, which Turner describes at length, occurred between Friedrich Max Müller, the German-born scholar of Sanskrit who became Oxford’s first professor of comparative philology, and William Dwight Whitney, the first Yale professor of Sanskrit—both eminent authorities even if Müller had become a kind of academic superstar through his public lectures. (But Whitney’s Sanskrit grammar of 1879 is not only still in use, but still in print.) The dispute, yet again, centered on the origins of language. Müller had made fun of Darwin’s idea that language developed when humans first began imitating animal cries, calling it “the bow-wow theory.” Müller believed instead that language exhibited “natural significancy” and that its origins could be uncovered by a search of Sanskrit roots. Whitney argued for its basis in convention: “the fact that an American says ‘chicken’ and a Frenchwoman ‘poulet’ to refer to the same fowl is purely arbitrary: ‘gibbelblatt’ and ‘cronk’ would work just as well.” Though this was a trans-Atlantic debate, it might as well have been between Plato and the Sophists; such questions were perennial just because they were unanswerable.
The roll-call of major figures who contributed to philology, or were deeply influenced by its methods and insights, includes Plato and Isocrates, the Alexandrian poeta doctus Callimachus (whom Turner somewhat harshly calls “a heroic grind”), Dionysius Thrax, who composed the first grammar book, and his pupil Tyrannion, who settled in Rome around 67 A.D. and “made a bundle as a chic teacher”—to mention only a few of the Greeks discussed here. There follows a consideration of the Jewish community of Alexandria who translated the Torah into Greek sometime in the third century B.C., the version known as the Septuagint (or “LXX”) because its seventy-two translators supposedly completed their work in seventy-two days. This inaugurates a theme that will sound, rather distressingly, throughout the book for Christian exegetes turned to rabbinic authorities for help in elucidating the Biblical text even while contemning them. As Turner nicely puts it with reference to the eighteenth-century English exegete Benjamin Kennicott, “Without Jews, Kennicott was helpless. With them, he was high-handed.” Some Christian scholars even went so far as to strip the Masoretic text of the Hebrew Bible of its ingenious vowel-signs (Hebrew, like Arabic, originally used a consonantal script: vowels were later indicated by special diacritics above or below the letters); this “unpointed” text, liberated from rabbinic exactitude, simply offered more scope for free-ranging interpretation.
In successive chapters, Turner moves from the Church Fathers through the Middle Ages, on to the Renaissance and Reformation. But though he handles these periods expertly, his “true subject,” as he states it, is “the modern humanities in the English-speaking world.” The nine chapters he devotes to this are almost impossible to summarize in a review; they are packed with detail and are simply enthralling. His careful account of the implantation of philology in nineteenth-century America, its steady rise and subsequent decline—or rather, its disintegration from a comprehensive realm of learning to an array of scattered and loosely connected disciplines under the aegis of “the humanities”—is at once sobering and compelling. The grand aspiration, the colossal energy, that propelled philology for almost two millennia passed to the sciences; the scientific model of research and learning ousted the philological.
The erudition of early philologists was staggering. Sir William Jones already had mastered eleven languages, with a good smattering of fifteen others, before he even arrived in Calcutta in 1783 and took up Sanskrit. The formidable German scholar Franz Bopp, whose studies in comparative grammar “revolutionized Indo-European philology” considered not only Greek, Latin, and German in his investigations but Sanskrit, Avestan, Old Slavonic, Lithuanian, and Gothic as well. Another nineteenth-century German scholar, Wilhelm von Humboldt, took on Basque, American Indian languages, and Malayo-Polynesian dialects. In a way, though, their polyglot accomplishments are only part of the picture. The philologists were also passionately interested in the realia of antiquity and the Biblical world. Richard Bentley made expert use of numismatics in his research. Others were antiquarians, a term now associated with the fussy sniffing out of ephemera, a sort of Pickwickian pastime; but it was antiquarianism in its loftiest aspects—the study of inscriptions, of classical architecture, of topography, all the shards of a lost past—that gave rise not only to such fields as scientific archeology, but also to such ancillary disciplines as Assyriology and Egyptology. The point is not simply that philologists were both single-minded and obsessive but that they were virtually omnivorous in their pursuits. No vestige of the past was inconsequential in their eyes.
Turner devotes some of his best pages to Edward Gibbon, to whom his own book owes much. Though he writes not in eighteenth-century cadences but in a lively, elegantly colloquial prose without a smidgeon of barbaric academic jargon, his approach seems modeled on Gibbon’s. As Turner points out, Gibbon was both a philosophical historian and a philological historian, one of the first; he drew on documents, on texts meticulously recovered and edited, to support his narrative. This is Turner’s approach as well.
Though Turner deplores “the monoglot, narrowly focused scholarship increasingly common in the humanities during the past half-century,” he seldom indulges in tirades. He doesn’t need to; his subject speaks for itself and tells us unmistakably what we have lost. Earlier I compared the philological tradition to a palace only the fragments of which remain. But I wonder whether it might be better described as a kind of invisible banquet at which we still unknowingly feast. It may be a feast with ghosts, but given the ghosts at the table, it’s all the more sumptuous for that.
1 Philology: The Forgotten Origins of the Modern Humanities, by James Turner; Princeton University Press, 574 pages, $35.

AU

War and Gold by Kwasi Kwarteng. As the Europeans conquered the New World, it was silver, much more than gold, that they brought back with them to flood European markets. Between 1503 and 1660, they brought 16,000 metric tonnes of silver as compared to 185 metric tonnes of gold. All this precious metal came through the Spanish city of Seville:

"Peru was the most significant source of the gold and silver that flooded into Europe. The term 'Peru' applied in the sixteenth century 'to the whole of South America, not just to the territories which bear this name today'. The empire of Arahualpa, the last Sapa Inca, or emperor, had abounded in gold and silver. It was in Peru that the largest mining discovery in the New World was made. In April 1545, a minor Indian nobleman, called Diego Gualpa by the Spanish, climbed a hill in search of a shrine and was thrown to the ground by a strong gust of wind. He found himself gripping silver ore with his hands and reported the discovery to some sceptical Spanish adventurers near by. On further investigation, five rich veins of silver were discovered at Potosí, where a silver rush quickly followed. It was silver and not gold which in the eyes of the world 'became the symbol of quickly made fortunes'. To the French of the sixteenth century the term 'Peru' simply became a synonym for fabulous wealth.

"The Potosí deposits in Peru, situated in modern Bolivia, were described as 'a "silver mountain" of six miles around its base on a remote and desolate plateau, 12,000 feet above sea level'. The city built after this discovery rapidly grew in size, peopled entirely by immigrants. In 1555, only ten years after that discovery, the population of Potosí had risen from nothing to 45,000. It climbed to a peak of 160,000 just fifty-five years later in 1610. The voluntary labour of many immigrants was supplemented by conscripted teams of miners, employed under the mitasystem, which prescribed that a number of Indian villages within a certain radius of Potosí should provide a quota of their population to work in the mines. This system itself was abolished only in 1812. Each year the Indians of the prescribed region had to send 'roughly a seventh of all adult males to work in the mines and refining mills of Potosí -- about 13,500 men, it has been estimated. The furthest village or town that the mitayos were expected to come from was Cuzco, the historic capital of the Inca empire, from which recruits would travel for two months across the Andes to complete the 600-mile journey. Once in Potosí, the mitayos were obliged to work only one week in three, and their chiefs accordingly divided the indentured men into three shifts when they arrived. The working conditions in the mines themselves became a byword for hardship and cruelty. The indentured mitayo had to work twelve hours a day, with only an hour's break at noon, in a mine up to 500 feet deep, where the air was 'thick and evil-smelling, trapped in the bowels of the earth', and where tough physical work would be undertaken in near-darkness, with only candlelight as a source of illumination.

Indian Miners at Potosí, 1590, by Theodor de Bry

"The Spanish Crown did not own, or manage, mines directly. The Habsburgs' share of the profits from the mineral wealth of the New World took the form of a tax, the quinto real, or royal fifth. As its name implied, this was a 20 per cent royalty fee imposed on every ounce of gold and silver extracted from the New World which came into Seville, the only permissible entry port into Europe for the Spanish treasures from further west. In a period just short of 160 years, from 1503 to 1660, it has been estimated that 16,000 metric tonnes of silver arrived at Seville, enough to triple the existing silver resources of Europe, while only 185 metric tonnes of gold entered Seville from the New World during the same period, which increased Europe's gold supplies by only a fifth. Silver was the key element of the imaginary wealth of Spain. The wealth was imaginary because its effects on the Spanish monarchy were unexpected and would lead, in the long run, to bankruptcy and a diminution in the power of Spain in world affairs. ...

"Within Spain itself, it was the port of Seville which gained from the trade in gold and silver from the New World. It was in Seville that the House of Trade, a government bureau for the regulation and development of New World commerce, was established. Seville was also the location for the Council of the Indies, and it is in Seville that the extraordinary archives, documenting five centuries of the Spanish American empire, are housed in the Archivo General de Indias. It is no accident that four of the most widely performed operas in the modern era -- Mozart's The Marriage of Figaro and Don Giovanni, Rossini's Barber of Seville and Bizet's Carmen -- are set in this city, which in 2011 had a population of only 700,000. Seville in the sixteenth and seventeenth centuries was one of the greatest financial centres of the world. As if to entrench its position, in 1572 Philip II declared the town the only legal terminus of the American trade, a status which was kept in law until 1717.  In Seville, the flow of bullion offered an unrivalled opportunity to 'acquire great wealth in trading' which 'lured both the local nobility and businessmen from abroad and from other regions of Spain'."

War and Gold: A Five-Hundred-Year History of Empires, Adventures, and Debt
Author: Kwasi Kwarteng
Publisher: PublicAffairs
Copyright 2014 Kwasi Kwarteng

These Surreal 'Time-Slice' Photos Show Famous Landmarks from Sunrise to Sunset - CityLab

These Surreal 'Time-Slice' Photos Show Famous Landmarks from Sunrise to Sunset - CityLab



Image Richard Silver

What scares the new atheists | John Gray

What scares the new atheists | John Gray | World news | The Guardian



The ruins of an ancient civilization that were neighbors to the Maya

3.3.15

City Talk

City Talk | Lapham’s Quarterly

Keeping it Organized

The Organized Mind by Daniel J. Levitin. We live in a world with 300 exabytes (300 billion billion) of information, an amount that is rapidly expanding to ever greater amounts from this already brobdingnagian level. And yet the processing capacity of the conscious mind is a mere 120 bits per second. This presents a challenge to not only our processing capacity, but also our decision-making ability:

"Neuroscientists have discovered that unproductively and loss of drive can result from decision overload. Although most of us have no problem ranking the importance of decisions if asked to do so, our brains don't automatically do this. ... The mere situation of facing ... many [small] decisions in daily life creates neural fatigue, leaving no energy for the important decisions. Recent research shows that people who were asked to make a series of meaningless decisions ... showed poorer impulse control and lack of judgment about subsequent decisions. It's as though our brains are configured to make a certain number of decisions per day and once we reach that limit, we can't make any more, regardless of how important they are. One of the most useful findings in recent neuroscience could be summed up as: The decision-making network in our brain doesn't prioritize.

"Today, we are confronted with an unprecedented amount of information, and each of us generates more information than ever before in human history. ... Information scientists have quantified all this: In 2011, Americans took in five times as much information every day as they did in 1986 -- the equivalent of 175 newspapers. During our leisure time, not counting work, each of us processes 34 gigabytes or 100,000 words every day. The world's 21,274 television stations produce 85,000 hours of original programming every day as we watch an average of 5 hours of television each day, the equivalent of 20 gigabytes of audio-video images. That's not counting YouTube, which uploads 6,000 hours of video every hour. And computer gaming? It consumes more bytes than all other media put together, including DVDs, TV, books, magazines, and the Internet.

"Just trying to keep our own media and electronic files organized can be overwhelming. Each of us has the equivalent of over half a million books stored on our computers, not to mention all the information stored in our cell phones or in the magnetic stripe on the back of our credit cards. We have created a world with 300 exabytes (300,000,000,000,000,000,000 pieces) of human-made information. If each of those pieces of information were written on a 3 x 5 index card and then spread out side by side, just one person's share -- your share of this information -- would cover every square inch of Massachusetts and Connecticut combined.

"Our brains do have the ability to process the information we take in, but at a cost: We can have trouble separating the trivial from the important, and all this information processing makes us tired. Neurons are living cells with a metabolism; they need oxygen and glucose to survive and when they've been working hard, we experience fatigue. Every status update you read on Facebook, every tweet or text message you get from a friend, is competing for resources in your brain with important things like whether to put your savings in stocks or bonds, where you left your passport, or how best to reconcile with a close friend you just had an argument with.

"The processing capacity of the conscious mind has been estimated at 120 bits per second. That bandwidth, or window, is the speed limit for the traffic of information we can pay conscious attention to at anyone time. While a great deal occurs below the threshold of our awareness, and this has an impact on how we feel and what our life is going to be like, in order for something to become encoded as part of your experience, you need to have paid conscious attention to it.

"What does this bandwidth restriction -- this information speed limit mean in terms of our interactions with others? In order to understand one person speaking to us, we need to process 60 bits of information per second. With a processing limit of 120 bits per second, this means you can barely understand two people talking to you at the same time. Under most circumstances, you will not be able to understand three people talking at the same time. We're surrounded on this planet by billions of other humans, but we can understand only two at a time at the most! It's no wonder that the world is filled with so much misunderstanding. With such attentional restrictions, it's clear why many of us feel overwhelmed by managing some of the most basic aspects of life."


The Organized Mind: Thinking Straight in the Age of Information Overload
Author: Daniel J. Levitin 
Publisher: Penguin Group
Copyright 2014 by Dan

Read the First-Ever Issue of TIME

Read the First-Ever Issue of TIME | TIME



First cover of TIME

2.3.15

Canines


Blame the Dog

The Invaders: How Humans and Their Dogs Drove Neanderthals to Extinction
By Pat Shipman
The Belknap Press of Harvard University Press, 2015
invadersThe humans who lived in Europe during the Pleistocene 45,000 years ago faced a fierce world. Their populations had endured sweeping climate changes over the centuries, and they shared their landscape with animals as monstrous as anything the world had seen since the dinosaurs: massive cave bears, saber-toothed tigers, lions bigger than any in Africa today, cave hyenas, huge woolly mammoths, woolly rhinoceri, wolves, leopards, roving packs of dholes – a world as fearsome and strange as something out of a science fiction novel.
And one of the strangest things about that world was those humans themselves, because they weren’t us. They were squatter, heavier, hairier, much stronger. They had very large brains, very skillful hands, a strongly family-oriented culture, and considerable technological capabilities. They were a species of human we know as Homo neanderthalensis, the Neanderthals, and they’d lived and laughed and hunted and died for millennia in Eurasia, weathering radical climate shifts and game migrations, raising their young, caring for their infirm, and burying their dead.
Things had been like that in Eurasia for hundreds of thousands of years. Then, roughly 40,000 years ago, everything began to change in what, anthropologically speaking, amounts to an eye-blink of time. Cave bears, saber-toothed tigers, mammoths, rhinos, lions, leopards, dholes … fierce as they were, they all vanished from the forests and steppes of Eurasia. And Neanderthal populations first drastically dwindled and then vanished as well, and now, in our time, for two hundred years, ever since the discovery of the first Neanderthal fossils, debate has raged as to what caused this catastrophic die-off. Did the climate shift too suddenly for adaptation or migration to keep pace? Did reproductively vulnerable species ‘bottleneck’ and enter irretrievable decline? Unlike with the extinction of the dinosaurs 66 million years ago, which was precipitated by a cataclysmic asteroid strike, the near-total extinction of the Pleistocene Eurasian species spectrum has no one prevailing theory to explain things.
There is an overwhelmingly likely culprit, however. Scientists, much like Sherlock Holmes, try never to theorize in advance of the facts, but nevertheless, something happened to the continent of Eurasia right before all its megafauna – including its resident species of human being – disappeared.
Modern humans happened. Homo sapiens arrived in Eurasia roughly 45,000 years ago, and very shortly thereafter, virtually every large species of prey animal and competing predator was gone. The patently obvious deduction is that Homo sapiens intentionally and methodically wiped out all those other species.
But science teaches the invaluable habit of distrusting the patently obvious, and so retired anthropology professor Pat Shipman takes very little for granted in her endlessly fascinating new bookThe Invaders: How Humans and Their Dogs Drove Neanderthals Sapiens_neanderthal_comparisonto Extinction. In fact, she has her own hypothesis to put forward as to a key factor that hastened the Neanderthals’ demise. Perhaps you spotted it in her book’s subtitle.
Shipman characterizes modern humans as the ultimate invasive species, flooding into a new ecosystem and radically destabilizing every aspect of it. In her clear-cut and pervasively (but perhaps not entirely intentionally?) ironic chapter “What Does an Invasion Look Like?”, Shipman illustrates the whole concept of invasive species with a modern example: the Greater Yellowstone Ecosystem and the much-smaller Yellowstone National Park inside it. The Park was officially designated in 1872, long before which the native inhabitants of the region, the Shoshone, Nez Perce, Crow, Cheyenne, and others, had been driven off or exterminated – at which point, Shipman writes, “The incoming white human settlers functioned as an invasive predator and promptly eliminated their chief remaining rivals, the wolves.” In 1915 the Federal Bureau of Biological Survey and its Division of Predator and Rodent Control sought to wipe out all large predators from federal lands, which in the case of wolves was largely accomplished by the 1930s.
And as Shipman succinctly points out, “The removal of wolves, the indigenous apex predator, made a huge difference to the ecosystem.” Everything from elk population to foliage density was altered, and it all began to alter again when wolves were re-introduced to Yellowstone National Park in 1995-96:
Almost as soon as wolves were released in 1995, they began killing and driving away coyotes. It was as if the first item on their agenda was “get rid of coyotes,” as the ranchers’ and settlers’ agenda had been first “get rid of Indians” and then “get rid of wolves.” Quite simply, the wolves would not tolerate the presence of coyote rivals in their territories – and they were equally merciless with members of other wolf packs that strayed onto their turf.
And if modern humans and wolves sound formidable individually, imagine how much more formidable they’d be if they worked together. That’s the heart of Shipman’s hypothesis: that Homo sapiens, roughly ten thousand years after arriving in Eurasia, stumbled onto a crucial adaptation that had eluded all other apex predators before them: making an alliance with another apex predator. Somehow, modern humans forged a symbiotic relationship with wolves that quickly led to a kind of wolf-dog that was no longer entirely wild. Homo sapiens had domesticated the competition.
Shipman theorizes that it was a feat Neanderthals couldn’t match. “Whatever abilities modern humans used to capture and apparently domesticate wolves into wolf-dogs,” she writes, “were either unknown to Neanderthals or beyond their capabilities.” She even offers an intriguing possibility for what Homo sapiens‘ x-factor might have been: the whites of their eyes! The idea being that the white sclera surrounding the modern human iris greatly facilitates gaze-directed silent hunting – an obvious advantage in tracking prey – and that it’s something certain kind of canids share to a greater degree than others (hence the centrality of bright-eyed wolf to the story rather than, say, omnipresent but black-eyed bush dog). Thus the tendency of modern Men_of_the_old_stone_age_(1915)_Wolfdog-owners to stare meaningfully into their dogs’ upturned faces might be vital in explaining how either the dog-owner or the dog is here at all.
Modern humans, Shipman contends, in forming this kind of “unprecedented alliance with another species … created for ourselves an ability to borrow the traits of other species and use them to enhance our own survival in almost every habitat on the planet.” Wolves instinctively understood exactly the kind of hierarchical social structure humans already had, which made it that much easier for Homo sapiens to begin domesticating wolf-dogs and using them in ways present-day hunters will recognize: a pack of canines can detect prey long before humans can, and they can chase that prey farther and longer than humans can, and, crucially, they can keep that prey at bay and stationary until humans can arrive with their superior numbers and projectile weapons. The wolf-dogs would have realized in short order that in exchange for their instinctive distrust of hominins the arrangement would garner them more reliable kills. And the humans would have seen that the wolf-dogs were helping to secure more meat than they’d provide if they themselves were simply slaughtered. And so the 35,000-year-old partnership between humans and dogs began – in multiple genocides.
It’s a word Shipman never uses, although she holds no illusions about the ominous timing of the whole thing. Modern humans appeared, spread through the Neanderthals’ range, eventually domesticated wolf-dogs, and shortly thereafter, the Neanderthals were gone. Good cautious scientific thinker that she is, Shipman makes no declarations, and she’s careful to stipulate that drastic climate change also played a key role: it was the “synergy” between the advent of modern humans and a sudden shift in climate that tipped the balance against the Neanderthals. She even occasionally goes so far as to allow for the absurd possibility – mentioned also elsewhere in the literature of Neanderthal extinction – that there were no hard feelings:
People today are often frightened of strangers; how much more threatening would meeting another hominin species be? Limited resources and food competition would only heighten the fear. Possibly there was no conscious awareness of competition between Neanderthals and modern humans, but equally possibly, there was. At any rate, Neanderthals went extinct after the arrival of modern humans and possibly not long after.
The ideas are unfailingly thought-provoking, but there’s a vein of contradiction running through a good deal of The Invaders, and it’s hard not to suspect it’s because the author is a member in good standing of Homo sapiens. The central contradiction is actually embodied in that Yellowstone example: the point isn’t the white settlers and the wolves – it’s the white settlers and the Indians. Where newly-introduced wolves are content merely to kill or drive off direct rivals for prey, Homo sapiens tends to be much more industrious when it comes to extirpation. There are wolves andcoyotes in Yellowstone today. But all of Eurasia wasn’t big enough for Homo sapiens and Neanderthals.
The modern human propensity for genocide is aptly illustrated in the case of mammoths. Remains of the animals have been found in connection with many Neanderthal sites, but it’s only with the advent of modern humans that the mammoth population as a whole begins to be affected. Paleontologists have discovered mammoth ‘megasites’ involving the remains of dozens of animals all in the same place – a disproportion, as Shipman notes, for which there was no precedent:
Modern humans disturbed the long-standing ecosystem with their arrival about 45,000 years BP, but starting about 32,000 years ago there was a second extraordinary change. From then until about 15,000 years ago, modern humans were killing and using mammoths in extraordinary numbers not seen in any Neanderthal sites.
Anyone even passingly familiar with the behavior of humans in the modern world – denuding entire ecosystems, lavishing waste and slaughter in all directions, moving one of the greatest mass extinctions in the history of life on Earth – will have no trouble at all understanding the ugly truths behind mammoth ‘megasites.’ But science, as noted, is inherently cautious, even when, as in this case, that caution ends up looking a little silly:
The use of mammoth resources does not prove mass killings, but sites with large numbers of mammoths need to be explained. Why would mammoths start dying in large numbers only after modern humans arrived? S. V. Leshchinsky and colleagues have suggested that woolly mammoths were stressed by nutritional deficiencies because of climate change, but why such stresses might become acute after the arrival of modern humans is unclear.
Fierce wolf-dogs may have given Homo sapiens an evolutionary advantage
Fierce wolf-dogs may have given Homo sapiens an evolutionary advantage
Shipman warns against a too-easy, overly-dramatic reading of the evidence; indeed, one of the scientists she interviews laughs at the “Ernest Hemingway” notion that Homo sapienswas responsible for wiping out the mammoths (and the cave bears, and the saber-toothed tigers, and the lions, and the rhinos, etc.). She reminds her readers that virtually no evidence has turned up in the fossil record to indicate direct modern human-Neanderthal violence, and she sticks to her own hypothesis: that the amazing, logic-defying partnership of modern humans and wolves, combined with a sudden climate change even more severe than usual, drove the already-strained Neanderthal population in Eurasia over the brink and into extinction.
Many factors, an unprecedented inter-species cooperation, a quickly-changing environment, a robust new species out-competing a struggling old species. A very intriguing new take on an old question, this one featuring the basset hound currently snoring on your couch. A broad-based new hypothesis that most definitely doesn’t go in for any Ernest Hemingway notions about one miswired, genocidal species wiping out anything bigger than a jackrabbit within a thousand miles in all directions.
Perfectly reasonable, and it might even end up being true. Just don’t mention it to the passenger pigeon.
____
Steve Donoghue is a writer and reader living in Boston with his dogs. He’s recently reviewed books for The Washington PostThe NationalThe Wall Street JournalThe Boston GlobeHistorical Novel Review Online, and The Quarterly Conversation. He is the Managing Editor of Open Letters Monthly, and hosts one of its blogs, Stevereads.

Music


Church music: Seen here in the medieval treatise "Tacuinam Sanitatis", it forged the basis of classical music

It is a mystery to many people why so few contemporary classical composers seem capable of writing "a good tune". Surely, given the number of students who pursue composition in our universities and conservatoires, and the hugely increased access which technologies such as music-notation software give to prospective composers, we should expect to find at least one or two capable of making a popular impact? Why is it that, with more people than ever engaged in the activity of composing, our culture still seems incapable of fostering a contemporary Verdi or Stravinsky, with the celebrity and popular recognition that such great figures once garnered?

It is certainly true, as Simon Heffer has amusingly put it in Standpoint ("A Raspberry for Emetic Music", November 2014), that the musical establishment is "in hock to the crap merchants" and in thrall to the state, creating a tyrannical orthodoxy of ugliness, admission to which can only be gained by imitating the style of "orchestrated raspberries" currently in vogue. However, the underlying cause—though closely related to the over-reaching influence of the modern state—ultimately goes far deeper than this. To understand the deficit of successful contemporary classical music, what we need to uncover are the feelings which motivated the artistic instincts of the great composers of the past, but which are now absent in the minds of modern composers, thus accounting for their "emetic" output.  

In the year 1900, the following composers were alive, and the majority of them active: Saint-Saëns, Debussy, Ravel, Stravinsky, Rimsky-Korsakov, Rachmaninoff, Prokofiev, Bartók, Elgar, Vaughan Williams, Holst, Mahler, Strauss, Sibelius, Grieg, Puccini, Dvořák and Janáček. This list of exalted and well-known figures is far from exhaustive, and should give us pause. We cannot possibly pretend that the world today can boast a similar number or calibre of composers; indeed, any one of these figures is of far more interest to most of us than any of today's most famous composers. Moreover, if one expands this categorisation to include any composer active between the years 1850 and 1950, one possesses pretty much a complete list of the works in the standard orchestral repertoire (save the old German masters), and hence those pieces which one would find overwhelmingly on offer in any events guide produced by today's professional orchestras.

On closer inspection, it is not hard to see the idée fixe that unites this vast array of varied talent: nationalism. To varying degrees of explicitness, whether through the deliberate inclusion of folk elements, or simply a general over-arching style suggestive of national sentiment, all of these figures would quite happily have thought of themselves, not just as composers, but as French, Russian, Hungarian, English, German, Finnish, Norwegian, Italian or Czech composers. It is in fact a statement of the obvious to point out that the feelings that underpin a good deal of what these composers set out to accomplish was driven by a passion for the language, history, customs, traditions, institutions and, perhaps most prominently, the countryside of their native lands.

This surge of nationalist output, produced during the long 19th century, was an obvious accompaniment to the growth of the nation state itself. However, there is another deeper set of convictions which the classical composers held in common, and upon which the nation states of Europe themselves were predicated: Christianity.

Even in opera, a seemingly secular arena, Christianity commonly frames the moral dilemmas of the characters on stage. Mozart's Don Giovanni is dragged off to Hell, Verdi's Leonora takes refuge in a monastery, and Janáček's Jenůfa is just one of the many characters from the operatic repertoire who offers up a Christian prayer in a moment of great despair and need. This isn't merely because the Church held the purse-strings, as some have argued, but because there is a profound and inseparable relationship between music and Christianity; in fact, I would go as far so to argue that there is a sense in which Western music is Christian. The very scales (originally church modes) and harmonies which musicians of any ilk take as a given were forged in the cathedrals and churches of the medieval world. Through a gradual process of setting liturgical texts to music, sonorities such as the dominant-seventh chord were discovered, which then became the basic material of all classical and popular music. Something of the wisdom of the Gospels and the Psalms shines out of the harmonies of Western music—which is that crucial balance between judgment and compassion—and this is why, even on the operatic stage, a Christian moral logic so naturally and fittingly flows forth from the voices of the characters and the machinations of their plots.

Two operas in particular strongly support this line of reasoning, both of which place the suffering of Christ on the cross as a central image around which their respective stories revolve: The Rape of Lucretia by Britten, in which a narrative chorus "view these human passions, and these years/through eyes which once have wept with Christ's own tears", and Wagner's last opera, Parsifal, with its profound insights into the relationship between religious communities and sexual desire. Both operas acknowledge the debt which music owes to Christianity by bringing it back into the realm of secular music-making, and the consequence in the instrumentation of both scores is a remarkable glowing luminosity.

To gain a proper and complete understanding of what we call "classical" music is to appreciate that it was all written within the context of societies which were predominantly Christian in nature, and where celebrations of traditional national attributes were not seen as old-fashioned or backward-looking as they often are today. This all changed, however, in the 1960s, with the old moral authority of Christianity and nationalism brought into question by two World Wars which had slain "half the seed of Europe one by one", and the dawning of the sexual revolution. Liberated from the traditional restraints of Christian society, not least because of the oral contraceptive pill which spread rapidly throughout the world during the early 1960s, there was a sudden seismic shift in young people's behaviour and attitude towards sex, and one of its many consequences was the beginning of an era of "popular" music which gave expression to the new feelings which they could now experience and communicate publicly without shame or censure.

Let's be honest with ourselves: except for a few tangents here and there, the 1960s, 70s, 80s and 90s were overwhelmingly the decades of popular music. If you ask anyone their choice favourites from the 60s and 70s, only a tiny fraction will say Boulez and Stockhausen—and even they are just kidding themselves. Classical music did not enter a fantastic new period of experimentation and innovation in the 1960s. It died. What really took place was a repositioning of the psychological focus of music from the mature feelings of reflective adults to the more impatient and direct feelings of the young. With its "oohs" and its "aahs", its "come-ons" and its "get-downs", its "rock me" this and its "baby" that, the three-     minute pop song homes in on the cheap thrills of recreational sex. Popular music is primarily about the highs and lows of the casual relationship. Different popular songs capture the feelings of different stages along its rise and fall: the yearning for it to begin ("Love me do"), the exuberance and satisfaction of being in the relationship ("I feel fine"), the little jealousies involved within the relationship ("Tell me why") and the angst of the breakup ("I'll cry instead")—to name but a few early Beatles songs.

None of these remarks are intended to condemn popular music (I would far prefer to listen to a favourite track by Michael Jackson than suffer through another BBC Proms commission). What these observations do illuminate, however, is the connection between the profound changes which affected the regulation of our sexual conduct during the 1960s, and, at the very same time, the decline of enduring new classical works, and the explosion of popular music onto the cultural scene as a new expressive force. In a sense, popular music stole classical music's mojo. Of course, my analysis is a broad gesture that does not take jazz or minimalism into account—which provide, so to speak, a bridge between the world of classical and popular music—nor does it explain the many other popular styles which existed before the 1960s (although these themselves bear witness to a growing liberalism), but it nevertheless represents a key moment, and helps to demonstrate the gradual passing of the baton that took place in music as progressive societies entered modernity.

Musical modernism is what was left behind after the feelings which motivated the great classical composers had dissipated. What you are hearing in the dysfunctional harmony and unattractive groans of Harrison Birtwistle and his many imitators is a massive God-shaped hole, where once natural authority and faith resided. This is what "atonal" music really is: a loss of faith, and this is why anyone who counteracts its dominance is quickly condemned as "naive", in just the same manner as those who continue to hold religious convictions in a scientific age. It is what has led composers such as Robin Holloway to confess that "all we like sheep have dumbly concurred in the rightness of [Schoenberg's] stance; against the evidence of our senses and our instincts".

I would be the first to acknowledge the dramatic talents of Alban Berg, the brilliant textural instrumentation of György Ligeti or the accomplished musicianship of Thomas Adès, but what all these composers have in common—despite the stylistic differences and time which separate their work—is that lack of inspiration within the musical material itself which began with Schoenberg and persists to this day. They all suffer from that excruciatingly dreary, lifeless sound which turns audiences off for want of "a good tune" (even if this phrase doesn't quite capture what they mean), and which is why ultimately none of their music has entered the standard repertoire, or enjoys   anything near the popular recognition of the composers I listed earlier. It is why modern orchestras and opera houses suffer an endlessly commissioned conveyor-belt of "world premieres", forgotten the moment they see the light of day, and it is why the money is now finally starting to run out, with the state less willing to pay for it all and private patronage (for the obvious reason that it is unlovable) unwilling to fill the gap.

All the phoney "outreach projects", pseudo-pop fusions or desperate appeals to political correctness cannot halt this inevitable financial decline, and, with the copyright on composers like Rachmaninoff and Vaughan Williams due to expire soon, an already ailing publishing industry—which has colluded for far too long in maintaining the illusion that musical modernism was ever worth much—is going to have its coffers hit hard. A list of the most popular rental titles offered by the major music publisher Boosey & Hawkes as of 2012 bears this contention out, since none of the works in question was written after 1960, nor could any of them be remotely considered atonal:

1. Bernstein: Symphonic Dances From "West Side Story"
2. Bernstein: Overture to "Candide"
3. Mussorgsky/Ravel: Pictures at an Exhibition
4. Britten: The Young Person's Guide to the Orchestra
5. Rachmaninoff: Piano Concerto No. 2
6. Britten: Four Sea Interludes
7. Copland: Appalachian Spring Suite
8. Copland: Clarinet Concerto

With all this in mind, therefore, we can start to comprehend those rare instances since the 1960s where some classical music worthy of our attention has been produced, and we should not be surprised to see that they have sprung most prominently from a Christian setting—in particular, the great tradition of choral music which continues in the Oxbridge colleges and cathedrals across England. The best examples include John Tavener's outstanding setting of Blake's "The Lamb", early insightful glances into what a composer like George Benjamin might have been in his magnificent "Twas in the year that King Uzziah died", the admirable liturgical output of Judith Bingham and Judith Weir, and the success of those two wonderful choral works, "Sleep" and "Lux Aurumque" by Eric Whitacre, suffused with his distinctive brand of American televangelism. In addition, another often-forgotten backwater is the world of wind and brass music which, given its ties to the Royal Family, the armed forces and (particularly in the case of brass bands) its commitment to the great Christian hymn tune, has allowed composers like Edward Gregson and Kenneth Hesketh to sneak past a few nationalist contributions which contrast starkly with their usual "squeaky-gate" output. With its tuba trills and macho melodies, Gregson's "The Plantagenets" for brass band masterfully evokes the passions and chivalry of the old English kings, whereas Hesketh's youthful "Masque" and "Whirlegigg"—which enjoy international renown—are straight out of the military banding traditions of Vaughan Williams and Holst. What all of the above examples go to prove is that modern composers do still have it in them, when they are brave enough (or innocent enough) to try; however, these examples still exist on the periphery of the musical establishment, which, as Glare—a new opera presented by the Royal Opera House last November—amply demonstrates, remains stuck in a self-hating modernist rut.

Things might be about to change, however, and I think I can suggest a few reasons why this might be: popular music has run out of steam. The young know this (several students of mine have testified to its truth); they admit that even the best that is on offer these days—the chilly sounds of Coldplay or the Arctic Monkeys—cannot compete with the energetic exuberance of, say, Abba, and that so much that is pumped out of the radio is now empty commercialism.

This decline, I suspect, relates back to the ongoing liberalisation of societies which began in the 1960s. The overthrowing of Christian chastity and discrediting of nationalism went hand in hand with the rights revolutions, which improved the freedoms of non-white races, homosexuals and women, and these causes were also reflected in popular music: hence, "[It doesn't matter if you're] Black or White" by Michael Jackson, "I want to break free" by Queen, or "Eleanor Rigby" by the Beatles. During this period, the young had a lot to rebel against, and many just causes to champion. Now, however, it is fair to say that, in the West. social norms have been established which condemn any form of discrimination based on race, gender or sexuality, and so the young have very little real to rebel against anymore, and the motivations and feelings which inspired so much great popular music, and which pushed the old authority of classical music to one side, have now run dry.

Instead, what has crept into our institutions of late—particularly in education—is a systemic lack of leadership and authority. So, in conservatoires and music departments, nobody teaches harmony and counterpoint any more, although this, as explained above, is fundamental to all Western music. What has happened here is that the baby has been thrown out with the bath-water, and an overshooting liberal agenda has jettisoned all that was of value from the past, as well as those things which needed changing—as Steven Pinker has aptly put it, the rights revolutions have now entered their "decadent phase".

We have now reached a point, however, where the rot has gone so deep that we can no longer afford to maintain the lie that modernism was ever worth much—and not just because the money is running out. With the many subversive and insidious forces of globalisation beginning seriously to undermine the legitimacy of the nation state, and with Christianity under attack from a new liberal bigotry which has made expressing Christian sentiments all but taboo in much public life, what we need now are forms of culture that will help us to shore up these foundations. However, this is only possible if we allow leadership and authority back into our artistic institutions, if we take a suitably compassionate pride in our national identity, and, without any awkwardness or shame, have belief in the value and virtue of our  Judaeo-Christian roots.

GAGA


The Sound of Gaga: An Oscars Reinvention

By 
On October 28, 2013, Lady Gaga exited a hotel in London sporting eerily chalk-white skin, an eccentric pale dress made of flyaway ribbons, and a mess of a teased platinum mane. Just a day later, she emerged from a taping of The Graham Norton Show as an entirely different person. With her skin tone back to normal and gigantic black feathers protruding from her head, she treaded the streets of London as if they were a runway, strutting her stuff in dangerously tall shoes, billowing business pants, and a see-through shirt.
Lady Gaga’s shtick has always relied on reinvention.
The queen of the outrageous has never worn the same outfit twice. She has coated herself in ginormous, glittery sticks to resemble a star. She has hidden her whole body in piles of white fur. Shehas worn a blown-up version of the Mona Lisa (while Leonardo da Vinci rolled over in his grave). Shehas rocked the red carpet in a dress, hat, and shoes made from slabs of actual meat. She has slippedinto a frock made only of bubbles. The list goes on and on.
Her hair has changed as rapidly as her outfits have. Her ever-evolving locks have gone through a kaleidoscope of hues and styles—from spiky strands like a sea urchin’s tentacles, to a faded mix of white and gray and turquoise, to a pile of ringlets atop her head, to neon yellow tresses, to impossibly long locks grazing her thighs, to gray ropes in place of hair, to choppy and jet-black hair, to blue wavesto match her painted skin.
The music that has made Mother Monster famous mirrors the fashion decisions that have made her unforgettable. She rose to fame with jarring tunes, tunes that stick with you, tunes that have driving beats, tunes that rely heavily on techno sounds. In 2008 she released The Fame, which offered hits like “Just Dance” and “Poker Face.” The next year, she followed up with The Fame Monster, which gave the public the memorable pop ballad “Bad Romance.” Born This Way presented explosive numbers like “The Edge of Glory” and the titular track in 2011. And in 2013, Artpop asked for the “Applause” that her songs always receive.
But then something unanticipated happened. Lady Gaga reinvented herself, again. But unlike ever before.
After years and years of a career founded upon the concept of reinvention, Lady Gaga cooked up a new persona so bizarrely unsuited to her, so wildly un-Gaga that she managed to surprise even her most attentive fans. She stripped off the meat, simplified her style, and left the techno musical embellishments at the door. She gave up the glare and glamour of pop for the soothing, nostalgic melodies of the oldies. She returned to a simpler time, a time that makes perfect sense as the next stop on Mother Monster’s road to global domination because she so totally does not belong there.
Her unforeseen evolution began at the doorstep of Tony Bennett, a renowned jazz artist and 18-time Grammy Award winner who became famous in the 1950s for ditties like “Rags to Riches” and “Because of You.” Tony Bennett and Lady Gaga are the definition of an unlikely duo—an 88-year-old jazz great and a 28-year-old pop rule-breaker—but together, they are unstoppable. In 2014, the pair put out an album entitled Cheek to Cheek, filled with old-timey standards including “Anything Goes,” “It Don’t Mean a Thing,” and “I Can’t Give You Anything But Love.” Audiences approved heartily of Lady Gaga’s newest reincarnation of herself: Cheek to Cheek debuted in the first spot on the Billboard 200 and went on to win Best Traditional Pop Vocal Album at the Grammy Awards.
A playful music video for “The Lady is a Tramp” shows Lady Gaga adapting spectacularly well to the new role that she has given herself: She flirts charmingly with Tony Bennett, even dancing with him when the music moves her. She sings with an exquisitely pure voice that represents a departure from the electric tunes she recorded in years past. Though her cyan-blue hair reminds us that we are indeed looking at Lady Gaga, she dons a lacy black gown that is far classier than most of the outlandish duds that the public is used to seeing her wear.
Lady Gaga delivered the greatest proof of her remarkable reinvention at the 87th Annual Academy Awards on February 22, when she took to the stage to perform a medley of songs from The Sound of Music in honor of the beloved film’s 50th anniversary. When Scarlett Johansson leaned into the microphone and announced the act, the group that I was watching the show with audibly balked. Lady Gaga singing Julie Andrews’s songs? the crowd seemed to murmur. Who does she think she is?
Lady Gaga was singing for a highly skeptical audience. For one of the first times since she was a newcomer in the music industry, she had to prove herself. She had her reputation to lose (remember Carrie Underwood’s widely panned turn as Maria von Trapp?) if she did not approach the majesty of Julie Andrews in the eyes of millions of fiercely loyal Sound of Music fans across the globe. She did not have to risk this.
But she did.
Lady Gaga sang some of the most illustrious tunes ever composed—“The Hills Are Alive,” “My Favorite Things,” “Edelweiss,” and “Climb Every Mountain”—with a powerful, soaring voice that rivaled the originals. Her blonde hair and lovely gown reflected the gorgeous simplicity of her voice. Her raw talent stunned my incredulous group. As the performance progressed, the previously chatty crowd fell silent in awe, and by the time Julie Andrews walked onstage at the conclusion of the medley, there was not an un-dropped jaw in the room. When the screen legend gave her successor a hug, Lady Gaga’s brilliant transformation seemed to be complete.
But Lady Gaga’s reinvention represents more than just an inspired career move. By performing duets with Tony Bennett and honoring The Sound of Music, Lady Gaga has built an important creative bridge from the present to the past, reassuring us that the musical accomplishments of yesteryear will live on thanks to the millennial artists who breathe new life into old material. There is something beautiful and utterly delightful about watching musical legacies continue to thrive. Lady Gaga gave all of us a gift as she reinvigorated artistic masterpieces from the past.
So when Julie Andrews hugged Lady Gaga at the Dolby Theater, everyone’s favorite onscreen nun was not only patting the performer on the back for a job well done and a song well sung. Rather, she was passing the torch from one generation to the next, a sign to viewers young and old that the hills will always be alive with the sound of music that we all love so dearly.

National Pastime

Best Photos of the Day

Tom Wolfe

It's the birthday of journalist and novelist Tom Wolfe (books by this author), born in Richmond, Virginia (1931), the author of the books The Electric Kool-Aid Acid Test (1968), Radical Chic and Mau-Mauing the Flake Catchers (1970), and The Right Stuff (1979). He helped spark the "New Journalism" movement, which began in the 1960s.
He went to graduate school at Yale in the 1950s, and in the midst of the Red Scare wrote a thesis entitled The League of American Writers: Communist Organizational Activity Among American Writers, 1929-1942. He had a Ph.D., but rather than go into academia he decided to be a newspaper reporter. Then, in the early 1960s, there was a newspaper strike in New York City, and the paper he worked for was affected. He was out of a job for a while, and he decided to pitch an idea to Esquire magazine for a story about the hot-rod car culture around southern California.
The editor agreed, and Wolfe went out to L.A., hung around car shows, drag races, and demolition derbies, and ran up a $750 bill at a Beverly Hills hotel. He'd taken lots and lots of notes, but he couldn't figure out what the story should be or how to write it up -not even by the night before his magazine deadline. The editor told him to type up his notes, send them, and he'd go ahead and put together the story. Wolfe sat at his typewriter and banged out a letter to his editor with his ideas and observations. His editor liked it so much that he just removed the salutation ("Dear Byron") at the top and published Wolfe's notes as a feature article. The story was a huge hit and became the title piece in Wolfe's first published book, The Kandy-Kolored Tangerine Flake Streamline Baby (1965).
A few years later, he published The Electric Kool-Aid Acid Test (1968), a nonfiction novel about Ken Kesey and the Merry Pranksters. It became a cult classic almost right away. In chapter six, he writes about the bus the Merry Pranksters refurbished to drive around the country and convert people to their new religion:
I couldn't tell you for sure which of the Merry Pranksters got the idea for the bus, but it had the Babbs touch. It was a super prank, in any case. ... They started painting it and wiring it for sound and cutting a hole in the roof and fixing up the top of the bus so you could sit up there in the open air and play music, even a set of drums and electric guitars and electric bass and so forth, or you could just ride. Sandy went to work on the wiring and rigged up a system with which you could broadcast from inside the bus, with tapes over microphones, and it would blast outside over powerful speakers on top of the bus. There were also microphones outside that would pick up sounds along the road and broadcast them inside the bus. There was also a sound system inside the bus so you could broadcast to one another over the roar of the engine and the road. You could also broadcast over a tape mechanism so you said something, then you heard your own voice a second later in variable lag, and could rap off of that if you wanted to. . The painting job, meanwhile, with everybody pitching in a frenzy of primary colors, yellows, oranges, blues, reds, was sloppy as hell, except for the parts Roy Seburn did, which were nice manic mandalas. Well, it was sloppy, but one thing you had to say for it, it was freaking lurid.
In an essay published in 2007, Tom Wolfe argued that the newspaper industry would stand a much better chance of survival if newspaper editors encouraged reporters to "provide the emotional reality of the news, for it is the emotions, not the facts, that most engage and excite readers and in the end are the heart of most stories." He said there are exactly four technical devices needed to get to "the emotional core of the story." They are the specific devices, he said, "that give fiction its absorbing or gripping quality, that make the reader feel present in the scene described and even inside the skin of a particular character."
The four: 1) constructing scenes; 2) dialogue - lots of it; 3) carefully noting social status details - "everything from dress and furniture to the infinite status clues of speech, how one talks to superiors or inferiors ... and with what sort of accent and vocabulary"; and 4) point of view, "in the Henry Jamesian sense of putting the reader inside the mind of someone other than the writer."
In a couple of paragraphs in The Electric Kool-Aid Acid Test, written more than 40 years ago, he showcased his four chosen techniques in a description of the Merry Pranksters bus test run:
They took a test run up into northern California and right away this wild-looking thing with wild-looking people was great for stirring up consternation and vague, befuddling resentment among the citizens. The Pranksters were now out among them, and it was exhilarating - "Look at the mothers staring at us!" - and it was going to be holy terror in the land. But there would also be people who would look up from out of their poor work-a-daddy lives in some town, some old guy, somebody's stenographer, and see this bus and register delight, or just pure open-invitation wonder. Either way, the Intrepid Travelers figured, there was hope for these people. They weren't totally turned off.
The bus also had great possibilities for altering the usual order of things. For example, there were the cops. One afternoon the Pranksters were on a test run in the bus going through the woods up north and a forest fire had started. There was smoke beginning to pour out of the woods and everything. Everybody on the bus had taken acid and they were zonked. The acid was in some orange juice in the refrigerator and you drank a paper cup full of it and you were zonked. Cassady was driving and barreling through the burning woods, wrenching the steering wheel this way and that way to his inner-wired beat, with a siren wailing and sailing through the rhythm.
A siren? It's a highway patrolman, which immediately seems like the funniest thing in the history of the world. Smoke is pouring out of the woods and they are all sailing through leaf explosions in the sky, but the cop is bugged about this freaking bus.
He's the author of the novels The Bonfire of the Vanities (1987), A Man in Full(1998), and I am Charlotte Simmons (2004). He said, "The reason a writer writes a book is to forget a book and the reason a reader reads one is to remember it."