About Me

My photo
New Orleans, Louisiana, United States
Admire John McPhee, Bill Bryson, David Remnick, Thomas Merton, Richard Rohr and James Martin (and most open and curious minds)



Kafa, the birthplace of coffee, was a kingdom straight out of Rider Haggard

Revered as gods, its rulers would never touch food with their own hands, while their subjects prostrated themselves in the dust — and ate it

17 February 2018
9:00 AM
Where the Wild Coffee Grows: The Untold Story of Coffee, from the Cloud Forests of Ethiopia to Your CupJeff Koehler
Bloomsbury, pp.254, £18.99
The Philosophy of CoffeeBrian Williams
The British Library, pp.80, £18.99
For many of us, coffee is the lift that eases the load of our working day. Yet the sharpened mental focus it offers is rarely directed towards its origins. Coffee’s birthplace is Ethiopia and its beans remain high on caffeine aficionados’ hit lists. They produce smooth brews that carry an extraordinary range of tastes — variously, chocolate, wine, floral, spice and fruit. They have an extraordinary history too.
Jeff Koehler travelled extensively in Ethiopia and other coffee producing countries to research Where the Wild Coffee Grows. The arabica species of coffee tree, which yields the finest coffee, first appeared in Ethiopia’s south-western mountain rainforests. Brilliant red coffee cherries are still gathered by hand there from spindly trees scattered through the dank forests.
Most likely the cherries were first eaten, before the secret of roasting and crushing their beans was discovered. The rainforests were once part of the long lost kingdom of Kafa, where coffee drinking became prominent in local culture. And what a culture it was. Koehler’s account of Kafa’s history is a yarn to rival anything from H. Rider Haggard. For 400 years, right up to the late 19th century, Kafa’s rulers were revered as gods. The king never ate or drank with his own hands and, as an additional safeguard for his health, a young boy was selected annually for sacrifice. When the king passed by, his subjects prostrated themselves in fealty, while the lowest caste, the Manjo, ate the soil where they lay. Whether such practices assisted the kingdom’s legendary prowess in trade is unclear — there are limited pointers here for Brexit strategies — but there is no denying that Kafa grew rich off the back of slaving
and coffee.



Extreme Medicine by Kevin Fong, M.D.The human body constantly amazes us with its intricate, delicate and resilient organs. Here's a brief description of how oxygen comes in contact with the three million alveoli in our lighter-than-sponge lungs:

"The desire to breathe is among our most primitive urges. We're designed to draw air into our lungs, to exchange fresh oxygen for the waste gas of carbon dioxide. Our lives depend upon this perpetual to and fro of gases ...

"When we describe the path that oxygen takes from the outside world to its final destination in our mitochondria, we do so as though it has agency of its own. We talk of molecules of oxygen moving into our bodies, diffusing across membranes, arriving at mitochondria, almost as though they know where they want to go. But of course oxygen has no free will of its own. In the act of living, your body must solve the problem of how to grab molecules of this gas from the atmosphere and bundle them into cells in sufficient concentration that they can do the stuff of life.

"The first part of that performance is the act of breathing. Your ribs are attached to your breastbone at the front and the bony column that is your spine at the rear. At the end of each exhalation, they slope steeply downward toward the ground. Contracting the muscles in the chest wall that do the work of breathing lifts the ribs up, to a nearly horizontal position, increasing the volume of the chest. At the same time your diaphragm, the large dome-shaped muscle that separates the chest from the contents of your abdomen, contracts and drops down, further increasing the volume of the cavity inside your chest.

"Your lungs sit inside the cage formed by your ribs, adherent to the chest wall. As the chest moves, your lungs move with them. As the volume in your chest cavity increases, so too does that inside your lungs. The increase in volume leads to a decrease in pressure in your chest. That in turn produces suction, in exactly the same way as separating the handles on a bellows does, and air begins to flow.

"That air passes through your upper airways, the larynx, and the trachea, and then down into your bronchial tree. I always thought of that branching network of airways as inverted sprigs of broccoli rather than trees. In terms of morphology, that's not far off. There's a hollow central trunk that sprouts branches of ever decreasing caliber, at the very end of which are saclike structures called alveoli: the buds, if you like, at the end of that sprig of broccoli. The cadaveric lung, formalin-soaked in the medical school's dissecting rooms, is solid and heavy; its airspaces are occupied by pungent preservative fluid. But in life, air-filled lungs are lighter than sponge, light enough to float on water. ...

"That fine structure exists to provide a massive surface area over which air can be brought into contact with blood. The alveoli, those tiny air sacs at the end of the bronchial tree, are each no more than a fraction of a millimeter in diameter, but each lung holds one and a half million. If you were to unfurl them and lay them out flat, they would form a mat of tissue half the size of a tennis court at Wimbledon. That vast area is required to bring enough air into contact with enough blood to keep you alive.

"Over the surfaces of those alveoli runs a spiderlike network of capillaries, vessels with walls a single cell thick, providing just enough structure to confine the blood cells squeezing through them, while offering the minimum obstruction to the molecules of oxygen diffusing through their walls.

"This is the most delicate interface in your body. Nowhere else is the point of contact between your body and the material from the outside world more insubstantial or delicate. That is why it is buried deep in your chest and protected with a formidable cage of ribs."
To subscribe, please click here or text "nonfiction" to 22828.
Extreme Medicine: How Exploration Transformed Medicine in the Twentieth CenturyAuthor: Kevin Fong, M.D.
Publisher: The Penguin Press
Copyright 2012 by Kevin Fong, M.D.
Pages: 155-159

Doesn't get any better


When Books Read You, a Defense of Bibliomancy


by Ed Simon

Now, here’s something you might do if you see fit. Bring me the works of Virgil, and, opening them with your fingernail three times running, we’ll explore, by the verses whose numbers we agree on, the future lot of your marriage. For, as by Homeric lots a man has often come upon his destiny.
—Francois Rabelais, Gargantua and Pantagruel (1532)
It’s the fault of those physicists and that synchronicity theory, every particle being connected with every other; you can’t fart without changing the balance in the universe. It makes living a funny joke with nobody around to laugh. I open a book and get a report on future events that even God would like to file and forget. And who am I? The wrong person; I can tell you that.
—Philip K. Dick, The Man in the High Castle (1962)
Towards the end of 1642, or possible the beginning of 1643, but either way in the midst of a miserable winter of civil war, King Charles I found himself an often uncomfortable refugee in Oxford’s Bodleian Library. At that archetypal university, the king and his confidant Lucis Carrey, the Second Viscount Falkland, held a monotonous vigil. Unlike Cambridge, which lent its name to that other college town named in its honor by schismatic coreligionists across the ocean, and which would come to be known for its steadfast Puritanism and positivism, Oxford was all dreaming spires and medieval dragon chasing, a perfect shelter for a king making his last stand against Roundheads that were closing in. A map of England was brushed with Parliamentarian red, with Oxford a dab of royalist purple, and there King Charles consulted his leather bound volumes, reading by candle-light in cold rooms, hoping to find some perspective on his predicament. But the King did not consult Herodotus, or Thucydides, or Cicero, but rather Virgil. And while the founding of the Roman Republic may seem apt enough to read in a situation such as this, Charles and Falkland did not confer with Virgil’s ghost as political strategists, but rather as eager and credulous customers having their Tarot read. For Charles, the wisdom of Virgil was not literary, but rather magical.
Charles was many things, not least of which was a flip man, every bit as superficial as the dog breed which bears his name. And, as a stupid ruler, perhaps the stupidest thing he believed in was the literal divine right of kings. Despite his flippancy he was, however, a lover of books, and for that he deserves some modicum of our esteem; even if Milton was disgusted by Charles’ ignoring scripture as part of his parting speech upon the scaffold, opting rather to choose Pamela’s prayer from the Philip Sidney romance The Countess of Pembroke’s Old Arcadia. Even if he was an often juvenile and impetuous man, dissolving Parliament over perceived slights, or failing to account for the deep dissatisfaction among the growing middle class and the religious non-conformists of both London and wider England, he was still in some ways a brave man. Loyally standing by deeply unpopular figures like his Archbishop of Canterbury William Laud, Charles was the man who while quoting a bit of pop culture upon the executioner’s stage was also one whom requested an extra heavy wool shirt, lest his shivering in the cold English morning be interpreted by his enemies as trepidation.
But that was still in the future, and here, in this library towards the end of 1642 (or again, possibly at the beginning of 1643) the king, whom we have already established was prone to literary flights and the palliative of diversionary trifles, asked Falkland if he knew of any games adequate to passing the time here in the gloomy, dark, cold Bodleian? A one Dr. Welwood, in his account of the event, reports that Falkland “show’d among other Books, a Virgil nobly printed and exquisitely bound. The Lord Falkland, to divert the King, would have his Majesty make a trial of his fortunes.” Both would use the Roman epic as an oracle, they would present their questions to Virgil, and let the book function as a divination tool that would answer their inquiries, a trusted means of prophesizing that according to Welwood “everybody knows was an usual kind of augury some ages past.” With the type of ardor which only a bibliomaniac can muster, Charles turned to a book during what he assumed was the height of his political, military, and personal misfortunes; and with an affinity for superstition matched only by his belief that his very divine touch could cure sickness, Charles asked Virgil what his fortune would be.  Letting the cracked leather spine hit the dark wooden surface of some Oxford desk, and with his eyes closed the King pointed to some random line of Latin on some random page of Virgil. He did not like the fortune which had been caste. So he asked Lord Falkland to try casting his own lot, hoping for a better result. Falkland opened the volume at book 11, lines 150-157, which recounted the death of Evander’s son Pallas. The viscount would be dead by September of the following year (or, of course, September of that year if it was 1643), felled by a Roundhead’s bullet at the Battle of Newbury.
Fortuna’s wheel seems to have crushed Charles with an extra heaviness. John Aubrey records that a year before the King’s decapitation, Charles’ son then living in exile in Paris asked the metaphysical poet Abraham Cowley to divert his own sorrows, writing that his friend offered “if his Highnesse pleased they would use ‘Siortes Virgilianae,’” as the poet, of course, “alwaies had a Virgil in his pocket.” This time, instead of letting the book fall open, Cowley rather took a pin and pushed it into the soft pages of the Aeneid, the prick arriving at the proper prediction for the royal estate. Both father and son, as it turned out, arrived at the exact same line regarding the Stuart family fortunes.  What of Charles’ lot, and that of the prince, which so distressed both of them? Book 4 of the Aeneid, line 615, which is Dido’s prayer against her former lover, reading: “Nor let him then enjoy supreme command; / But fall, untimely, by some hostile hand.” In 1649 Charles would stand as upon the scaffold at Westminster, wearing his extra heavy shirt and quoting his Sidney, awaiting the regicide’s blade on his neck. Virgil may guide everyone to the truth, but that doesn’t mean that the truth will always set one free.
What Charles and the prince were so inadequately diverting their troubled minds with was a variety of divination known as bibliomancy, or the telling of the future with the aid of literature. Dr. Welwood was correct that the practice had an august history, across both east and west. One of many means of dubious divination, from the relatively well known such as tasseography (reading tea leaves) to the more obscure and thankfully extinct, like haruspicy (interpreting the organs, often livers, of sacrificed animals – popular in ancient Rome). All methodologies share a conception of meaning which is fundamentally different from the current dominant definition. While there is no shortage of people for whom astrologers, tarot, and palm readers offer some succor, it should be uncontroversial to note that the mainstream accepted definitions of “meaning” departs from the magical in favor of the observable, the empirical, and the measurable. Science, it should be said, is not just a different form of magic where the terminology has been altered, palm reading replaced with MRI machines and tea leaves with spectroscopy. No, the very model of what constitutes meaning is fundamentally different as well, because for Charles I and the prince, Falkland and Abraham Cowley, they understood meaning very differently from our current accepted norms. For them, the very world was pregnant with an enchanted, glowing significance. Weather indicated not just what storms might await in the future, but what the course of one’s life may be; anomalies and strange phenomenon were to be read as harbingers of future events, as one might identify foreshadowing in a novel written with planning and intention. Nothing was divorced from a wider, if nebulous, meaning.
What the Stuart king was distracting himself with was a form of rhapsodomancy, which is the use of poetry to ascertain the future, and even more specifically the previously mentioned Sortes Virgilianae, an example of what Virgil himself might have called “this dark technology of magic.” Other popular varieties of rhapsodamancy included the Sortes Homericae, which utilized the two epic poems of Homer, and the Sortes Sanctorum which, contrary to a prohibition against magic in Deuteronomy, used the Bible for fortune-telling. As bibliomancy, and rhapsodamancy, generally connotes the use of literature for divination, the methodology of said reading could be accomplished in different ways. One could write an assortment of poetic lines on scraps of parchment or wood and draw them as lots, as was common with the use of those weird sisters and their Sibylline prophecies in ancient Rome. Or, one could use some sort of “randomness generator,” such as a die, coins, or yarrow sticks as are used with the ancient Taoist I Ching. Readers of Philip K. Dick’s science fiction classic The Man in the High Castle (or viewers of its Amazon television adaptation) will be familiar with how the character Tagomi consults the almost 3,000 year old text for advice. Dick himself used the I Ching to generate the narrative of the novel, harnessing randomness in his writing, a method referred to as “aleatory composition.” This use of coins, and sticks, and dice can be contrasted to those methods in which the book is allowed to speak for itself (as it were), where a volume simply falls open so as to answer the questions posed to it, as with Charles at Oxford. This method relies on an innovation in information technology so ubiquitous that its radicalism may not be apparent: the codex. Simply the technical term for what we call a “book,” the codex was first used as a means of Roman record keeping, and arguably only began to supplant the scroll as the main method of literary transmission with the writing of the gospels, and then shortly thereafter with the roughly simultaneous, if divergent, Jewish and Christian canonization of certain texts as officially scriptural. A codex, by its very architecture (circumscribed by cover and back), enshrines certain presuppositions about writing: it makes a text discrete and separate, it turns the book into an individual. And an individual of course can have a question posed to it, with an expected answer. Yet the book’s prehistory of being a scroll endures in the malleability and interconnectedness which bibliomancy presumes, where literature endures as a type of electromagnetic force field which the auger can master. Perhaps modern technology will generate further innovations in the field of bibliomancy studies (as with the Twitter handle @BiblioOracle).
But regardless of the procedure, what makes bibliomancy fascinating is that unlike other forms of divination, it trades in something which already has an interpretable meaning – words. Perhaps a butcher can figure out the narrative that a sheep’s liver conveys, but that The Aeneid,as indeed all texts, has a meaning requires no suspension of disbelief, even if the meanings which are being derived seem far from authorial intention. What I find so interesting about bibliomancy is that it takes the written word, which again we all assent to as composing the very atoms of meaning, and it interprets those lines and sentences slant. Furthermore, it wrenches the very interpretive center of a given text away from the authority of the author who created it toward the service of the reader who consults the text as pilgrim at Delphi. Bibliomancy is thus a radical form of reading, one in which the reader themselves becomes a figure in the text’s narrative, for in presuming that The Aeneid (or Horace, or the Bible, or the I Ching) can predict our individual future, even obliquely, is to assume that we’re somehow encoded as characters in the text itself, like Moses reading of his own death in the Torah. A type of fantastic metafiction which breaks the cosmic fourth wall, finding our own fortunes hidden between the very letters in poems written millennia before we were born. Bibliomancy, in short, is one of the few opportunities in which the book is allowed to read you.
One could argue that such an approach cheapens literature, but I’d claim that far from a reduction such an approach rather widens interpretive possibilities, by making us partner in the creation of meaning. Note that in discussing an “approach” to bibliomancy, one need not embrace a literal belief that The Aeneid (or anything else) was actually predicting Charles’ moment upon the scaffold. Rather an openness to bibliomancy is simply to reaffirm literature as a vast, interconnected, endlessly mercurial field of potential where meaning is created by readers across centuries. While New Critics in the early part of the twentieth-century may have preferred to view poems as dead butterflies mounted by pin and preserved on cardboard, it’s true that the last three generations of literary theorists have been more amenable to the endlessly recursive play of literature. And yet for such a once profoundly popular practice, bibliomancy is under analyzed and under theorized. Under-used as well. I suggest that it’s time we take bibliomancy seriously, not because it’s literally an accurate means for ascertaining the future (though perhaps it’s a diverting trifle as you wait for Parliamentary troops to arrive), but because it resurrects an archaic yet valuable manner of approaching language which has been in eclipse since the birth of modernity. I’m not really concerned with whether bibliomancy literally works in casting our future (spoiler alert: I have my doubts), rather I am interested in its potential as both totem in demonstrating the flexible interconnectedness of literature, and as exercise in circumnavigating deficiencies in our own thought, by leaving some aspects of interpretation up to Fortuna.
As book critic Jessa Crispin writes in defending the creative possibility of Tarot, “the meaning of the cards comes from us.” That is fundamentally the case with any type of literary theurgy, whether its Tarot, automatic writing, or bibliomancy – they present a method of circumnavigating the conscious mind, drawing connections which might not be apparent to either logic or literalism. A venerable way of understanding how meaning is connected by that unseen, golden web of connotation that literary theorists with their jargony talk of “intertextuality” and “heteroglossia” only dimly approach. There is an infinite richness in how those in centuries past approached words (and by proxy the Word). The great scholar of Jewish mysticism Gershom Scholem described the process of connotative reading practiced by the thirteenth-century kabbalist Abraham Abulafia (who also, it should be noted, tried to convert the Pope to Judaism – example under the dictionary entry for “chutzpa”). Scholem explains that Abulafia read through “’jumping’ or ‘skipping.’” He elaborates that this is the “remarkable method of using associations as a way of meditation… Every ‘jump’ opens a new sphere… Within this sphere the mind may free associate. The ‘jumping’ unites, therefore, elements of free and guided association and is said to assure quite extraordinary results as far as the ‘widening of the consiousness.’” Bibliomancy can be used to generate new ideas, or as Cicero explains in his work on divination the truths that are generated “from mental excitement of some sort when the mind moves free and uncontrolled.” The brain thus liberated by chance, fate, and fortune. Thus fortuna becomes a means of elucidating novel connections, while simple surface literalism remains largely mute. This is a type of reading where all interpretation is as bibliomancy, something which the early twentieth-century avant-gardes from the Surrealists to the Dadaists understood well. For in recasting all books as spell-book, all literature as a potential grimoire, we accomplish not just clarification, but wonder as well.
Consider that both the Christian gospel of John famously begins with a declaration that “In the beginning there was the Word” and that Jewish kabbalah presumes the preexistence of the Hebrew alphabet before all else emanated forward in creation. Take the example of the Catalan Franciscan Ramon Llull, a thirteenth-century contemporary of Abulafia. Llull, a beatified not-quite-saint, is primarily remembered for his ars Magna, or “Great Art,” an intricate, baroque, labyrinthine method of calculation supposedly based on pure deductive axioms and in large part inspired by the work of Islamic logicians, and which Llull believed was an engine capable of answering any philosophical question posed to it. A type of tony, engineered bibliomancy. Llull’s ars Magna was manifested in an exceedingly odd book, whereby particular ideas had numeric assignations that could be combined in various permutations to arrive at novel conclusions (his work on combinatorics merits him as a potential “patron saint of computers”). Particularly unique in his manuscript were actual movable parts in the form of spinning paper wheels and gears, which could be manipulated in calculation. Central to Llull’s understanding was an orientation similar to Abulafia’s that the augury must “put the alphabet in this art in order to be able to make figures with it… for seeking the truth.” This mystical Franciscan saw the alphabet as the very ground of being upon which reality was constructed, and thus approaching literature is a means of ascertaining reality. That exact same view would be promulgated by an equivalently enigmatic figure some centuries later, when in the late sixteenth-century Elizabeth’s court astrologer John Dee made a name for himself on the continent as an initiate in occult and mystical arts. As with Abulafia and Llull, the alphabet (whether Hebrew, Latin, or the astral “Enochian” one of his own discovery) was that with which divinity generated existence. Dee explains his views concerning the hieroglyphic arts in a letter to that most exulted of esoteric monarchs, Maximillian II of the Holy Roman Empire, whose Prague court featured all manner of wonders, from dancing dwarves and prophets, and the astronomer Johannes Kepler to the golem’s father Rabbi Judah ben Lowe. Dee explained to the king that “writings on the alphabet contain great mysteries… since He who is the only author of all mysteries has compared himself to the first letter and the last.” For Abulafia, Llull, and Dee the universe was born not from a primordial atom, but from a primordial “A,” not from the breath of God but from his command, from the first Hebrew letter “Aleph.” Not as ridiculous as it first sounds – that the alphabet both underlines and constructs reality. Consider that the answer to any question you would ever have on any subject, that the composition of any narrative you could ever conceive from the most debased to the most beautifully exulted, that the correct description for any event that did, or could have happened, or the accurate prophecy for anything that will or might happen, actually already exists in print, albeit in disorganized form. All is answered in the very words of whatever dictionary is closest to you. The task of the writer, prophet, and bibliomancer is simply to put those words in the right order.
A pre-modern idea whose time may yet come again? Historian Stephan Gerson observes that “Like formulas and incantations, words had obtained therapeutic or magical powers… They embodied hidden verities and divine ideas and the essence of things and people. By the Renaissance, they brimmed with meaning and could modify the natural world and sometimes transcend the symbolic realm to become analogies of the cosmos.” Bibliomancy returns us to an almost mystical embrace of the transcendent ways in which interpretations permeate a reality pregnant with meaning. This view of how language operates – admittedly theological – is one that we must embrace if we’re to resuscitate what is instrumental about literature. And it’s a perspective which is the historical norm, across cultures and religions, only exorcised from our normal discourse upon the disenchantments of the world. Prosaic models of comprehending meaning understand it as a static kernel, but bibliomancic reading sees the text itself as sentient, as somehow conscious. Books are thus beings unto themselves, capable of defining their own role in the world; minds with agency capable of answering questions posed to them, of responding to queries beyond what’s literally printed on their pages. Scholar Michael Wood says as much in his cultural study of oracles when he writes that “The point is to ask us to think of the world as haunted by the divine, and to see how the divinity can talk to us through the world,” or as Walt Whitman put it in Leaves of Grass “I find letters from God dropped in the street, and every one is signed by God’s name.” When we think of meaning as diffuse and capable of actually conversing back with us, what results is an asynchronous theory of influence across time, evoking an observation I once heard a scholar of medieval literature make, that “One of the best interpreters of T.S. Eliot was Dante.” Bibliomancy abolishes the tyranny of time and the authoritarianism of age; books are liberated from being stationary and are transformed into full partners with whom we can converse. Whether or not we use novels and poetry to caste lots or not, bibliomancy is a reminder that literature itself is a breathing thing, and an ever mercurial one at that.
Both individual works of literature and all of collective Literature compose the primordial alphabet, in which all answers to all questions lay dormant, occasionally becoming loud enough that we can hear them – and bibliomancy reminds us of that majesty. The French poet Max Jacob, reflecting on the writings of Nostradamus, once claimed that “the Prophecies contain the universe,” but this is only partially correct – all language contains the universe, all dictionaries encapsulate reality, all literature holds the cosmos. Bibliomancy teaches that none of us are passive readers, for we are all characters within literature itself, and that no book is an island, for all that has been written is like a giant continent crossed with highways, and every book, every story, every poem, every line is not just permeated with meaning, but is conscious of it as well. Bibliomancy demonstrates that we’ve always been alive, hidden between the very words on the page. Even if it is superstition, it is also an exercise in literature’s imaginative potential, where British kings can be hidden in Virgil, and where we all may live in Whitman, or Emily Dickinson, or whatever dog eared volume we treasure, whose cracked spine points fate towards those particular passages which illuminate our own fortunes, where we reside before we’ve even breathed, where we live before we were even born. Whatever we call it, whether the alphabet of Abulafia and Lull, or what Leibnitz called the Monad, or Jorge Louis Borges termed the Aleph, that generative letter which shows us everything, from “the coupling of love and the modification of death” where one can see “the earth and in the earth the Aleph and in the Aleph the earth” and where the Argentinean writer could see “your face; and I felt dizzy and wept, for my eyes had seen that secret and conjectured object whose name is common to all men but which no man has looked upon – the unimaginable universe.”
Literature is the unimaginable and the imagined universe, and books are the oracles which allow us to look upon that kingdom; bibliomancy the liturgy that reminds us of that sacred truth. Bibliomancy, as Scholem might say, “liberates us from the prison of the natural sphere and leads us to the boundaries of the divine sphere.” And what then is life itself, except a type of performative aleatory literature? A related observation: literary critics are the last theists. For those who spend any time amidst the vagaries of plot, whether critic or novelist, often read the story of our lives as if it followed the dictates of narratology (or at least I find myself doing that), as if some cosmic Author penned that tale. That’s not to imply that what the royals holed up in the Bodleian were doing was anything like literary criticism, and yet any reading of literature must defer to the strange glow of meaning which accrues on poetic language. We’re meaning making creatures, and more than that we’re story-telling animals as the critic Jonathan Gottschall calls us, with brains evolved to find patterns, seeing faces in the pareidolia of mountains and clouds. We just as often interpret the texture of narrative in sequences of unrelated events. What is fascinating about bibliomancy, or any of the related methods of divination from literature, is that rather than looking at the green of a sheep’s liver or the random distribution of some chamomile grounds in a saucer, the initiate is interpreting something which unequivocally already has intentional meaning – but it births it anew. Bibliomancy returns us to the charged potential and innate weirdness of literature, of fiction and poetry, of the sacred origins of language itself. And so I ask myself, what is the critical potential of such a practice? And I pose that question to a dog-eared copy of Whitman’s Leaves of Grass and the book answers me “They are but parts…. any thing is but a part,” so as to confirm that all poetry is but a side on that infinite Monad, but a single letter in that eternal alphabet. And I ask myself, what hope can bibliomancy offers us in our own epoch of uncertainty, our anxious age, and I in turn deliver that query to The Complete Poems of Emily Dickinson and she, cryptically as any at Delphi or among the Sibyllines, answers with another interrogative “Who has Paradise” with no question mark, but only her eternally available em dash – signaling that all of literature is but a conversation that can never end, paradise always visible and yet never reached.

About the Author:
Ed Simon is the Editor-at-Large for The Marginalia Review of Books, a channel of The Los Angeles Review of Books. A regular contributor at several different sites, his collection America and Other Fictions: On Radical Faith and Post-Religion will be released by Zero Books this year. He can be followed on Facebook, at his author website, and on Twitter @WithEdSimon.


Steve and Cokie Roberts

Can Donald Trump be stopped short of the Republican nomination? Probably not. But the rational wing of the party has to try — quickly and forcefully — to make that happen.
The stakes are far too high for the rationalists to stay on the sidelines, and their first motive should be political self-interest.
“I think Trump would be a disaster,” says strategist Stuart Stevens, speaking for many pragmatic Republicans. With The Donald heading the ticket, pragmatists fear, the GOP couldn’t only lose the White House but also the Senate.
But there is a deeper reason, beyond partisanship, to stop Trump: He is one of the least qualified candidates ever to make a serious run for the presidency. If he is nominated by a major party — let alone elected — the reputation of the United States would suffer a devastating blow around the world.
He wouldn’t “make America great again.” He would make America weak again. He wouldn’t increase our power and influence; he would degrade it. That’s why the national interest requires a maximum effort to thwart Trump now. And while the task will be difficult, it isn’t impossible.
Trump’s performance has been remarkably consistent. He received 32.5 percent of the vote in the South Carolina primary (but 45 percent in Nevada). According to Real Clear Politics, his average in national polls is 34.2 percent.
In a fragmented field, that’s been enough to win. But what happens when the field narrows? Sure, Trump will attract some voters who supported other contenders, but he has been a prominent national figure for many years. People have had plenty of time to consider his credentials, and two-thirds of Republicans have consistently rejected him.
Look at South Carolina. Late deciders went heavily for other candidates; most Trump backers made up their minds months ago. That indicates he has both a strong following and a significant ceiling. So how can his opponents solidify the anti-Trump vote?
Clearly they don’t have many weapons. They can’t cut off campaign funds, because he is mainly spending his own money. And through his skillful use of TV and social media, Trump has effectively created his own platform, The Trump Network, to reach his followers directly and establish a campaign organization in cyberspace.
Marco Rubio is already attracting many donors and party leaders who had backed Jeb Bush, but those Bushies have painfully demonstrated that their checks and endorsements have limited value this year.
That leaves only one real way to block Trump: Convince enough voters that he would be a dangerous choice, both for the party and the country. One-third of the GOP seems immune to those blandishments, but that leaves plenty of other targets, and here are three lines of argument that might work.
One: Trump can’t win in November. Only 15 percent of South Carolina Republicans cited electability as their main motive, but of those voters, 4 out of 5 chose someone other than Trump.
In a recent AP poll, 60 percent of all registered voters expressed an unfavorable view of Trump and 54 percent said they definitely wouldn’t vote for him. That is hardly a great way to start a campaign, with a majority dead-set against you.
The death of Justice Antonin Scalia gives the anti-Trumpists added ammunition. Elections have consequences. Almost certainly, the first job of the new president will be to pick Scalia’s replacement, which will affect the court’s balance for a generation.
Two: Trump’s policies are deeply flawed. Some are totally unworkable and profoundly cynical, such as throwing out undocumented immigrants.
Others are truly damaging, such as igniting a trade war with China or barring Muslims from entering the country. That would hand ISIS a public relations coup and alienate the very allies we need to fight the jihadists.
Three: Trump lacks the character and temperament to be president. Princeton political scientist Fred I. Greenstein studied the last 12 presidents, and concluded that “emotional intelligence” was the most important quality in determining their success.
“Beware the presidential contender who lacks emotional intelligence,” he wrote. “In its absence, all else may turn to ashes.”
Americans want and need a president who will keep them safe and secure, who will meet a crisis with calm judgment and clear vision. Nothing in Trump’s background — absolutely nothing — remotely qualifies him to be that kind of leader. And his late-night tweets, often unfair and unhinged, only aggravate concerns about his stability.
That’s why Republicans of good will and good sense must try to stop him.

Two for One

Ash Wednesday | St Valentine’s Day

by Digitalnun on February 14, 2018
I was taken to task this morning for not mentioning St Valentine in my first tweet of the day, which is always a prayer tweet. I daresay  many will be celebrating him, or rather the popular romantic parody of him we have in the West, but 1.2 billion Catholics and millions of Reformed and Protestant Christians will be keeping today as a holy fast in honour of the Lord. We shall be doing our best to look cheerful, and many will be wearing a smudge of ashes on their foreheads as a reminder that we were created from dust and to dust we shall return.
With Ash Wednesday comes a wonderful freedom. Whatever we have decided to ‘do’ for Lent, we do with the joy of the Holy Spirit (RB 49.6). We are indeed ‘looking forward to holy Easter with joy and spiritual longing,’ as St Benedict says (RB 49.7). The particularities of our penances melt into insignificance beside the fact that the Lord has invited us to make a Lenten journey with him and to him. He has spoken to us the words of the prophet Hosea, ‘I will lead her into the wilderness, and there I will speak to her heart.’ All he desires is our love.
I was thinking about those words of Hosea and realised that, without being soppy or sentimental, the gift of Lent can be seen as a kind of Valentine from the Lord in which he reaffirms his infinite love for us, and we try to respond as fully as we can. We know that parts of Lent will be hard, that the penances the Lord sends us will be much more demanding than anything we have taken on ourselves, but we have faith and hope that the journey will lead us closer to him. So, be of good cheer. Ash Wednesday gives us a fresh start and the assurance that the Lord will never abandon us. Let us set out boldly in his footsteps.
If you wish to know more about Lent and some of its practices, you may find this link useful: http://www.benedictinenuns.org.uk/Additions/Additions/lent.html 

The Two Cultures Redux

The Intellectual War on Science

It’s wreaking havoc in universities and jeopardizing the progress of research
Kevin van Aelst for The Chronicle Review
The waging of a "war on science" by right-wing know-nothings has become part of the conventional wisdom of the intelligentsia. Even some Republican stalwarts have come to disparage the GOP as "the party of stupid." Republican legislators have engaged in spectacles of inanity, such as when Sen. James Inhofe, chair of the Committee on Environment and Public Works, brought a snowball to the Senate floor in 2015 to dispute the fact of global warming, and when Rep. Lamar Smith, chair of the House Committee on Science, Space, and Technology, pulled quotes out of context from peer-reviewed grants of the National Science Foundation so he could mock them (for example, "How does the federal government justify spending over $220,000 to study animal photos in National Geographic?").
Yet a contempt for science is neither new, lowbrow, nor confined to the political right. In his famous 1959 lecture "The Two Cultures and the Scientific Revolution," C.P. Snow commented on the disdain for science among educated Britons and called for a greater integration of science into intellectual life. In response to this overture, the literary critic F.R. Leavis wrote a rebuttal in 1962 that was so vituperative The Spectator had to ask Snow to promise not to sue for libel if they published the work.
The highbrow war on science continues to this day, with flak not just from fossil-fuel-funded politicians and religious fundamentalists but also from our most adored intellectuals and in our most august institutions of higher learning. Magazines that are ostensibly dedicated to ideas confine themselves to those arising in politics and the arts, with scant attention to new ideas emerging from science, with the exception of politicized issues like climate change (and regular attacks on a sin called "scientism"). Just as pernicious is the treatment of science in the liberal-arts curricula of many universities. Students can graduate with only a trifling exposure to science, and what they do learn is often designed to poison them against it.
The most frequently assigned book on science in universities (aside from a popular biology textbook) is Thomas Kuhn’s The Structure of Scientific Revolutions. That 1962 classic is commonly interpreted as showing that science does not converge on the truth but merely busies itself with solving puzzles before lurching to some new paradigm that renders its previous theories obsolete; indeed, unintelligible. Though Kuhn himself disavowed that nihilist interpretation, it has become the conventional wisdom among many intellectuals. A critic from a major magazine once explained to me that the art world no longer considers whether works of art are "beautiful" for the same reason that scientists no longer consider whether theories are "true." He seemed genuinely surprised when I corrected him.
The historian of science David Wootton has remarked on the mores of his own field: "In the years since Snow’s lecture the two-cultures problem has deepened; history of science, far from serving as a bridge between the arts and sciences, nowadays offers the scientists a picture of themselves that most of them cannot recognize." That is because many historians of science consider it na├»ve to treat science as the pursuit of true explanations of the world. The result is like a report of a basketball game by a dance critic who is not allowed to say that the players are trying to throw the ball through the hoop.
Many scholars in "science studies" devote their careers to recondite analyses of how the whole institution is just a pretext for oppression. An example is a 2016 article on the world’s most pressing challenge, titled "Glaciers, Gender, and Science: A Feminist Glaciology Framework for Global Environmental Change Research," which sought to generate a "robust analysis of gender, power, and epistemologies in dynamic social-ecological systems, thereby leading to more just and equitable science and human-ice interactions."
Students can graduate with only a trifling exposure to science, and what they do learn is often designed to poison them against it.
More insidious than the ferreting out of ever more cryptic forms of racism and sexism is a demonization campaign that impugns science (together with the rest of the Enlightenment) for crimes that are as old as civilization, including racism, slavery, conquest, and genocide.
This was a major theme of the Critical Theory of the Frankfurt School, the quasi-Marxist movement originated by Theodor Adorno and Max Horkheimer, who proclaimed that "the fully enlightened earth radiates disaster triumphant." It also figures in the works of postmodernist theorists such as Michel Foucault, who argued that the Holocaust was the inevitable culmination of a "bio-politics" that began with the Enlightenment, when science and rational governance exerted increasing power over people’s lives. In a similar vein, the sociologist Zygmunt Bauman blamed the Holocaust on the Enlightenment ideal to "remake the society, force it to conform to an overall, scientifically conceived plan."
In this twisted narrative, the Nazis themselves are somehow let off the hook ("It’s modernity’s fault!"). Though Critical Theory and postmodernism avoid "scientistic" methods such as quantification and systematic chronology, the facts suggest that they have the history backwards. Genocide and autocracy were ubiquitous in premodern times, and they decreased, not increased, as science and liberal Enlightenment values became increasingly influential after World War II.
To be sure, science has often been pressed into the support of deplorable political movements. It is essential, of course, to understand that history, and legitimate to pass judgment on scientists, just like any historical figures, for their roles in it. Yet the qualities that we prize in humanities scholars — context, nuance, historical depth — often leave them when the opportunity arises to prosecute a campaign against their academic rivals. Science is commonly blamed for intellectual movements that had a pseudoscientific patina, though the historical roots of those movements ran deep and wide.
"Scientific racism," the theory that races fall into a hierarchy of mental sophistication with Northern Europeans at the top, is a prime example. It was popular in the decades flanking the turn of the 20th century, apparently supported by craniometry and mental testing, before being discredited in the middle of the 20th century by better science and by the horrors of Nazism. Yet to pin ideological racism on science, in particular on the theory of evolution, is bad intellectual history. Racist beliefs have been omnipresent across history and regions of the world. Slavery has been practiced by every major civilization and was commonly rationalized by the belief that enslaved peoples were inherently suited to servitude, often by God’s design. Statements from ancient Greek and medieval Arab writers about the biological inferiority of Africans would curdle your blood, and Cicero’s opinion of Britons was not much more charitable.

Get the Chronicle Review Newsletter

Sign up to receive highlights from our magazine of ideas and the arts, delivered once a week.
More to the point, the intellectualized racism that infected the West in the 19th century was the brainchild not of science but of the humanities: history, philology, classics, and mythology. In 1853, Arthur de Gobineau, a fiction writer and amateur historian, published his cockamamie theory that a race of virile white men, the Aryans, spilled out of an ancient homeland and spread a heroic warrior civilization across Eurasia, diverging into the Persians, Hittites, Homeric Greeks, and Vedic Hindus, and later into the Vikings, Goths, and other Germanic tribes. (The speck of reality in this story is that these tribes spoke languages that fell into a single family, Indo-European.) Everything went downhill when the Aryans interbred with inferior conquered peoples, diluting their greatness and causing them to degenerate into the effete, decadent, soulless, bourgeois, commercial cultures that the Romantics were always whingeing about. It was a small step to fuse this fairy tale with German Romantic nationalism and anti-Semitism: The Teutonic Volk were the heirs of the Aryans, the Jews a mongrel race of Asiatics. Gobineau’s ideas were eaten up by Richard Wagner (whose operas were held to be re-creations of the original Aryan myths) and by Wagner’s son-in-law Houston Stewart Chamberlain (a philosopher who wrote that Jews polluted Teutonic civilization with capitalism, liberal humanism, and sterile science). From them the ideas reached Hitler, who called Chamberlain his "spiritual father."
Science played little role in this chain of influence. Pointedly, Gobineau, Chamberlain, and Hitler rejected Darwin’s theory of evolution, particularly the idea that all humans had gradually evolved from apes, which was incompatible with their Romantic theory of race and with the older folk and religious notions from which it had emerged. According to these widespread beliefs, races were separate species; they were fitted to civilizations with different levels of sophistication; and they would degenerate if they mixed. Darwin argued that humans are closely related members of a single species with a common ancestry, that all peoples have "savage" origins, that the mental capacities of all races are virtually the same, and that the races blend into one another with no harm from interbreeding. The University of Chicago historian Robert Richards, who traced Hitler’s influences, ended his book titled Was Hitler a Darwinian? (a common claim among creationists) with "The only reasonable answer to the question ... is a very loud and unequivocal No."
I mention the limited role of science in so-called scientific racism not to absolve the scientists (many of whom were indeed active or complicit) but because the movement deserves a deeper and more contextualized understanding than its current role as anti-science propaganda. Misunderstandings of Darwin gave scientific racism a boost, but it sprang from the religious, artistic, intellectual, and political beliefs of its era. If we think scientific racism is not just unfashionable but mistaken, it is because of the better historical and scientific understanding we enjoy today.
Recriminations over the nature of science are by no means a relic of the "science wars" of the 1980s and 1990s — when scientists and humanities scholars clashed over the nature of scientific truth — but continue to shape the role of science in universities. When Harvard reformed its general-education requirement in 2006-7, the preliminary report of the task force introduced the teaching of science without any mention of its place in human knowledge: "Science and technology directly affect our students in many ways, both positive and negative: they have led to life-saving medicines, the internet, more efficient energy storage, and digital entertainment; they also have shepherded nuclear weapons, biological warfare agents, electronic eavesdropping, and damage to the environment."
Kevin van Aelst for The Chronicle Review
Well, yes, and I suppose one could say that architecture has produced both museums and gas chambers, and that classical music both stimulates economic activity and inspired the Nazis. But this strange equivocation between the utilitarian and the nefarious was not applied to other disciplines, and the statement gave no indication that we might have good reasons to prefer understanding and know-how to ignorance and superstition.
Does the demonization of science in the liberal arts matter? It does, for a number of reasons. Though many talented students hurtle along pre-med or engineering tracks from the day they set foot on campus, many others are unsure of what they want to do with their lives and take their cues from professors and advisers. What happens to those who are taught that science is just another narrative like religion and myth, that it lurches from revolution to revolution without making progress, and that it is a rationalization of racism, sexism, and genocide? I’ve seen the answer: Some of them figure, "If that’s what science is, I might as well make money!" Four years later, their brainpower is applied to thinking up algorithms that allow hedge funds to act on financial information a few milliseconds faster, rather than to finding new treatments for Alzheimer’s disease or technologies for carbon capture and storage.
The stigmatization of science is also jeopardizing the progress of science itself. Today anyone who wants to do research on human beings, even an interview on political opinions or a questionnaire about irregular verbs, must prove to a committee that he or she is not Josef Mengele. Though research subjects obviously must be protected from exploitation and harm, the institutional-review bureaucracy has swollen far beyond this mission. Its critics have pointed out that it has become a menace to free speech, a weapon that fanatics can use to shut up people whose opinions they don’t like, and a red-tape dispenser that bogs down research while failing to protect, and sometimes harming, patients and research subjects. Jonathan Moss, a medical researcher who had developed a new class of drugs and was drafted into chairing the research-review board at the University of Chicago, said in a convocation address, "I ask you to consider three medical miracles we take for granted: X-rays, cardiac catheterization, and general anesthesia. I contend all three would be stillborn if we tried to deliver them in 2005." The same observation has been made about insulin, burn treatments, and other lifesavers.
The hobbling of research is not just a symptom of bureaucratic mission creep. It is actually rationalized by many bioethicists.
The hobbling of research is not just a symptom of bureaucratic mission creep. It is actually rationalized by many bioethicists. These theoreticians think up reasons that informed and consenting adults should be forbidden to take part in treatments that help them and others while harming no one. They use nebulous rubrics like "dignity," "sacredness," and "social justice." They try to sow panic about advances in biomedical research with far-fetched analogies to nuclear weapons and Nazi atrocities, science-fiction dystopias like Brave New World and Gattaca, and freak-show scenarios like armies of cloned Hitlers, people selling their eyeballs on eBay, and warehouses of zombies to supply people with spare organs. The University of Oxford philosopher Julian Savulescu has exposed the low standards of reasoning behind these arguments and has pointed out why "bioethical" obstructionism can be unethical: "To delay by 1 year the development of a treatment that cures a lethal disease that kills 100,000 people per year is to be responsible for the deaths of those 100,000 people, even if you never see them."
Ultimately the greatest payoff of instilling an appreciation of science is for everyone to think more scientifically. Cognitive psychologists have shown that humans are vulnerable to crippling biases and fallacies. Movements that aim to work around those biases and to spread scientific sophistication — data journalism, Bayesian forecasting, evidence-based medicine and policy, real-time violence monitoring, effective altruism — have a vast potential to enhance human welfare. But an appreciation of their value has been slow to penetrate the culture.
I asked my doctor whether the nutritional supplement he had recommended for my knee pain would really be effective. He replied, "Some of my patients say it works for them." A business-school colleague shared this assessment of the corporate world: "I have observed many smart people who have little idea of how to logically think through a problem, who infer causation from a correlation, and who use anecdotes as evidence far beyond the predictability warranted." A colleague who uses quantitative tools to study war, peace, and human security describes the United Nations as an "evidence-free zone":
The higher reaches of the UN are not unlike anti-science humanities programs. Most people at the top are lawyers and liberal-arts graduates. The only parts of the Secretariat that have anything resembling a research culture have little prestige or influence. Few of the top officials in the UN understood qualifying statements as basic as "on average" and "other things being equal." So if we were talking about risk probabilities for conflict onsets, you could be sure that Sir Archibald Prendergast III or some other luminary would offer a dismissive, "It’s not like that in Burkina Faso."
Resisters to scientific thinking often object that some things just can’t be quantified. Yet unless they are willing to speak only of issues that are black or white and to forswear using the words more, less, better, and worse (and, for that matter, the suffix -er), they are making claims that are inherently quantitative. If they veto the possibility of putting numbers to those claims, they are saying, "Trust my intuition." But if there’s one thing we know about cognition, it’s that people (including experts) are arrogantly overconfident about their intuition.
In 1954, Paul Meehl stunned his fellow psychologists by showing that simple actuarial formulas outperform expert judgment in predicting psychiatric classifications, suicide attempts, school and job performance, lies, crime, medical diagnoses, and pretty much any other outcome in which accuracy can be judged at all. His conclusion about the superiority of statistical to intuitive judgment is now recognized as one of the most robust findings in the history of psychology.
Data, of course, cannot solve problems by themselves. All the money in the world could not pay for randomized controlled trials to settle every question that occurs to us. Human beings will always be in the loop to decide which data to gather and how to analyze and interpret them. The first attempts to quantify a concept are always crude, and even the best ones allow probabilistic rather than perfect understanding. Nonetheless, social scientists have laid out criteria for evaluating and improving measurements, and the critical comparison is not whether a measure is perfect but whether it is better than the judgment of an expert, critic, interviewer, clinician, judge, or maven. That turns out to be a low bar.
Many humanities scholars are receptive to insights from science. But the highbrow police proclaim that they may not indulge such curiosity.
Because the cultures of politics and journalism are largely innocent of the scientific mind-set, questions with major consequences for life and death are answered by methods that we know lead to error, such as anecdotes, headlines, rhetoric, and what engineers call HiPPO (highest-paid person’s opinion). Many dangerous misconceptions arise from this statistical obtuseness. People think that crime and war are spinning out of control, though homicides and battle deaths are going down, not up. They think that Islamist terrorism is a major risk to life and limb, though the danger is less than that from wasps and bees. They think that ISIS threatens the existence or survival of the United States, though terrorist movements rarely achieve any of their strategic aims.
The dataphobic mind-set ("It’s not like that in Burkina Faso") can lead to real tragedy. Many political commentators can recall a failure of peacekeeping forces (such as in Bosnia in 1995) and conclude that they are a waste of money and manpower. But when a peacekeeping force is successful, nothing photogenic happens, and it fails to make the news. In her book Does Peacekeeping Work? (Princeton University Press, 2008),the Columbia University political scientist Virginia Page Fortna addressed the question in her title with the methods of science rather than headlines, and found that the answer is "a clear and resounding yes." Knowing the results of these analyses could make the difference between an international organization’s helping to bring peace to a country and letting it fester in civil war.
Take another life-or-death political question. Do campaigns of nonviolent resistance work? Many people believe that Gandhi and King just got lucky: Their movements tugged at the heartstrings of enlightened democracies at opportune moments, but everywhere else, oppressed people need violence to get out from under a dictator’s boot. The political scientists Erica Chenoweth and Maria J. Stephan assembled a data set of political-resistance movements across the world between 1900 and 2006 and discovered that three-quarters of the nonviolent resistance movements succeeded, compared with only a third of the violent ones. Gandhi and King were right, but without data, you would never know it.
Though the urge to join a violent insurgent or terrorist group may owe more to male bonding than to just-war theory, most of the combatants probably believe that if they want to bring about a better world, they have no choice but to kill people. Would anything change if everyone knew that violent strategies were not just immoral but ineffectual? It’s not that I think we should airdrop crates of Chenoweth and Stephan’s book into conflict zones. But leaders of radical groups are often highly educated, and even the cannon fodder often have had some college and absorb the conventional wisdom about the need for revolutionary violence. What would happen over the long run if a standard college curriculum devoted less attention to the writings of Karl Marx and Frantz Fanon and more to quantitative analyses of political violence?
One of the greatest potential contributions of modern science may be a deeper integration with the humanities. By all accounts, the humanities are in trouble. University programs are downsizing; the next generation of scholars is un- or underemployed; morale is sinking; students are staying away.
No thinking person should be indifferent to our society’s disinvestment in the humanities. A society without historical scholarship is like a person without memory: deluded, confused, easily exploited. Philosophy grows out of the recognition that clarity and logic don’t come easily to us, and that we’re better off when our thinking is refined and deepened. The arts are one of the things that make life worth living, enriching human experience with beauty and insight. Criticism is itself an art that magnifies the appreciation and enjoyment of great works. Knowledge in these domains is hard won and needs constant enriching and updating as the times change.
Diagnoses of the malaise of the humanities rightly point to anti-intellectual trends in our culture and to the commercialization of universities. But an honest appraisal would have to acknowledge that some of the damage is self-inflicted. The humanities have yet to recover from the disaster of postmodernism, with its defiant obscurantism, self-refuting relativism, and suffocating political correctness. Many of its luminaries — Nietzsche, Heidegger, Foucault, Lacan, Derrida, the Critical Theorists — are morose cultural pessimists who declare that modernity is odious, all statements are paradoxical, works of art are tools of oppression, liberal democracy is the same as fascism, and Western civilization is circling the drain.
With such a cheery view of the world, it’s not surprising that the humanities often have trouble defining a progressive agenda for their own enterprise. Several college presidents and provosts have lamented to me that when a scientist comes into their office, it’s to announce some exciting new research opportunity and demand the resources to pursue it. When a humanities scholar drops by, it’s to plead for respect for the way things have always been done.
To be sure, there is no replacement for the close reading, thick description, and deep immersion that erudite scholars can apply to individual works. But must these be the only paths to understanding? A consilience with science offers the humanities many possibilities for new insight. Art, culture, and society are products of human brains. They originate in our faculties of perception, thought, and emotion, and they accumulate and spread through the epidemiological dynamics by which one person affects others. Shouldn’t we be curious to understand these connections by tearing down academic silos and mining the sciences for insights about human nature that could illuminate culture and society? Both sides would win. The humanities would enjoy more of the explanatory depth of the sciences, as well as a forward-looking agenda that could attract ambitious young talent (not to mention appeal to deans and donors). The sciences could challenge their theories with the natural experiments and ecologically valid phenomena that have been so richly characterized by humanities scholars.
In some fields, this consilience is a fait accompli. Archaeology has grown from a branch of art history to a high-tech science. The philosophy of mind shades into mathematical logic, computer science, cognitive science, and neuroscience. Linguistics combines philological scholarship on the history of words and grammatical constructions with laboratory studies of speech, mathematical models of grammar, and the computerized analysis of large corpora of writing and conversation.
Comparable opportunities beckon in political theory, the visual arts, musicology, and literature, deepening John Dryden’s insight that a work of fiction is "a just and lively image of human nature, representing its passions and humours, and the changes of fortune to which it is subject, for the delight and instruction of mankind." And though many concerns in the humanities are best appreciated with traditional narrative criticism, some raise empirical questions that can be informed by data. The advent of data science applied to books, periodicals, correspondence, and musical scores has inaugurated the digital humanities, whose potential is limited only by the imagination.
The promise of a unification of knowledge can be fulfilled only if knowledge flows in all directions. Some of the scholars who have recoiled from scientists’ forays into explaining art are correct that these explanations have been, by their standards, shallow and simplistic. All the more reason for them to reach out and combine their erudition about individual works and genres with scientific insight into human emotions and aesthetic responses. Better still, universities could train a new generation of scholars who are fluent in each of the two cultures.
Although in my experience many artists and humanities scholars are receptive to insights from science, the policemen of highbrow culture proclaim that they may not indulge such curiosity. In a dismissive review in The New Yorker of a book by the literary scholar Jonathan Gottschall on the evolution of the narrative instinct, Adam Gopnik writes, "The interesting questions about stories ... are not about what makes a taste for them ‘universal,’ but what makes the good ones so different from the dull ones. ... This is a case, as with women’s fashion, where the subtle, ‘surface’ differences are actually the whole of the subject." But in appreciating literature, must connoisseurship really be the whole of the subject? An inquisitive spirit might also be curious about the recurring ways in which minds separated by culture and era deal with the timeless conundrums of human existence.
In 1782, Thomas Paine extolled the cosmopolitan virtues of science:
Science, the partisan of no country, but the beneficent patroness of all, has liberally opened a temple where all may meet. Her influence on the mind, like the sun on the chilled earth, has long been preparing it for higher cultivation and further improvement. The philosopher of one country sees not an enemy in the philosopher of another: he takes his seat in the temple of science, and asks not who sits beside him.
What he wrote about the physical landscape applies as well to the landscape of knowledge. In this and other ways, the spirit of science is the spirit of the Enlightenment.
Steven Pinker is a professor of psychology at Harvard University, and author, most recently, of  Enlightenment Now: The Case for Reason, Science, Humanism, and Progress (Viking), from which this essay is adapted.