About Me

My photo
New Orleans, Louisiana, United States
Admire John McPhee, Bill Bryson, David Remnick, Thomas Merton, Richard Rohr and James Martin (and most open and curious minds)

31.12.08

BRITON of the YEAR

Can you imagine a U.S. newspaper's ever naming a museum director as its American of the Year?



The London Times has just designated the British Museum's director, Neil MacGregor, as its "Briton of the Year." Barack Obama was the Times' Person of the Year. This is the first time that the Times has bestowed such honors.

Rachel Campbell-Johnston, chief art critic of the Times, lauded MacGregor as "a man who had managed---by what often felt like charm and enthusiasm alone---to turn a financial basket case back into a cultural jewel....By emphasising the importance of an international community of inquiry..., he has helped to create a global society that is quite separate from others that constantly get caught up in the squabblings of government and politics."

All true. But the Greeks, Egyptians, and Nigerians, "squabbling" (respectively) over the Elgin Marbles, Rosetta Stone and Benin bronzes, might beg to differ.

The Times profile provides some interesting personal details:


He appears to have little interest in the trappings of his position. He is the first director not to live in a grand apartment on the premises of the British Museum. He owns few paintings---he was spoilt by his time at the National Gallery, he jokes. He turned down a knighthood, though he does not discuss it. And he appears to find his satisfaction and reward in the simple fulfilment of a civic duty.

As for Obama, the Times declared him to be "unusual among modern presidents in coming from the Northern urban Left." I guess if you string together enough modifiers, anyone can be unusual. The key thing about Obama, though, from the American perspective, is that he "comes from" a wide variety of places.

2008 in Great Pictures

Check these photographs from the GUARDIAN for 2008.

2009

May the New Year bring blessings, prosperity, good health, peace and contentment to all.

30.12.08

2008 (It's All Here)

from REFDESK

AP: Top Ten Stories 2008
Amazon: Best Books of 2008
American Library Association: Best Books for Young Adults 2008
Boston.com: Year in Photographs 2008
Business Week: Best of the Web 2008
CNN: The Year in Review 2008
Chicago Tribune: Olympics Opening Spectacle 2008
Discover Magazine: Top 100 Stories of 2008
Ebert, Roger: Best Films 2008
Google: Year-end Zeitgeist for 2008
Infoplease: Key News Events of 2008
Lifehacker: Most Popular Free Windows Downloads of 2008
Lifehacker: Most Popular How-To Features of 2008
MSN: Year in Review 2008
MSNBC: Top Baby Names for 2008
Media Research Center: The Best Notable Quotables of 2008
Merriam Webster: Words of the Year 2008
NASA: Year in Review 2008
National Geographic: Top Ten Photos 2008
Popular Science: Best of What's New 2008
Smoking Gun: Mug Shots of the Year 2008
Technorati: Best Tags of 2008
Time Magazine:50 Best Websites of 2008
Time Magazine: Best Inventions of 2008
Time Magazine: The Top Ten Everything of 2008
Travel & Leisure: America's Best and Worst Airlines 2008
USNews: Best Careers 2008
USNews: Best Colleges 2008
USNews: Best Hospitals 2008
Yahoo: 2008 Year in Review
Yahoo: Top Ten Movie Trailers of 2008
Yahoo: Top Ten Products of 2008
Yahoo: Top Ten Searches of 2008
Yale Book of Quotations: Most Notable Quotes of 2008
More >>


2008 in SPORT

The year in sports: Believe the hype

2008 was a series of did you see thats that are destined to become do you remembers.
By King Kaufman



Years from now, 2008 will probably be remembered as the year of an economic collapse so severe that even the usually recession-proof world of North American sports felt it. The NFL laid people off. That doesn't happen most years.



But for most of 2008, living through it, even as housing prices fell and the recession gathered, the sports year didn't feel like the Year of the Crash. Most of 2008 seemed to be about big sports stories actually living up to their hype.



It started with the New England Patriots chasing an undefeated season. They'd ended 2007 by winning an epic regular-season finale over the New York Giants, and a month later lined up as heavy favorites in the Super Bowl against the same team. Giants quarterback Eli Manning engineered a late touchdown drive that gave New York a stunning victory.



The highlight, Manning spinning away from the grasp of the Patriots pass rush, sprinting to the sideline and heaving the ball downfield, where David Tyree trapped the ball against his helmet and hung on while the great safety Rodney Harrison wrestled with him, was the signature football moment of the year and, so far, of the century. It might have been the single greatest play in Super Bowl history.



Like that, is how 2008 was. A series of breathtaking did you see thats destined to become do you remembers.



Swimmer Michael Phelps set out to win eight gold medals in the Beijing Olympics and succeeded. His seventh gold, tying Mark Spitz's record for one Games, was in the 100-meter butterfly. Trailing badly at the turn and still behind Serbian Milorad Cavic one body length from the wall, he somehow made up the distance on the last stroke, touching one-hundredth of a second before Cavic.



And that wasn't even the most electrifying moment of the Games. That honor belonged, pun and all, to Jamaican sprinter Usain Bolt, who didn't just win the men's 100- and 200-meter gold medals, the first to do so since Carl Lewis in 1984, he did it in cartoonish, world-record-setting fashion and with a sparkling personality -- which drew fire from the International Olympic Committee's idiotic chieftain, Jacques Rogge.



Bolt was so rattled by Rogge's inanity that he went out and helped Jamaica win the four-by-100 gold, also in world-record time.



American Dara Torres became the first woman over 40 to swim in the Olympics and the first to swim in five of them, all the more remarkable because the five, dating to 1984, weren't consecutive. She won two silver medals in relays and another in the 50-meter freestyle, losing to gold medal-winner Britta Steffen by a Phelpsian hundredth of a second. "I'm thinking," she said afterward, "I shouldn't have filed my nails last night."



Overall the Olympics failed to live up to the hype in one good way. After a few American athletes were censured for arriving with masks on to filter out the pollution, fears of athletes being overcome by Beijing's horrible air quality were not realized. American television viewers, however, were nearly suffocated by NBC's ceaseless broadcasting of synchronized diving and beach volleyball.



In between, glimpses were caught of the U.S. men and women winning basketball gold, and the usual drama in the gymnastics arena. China dominated the men's competition and won the women's team all-around, but Americans Nastia Liukin and Shawn Johnson went gold-silver in the individual all-around, and Johnson, a darling of the pre-Games buildup, won gold on the balance beam.



The Euro 2008 soccer tournament lived up to its billing thanks in large part to a thrilling underdog run by Turkey, which staged dramatic comebacks against Switzerland, the Czech Republic and Croatia before falling to Germany in the semifinals. Spain, an exciting, attacking team -- two concepts often lacking in international soccer -- beat Germany for the cup.



Kansas guard Mario Chalmers hit a 3-pointer in the final seconds to cap a Jayhawks comeback against Memphis in the NCAA men's basketball Tournament Championship Game, forcing an overtime, which Kansas dominated for the title.



That finished off a Tournament that was outstanding even by its own high standards, with just enough upsets to make it interesting and a deep run by an exciting -- and underseeded -- No. 10, Davidson, but the best teams were left standing at the end. The Final Four was the first ever to feature all four top seeds.



In the women's Tournament, the sport's colossus, Tennessee, won yet another title, led by Candace Parker, the game's best player. Parker was taken first in the WNBA draft by the Los Angeles Sparks, scored 34 points in her first game, and went on to be named Rookie of the Year. Now that's living up to the hype.



The Boston Celtics and Los Angeles Lakers spent the 2007-08 season threatening to renew their old NBA Finals rivalry, and sure enough, this being the year that anticipation paid off, they did. The offseason acquisition of Kevin Garnett was the centerpiece of the Celtics' return to prominence. Boston started strong and never let up.



The Lakers had been fringe contenders for a while but became championship caliber when they made a one-sided trade for Memphis Grizzlies star Pao Gasol. Even without talented young center Andrew Bynum, who was injured during the season, the Lakers won the Western Conference behind Gasol and Kobe Bryant. But they proved too soft to be a match for the Celtics, who took the Finals in six games and won the title for the first time since 1986.



The Gasol trade sparked two answer trades in the West, longtime contenders trying to reload for another run by bringing in aging superstars. Shaquille O'Neal went to the Phoenix Suns, and Jason Kidd to the Dallas Mavericks. It didn't work out in either place. Not everything in 2008 lived up to the hype.



It just felt like it. Baseball's trading deadline, July 31, is usually a time of a million blockbuster rumors and a handful of minor deals. Oh, but this was 2008. In early July, C.C. Sabathia, the defending Cy Young Award winner, was dealt from the Cleveland Indians to the Milwaukee Brewers, and Rich Harden, who might have the best stuff in baseball when he's healthy, which isn't often, was sent from the Oakland A's to the Chicago Cubs.



And then, supposedly within seconds of the deadline, colorful slugger Manny Ramirez went from the Boston Red Sox to the Los Angeles Dodgers.



Harden pitched brilliantly as the Cubs, already in first place when the deal was made, reached the postseason. But it was Sabathia and Ramirez who led their new teams to the playoffs by playing so spectacularly well that they got serious consideration for postseason awards in the National League for two months' work.



A few weeks before those trades, Rafael Nadal and Roger Federer met in the Wimbledon men's final for the third year in a row. That one lived up to its billing and how. In an epic, see-saw, rain-delayed match that people who are paid to know about such things have called the greatest ever played, Nadal ended Federer's five-year run as Wimbledon champ.



The women's final took a far backseat, but it was as glamorous a matchup as women's tennis is capable of serving up: Venus Williams beat her younger sister Serena for her second straight Wimbledon title, her fifth overall.



The National Hockey League even lived up to its hype, kicking off the year with an outdoor game that easily overshadowed the now-meaningless New Year's Day college football bowl games.



The game, between the Pittsburgh Penguins and Buffalo Sabres in front of 71,000 people in Buffalo, was the first outdoor contest in the league in four years, and it was a humdinger, the Penguins winning 2-1 on a shootout, though that result, and the sloppy, snowy hockey that led to it, was secondary to the spectacle, which was magnificent.



In a show of brainpower that's often missing in the NHL, including five years ago, when the league failed to follow up on a similarly successful outdoor game in Edmonton, plans were made to do it again on New Year's Day 2009. The Chicago Blackhawks were to host the Detroit Red Wings at Wrigley Field, though unseasonably warm weather was threatening to delay the game.



The Red Wings won their fourth Stanley Cup in 11 years in 2008, beating the Penguins and the league's transcendent rising star, Sidney Crosby, in the Finals.



And then there was Tiger Woods. His legend would have been secure even if he'd bowed out of this year's U.S. Open with what he later revealed was a torn ligament in his knee and a broken tibia. Instead, he played on that bum leg and beat game journeyman Rocco Mediate in 19-hole playoff. It was a moment so grand, Woods would have been a legend if he'd done that and nothing else in his career. As it is, that win was just one more case of Tiger being Tiger, just one in a series of the hype coming true in 2008.



Of course it wasn't all greatness and wonder. It never is. While 2008 wasn't weighed down with scandal, tragedy and misbehavior like most recent years, it didn't escape those things either.



The year began in the shadow of the Mitchell Report, baseball's December 2007 accounting of the steroid era, which by January had become the story of the fall of Roger Clemens. The great pitcher decided against the "disappear and hope it blows over" strategy of Mark McGwire and Rafael Palmeiro and fought back hard against accusations that he'd used steroids.



Clemens traded barbs and lawsuits with his former personal trainer, Brian McNamee, and eagerly faced a congressional subcommittee. But the more he spoke, the less believable he sounded. Then a small-time country singer went public with her story that she'd carried on a long affair with the married Clemens. His pal and teammate Andy Pettitte admitted that the parts of McNamee's story that concerned Pettitte were true, which badly damaged Clemens' credibility. When the dust settled, Clemens' reputation was in ruins.



A lot of the year's most depressing stories were like that, holdovers from previous years.



The New England Patriots "Spygate" game-taping scandal oozed into the new year before fizzling out in the spring when former Patriots video assistant Matt Walsh admitted to NFL commissioner Roger Goodell that an alleged tape of the St. Louis Rams' pre-Super Bowl walkthrough from January 2002 did not exist.



Marion Jones was released from prison and went on Oprah Winfrey's TV show to apologize for her "mistake," the slight boo-boo of lying about her illegal activities to an admiring world for years. Indiana basketball coach Kelvin Sampson took a $750,000 buyout -- a year and a half's salary -- as punishment for his illegal calling of recruits.



The year's biggest controversy in sports wasn't directly about sports. It was the worldwide protests over human rights violations in China in the run-up to the Beijing Olympics. The traditional celebratory globe-trot of the torch became a tense security gantlet as grim-faced Chinese military security forces squared off against protesters.



At times it became comical as officials head-faked demonstrators and sneaked the torch through streets filled not with cheering fans but with bemused commuters. Is a parade still a parade if no one knows it's going on?



Once the Games began, a controversy broke out about Chinese officials lying about the age of some allegedly underaged female gymnasts. If it can be called a controversy when almost no one believed what China was saying.



As is usual with malfeasance and misbehavior committed not by relatively powerless individuals but by formidable entities, nothing came of it. That's the way to bet on the shenanigans surrounding the new Yankee Stadium in New York.



The Yankees shut down their historic 85-year-old ballpark in the Bronx this season amid much hullabaloo and prepared to move next door to a new park, one with a reported price tag of nearly $2 billion, including more than a half-billion in taxpayer subsidies, according to various outlets, including the Village Voice.



The Yankees allegedly pulled a fast one on the real estate assessment, telling the IRS the parkland under the new stadium was worth more than $200 million in order to qualify for a massive tax break, and telling the state of New York the land was worth only $21 million in order to keep from having to replace it with more parkland.



The press and public are not nearly as outraged about this as they are about the Yankees working within the rules of baseball to improve the team on the field. So far, all of the big free-agent prizes of the offseason -- Sabathia, pitcher A.J. Burnett and first baseman Mark Teixeira -- have signed with the Yanks, who have literally outspent the other 29 major league teams combined. The remaining marquee name, Ramirez, who grew up in New York City, has been the subject of a few rumors involving the Yankees, which have been denied by all sides.



The New York Mets are also moving into new digs in 2009, and while they made the big splash of the 2007-08 offseason, trading for pitcher Johan Santana, their results were the same: They collapsed down the stretch and missed the playoffs.



The Yankees are hoping to spend their way back into the postseason, which they missed for the first time since 1993. This turned out to be a season of the underdog as the Tampa Bay Rays, a doormat of a franchise for a decade, went from their habitual last-place finish in 2007 to the World Series in '08, where they lost to the Philadelphia Phillies, a doormat of much longer standing, having now won two championships in the last 29 seasons, and also two in the last 126.



The World Series ended with the completion of a suspended game, the first ever in Series history. Following two days of rain, the teams finally got together to play the last three innings of Game 5. It wasn't a classic of a Series but it ended up as a heck of a way to decide a championship.



Almost any way of deciding a championship would beat college football's Bowl Championship Series, which lunged through another year. LSU won the 2007 title in January by destroying Ohio State, and in the year between that game and the upcoming Florida-Oklahoma tilt for the '08 championship, the president of the University of Georgia and the president-elect of the United States both joined the chorus calling for a playoff or tournament to decide the victor.



Fat chance. ESPN signed a deal to televise the BCS bowl games through 2015, making any revisions before then unlikely. And with the juggernaut of the sports industry having a vested interest in the status quo, expect calls for change to become fewer and farther between in the media.



2008 had its share of departures. Some major figures died, among them Sammy Baugh, Pete Newell, Gene Upshaw and Buzzie Bavasi. Sports lost several great chroniclers this year, most prominently W.C. Heinz, Jim McKay, Skip Caray and a pair of good players who became much-loved broadcasters, Herb Score and Bobby Murcer.



One of the year's most poignant moments came at the end of the Kentucky Derby, when the filly Eight Belles collapsed with two broken ankles just after finishing second to Big Brown. The horse had to be euthanized. It was the second year in a row sports fans had to watch a popular American racehorse die. Barbaro, the 2006 Derby winner who was injured at the Preakness, died in January 2007.



There were calls for reform in the breeding and training of thoroughbreds in the wake of the Eight Belles incident, with industry critics saying the inbreeding of horses has led to equine physiology like that of Eight Belles, who, Sally Jenkins wrote in the Washington Post, "ran with the heart of a locomotive, on champagne-glass ankles for the pleasure of the crowd."



There were other kinds of departures as well, many caused by the economic upheaval that climaxed late in the year. The Arena Football League, hailed just a few years ago as the next big thing on the North American sporting scene, shut down.



So did EliteXC, a mixed martial arts circuit that was second tier but notable because an EliteXC event was the first MMA card broadcast on U.S. prime time network television. CBS showed a lackluster EliteXC card in May, lying to viewers that what they were seeing was the sport's big leagues.



Instead they were watching YouTube sensation Kimbo Slice, a Florida bouncer and street fighter, who beat a tomato can on cuts in the third round. A few months later, Slice was TKO'd in 14 seconds by a last-minute replacement fighter named Seth Petruzelli. EliteXC folded shortly after.



Not exactly dead but far more lamented are the Seattle SuperSonics, an NBA team that abandoned its home of 40 years for the -- greener? -- pastures of Oklahoma City, where the team now plays as the Thunder. Clay Bennett, an Oklahoma City businessman, and his partners bought the team in 2006 and immediately began lying about their intentions to move the team south.



Bennett and Co. dropped that charade fairly quickly, and it took them two years to escape their lease. The loss of the Sonics was the lowlight of a lousy year in the Emerald City. The baseball Mariners had their worst season in 25 years, finishing last in the American League. The football Seahawks had their worst season in 16 years, going 4-12 after a five-year run as NFC West champions.



And the Washington Huskies football team managed to do something that no other team in the NCAA's Bowl Subdivision -- formerly Division I-A -- did: They went 0-12.



Women's basketball fans in Houston lost their team and couldn't even hate on the city that took it. The Houston Comets, the team that won the first four WNBA championships and was home for Sheryl Swoopes and Cynthia Cooper, among other stars, folded. The Comets were disbanded by the league when owner Hilton Koch couldn't find a buyer.



That was the last in a series of blows for women's sports this year, the most notable of which was the twin retirements, one day apart, of golfer Annika Sorenstam and tennis player Justine Henin, arguably the best in the world at their respective sports.



Sorenstam, 37 when she announced that she would quit the LPGA tour at the end of the year, had slipped some from her peak, when she was the best female golfer of her generation, the rare woman who transcended her sport, famous for owning the women's tour and sometimes playing with the men. Back and neck problems had slowed her down, and she admitted to having lost some of the burning desire that had helped make her great. She has various business interests and has talked of starting a family.



In her last tournament, two weeks ago in Dubai, she led after two rounds before fading. She ended her career with a birdie, though she has said that might not be the end. She might return to competitive golf someday.



Henin's withdrawal, a day later, was far more shocking. She was two weeks shy of her 26th birthday and a few days from opening her defense of three straight French Open titles when she announced the immediate end of her career. She asked the WTA to remove her name from the rankings, making her the first woman ever to quit while ranked No. 1 in the world.



Though she'd reportedly spoken excitedly about the French Open and other upcoming major events mere weeks before, she said she had had enough of tennis. This month she was named a goodwill ambassador for the joint bid of the Netherlands and her native Belgium to host the 2018 soccer World Cup.



Henin acknowledged that it's hard to believe a 25-year-old superstar athlete at the top of her game would simply walk away, and she dropped the name of a 38-year-old superstar quarterback who then, and for months before, and for months after, was waffling about his own retirement.



Brett Favre, who had for years made an annoying habit of playing out the offseason will he or won't he retire drama to the hilt, made a veritable career of it in 2008.



After his usual couple of months of indecision, he announced in early March that he would retire from the Green Bay Packers. There were rumors throughout the spring that he might return, and in early July he asked the Packers to release him so he could sign with another team. The Packers refused, leading to a standoff of sorts.



There was talk of a trade to the rival Minnesota Vikings, and then charges by the Packers that the Vikings had tampered with Favre. Eventually, Favre petitioned for and was granted reinstatement to the league and actually reported to the Packers training camp, though he never suited up and the team sent him home.



A weary nation begged for release from the long nightmare of wall-to-wall Favre coverage -- one of the key moments in the summertime melodrama was an interview Favre gave to, of all people, Greta Van Susteren of Fox News -- and at long last, in early August, he was traded to the New York Jets for a bag of kicking tees and two tickets to "South Pacific."



It started well, with Favre showing flashes of his old self as the Jets won five straight at midseason to improve to 8-3. But since that eighth win, a thrashing of the then-undefeated Tennessee Titans, things have gone sour. The Jets lost four of their last five and missed the playoffs. Favre, complaining of a bum throwing shoulder, threw two touchdowns and nine interceptions down the stretch.



Thus as the NFL playoffs and a new calendar year begin, so begins another edition of will he or won't he, starring Brett Favre. Favre was to undergo an MRI on Monday, and if the news was bad from that, the drama this time might be a short one. If not, stay tuned.



One small consolation: Anything that looks like 2008 can't be all bad. It was a year that really lived up to the billing.

ART

ARTS & LETTERS' DENIS DUTTON

Charlie Allnut, the gin-swilling Canadian boat operator played by Humphrey Bogart in The African Queen, explains his drinking habits by saying, "It's only human nature." That doesn't satisfy the puritanical Rose Sayer, played by Katharine Hepburn. She answers: "Nature, Mr. Allnut, is what we are put in the world to rise above."

Denis Dutton, in his exhilarating new book The Art Instinct: Beauty, Pleasure and Human Evolution (Bloomsbury Press), comes down firmly on Rose's side. But while Rose sees humankind battling to escape its innate imperfections, Dutton outlines something grander and more complicated, the struggle of artists "to transcend even our animal selves" through their work. Evolution makes art possible by endowing humans with imagination and intellect. Art, in response, lifts us above the very instincts installed in our brains by evolution.

As 2009 approaches, let us set aside the great puzzle of 2008 ("Where did the money go?") and deal with a more pleasant question: "Why are we so crazy about the arts?" Why, for instance, did Toronto build, in the last four years, an opera house, two major museums, a conservatory and a ballet school, each of them risky and expensive? Speaking as a Torontonian, I appreciate the effort, but realize it wasn't done just to please me. This flurry of construction, like many such civic phenomena around the world, reflects an urgent need for the arts -- a need that became part of our personalities over many thousands of years.

We do all this, Dutton explains, because it's built into us. We have no choice.

Originally a Californian, Dutton is now professor of the philosophy of art at the University of Canterbury in Christchurch, New Zealand. He edits a learned journal, Philosophy and Literature, where he conducted a furious and much-publicized campaign against academics whose bad prose beats readers into submission just to prove "they are in the presence of a great and deep mind." More important, Dutton edits Arts & Letters Daily, a website that collates articles from everywhere on the planet and has become much more than its founders expected.

By shrewdly choosing the best material available, A&LD has emerged as the most useful intellectual magazine in the English-speaking world.

Dutton's interest in cultural evolution began in the 1960s when he was a Peace Corps volunteer in India. As a student he had absorbed (and partially accepted) the academic belief that cultures are so sealed off from each other that cross-cultural understanding is all but impossible; art is "socially constructed," the product of a certain time and place, nothing else. That suggests to many scholars that attempting to see connections between cultures amounts to a form of colonialism.

But in rural India, Dutton changed his mind. He discovered that the hopes, fears and vices of the Indians were altogether intelligible to a twentysomething graduate of the University of California Santa Barbara. And much of the cultural life of India was equally graspable. In Hyderabad he learned the sitar from a student of Ravi Shankar and found Indian music no more remote from Western music than 17th-century Italian madrigals are from the harmonies of Duke Ellington: "The lure of rhythmic drive, harmonic anticipation, lucid structure and divinely sweet melody cuts across cultures with ease."

How could this be? Were these cultures somehow connected at their roots?

In 1993 two Russian artists, Vitaly Komar and Alexander Melamid, organized a statistically impeccable survey of taste in 10 countries. They concluded that people from Iceland to China hold similar opinions about art: All express affection for landscapes, particularly landscapes dominated by blue, with water somehow involved. Melamid suggested that this implies that a blue landscape is genetically imprinted on humanity. It may be a paradise we all carry within us, he speculated. Perhaps "we came from the blue landscape and we want it."

Well, yes, says Dutton. In the Pleistocene era, the nomads who developed into people like us were (it's widely believed) living under blue African skies in savannas and woodlands. These protein-rich regions were good hunting grounds. Those who chose to inhabit that landscape had a "survival advantage." They prospered, had children, passed on their genes.

That process continued for a length of time that we find almost impossible to imagine -- about 1.6 million years, or 80,000 generations. In the extreme slo-mo theatre of evolution, the architecture of the mind developed. Countless minor choices, when rewarded by success, created impulses that live within us now.

Take, for instance, the universal obsession with storytelling. In all cultures (including the few remaining clusters of hunter-gatherers) narrative is an essential element. It's both a source of pleasure and a way to convey information. Those who had this inclination and talent in the Pleistocene era had a special "survival advantage." A nomad with a storyteller's imagination could weigh a group's travel plans, outlining a new territory's opportunities against its potential dangers. Storytelling, perhaps, began as a question of life and death. In detailing the complications that followed, Dutton demonstrates both his own poised scholarship and the infinite richness of the subject he's opening up.

And music? There's no obvious reason for it to exist, since the ability to perceive pitched sound provides in itself no contribution to survival. Dutton notes Charles Darwin's suggestion that musical tones and rhythm were part of courtship for our ancestors. And perhaps musical sounds were a way of inventing language. Dutton finds that plausible and suggests that music and dance also build "empathy, co-operation and social solidarity." He speculates that music, dancing, storytelling and other art forms "evolved specifically to strengthen the social health of hunter-gatherer bands."

The Art Instinct offers fresh and liberating ideas while demonstrating Dutton's profound sense of curiosity and his willingness to take risks while dealing with puzzling and largely fragmentary pre-history. He bluntly argues with fashionable theorists and the reviews of his book will not be uniformly favourable. Some will be offended and angry.

Whatever the critical response, the discussion of his book deserves to reach far beyond academics and people directly involved in the arts. His subject is the mysterious beginning of the cultural life that all of us, on whatever level of complexity, live. As he says, we resemble our distant ancestors in the way we share communion with other humans through art. "Our art instinct is theirs."

Cosmic Recall

ABHAY ASHTEKAR remembers his reaction the first time he saw the universe bounce. "I was taken aback," he says. He was watching a simulation of the universe rewind towards the big bang. Mostly the universe behaved as expected, becoming smaller and denser as the galaxies converged. But then, instead of reaching the big bang "singularity", the universe bounced and started expanding again. What on earth was happening?

Ashtekar wanted to be sure of what he was seeing, so he asked his colleagues to sit on the result for six months before publishing it in 2006. And no wonder. The theory that the recycled universe was based on, called loop quantum cosmology (LQC), had managed to illuminate the very birth of the universe - something even Einstein's general theory of relativity fails to do.

Einstein's relativity fails to explain the very birth of the universe
LQC has been tantalising physicists since 2003 with the idea that our universe could conceivably have emerged from the collapse of a previous universe. Now the theory is poised to make predictions we can actually test. If they are verified, the big bang will give way to a big bounce and we will finally know the quantum structure of space-time. Instead of a universe that emerged from a point of infinite density, we will have one that recycles, possibly through an eternal series of expansions and contractions, with no beginning and no end.

LQC is in fact the first tangible application of another theory called loop quantum gravity, which cunningly combines Einstein's theory of gravity with quantum mechanics. We need theories like this to work out what happens when microscopic volumes experience an extreme gravitational force, as happened near the big bang, for example. In the mid 1980s, Ashtekar rewrote the equations of general relativity in a quantum-mechanical framework. Together with theoretical physicists Lee Smolin and Carlo Rovelli, Ashtekar later used this framework to show that the fabric of space-time is woven from loops of gravitational field lines. Zoom out far enough and space appears smooth and unbroken, but a closer look reveals that space comes in indivisible chunks, or quanta, 10-35 square metres in size.

In 2000, Martin Bojowald, then a postdoc with Ashtekar at the Pennsylvania State University in University Park, used loop quantum gravity to create a simple model of the universe. LQC was born.

Bojowald's major realisation was that unlike general relativity, the physics of LQC did not break down at the big bang. Cosmologists dread the singularity because at this point gravity becomes infinite, along with the temperature and density of the universe. As its equations cannot cope with such infinities, general relativity fails to describe what happens at the big bang. Bojowald's work showed how to avoid the hated singularity, albeit mathematically. "I was very impressed by it," says Ashtekar, "and still am."

Jerzy Lewandowski of the University of Warsaw in Poland, along with Bojowald, Ashtekar and two more of his postdocs, Parampreet Singh and Tomasz Pawlowski, went on to improve on the idea. Singh and Pawlowski developed computer simulations of the universe according to LQC, and that's when they saw the universe bounce. When they ran time backwards, instead of becoming infinitely dense at the big bang, the universe stopped collapsing and reversed direction. The big bang singularity had truly disappeared (Physical Review Letters, vol 96, p 141301).

But the celebration was short-lived. When the team used LQC to look at the behaviour of our universe long after expansion began, they were in for a shock - it started to collapse, challenging everything we know about the cosmos. "This was a complete departure from general relativity," says Singh, who is now at the Perimeter Institute for Theoretical Physics in Waterloo, Canada. "It was blatantly wrong."

Ashtekar took it hard. "I was pretty depressed," he says. "It didn't bode well for LQC." However, after more feverish mathematics, Ashtekar, Singh and Pawlowski solved the problem. Early versions of the theory described the evolution of the universe in terms of quanta of area, but a closer look revealed a subtle error. Ashtekar, Singh and Pawlowski corrected this and found that the calculations now involved tiny volumes of space.

It made a crucial difference. Now the universe according to LQC agreed brilliantly with general relativity when expansion was well advanced, while still eliminating the singularity at the big bang. Rovelli, based at the University of the Mediterranean in Marseille, France, was impressed. "This was a very big deal," he says. "Everyone had hoped that once we learned to treat the quantum universe correctly, the big bang singularity would disappear. But it had never happened before."

Physicist Claus Kiefer at the University of Cologne in Germany, who has written extensively about the subject, agrees. "It is really a new perspective on how we can view the early universe," he says. "Now, you have a theory that can give you a natural explanation for a singularity-free universe." He adds that while competing theories of quantum gravity, such as string theory, have their own insights to offer cosmology, none of these theories has fully embraced quantum mechanics.

If LQC turns out to be right, our universe emerged from a pre-existing universe that had been expanding before contracting due to gravity. As all the matter squeezed into a microscopic volume, this universe approached the so-called Planck density, 5.1 × 1096 kilograms per cubic metre. At this stage, it stopped contracting and rebounded, giving us our universe.

The pre-existing universe was squeezed into a microscopic volume
"You cannot reach the Planck density. It is forbidden by theory," says Singh. According to Bojowald, that is because an extraordinary repulsive force develops in the fabric of space-time at densities equivalent to compressing a trillion solar masses down to the size of a proton. At this point, the quanta of space-time cannot be squeezed any further. The compressed space-time reacts by exerting an outward force strong enough to repulse gravity. This momentary act of repulsion causes the universe to rebound. From then on, the universe keeps expanding because of the inertia of the big bounce. Nothing can slow it down - except gravity.

LQC also illuminates another mysterious phase of our universe. In classical cosmology, a phenomenon called inflation caused the universe to expand at incredible speed in the first fractions of a second after the big bang. This inflationary phase is needed to explain why the temperature of faraway regions of the universe is almost identical, even though heat should not have had time to spread that far - the so-called horizon problem. It also explains why the universe is so finely balanced between expanding forever and contracting eventually under gravity - the flatness problem. Cosmologists invoke a particle called the inflaton to make inflation happen, but precious little is known about it.

Cosmic recall
More importantly, even less is known about the pre-inflationary universe. Cosmologists have always assumed that they could ignore quantum effects and regard space-time as smooth at the onset of inflation, as general relativity requires. This had always been an educated guess - until now. LQC shows that at the time inflation begins, space-time can be treated as smooth. "This is not an assumption any more," says Singh. "It's actually a prediction from loop quantum cosmology."

The models developed by Ashtekar, Singh, Bojowald and Pawlowski represent an enormous step forward. This is the first time that a theory is able to make predictions about what was happening prior to inflation, while correctly predicting what happens post-inflation. "To do both of these things at the same time has been difficult," says Ashtekar.

If the universe we inhabit emerged from a previous cosmos, can we know something about the universe that preceded ours? LQC simulations show that it too would have had stars and galaxies. But opinions differ when it comes to the quantum phase just before and after the big bounce, when it is impossible to pin down the volume of the universe due to quantum fluctuations. Bojowald's calculations show that some of the information about the earlier universe is wiped out as it goes through the big bounce. In other words, there is no "cosmic recall" (Nature Physics, vol 3, p 523). In contrast, another detailed analysis done by Singh and Alejandro Corichi, of the Autonomous National University of Mexico in Michoacán, suggests otherwise (Physical Review Letters, vol 100, p 161302).

Ashtekar likens the spirited spat among his former postdocs and students to watching his children squabble. "It's much ado about nothing," he says. Though arguments about the universe possibly having a cosmic recall may be of philosophical interest, they are premature. "We should be worrying about making contact with experiments today."

That day may be near. The researchers' first target is the cosmic microwave background (CMB), a radiation released long after the universe's quantum phase. Even though the CMB originated 370,000 years after the big bang, its seeds were laid out much earlier, says Bojowald. "That could be a period when quantum gravity effects might play a role."

Bojowald has discovered that such effects would have dominated when, according to LQC, the universe went through a short phase of accelerated expansion before the onset of inflation. Dubbed superinflation, it occurred due to the immense repulsive forces of the high-density quantum universe rather than the presence of inflatons. Exactly how this phase might affect the CMB is unclear, but already there are hints that LQC might predict something different from classical cosmology. "This is what we are going to work on in the next two years. We are going to find robust predictions," says Singh.

Meanwhile, Ed Copeland of the University of Nottingham, UK, and his colleagues have shown that superinflation can produce the kind of quantum fluctuations in the fabric of space-time that eventually became seeds for the formation of galaxies and clusters of galaxies. This suggests that superinflation might make inflation unnecessary, thus removing what has essentially always been an add-on to standard cosmological theory. It is early days for superinflation, though, because it cannot yet solve the horizon and flatness problems that inflation so elegantly resolves.

Copeland says that future experiments might reveal whether our universe underwent inflation or superinflation by looking for a pattern of gravitational waves that only inflation could have created. These ripples in the fabric of space-time would have polarised the CMB, though the effect is too faint for today's instruments to detect. Things might change next year, however, when the European Space Agency launches the Planck satellite, promising the most detailed view of the microwave background to date. Copeland's work suggests that superinflation would suppress the production of gravitational waves at cosmological scales, and that there would be no such imprint in the CMB. "If you do detect them, it would probably count against LQC," he says.

Kiefer cautions that all the predictions of LQC are subject to one big caveat. The predictions of classical cosmology come from solving the equations of general relativity, albeit with certain simplifying assumptions about the universe. Ideally, LQC should be put on the same footing - all its equations should be derived from loop quantum gravity. Instead, Bojowald and others obtained LQC by starting with an idealised universe derived from general relativity and then using techniques from loop quantum gravity to quantise gravity in the model. "From a physicist's point of view, it is fully justified," says Kiefer. "Mathematicians perhaps would not be amused."

Rovelli agrees. To put LQC on a firmer foundation, he and his colleague Francesca Vidotto have been working to reconcile it with loop quantum gravity (www.arxiv.org/abs/0805.4585v1). "The conclusion is very positive," says Rovelli. "We are able to recover the equations of LQC, starting with something much closer to loop quantum gravity."

No wonder Rovelli is looking forward to upcoming experiments that could vindicate the theory. "I hope before dying to know whether loop quantum gravity is correct or not," he says. For a man who turned 50 only recently, he is being unduly pessimistic. A raft of experiments, of which Planck is only the first, will soon be measuring the CMB and looking for gravitational waves. A revolution in our notions of how our universe began may be closer than he thinks.

Will our universe bounce?
According to the big bounce picture formulated by theoretical physicist Abhay Ashtekar and others, the cosmos grew from the collapse of a pre-existing universe. Will the same fate await us?It depends. We used to think that the universe was dominated by the gravity of its stars and other matter: either the universe is dense enough for gravity to halt the expansion from the big bang and pull everything back, or else it isn't, in which case the expansion would carry on forever. However, observations of distant supernovae in the past 10 years have challenged that view. They show not just that the universe is expanding, but also that the expansion is speeding up due to a mysterious repulsive force that cosmologists call "dark energy". So if the universe fails to contract, has it already bounced its last bounce?Perhaps not. Cosmologists are still very much in the dark about dark energy. Some theoretical models speculate that the nature of dark energy could change over time, switching from a repulsive to an attractive force that behaves much like gravity. If that happens, the universe will stop expanding and the galaxies will begin to rush together. A question mark also hangs over the universe's matter and energy density, which we have not measured with sufficient accuracy to be sure that the universe will not eventually stop expanding. If it turns out to be a smidgen greater than current observations, then it is a recipe for cosmic collapse.According to the big bounce, in both scenarios the universe will eventually collapse until it reaches the highest density allowed by the theory. At this point, the universe will rebound and begin expanding again - the ultimate in cosmic recycling.

29.12.08

Wounded Knee

Today is the anniversary of the massacre at Wounded Knee, which took place in South Dakota in 1891. Twenty-two years earlier, the local tribes had signed a treaty with the United States government that guaranteed them the rights to the land around the Black Hills, which was sacred land.

But in the 1870s, gold was discovered in the Black Hills, and the treaty was broken. People from the Sioux tribe were forced onto a reservation, with a promise of more food and supplies, which never came. Then in 1889, a prophet named Wovoka, from the Paiute tribe in Nevada, had a vision of a ceremony that would renew the earth, return the buffalo, and cause the white men to disappear. This ceremony was called the Ghost Dance. The Ghost Dance scared the white Indian Agents, and they moved in to arrest Chief Sitting Bull, who was killed in the attempt.
The next leader they focused on was Sitting Bull's half-brother, Chief Big Foot. He was leading his people to the Pine Ridge reservation, seeking safety there. But it was winter, 40 degrees below zero, and he contracted pneumonia.
Big Foot was sick, he was flying a white flag, and he was one of the leaders who had actually renounced the Ghost Dance. But the Army didn't make distinctions. They intercepted Big Foot's band and ordered them into the camp on the banks of the Wounded Knee Creek.
The next morning, federal soldiers began confiscating their weapons, and a scuffle broke out between a soldier and an Indian. The federal soldiers opened fire, killing almost 300 men, women, and children, including Big Foot.
One of the survivors was the famous medicine man Black Elk, who told his story to John Neihardt in Black Elk Speaks (1932).

The Coming Collapse of China

http://www.forbes.com/opinions/2008/12/16/china-economic-reform-oped-cx_gc_1216chang.html

28.12.08

L.A. Style 2008

The 2008 Image Index: winners and losers

What worked and what didn't? Think 'Mad Men' and mustaches, Britney Spears and Rachel Maddow on the bright side, and celebrity clothing lines and plastic surgery on the downside.

December 28, 2008

Image is a slippery creature, and one month's style sensation is the next month's tasteless trend. For those keeping score, we offer the 2008 Image Index, a sliding scale of the people, places, ideas and trends that moved up -- and down -- the pop culture barometer this year.

UP

MARC JACOBS

That last stint in rehab in March 2007 really must have stuck. Not only is Jacobs in top physical shape (see the January issue of Harper's Bazaar, where a graffitied Jacobs poses nude with a Stephen Sprouse for Louis Vuitton Neverfull bag), but his multi-culti mash-up collections for Spring '09 were also among the season's best. And those runway shows, which used to start as much as two hours late, have been right on time. At last, he's earned his title as the most influential American fashion designer.

'MAD MEN'

Smart, psychologically compelling plots that revolve around pencil skirts, double martinis and afternoon trysts made us wish for a bigger flat screen. The addictive show also elevated the aesthetics of workplace attire to include sweater sets for women and ties for guys. Meet you at the water cooler to discuss last night's episode?

THE 'STACHE

Not since Burt Reynolds popularized thick upper-lip fringe has the mustache been so hot, thanks to Brad Pitt's latest look. But really, he's no facial hair visionary. Shia LaBeouf showed some fuzz in "Eagle Eye" earlier this year, and a coterie of 'stache-ophiles meet downtown monthly for Mustache Mondays.

RACHEL MADDOW

How refreshing to see a TV pundit -- and Rhodes scholar -- who's loyal to her signature style. Rachel Maddow, 35, waxes poetic about politics and takes on the GOP on her daily eponymous show on MSNBC. This month she appears in Vogue, and the self-described butch lesbian didn't succumb to a glitzy glam makeover; she's wearing a Jil Sander suit and black Converse high-tops.

UNISEX-Y

East Side guys in skinny jeans. Shrunken blazers with suspenders on gals. Gender-bending fashion was an equal-opportunity trend, and Fall Out Boy pop star Pete Wentz -- who reportedly prefers DKNY women's denim -- even launched a unisex line in April. Hey, we're all for any fad that opens up new closets.

BRITNEY SPEARS

At the beginning of the year she was a mess, having just lost custody of her sons and been hospitalized for mental-health issues. But all that melted away, thanks to a few well-chosen TV appearances, an MTV documentary, her new hit album "Circus" and a European comeback tour. Pop music's favorite fashion train wreck is blond again, and looking better than she has in years, thanks in part to über-stylist Marjan Malakpour, who has been dressing her onstage and in photo shoots. We're even digging Brit's new ringmaster persona.

STONERS

This year the chemical bromance "Pineapple Express" (named for a potent strain of herb) took in $89 million at the multiplex, we got hooked on Showtime's "Weeds" and its dope-dealing soccer mom, and Cheech and Chong reunited for a sold-out reunion tour. When Jonathan Adler is dealing $68 pot-leaf motif Hashish candles and the cast of the 2009 weed-themed flick ("Leaves of Grass") includes Susan Sarandon, Richard Dreyfuss and Edward Norton, you know the stoner's stock is high.

OLDER MODELS

There were so many '90s supermodels gracing the pages of fashion magazines this fall, you'd suspect a George Michael and RuPaul comeback tour was in the works. Linda Evangelista wore Prada's lace head to toe, Naomi Campbell looked as lithe as ever in the YSL ads, Christy Turlington gave her yoga togs a rest and posed for Escada, and Claudia Schiffer was the season's most ubiquitous face as the main mannequin for Chanel and Ferragamo. All are well into their late 30s and early 40s, and in an industry where 24 is over the hill, it's refreshing to see that getting older is actually in fashion.

GADGET MAKEUP

Vibrating mascara. Super lip-plumping balm. Wrinkle-filling foundation. High-tech makeup scored with customers looking for a fresh variety of magic in a tube or bottle -- and an alternative to plastic surgery's needles and knives.

CHICAGO

The Second City stepped up to No. 1 when Sen. Barack Obama took the stage in Grant Park for his election night victory speech. Thanks to Michelle Obama, local designer Maria Pinto's star has risen too, along with that of upscale Chicago retailer Ikram Goldman, who's been unofficially helping the future first lady in the fashion department. How good is the city's standing? When huge-haired Illinois Gov. Rod Blagojevich became embroiled in scandal, the City of Big Shoulders shrugged it off like so much dandruff.

REDHEADS

Blonds, take a back seat. Christina Hendricks, Amy Adams, Kate Walsh and Isla Fisher grabbed center stage -- and even sixtysomething Susan Sarandon posed as a pinup. Not since we all loved Lucy have flame tresses been the tint of Tinseltown.

DOWN

CHRISTIAN AUDIGIER

There's no doubt Audigier's brand-building wizardry has made his stable of labels (Ed Hardy, Christian Audigier and Smet among them) a hit on the street, but this year he came close to flogging his lifestyle brands to death by expanding into watches, children's clothes, animal accessories and a Las Vegas nightclub, not to mention those billboards with his likeness all over town. It can't do much to burnish the brand when Ed Hardy pillow shams turn up in the window of the local Linen Outlet.

CELEBRITY CLOTHING LINES

If TV style icon Sarah Jessica Parker couldn't save retailer Steve & Barry's from going under, it's time to call a halt to the craze of celebrity clothing lines. Celebs as unlikely as Rod Stewart and Charlie Sheen have put their monikers on labels, and frankly, the trend is insulting to fashion designers. Bring back the age-old actor aspiration: "I want to direct."

THE iPHONE

Skinny, expensive, easy on the eyes and known to play hard to get -- the iPhone is the gadget world's version of the runway model. With the continual rollout of new versions, and dropping prices, its cachet is teetering like a model on 8-inch heels. And that recent blast of cold air got us thinking: Does that touch screen really seem as cool when you're suddenly wearing mittens?

PAYING FULL PRICE

With Saks at 70% off a month before Christmas, Neiman Marcus literally paying you in gift cards to spend, Barneys New York offering to put items aside until further discounts can be taken and sales announcements flooding in, you have to be a rube to pay full price these days. The days of triple markups are over, and the consumer is king.

PLASTIC SURGERY

A wrinkle in time, indeed. The American Society of Plastic Surgeons reports that cosmetic procedures are down 62% for the first half of this year. Will those Botoxed Beverly Hills matrons finally be able to frown again?

PREMIUM DENIM

You could pay $300 for elaborate pocket embroidery or 24-karat gold side stitching. Or you could buy simple, flattering, reasonably priced jeans. Suddenly, the choice was easy. And the news wasn't good for the boutique brands.

'IT' BAGS

Bags are still a priority, but not the ones that scream: "I rushed to put my name on a waiting list so I could pay $1,800 for this." Understatement and frugality, or at least the appearance of them, are the new "It."

MADONNA

Her recent tour may have sold out and her arms still look like chiseled alabaster, but the 50-year-old Material Girl is trying too hard to fight Mother Nature. Sure, she can still sport hot pants better than any 20-year-old, but the taut, gaunt look of her face and that A-Rod business make us wish she'd stop trying to fit into the pop princess mold at middle age.

Huntington Dies

Samuel Huntington, 81, political scientist, scholar
'One of the most influential political scientists of the last 50 years'

By Corydon Ireland


Samuel P. Huntington - a longtime Harvard University professor, an influential political scientist, and mentor to a generation of scholars in widely divergent fields - died Dec. 24 on Martha's Vineyard. He was 81.

Huntington had retired from active teaching in 2007, following 58 years of scholarly service at Harvard. In a retirement letter to the President of Harvard, he wrote, in part, "It is difficult for me to imagine a more rewarding or enjoyable career than teaching here, particularly teaching undergraduates. I have valued every one of the years since 1949."

Huntington, the father of two grown sons, lived in Boston and on Martha's Vineyard. He was the author, co-author, or editor of 17 books and over 90 scholarly articles. His principal areas of research and teaching were American government, democratization, military politics, strategy, and civil-military relations, comparative politics, and political development.

"Sam was the kind of scholar that made Harvard a great university," said Huntington's friend of nearly six decades, economist Henry Rosovsky, who is Harvard's Lewis P. and Linda L. Geyser University Professor, Emeritus. "People all over the world studied and debated his ideas. I believe that he was clearly one of the most influential political scientists of the last 50 years."

"Every one of his books had an impact," said Rosovsky. "These have all become part of our vocabulary."

Jorge Dominguez, Harvard's vice provost for International Affairs, described Huntington as "one of the giants of political science worldwide during the past half century. He had a knack for asking the crucially important but often inconvenient question. He had the talent and skill to formulate analyses that stood the test of time."

Huntington's friend and colleague Robert Putnam, the Peter and Isabel Malkin Professor of Public Policy at the Harvard Kennedy School, called him "one of the giants of American intellectual life of the last half century."

To Harvard College Professor Stephen P. Rosen, Beton Michael Kaneb Professor of National Security and Military Affairs, "Samuel Huntington's brilliance was recognized by the academics and statesmen around the world who read his books. But he was loved by those who knew him well because he combined a fierce loyalty to his principles and friends with a happy eagerness to be confronted with sharp opposition to his own views."

Huntington, who graduated from Yale College at age 18 and who was teaching at Harvard by age 23, was best known for his views on the clash of civilizations. He argued that in a post-Cold War world, violent conflict would come not from ideological friction between nation states, but from cultural and religious differences among the world's major civilizations.

Huntington, who was the Albert J. Weatherhead III University Professor at Harvard, identified these major civilizations as Western (including the United States and Europe), Latin American, Islamic, African, Orthodox (with Russia as a core state), Hindu, Japanese, and "Sinic" (including China, Korea, and Vietnam).

"My argument remains," he said in a 2007 interview with Islamica Magazine, "that cultural identities, antagonisms and affiliations will not only play a role, but play a major role in relations between states."

Huntington first advanced his argument in an oft-cited 1993 article in the journal Foreign Affairs. He expanded the thesis into a book, "The Clash of Civilizations and the Remaking of World Order," which appeared in 1996, and has since been translated into 39 languages.

To the end of his life, the potential for conflict inherent in culture was prominent in Huntington's scholarly pursuits. In 2000, he was co-editor of "Culture Matters: How Values Shape Human Progress." And just before his health declined, in the fall of 2005, he was beginning to explore religion and national identity.

"His contributions ranged across the whole field of political science, from the deeply theoretical to the intensely applied," said Putnam, author of a lengthy appreciation of Huntington in a 1986 issue of the journal PS: Political Science and Politics. "Over the years, he mentored a large share of America's leading strategic thinkers, and he built enduring institutions of intellectual excellence."

And Putnam added a personal note. "What was most rare about Sam, however, was his ability to combine intensely held, vigorously argued views with an engaging openness to contrary evidence and argument. Harvard has lost a towering figure, and his colleagues have lost a very good friend."

Timothy Colton, the Morris and Anna Feldberg Professor of Government and Russian Studies at Harvard, remarked on his old friend's breadth of intellectual interests. He used the American political experience as a pivot point (Huntington's doctoral dissertation was on the Interstate Commerce Commission), but soon deeply studied a globe-spanning range of topics.

"He was anchored in American life and his American identity, but he ended up addressing so many broad questions," said Colton, who had Huntington as a Ph.D. adviser at Harvard in the early 1970s. "His degree of openness to new topics and following questions where they take him is not as often found today as when he was making his way."

Huntington's first book, "The Soldier and the State: The Theory and Politics of Civil-Military Relations," published to great controversy in 1957 and now in its 15th printing, is today still considered a standard title on the topic of how military affairs intersect with the political realm. It was the subject of a West Point symposium last year, on the 50th anniversary of its publication.

In part, "Soldier and the State" was inspired by President Harry Truman's firing of Gen. Douglas MacArthur - and at the same time praised corps of officers that in history remained stable, professional, and politically neutral.

In 1964, he co-authored, with Zbigniew Brzezinski, "Political Power: USA-USSR," which was a major study of Cold War dynamics - and how the world could be shaped by two political philosophies locked in opposition to one another.

Brzezinski, a doctoral student at Harvard in the early 1950s who was befriended by both Huntington and Rosovsky, was U.S. National Security Adviser in the Carter White House from 1977 to 1981. In those days, said Rosovsky, the youthful Huntington, though an assistant professor, was often mistaken for an undergraduate.

According to his wife Nancy, Huntington was a life-long Democrat, and served as foreign policy adviser to Vice President Hubert Humphrey in his 1968 presidential campaign. In the wake of that "bitter" campaign, she said, Huntington and Warren Manshel - "political opponents in the campaign but close friends" - co-founded the quarterly journal Foreign Policy (now a bimonthly magazine). He was co-editor until 1977.

His 1969 book, "Political Order in Changing Societies," is widely regarded as a landmark analysis of political and economic development in the Third World. It was among Huntington's most influential books, and a frequently assigned text for graduate students investigating comparative politics, said Dominguez, who is also Antonio Madero Professor of Mexican and Latin American Politics and Economics. The book "challenged the orthodoxies of the 1960s in the field of development," he said. "Huntington showed that the lack of political order and authority were among the most serious debilities the world over. The degree of order, rather than the form of the political regime, mattered most."

His 1991 book, "The Third Wave: Democratization in the Late Twentieth Century" - another highly influential work - won the Grawemeyer Award for Ideas Improving World Order, and "looked at similar questions from a different perspective, namely, that the form of the political regime - democracy or dictatorship - did matter," said Dominguez. "The metaphor in his title referred to the cascade of dictator-toppling democracy-creating episodes that peopled the world from the mid 1970s to the early 1990s, and he gave persuasive reasons for this turn of events well before the fall of the Berlin Wall."

As early as the 1970s, Huntington warned against the risk of new governments becoming politically liberalized too rapidly. He proposed instead that governments prolong a transition to full democracy - a strand of ideas that began with an influential 1973 paper, "Approaches to Political Decompression."

Huntington's most recent book was "Who Are We? The Challenges of America's National Identity" (2004), a scholarly reflection on America's cultural sense of itself.

Samuel Phillips Huntington was born on April 18, 1927, in New York City. He was the son of Richard Thomas Huntington, an editor and publisher, and Dorothy Sanborn Phillips, a writer.

Huntington graduated from Stuyvesant High School, received his B.A. from Yale in 1946, served in the U.S. Army, earned an M.A. from the University of Chicago in 1948, and a Ph.D. from Harvard in 1951, where he had taught nearly without a break since 1950.

From 1959 to 1962, he was associate director of the Institute of War and Peace Studies at Columbia University. At Harvard, he served two tenures as the chair of the Government Department - from 1967 to 1969 and from 1970 to 1971.

Huntington served as president of the American Political Science Association from 1986 to 1987.

Huntington was director of Harvard's Center for International Affairs from 1978 to 1989. He founded the John M. Olin Institute for Strategic Studies, and was director there from 1989 to 1999. He was chairman of the Harvard Academy for International and Area Studies from 1996 to 2004, and was succeeded by Jorge Dominguez.

Huntington applied his theoretical skills to the Washington, D.C., arena too. In 1977 and 1978, he served in the Carter White House as coordinator of security planning for the National Security Council. In the 1980s, he was a member of the Presidential Commission on Long-Term Integrated Strategy.

27.12.08

The FT looks at 2008

Journalism, so the adage goes, is the first draft of history. In 2008, the Financial Times had a once-in-a-lifetime opportunity to report, analyse and comment on the most serious financial crisis since the Great Crash of 1929. Here was a global story whose tentacles spread from the US sub-prime mortgage market to the City of London, Iceland, Russian oligarchs, Dubai property barons and numerous other actors. It was a story tailor-made for the FT.

The following articles are a selection of the best of the FT’s coverage over the past year. Inevitably, there are gaps. There is no space, for example, for our groundbreaking reporting on the credit ratings agencies and the failure of computer modelling. The aim of this special report is to offer readers an unfolding narrative, as well as a broader perspective on a crisis which shook the western model of market capitalism to its foundations.

The opening commentary by George Soros, the financier and philanthropist, sets the scene. Writing in January, Soros argued that the financial crisis is very different from other crises which have erupted at intervals since the end of the second world war. “[It] marks the end of an era of credit expansion based on the dollar as the international reserve currency.” As such, it signals “the culmination of a super-boom that has lasted more than 60 years”.

Soros’s article provides a useful antidote to those commentators who too easily laid the blame for the crisis on lax regulation of sophisticated financial products. Its origins may indeed go back to sub-prime mortgage lending in the US. But sub-prime mortgages – loans to high-risk borrowers seeking a toehold on the residential property ladder – were merely symptoms of a larger problem: global imbalances, in particular a highly indebted US which sucked up the savings of the rest of the world and consumed more than it produced.

Martin Wolf, the FT’s chief economics commentator, has long warned of the threat posed by global imbalances to the world economy. These imbalances include not just the US, but also China’s huge current account surplus conveniently invested in US Treasuries. In late February, Wolf showed he was alert to the scale of the crisis, warning that the US faced “the Mother of all Meltdowns”. He was clear, too, that the banking system would require a bail-out. And he was spot-on with his strictures to policymakers: the crisis could be managed provided the US acted quickly and others followed suit to sustain domestic demand. This holds even truer today.

In hindsight, the late spring and summer were the calm before the September storm. Equity markets recovered ground and oil continued its dizzy rise. Ironically, in the light of the burst of rate cutting which was soon to follow, central bankers were still worried about inflationary pressures. Niall Ferguson, the author, historian and FT contributing editor, struck a more cautionary note in August. He warned that the local squall in the US could easily turn into a global tempest with profound consequences for economic growth.

In a devastating commentary in September, David Blake, an asset manager and former Goldman Sachs analyst, pointed a finger at Alan Greenspan, long feted as the doyen of central bankers and architect of global prosperity during his 18 years at the Federal Reserve. The Greenspan Fed’s policy of low interest rates was not to blame, says Blake, because the US needed low interest rates to avoid a severe recession. “Where Mr Greenspan bears responsibility is his role in ensuring that the era of cheap interest rates created a speculative bubble.”

Blake identifies two fatal lapses in the late 1990s: the failure to prick the dotcom equity bubble and the Fed chairman’s opposition to regulation of over-the-counter derivatives which formed the bulk of counterparty risk in the ensuing explosion of credit. He says: “To create one bubble may be seen as a misfortune; to create two looks like carelessness. Yet that is exactly what the Greenspan Fed did.”

Another warning voice was that of Gillian Tett, the FT’s award-winning capital markets editor. For more than two years Tett, who has a PhD in social anthropology, had pointed to the risks in the sophisticated debt instruments known as credit derivatives. In September 2007, well before before the collapse of Lehman Brothers, Tett identified the heart of the problem. “Although it has been taken as self-evident in recent years that the financial system grows stronger if banks spread their credit risks, some are starting to refine that view ... Bankers had become adept at removing credit risk from banks’ balance sheets, either by selling the loan directly to outside investors or, more usually, by turning loans into new securities such as bonds and then selling them on the capital markets.”

In October 2008, Tett assessed the British authorities’ record in the face of the subsequent banking meltdown. After much Bertie Woosterish bumbling, the government announced a £400bn ($538bn) rescue package which recapitalised the banks, drawing lessons from earlier crises in Japan and Sweden. ”Finally – albeit belatedly – they have got something right.”

Samuel Brittan, the renowned FT economics commentator, takes a philosphical approach in a discourse on what he describes as competitive capitalism. His conclusion at the end of an elegant discussion of asset markets and their destabilising role in the financial system is typically succinct: “We need to be reminded of the dictum of Keynes that ‘money will not manage itself’. That goes for credit too.”

For a switch in pace, readers should turn to the account of the downfall of Lehman Brothers, the 158-year-old Wall Street investment bank. Written by the FT’s banking teams in New York and London, the narrative examines the last months of Dick “the gorilla” Fuld, the former bond trader who ran the bank for 15 years. It is a story of management hubris and excessive risk-taking and it raises one of the most tantalising questions of the year: were the US authorities, notably Hank Paulson, US Treasury secretary, right to let Lehman go under?

In November, Queen Elizabeth II asked another pressing question about the global financial crisis: “Why did no one see this coming?” Chris Giles, the FT’s economics editor, examines the record of the world’s pre-eminent economists. His conclusion: many saw a piece of the jigsaw but very few practitioners of the dismal science covered themselves in glory.

Two commentaries offer a broader political perspective. Martin Wolf looks ahead to post-crisis construction and the need for a regulatory overhaul as substantial as the 1944 Bretton Woods agreement. Philip Stephens concludes that the crisis has indeed produced the outlines of a new geopolitical order, not necessarily to the advantage of the US.

These are necessarily preliminary conclusions. In the New Year, a new US president, Barack Obama, will have his say. The world – and the FT – will be watching.

An Imaginary 2009

An imaginary retrospective of 2009

By Niall Ferguson


It was the year when people finally gave up trying to predict the year ahead. It was the year when every forecast had to be revised – usually downwards – at least three times. It was the year when the paradox of globalisation was laid bare for all to see, if their eyes weren’t tightly shut.

On the one hand, the increasing integration of markets for commodities, manufactures, labour and capital had led to great gains. As Adam Smith had foreseen in The Wealth of Nations, economic liberalisation had allowed the division of labour and comparative advantage to operate on a global scale. From the 1980s until 2007, the world economy had enjoyed higher, more widespread growth and fewer, less severe crises – hence Federal Reserve chairman Ben Bernanke’s hubristic celebration of a “great moderation” in 2004.

On the other hand, the more the world came to resemble an intricate, multi-nodal network operating at maximum efficiency – with minimal inventories and just-in-time delivery – the more vulnerable it became to a massive systemic crash.

That was the true significance of the Great Repression which began in August 2007 and reached its nadir in 2009. It was clearly not a Great Depression on the scale of the 1930s, when output in the US declined by as much as a third and unemployment reached 25 per cent. Nor was it merely a Big Recession. As output in the developed world continued to decline throughout 2009 – despite the best efforts of central banks and finance ministries – the tag “Great Repression” seemed more and more apt: although this was the worst economic crisis in 70 years, many people remained in deep denial about it.

“We assumed that we economists had learned how to combat this kind of crisis,” admitted one of President Barack Obama’s “dream team” of economic advisers, shortly after his return to academic life in September 2009. “We thought that if the Fed injected enough liquidity into the financial system, we could avoid deflation. We thought if the government ran a big enough deficit, we could end a recession. It turned out we were wrong. So much for [John Maynard] Keynes. So much for [Milton] Friedman.”

The root of the problem remained the US’s property bubble, which continued to deflate throughout the year. Many people had assumed that by the end of 2008 the worst must be over. It was not. Economist Robert Shiller’s real home price index in 2006 had stood at just under 206, nearly double its level just six years earlier. To return to its pre-bubble level, it therefore had to fall by 50 per cent. Barely half that decline had taken place by the end of 2008. So house prices continued to slide in the US. As they did, more and more families found themselves in negative equity, with debts exceeding the value of their homes. In turn, rising foreclosures translated into bigger losses on mortgage-backed securities and yet more red ink on banks’ balance sheets.

With total debt above 350 per cent of US gross domestic product, the excesses of the age of leverage proved difficult to purge. Households reined in their consumption. Banks sought to restrict new lending. The recession deepened. Unemployment rose towards 10 per cent, and then higher. The economic downward spiral seemed unstoppable. No matter how hard they saved, Americans simply could not stabilise the ratio of their debts to their disposable incomes. The paradox of thrift meant that rising savings translated into falling consumer demand, which led to rising unemployment, falling incomes and so on, ever downwards.

“Necessity will be the mother of invention,” Obama declared in his inaugural address on January 20. “By investing in innovation, we can restore our faith in American creativity. We need to build new schools, not new shopping malls. We need to produce clean energy, not dirty derivatives.” Commentators agreed that the speech was on a par with Franklin Roosevelt’s on his inauguration in 1933. Yet Roosevelt had spoken after the worst of the Depression was over, Obama in mid-tailspin. The rhetoric flew high. But the markets sank lower. The contagion spread inexorably from subprime to prime mortgages, to commercial real estate, to corporate bonds and back to the financial sector. By the end of June, Standard & Poor’s 500 Index had sunk to 624, its lowest monthly close since January 1996, and about 60 per cent below its October 2007 peak.

The crux of the problem was the fundamental insolvency of the major banks, another reality that policymakers sought to repress. In 2008, the Bank of England had estimated total losses on toxic assets at about $2.8 trillion. Yet total bank writedowns by the end of 2008 were little more than $583bn, while total capital raised was just $435bn. Losses, in other words, were either being massively understated, or they had been incurred outside the banking system. Either way, the system of credit creation had broken down. The banks could not contract their balance sheets because of a host of pre-arranged credit lines, which their clients were now desperately drawing on, while their only source of new capital was the US Treasury, which had to contend with an increasingly sceptical Congress. The other credit-creating institutions – especially the markets for asset-backed securities – were all but paralysed.

There was uproar when Timothy Geithner, US Treasury secretary, requested an additional $300bn to provide further equity injections for Citigroup, Bank of America and the seven other big banks, just a week after imposing an agonising “mega-merger” on the automobile industry. In Detroit, the Big Three had become just a Big One, on the formation of CGF (Chrysler-General Motors-Ford; inevitably, the press soon re-christened it “Can’t Get Funding”). The banks, by contrast, seemed to enjoy an infinite claim on public funds. Yet no amount of money seemed enough to persuade them to make new loans at lower rates. As one indignant Michigan law-maker put it: “Nobody wants to face the fact that these institutions [the banks] are bust. Not only have they lost all of their capital. If we genuinely marked their assets to market, they would have lost it twice over. The Big Three were never so badly managed as these bankrupt banks.”

In the first quarter, the Fed continued to do everything in its power to avert the slide into deflation. The effective federal funds rate had already hit zero by the end of 2008. In all but name, quantitative easing had begun in November 2008, with large-scale purchases of the debt and mortgage-backed securities of government-sponsored agencies (the renationalised mortgage giants Fannie Mae and Freddie Mac) and the promise of future purchases of government bonds. Yet the expansion of the monetary base was negated by the contraction of broader monetary measures such as M2 (the measurement of money and its “close substitutes”, such as savings deposits, that is a key indicator of inflation). The ailing banks were eating liquidity almost as fast as the Fed could create it. The Fed increasingly resembled a government-owned hedge fund, leveraged at more than 75 to 1, its balance sheet composed of assets everyone else wanted to be rid of.

. . .

The position of the US federal government was scarcely better. By the end of 2008, the total value of loans, investments and guarantees given by the Fed and the Treasury since the beginning of the financial crisis had already reached $7.8 trillion. In the year to November 30 2008, the total federal debt had increased by more than $1.5 trillion. Morgan Stanley estimated that the total federal deficit for the fiscal year 2009 could equal 12.5 per cent of GDP. The figure would have been even higher had President Obama not been persuaded by his chief economic adviser, Lawrence Summers, to postpone his planned healthcare reform and promised spending increases in education, research and foreign aid.

Obama had set out to construct an administration in which his rivals and allies were equally represented. But his rivals were a good deal more experienced than his allies. The result was an administration that talked like Barack Obama but thought like Bill Clinton. The Clinton-era veterans, not least Secretary of State Hillary Clinton, had vivid memories of the bond-market volatility that had plagued them in 1993 (prompting campaign manager James Carville to say that, if there was such a thing as reincarnation, he wanted to come back as the bond market). Terrified at the swelling size of the deficit, they urged Obama to defer any expenditure that was not specifically targeted on ending the financial crisis.

Yet the world had changed since the early 1990s. Despite the fears of the still-influential former Treasury secretary Robert Rubin, investors around the world were more than happy to buy new issues of US Treasuries, no matter how voluminous. Contrary to conventional wisdom, the quadrupling of the deficit did not lead to falling bond prices and rising yields. Instead, the flight to quality and the deflationary pressures unleashed by the crisis around the world drove long-term yields downwards. They remained at close to 3 per cent all year.

Nor was there a dollar rout, as many had feared. The foreign appetite for the US currency withstood the Fed’s money-printing antics, and the trade weighted exchange rate actually appreciated during 2009.

Here was the irony at the heart of the crisis. In all kinds of ways, the Great Repression had “Made in America” stamped all over it. Yet its effects were more severe in the rest of the world than in the US. And, as a consequence, the US managed to retain its “safe haven” status. The worse things got in Europe, in Japan and in emerging markets, the more readily investors bought Treasuries and held dollars.

. . .

For the rest of the world, 2009 proved to be an annus horribilis. Japan was plunged back into the deflationary nightmare of the 1990s by yen appreciation and a collapse of consumer confidence. Things were little better in Europe. There had been much anti-American finger-pointing by European leaders in 2008. The French president Nicolas Sarkozy had talked at the G-20 summit in Washington as if he alone could save the world economy. The British prime minister Gordon Brown had sought to give a similar impression, claiming authorship of the policy of bank recapitalisation. The German chancellor Angela Merkel, meanwhile, voiced stern disapproval of the excessively large American deficit.

By the first quarter of 2009, however, the mood in Europe had darkened. It became apparent that the problems of the European banks were just as serious as those of their American counterparts. Indeed, the short-term liabilities of the Belgian, Swiss, British and Italian banks were far larger in relation to those countries’ economies, while the German, French and Danish banks were much more dangerously leveraged. Moreover, in the absence of a European-wide finance ministry, all talk of a European stimulus package was just that – mere talk. In practice, fiscal policy became a matter of sauve qui peut, with each European country improvising its own bailout and its own stimulus package. The result was a mess. Currencies outside the Euro area were afflicted by severe volatility. Inside the Euro area, the volatility was in the bond market, with spreads on Greek and Italian bonds exploding relative to German bunds.

The picture was even worse in most emerging markets. Especially hard hit in eastern Europe were Bulgaria, Romania, Ukraine and Hungary. Of the Brics (Brazil, Russia, India and China), Brazil had the best year, Russia the worst. It was a terrible year for oil and gas exporters, as prices plunged, taking currencies such as the rouble down with them. The Indian stock market, meanwhile, was battered by escalating tensions between New Delhi and Islamabad in the wake of the Mumbai terrorist attacks.

Political instability also struck China, where riots by newly redundant workers in Shenzhen and other export centres provoked a heavy-handed clampdown by the government, but also a renewed effort by the People’s Bank of China to prevent the appreciation of the yuan by buying up yet more hundreds of billions of dollars of US Treasuries. “Chimerica” – the symbiotic relationship between China and America – not only survived the crisis, but gained from it. Although Obama’s decision to attend the first G-2 summit in Beijing in April dismayed some liberals, most recognised that trade trumped Tibet at such a time of economic crisis.

This asymmetric character of the global crisis – the fact that the shocks were even bigger on the periphery than at the epicentre – had its disadvantages for the US, to be sure. Any hope that America could depreciate its way out from under its external debt burden faded as 10-year yields and the dollar held firm. Nor did American manufacturers get a second wind from reviving exports, as they would have done had the dollar sagged. The Fed’s achievement was to keep inflation in positive territory – just. Those who had feared galloping inflation and the end of the dollar as a reserve currency were confounded.

On the other hand, the troubles of the rest of the world meant that in relative terms the US gained, politically as well as economically. Many commentators had warned in 2008 that the financial crisis would be the final nail in the coffin of American credibility around the world. First, neo-conservatism had been discredited in Iraq. Now the “Washington consensus” on free markets had collapsed. Yet this was to overlook two things. The first was that most other economic systems fared even worse than America’s when the crisis struck: the country’s fiercest critics – Russia, Venezuela – fell flattest. The second was the enormous boost to America’s international reputation that followed Obama’s inauguration.

. . .

If proof were needed that the US constitution still worked, here it was. If proof were needed that America had expunged its original sin of racial discrimination, here it was. And if proof were needed that Americans were pragmatists, not ideologues, here it was. It was not that Obama’s New New Deal – announced after the Labor Day purge of the Clintonites – produced an economic miracle. Nobody had expected it to do so. It was more that the federal takeover of the big banks and the conversion of all private mortgage debt into new 50-year Obamabonds signalled an impressive boldness on the part of the new president.

The same was true of Obama’s decision to fly to Tehran in June – a decision that did more than anything else to sour relations with Hillary Clinton, whose supporters never quite recovered from the sight of the former presidential candidate shrouded in a veil. Not that the so-called “opening to Iran” produced a dramatic improvement in the Middle East region. Nobody had expected that either. It was more that, like Richard Nixon’s visit to China in 1972, it symbolised a readiness on Obama’s part to rethink the very fundamentals of American grand strategy. And the downfall of the Iranian president Mahmoud Ahmedinejad – followed soon after by the abandonment of the country’s nuclear weapons programme – was a significant prize in its own right. With their economy prostrate, the pragmatists in Tehran were finally ready to make their peace with “the Great Satan”, in return for desperately needed investment.

Meanwhile, al-Qaeda’s bungled attempt to assassinate Obama – on the eve of Thanksgiving – only served to discredit radical Islamism and to reinforce Obama’s public image as “The One”. Another of the many ironies of 2009 was that the mood of religious reawakening triggered by the economic crisis benefited the Democrats rather than the deeply divided Republicans.

By year end, it was possible for the first time to detect – rather than just to hope for – the beginning of the end of the Great Repression. The downward spiral in America’s real estate market and the banking system had finally been halted by radical steps that the administration had initially hesitated to take. At the same time, the far larger economic problems in the rest of the world had given Obama a unique opportunity to reassert American leadership, particularly in Asia and the Middle East.

The “unipolar moment” was over, no question. But power is a relative concept, as the president pointed out in his last press conference of the year: “They warned us that America was doomed to decline. And we certainly all got poorer this year. But they forgot that if everyone else declined even further, then America would still be out in front. After all, in the land of the blind, the one-eyed man is king.”

And, with a wink, President Barack Obama wished the world a happy new year.

Niall Ferguson is a contributing editor of the FT and the author of ‘The Ascent of Money: A Financial History of the World’ (Penguin)

The (Once So) Great Books

Like many cultural conventions, the canon of great books is one part myth, another part wishful-thinking. At once self-limiting and ever expanding, the western literary and philosophical tradition has grown by means organic and totally artificial. Classics, after all, were once new; but only posterity decides which works survive to be handed down from generation to generation, and which vanish into obscurity.

Few would deny that the likes of Aristotle, Cervantes and Shakespeare are central figures in the western canon. But what, exactly, do we mean when we speak of literary greatness? The very notion is enshrouded in a kind of hoary mysticism. The Victorian critic Matthew Arnold wrote of “the best that has been thought and known in the world,” but that only takes us so far. There is a cloudy, if universal agreement – a convenient fiction, really – that such an elevated category exists, but there are not, and never will be, fixed criteria for determining those books that are entitled to the sobriquet “great”.

Greatness may be bestowed by a kind of collective acclaim, in the accretion of hundreds of years of opinion from critics, academics, writers and thinkers. And it is ultimately the authority of cultural elites that forms the boundaries of what we keep in the canon – by reading it, teaching it, writing about it – and what falls by the wayside. Taking this measure – the wisdom of crowds, if you will – one could define the canon of great books in an expansive sense: it includes those works that have, over time, been esteemed as great. This was the approach taken in 2006 by the New York Times, which polled “a couple of hundred prominent writers, critics, editors and other literary sages” in an attempt to crown the best American novel of the last quarter-century. (Toni Morrison’s Beloved was the winner.) In this conception, the canon is a fluid, living thing: its boundaries ebb and flow as new works emerge and older books fall out of favour.

But this descriptive approach strikes a certain kind of mandarin as far too permissive – and there remains always a temptation to prescribe a list instead: to pin down, once and for all, a definitive and precise list of imperishable works that speak to all ages and eras, monuments of aesthetic accomplishment; not just those books we do still read, but what we should read.

This was precisely the mandate of one of the most famous attempts at canon construction – the making and selling of The Great Books of the Western World, a 54-volume set assembled by a team of scholars determined to fix the canon in stone and deliver it to the masses. Led by Robert Maynard Hutchins, the wunderkind president of the University of Chicago, the project culminated in an encyclopaedic gathering of classic works of philosophy, science, literature and political, aesthetic and social thought, which collected some 433 representative figures – Plato and Homer; Milton and Chaucer; John Locke and David Hume, among many others.

The 54 volumes of the Great Books landed with a heavy thud in 1952 (A second edition was published in 1990, and is still available from Amazon.com – it’s yours for only $995). Heavily advertised and flogged by an army of door-to-door encyclopaedia salesmen, the sets miraculously achieved sales in excess of one million by the end of the Sixties. Many books are bought but never read, but the Great Books phenomena remains headscratchingly quaint, if not downright puzzling. Hutchins and his partners wanted to package high culture for a mass audiences, but everything about the series militated against not only commercial success, but the very possibility of reading with any sort of pleasure. Enjoyment, it turns out, was irrelevant: these great books were meant to serve as a dose of cultural medicine.

As Alex Beam writes in his entertaining if superficial history of the project, A Great Idea at the Time, “The Great Books of the Western World were in fact icons of unreadability – 32,000 pages of tiny, double-column, eye-straining type... the translations of the great works were not particularly modern. There were no footnotes to mitigate the reader’s ignorance, or gratify his curiosity.” In contrast to these forbiddingly solemn volumes, Beam’s book is playful and flippant, a light treatment of a quixotic endeavour, but it dodges the larger cultural questions: he gently twits the project, but in the end he offers only the slightest criticism.

From the start, The Great Books of the Western World project was a paradoxical undertaking. Hutchins and his evangelising colleague Mortimer Adler were elitists on a democratic mission, popularisers who made few concessions to popular taste. The texts came unadorned with any historical or biographical materials to situate the reader, but such orientation was besides the point: Hutchins wanted the reader to confront the texts of Aristotle, Dante and Aeschylus naked, as it were, without any guidance or forethought. As he intoned in his preface, “Great books contain their own aids to reading; that is one reason they are great. Since we hold that these works are intelligible to the ordinary man, we see no reason to interpose ourselves or anybody else between the author and the reader.”

Hutchins’s perverse reasoning perfectly encapsulates his particular brand of highbrow populism. Hutchins wanted to reform the modern university curriculum, which he believed to be an anarchic free-for-all – since students picked their own courses – and ground it in a solid foundation of classical learning. In the early 1930s Hutchins, along with Adler, began teaching a select group of University of Chicago undergraduates “General Honors Course 110 – Readings in the classics of Western Europe literature.” They put the Socratic method into action; the class was a rigorous and demanding back and forth between teacher and students, who were often cowed into silence.

If there is something slightly authoritarian about Hutchins’s methods, he was driven by his belief that the great books – and their study – were vital to the survival of democratic culture. Great books courses and discussion groups popped up all over Chicago in the Thirties and Forties – and beyond. (“Plato-for-the-Masses’ Drive to Bring Classics to the Public,” ran one newspaper headline.) For Hutchins, here were the seeds of civic renewal in a century wracked by war and totalitarianism. Hutchins’s beliefs were almost touching in their grandiosity: “An interminable liberal education,” was necessary for “effective citizenship in a democracy,” he mused. Just as Matthew Arnold felt that culture – “the best that has been thought and known” – “seeks to do away with classes... to make all men live in an atmosphere of sweetness and light”, Hutchins believed the great books could help “revive the great tradition of liberal human thought” and bring about “a world republic of law and justice.”

As with any canon, Hutchins’s and Adler’s had many quirks and idiosyncrasies. According to Adler, who laid out the criterion by which the committee should make its selections, a great book should, by definition, “be important in itself and without reference to any other; that is, it must be seminal and radical in its treatment of basic ideas or problems.” Homer, Plato, Thucydides and other major Greeks were in, as were Montaigne, Rabelais, Shakespeare and Spinoza. But as Hutchins’s team of academics approached the 18th and 19th century, there was hardly any unanimity about the selections. There was neither Dickens nor Jane Austen, both of whom are practically bywords for the canon of English literature today. A heated debate erupted over Molière. The poet and critic Mark Van Doren declared “Molière will go out over my bruised body. He is the perfect comedian, the classic comedian, and he is also universally delightful”; Hutchins thought the French playwright was “trash... and nothing else.”

There was an undue representation of dated scientific work, technically daunting treatises that had once been critical milestones in understanding the natural world but were baffling to the modern lay reader. Beam breezes through all this with a slight smirk. He laughs at Hutchins and Adler, but he gives them a pass. The unreadable format of the texts themselves, and Adler’s own risible contribution, the “Synopticon,” an absurd index to 102 “great ideas” that filled 2,248 pages and two entire volumes of the set, are treated as the eccentric pursuits of a pair of wayward but loveable uncles.

Beam is a columnist for the Boston Globe, not a cultural theorist. He correctly locates the Great Books among other middle class diversions like the Saturday Review of Literature and the Book of the Month Club. Newly affluent Americans wanted the trappings of learning – and the faux-leather volumes of the Great Books of the Western World fit the bill. But beyond this, Beam doesn’t give much consideration to Hutchins’s brand of cultural uplift.

Does establishing a canon of cultural greatness aid the preservation or defence of democracy? It’s not an easy question to answer, and Hutchins – the self-styled guardian of culture – can seem misguided, unaware that there was some irony in a defence of democratic virtues mounted by a tiny committee of elitists. (“I am not saying,” he stated, “that reading and discussing the Great Books will save humanity from itself, but I don’t know anything else that will.”)

For Hutchins’s most trenchant detractor, there was something more sinister at work. In an infamous 1952 New Yorker polemic called “The Book of the Millennium Club,” Dwight Macdonald accused Hutchins of outright betrayal: the books were “a typical expression of the religion of culture that appeals to the American academic mentality. And the claims its creators make are a typical expression of the American advertising psyche.”

For Macdonald, the Great Books of the Western World represented the apotheosis of middlebrow gimmickry: “The way to put over a two-million-dollar cultural project is, it seems, to make it appear as pompous as possible” – a giant put-on that was not only pretentious, but a sinister ploy. “Its aim,” Macdonald thundered, “is hieratic rather than practical – not to make the books accessible to the public... but to fix the canon of the Sacred Texts by printing them in a special edition.” He had even less use for the Synopticon, which he ridiculed as the “Handy Key to Kulture”.

To Macdonald, Adler and Hutchins were not intellectuals; they were hucksters who practised a kind of cultural voodoo. They treated the great books like any consumer product, to be foisted on the masses with false promises of moral improvement: they were not scholars, they were salesmen.

Macdonald was a contrary sort of leftist, but he was no less an elitist, committed in his own way to protecting the integrity of high culture from the contaminating effects of efforts at popularisation. But his objections could not stop the spread of high culture: in the years after the Second World War, programmes like the GI Bill spurred a surge in college enrolment; Americans hankered after the trappings of culture and education – or at least the illusion of them. Hutchins and Adler, for a time, were travelling with the prevailing current. Great Books clubs were not foisted on a supine public, and the eager joiners who attended responded deeply to Hutchins’s message – and didn’t give a darn about Dwight Macdonald.

Today you can still find old sets of The Great Books collecting dust in used bookstores. But the moment Adler and Hutchins embodied has long passed, and there’s little evidence that we’re better off for it.