The human brain gets a lot of press these days, but not all the publicity has been good. Its reviews are reminiscent of Barack Obama’s during the 2008 presidential campaign, when one side said he was a socialist Muslim foreigner and the other thought he was a savior from on high. To its detractors, the brain is a kludge, a hacked-up device beset with bugs, biases and self-­deceptions that undermine our decision making and well-being at every turn. To its admirers, it contains vast potential we can all unlock to improve our lives, thanks to “neural plasticity” that enables the adult nervous system to change in more dramatic ways than previously thought. Lately, a growing army of Chicken Littles retorts that this very plasticity has been hijacked by the Internet and other forms of technological crack that are rewiring our brains into a state of continual distraction and intellectual torpor.
The “your brain, warts and more warts” genre is well represented by the new book “Brain Bugs: How the Brain’s Flaws Shape Our Lives,” by Dean Buonomano, a neuroscientist at U.C.L.A. He takes readers on a lively tour of systematic biases and errors in human thinking, citing examples that are staples of psychology courses and other popular books. What is new, however, is Buonomano’s focus on the mechanisms of memory, especially its “associative architecture,” as the main causes of the brain’s bugs. “The human brain stores factual knowledge about the world in a relational manner,” he explains. “That is, an item is stored in relation to other items, and its meaning is derived from the items to which it is associated.”
This is an old idea that is well illustrated by word-association experiments, in which “river” leads to “bank,” which activates “money,” and so on. But a much newer body of research suggests that this “priming” can spread not just from word to word but from one kind of information, a puzzle, say, to an entirely different domain, such as a social interaction, as long as the same concept is invoked in both. Buonomano describes a famous experiment by the psychologist John Bargh and colleagues in which subjects who unscrambled lists of words that yielded sentences about politeness, like “They usually encourage her,” were later more polite toward a lab assistant than were subjects who generated sentences related to rudeness. Other researchers have reported that subjects with full bladders exercised more self-control in a completely unrelated realm (financial decisions) than subjects who had been permitted to relieve themselves first — a finding that earned them this year’s Ig Nobel Prize in medicine, awarded annually to unusual or ridiculous-seeming scientific research.
Buonomano engagingly uses associative memory to explain our susceptibility to advertising, our difficulty connecting events that are separated in time, and even our tendency toward supernatural beliefs. But he should have asked more tough, critical questions about the studies he presents, rather than accepting them at face value. He also gives little attention to the fact that the human brain is capable of escaping the grip of its associations. With effort we can save money, buy unadvertised products and unlearn bad habits. Surely there is something interesting to say about how we are occasionally able to rise above our “bugs.”
At the other extreme is Cathy N. Davidson, a professor of English and interdisciplinary studies at Duke, where, as a vice provost, she helped found programs on information science and cognitive neuroscience. Her book “Now You See It” celebrates the brain as a lean, mean, adaptive multitasking machine that — with proper care and feeding — can do much more than our hidebound institutions demand of it. The first step is transforming schools, which are out of touch with the radical new realities of the Internet era. “We currently have a national education policy based on a style of learning — the standardized, machine-readable multiple-choice test — that reinforces a type of thinking and form of attention well suited to the industrial worker — a role that increasingly fewer of our kids will ever fill,” she writes. Thanks mainly to the Internet, “their world is different from the one into which we were born, therefore they start shearing and shaping different neural pathways from the outset. We may not even be able to see their unique gifts and efficiencies.”
Davidson’s book is subtitled “How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn,” but there is almost no brain science in the book at all, and attention is invoked mainly as a metaphor. Davidson begins by describing an experiment the psychologist Daniel Simons and I conducted that illustrated what she calls “attention blindness,” the surprising failure to notice salient events when our attention is directed to other events — even ones happening in the same place. In our study, when subjects were told to count the passes made by basketball players in a video, they often completely missed a person in a gorilla suit who walked through the middle of the action and thumped its chest. Davidson is so taken with the phenomenon that she proclaims it “the fundamental structuring principle of the brain.” Inattentional blindness (as it is properly called) is an important and counterintuitive fact about how perception works, but even I don’t think it can carry half as much weight as Davidson loads upon it. And she provides little but anecdotal support for a central argument of the book: that since every individual is bound to miss something, by working together people can cover one another’s blind spots and collectively see the big picture.
Like many authors who embrace new ideas rather than build on what has come before, Davidson sets out to destroy the old beliefs, as if burning down a forest in order to plant new crops. Take intelligence testing: Davidson starts with the mistaken assertion that I.Q. refers to a purely innate cognitive ability, and then says that the “inherited component to I.Q.” is not genetic but “inherited cultural privilege.” Both claims are contradicted by virtually every relevant study ever conducted.
A general aversion to standards runs through Davidson’s thinking. “ ‘Better’ and ‘worse’ don’t make a lot of sense to me,” is how she frames a comparison of old-fashioned “receptive” and modern “interactive” models of learning. By extension, anything that comes from the Internet mustipso facto be worth incorporating into education — hence her proposal that grading be “crowdsourced” to the very students under evaluation. Davidson seems to interpret the harsh criticism she received when she first floated this idea a few years ago as a sign that she must be on to something.
It’s a shame Davidson decided to wrap her ideas in neurobabble, since she has some interesting first-person reports to make on schools and businesses that have adopted innovative practices like complex simulation and strategy games. In one classroom she visits, the students organize themselves into teams and compete to design and build bridges out of Popsicle sticks, developing management plans, doing experiments on structural strength and making financial decisions along the way. Indeed, Davidson is such a good storyteller, and her characters are so well drawn, it’s easy to overlook the lack of hard evidence in favor of the intriguing ideas she advocates.
Davidson correctly notes that there is no data showing that the Internet hurts our brains, though she errs in implying that no evidence of an effect means evidence that there is no effect. She also thinks multitasking has gotten a bad rap. Switching rapidly from one task to another actually helps us see connections between ideas and be more creative than we would if we held ourselves to a regimen of completing one task before we start another, she suggests. “Mind-­wandering,” she writes, “might turn out to be exactly what we need to encourage more of in order to accomplish the best work in a global, multimedia digital age.” But this speculation is up against facts Davidson omits: the results of experiments showing that for all but perhaps an elite 2 to 3 percent of subjects, doing things in sequence leads to better performance than trying to do them simultaneously.
Fortunately, there is an area of neuroscience that does tell us something useful about the seductions of multitasking and the Internet age. The first project I worked on as a new postdoctoral researcher at Massachusetts General Hospital involved recording activity in the brains of college students while they looked at photographs of male and female faces. The faces were of two types: “beautiful” and “average,” belonging to professional models or ordinary people. The group I had joined was responsible for an earlier study in which cocaine was given intravenously to addicts while their brains were being scanned. The results of these two experiments were very similar: activity in a specific network of brain regions increased when men looked at female models and when cocaine addicts received a dose.
In his book “The Compass of Pleasure,” the Johns Hopkins neurobiologist David J. Linden explicates the workings of these regions, known collectively as the reward system, elegantly drawing on sources ranging from personal experience to studies of brain activity to experiments with molecules and genes. Linden builds a powerful case that every kind of substance, activity or stimulus that motivates human choice does so because it acts on this particular network, whose neurons use the chemical dopamine to communicate with one another. Cocaine produces a high by preventing dopamine from being recycled, thereby prolonging its action and keeping these neurons firing. Beauty, money and a photo of a juicy cheeseburger are compelling because they also increase the activity of these neurons. This may not seem surprising, but before the relevant experiments were done, it was far from clear that an “abstract” reward like money, or a mere reminder of one like a picture, operated via the same mechanism as “primary reinforcers” like food and water.
But the biggest surprise, and the one most relevant to current debates, is a “revolutionary” experiment Linden discusses near the end of his book. Researchers at the National Institutes of Health gave thirsty monkeys the option of looking at either of two visual symbols. No matter which they moved their eyes to, a few seconds later the monkeys would receive a random amount of water. But looking at one of the symbols caused the animals to receive an extra cue that indicated how big the reward would be. The monkeys learned to prefer that symbol, which differed from the other only by providing a tiny amount of information they did not already have. And the same dopamine neurons that initially fired only in anticipation of water quickly learned to fire as soon as the information-providing symbol became visible. “The monkeys (and presumably humans as well) are getting a pleasure buzz from the information itself,” Linden writes.
If this discovery proves reliable, it implies that the Internet doesn’t change our brains at all, for good or for ill. It doesn’t damage brain areas, destroy links between parts of our brains, or grow new areas or connections. What the Internet does is stimulate our reward systems over and over with tiny bursts of information (tweets, status updates, e-mails) that act like primary rewards but can be delivered in more varied and less predictable sequences. These are experiences our brains did not evolve to prefer, but like drugs of abuse, they happen to be even better suited than the primary reinforcers to activating the reward system. So if you find yourself stopping every 30 seconds to check your Twitter feed, your brain has no more been rewired than if you find yourself taking a break for ice cream rather than celery. Picking the more rewarding stimulus is something our brains can do perfectly well with the wiring they start out with.
So what’s the right way to think about the brain? Like a piece of software stuck in permanent beta, it has its share of bugs, but its plasticity allows for frequent updates. And it somehow enables cognitive feats so remarkable they often go unnoticed. Beginning to understand its own limitations is only one of them.