The
following text is drawn from
January
16, 2006 Time magazine.
Mind &
Body
How to Tune
Up Your Brain
In a special
report, TIME explores the latest research on how to stay mentally sharp. In a
complex world, it's news we all need
By CLAUDIA
WALLIS
We
live in an age that's gushing with information and dizzying possibilities. You
can almost feel your brain cells crackling to keep up with the choices—trivial
and profound—that confront us at every turn: picking a cell-phone plan or an
on-demand movie, selecting the best mix of investments in a 401(k) or the right
health plan or just knowing which eggs to buy at the supermarket. (Cage free?
Organic? Omega-3 enriched?) Surely there has never been a greater need to stay
alert and informed, to act shrewdly and remain focused.
Luckily,
we also live in an era in which research is showing us how to nurture and
maintain our mental faculties—from infancy through the golden years—and how to
deter what was once seen as an inevitable decline. Some new findings confirm
what we have always suspected: Grandma was right—fish really is brain food; a
steaming cup of joe actually does turbocharge our mental acuity; getting less
than eight hours of sleep seriously compromises our ability to concentrate and
solve problems.
But
some findings are unexpected, even counterintuitive. Creativity, for instance,
rarely strikes in a flash but more typically results from steady cogitation.
Multitasking, for all its seeming efficiency, can exact a heavy toll on the
quality of our output. Daily meditation physically transforms the cerebral
cortex. Physical exercise may be as important as mental gymnastics in keeping
Alzheimer's disease at bay. Baby Einstein-type videos make a poor substitute for
human interaction in stimulating a tender young mind. And perhaps the most
unexpected and comforting, recent research confirms that the human brain retains
an astonishing degree of plasticity and capacity for learning throughout life.
In some respects, our mental performance, despite a few glitches with short-term
memory, doesn't peak until midlife, when the white matter in the loftiest parts
of the brain is thickest.
In
the following pages, TIME offers a wealth of new discoveries and practical
information to guide you in the care and maintenance of your mental faculties.
Pay careful attention. Don't get distracted. But don't stress too much either;
relaxation is a balm for the overtaxed brain.
In
pursuit of truth and a tidy desk, a TIME senior editor spends a week on a
mind-altering stimulant
By
BELINDA LUSCOMBE
It
wasn't until I was in the limo being whisked to the studio that it occurred to
me that it was probably a bad idea to go on live TV under the influence of
mind-altering drugs. When I got the call asking me to talk about a TIME story on
that night's late news, I was playing host to a 5-year-old's birthday party,
with attendees who included Ella, Ella, Stella, Ale (pronounced Ellay), Elee
(same) and Belle, the princess I had rented. I don't know if my suggestibility
was caused by a surfeit of medication—in this case, Ritalin—or of liquid
consonants, but I agreed to do it.
The
Ritalin was supposed to make me sharper and prevent this kind of distraction. My
pharmacological experience is rather shallow—I had a nasty SweeTarts addiction a
while back that scared me off the harder stuff—but I knew that millions of kids
with attention-deficit problems were on methylphenidates, as Ritalin and its
cousins are known. I too have attention problems. I too am still maturing. Why
shouldn't it work for me?
Lots
of adults have started taking Ritalin, hoping it will give them mastery over
their oversubscribed lives. Could I get my work done more efficiently? Could I
make decisions more quickly? Could I just maybe tidy my desk? I wanted to see
what it would be like to have focus, clarity, direction. So I found a friendly
psychiatrist, whom I'm going to call Mark although that's only half his name.
After giving me a long lecture on the risks of taking it, most of which I was
too busy answering e-mail to hear, he sent me a
prescription.
The
TV appearance was at the end of my very first day on the big R, as we users call
it. I have no clear memory of what I said, probably because I woke up three
times that night, with a start, as if someone had hit me with a wet sock. At
3:10 a.m., I remembered I owed my mother money. At 4:12 a.m., I felt guilty
about something I said to my son. At 5:14 a.m., I deeply regretted a headline I
had written. Dr. Mark hadn't warned me about this: Ritalin is basically a drug
that wakes you up to remind you of what a loser you are.
At
work the next day, however, the TV bookers showered me in praise. The word
adorable was used several times. The word funny was used. The words better than
Diane Sawyer were not used, but I got the picture. Ritalin plus Ernie Anastos (a
local-TV newscaster). It's a winning combo.
But
it wasn't all sweetness and klieg lights. I was always thirsty. I was often
hungry. When I walked down the street I would involuntarily clench and unclench
my fists, as if I were the Thing. I woke in the wee hours so often I no longer
bothered to wake my husband to tell him about it. And Ritalin made my toes hurt.
O.K., technically it was the wall I was kicking that made my toes hurt. I was
trying to get my kids downstairs to school, and they were moving with the speed
of treacle on asphalt. This is their standard speed, but I don't usually take my
frustrations out on the wall. My children, being New York City kids, simply
shrugged. They've seen worse.
In
fact, being on Ritalin was like landing in Manhattan and assimilating in
fast-forward. First you feel confusion, then a little exhilaration and then,
after a few days and a few more milligrams than is recommended, all-out
aggression. As I walked down the streets, I didn't even see the tourists. I just
saw the line I had to pick through them to get where I had to go. I stepped out
in front of cars that were shooting through the lights, threw myself on to
subways and cursed gratuitously. I had to apologize to one poor lunch companion,
a journalist from out of town who wanted advice on working in the city and whose
chances of success I outlined a little too graphically. I told him I had just
started taking Ritalin. He told me he took it instead of a disco nap to go
clubbing. Wait. We're putting what percentage of the nation's kids on this
drug?
But
if I was becoming a New Yorker squared on Ritalin, I was doing it without any
big-city jadedness or ennui. Nothing seemed too hard. All my deadlines were
invigorating, and all the work I had to get done to meet them lay like a
playground before me. It was going to be a hoot. I didn't get the work done any
faster, but I never felt intimidated or overwhelmed by it.
On
the other hand, I didn't get it done any better either. I think I might have
done it worse. There was an engine driving me and no moment of rest. Watching TV
was almost impossible. I couldn't sit still, could not even derive pleasure from
our household's favorite pastime, mocking David Caruso's cadences on CSI: Miami:
"Where's [long pause] your vault?" I kept wishing Deadwood were on. Now that's a
Ritalin-friendly show.
I
definitely got more done, but it was at the cost of those moments when, while
doing nothing, you have a great idea or find a solution or arrive at the perfect
headline. I had no great ideas on Ritalin. I had some really bad ones, like
chasing the pills with two vodka gimlets—my teeth felt itchy for hours—but it
was all movement, no color. My life became like a bad soccer game in which there
were lots of goals but no thrilling play on the field.
By
the way, that's the kind of incisive sports analysis that lands you on TV. And
I'm keeping a few extra pills handy, just in case.
Can
You Prevent Alzheimer's Disease?
The
latest research suggests that exercising your brain—and your heart—may
help
By
CHRISTINE GORMAN
Laura
Pizzuto, 78, of Seattle admits she loses her words every now and then. An avid
gardener, she will sometimes forget the name of a familiar plant. "But I know
how to look things up," she says. "Or I can go to the library or call a friend."
Occasional memory lapses are not going to slow down this professional artist. "I
want to keep myself going so I can work and enjoy my grandchildren," she says.
To
that end, Pizzuto is doing everything she can to keep her brain, as well as the
rest of her body, in top shape. The odds are decidedly in her favor. For one
thing, she's blessed with good genes. But she also finds fulfillment in her
painting, is active in her community, eats lots of vegetables and exercises
regularly. According to the latest research on aging, those are exactly the
sorts of things we all should be doing to help maintain our ability to remember,
reason, make decisions and learn.
There
are even tantalizing hints that those healthful habits may also prevent or delay
Alzheimer's disease and other forms of dementia—although that conclusion is
controversial. "I would phrase it differently," says Marilyn Albert, director of
the division of cognitive neuroscience at Johns Hopkins University. "What the
studies have done is to take people who are middle-aged and elderly and look at
what maintains good brain health."
No
one is suggesting that a crossword puzzle a day will keep senility at bay or
that somehow it's your fault if your mental capacity fails. But given how
quickly the average age of Americans is rising and how much the risk of dementia
leaps with advancing years, finding anything that delays cognitive decline even
a little would be of enormous value.
No
wonder research looking for links between lifestyle and a healthier brain has
been booming in recent years. Later this month the journal Alzheimer's &
Dementia will publish a long-awaited report prepared for the National Institutes
of Health that summarizes what scientists know and don't know about improving
cognitive and emotional health in the elderly. And the fourth major study on the
role of exercise will be published in the Annals of Internal Medicine by the
Center for Health Studies in Seattle (Pizzuto is one of the 1,740
participants).
Along
the way, neurologists have discovered that the brain is much more adaptable as
it ages than they realized. They have determined that the so-called plasticity
of the brain, which allows the formation of new neurons as well as new
connections between those neurons, can last a lifetime. "As far as our brains
are concerned, learning something new or even retrieving something from memory
is a plasticity response," says Molly Wagster of the National Institute on
Aging. It may get harder as you age, but if you can teach an old brain new
tricks, you might, just might, also be able to keep it functioning well into the
90s.
One
of the top ways to take care of your mind, it turns out, is to make sure your
heart is performing at its best. And there's nothing like physical activity to
promote cardiac fitness. For some people, that will mean participating in an
aerobics class three or more times a week. For others, walking as fast as they
can half an hour a day most days of the week will do the trick. In fact, all
other things being equal, people who engage in a wide variety of physical
activities—like walking and biking and dancing and swimming—seem to be better
protected against cognitive decline than those who don't.
The
research linking heart and brain health is so strong that as you continue
reading this article, you may get the feeling that you've stumbled into a story
about how to prevent cardiac disease. But if fear of a heart attack isn't enough
to get you to pamper your ticker, fear of senility just might. So think about
doing your heart and your head a favor. If you smoke, quit. Get your cholesterol
levels and blood pressure checked, and if they are high, get them treated. If
you have diabetes, do everything you can to keep it under control. Eat at least
five servings of fruits and vegetables a day, consume fish once or twice a week
and cut down on the amount of trans and saturated fat in your diet. The effects
appear to be cumulative. A study published in August found that folks with three
or more major cardiovascular risk factors—for example, hypertension, diabetes
and current smoking—were more likely to develop Alzheimer's disease as
well.
Why
is cardiovascular fitness so important to cognitive health? Researchers used to
think it was all about making sure that plenty of oxygen-rich blood made it to
the brain. Now they are starting to suspect there may be more to it. In
laboratory animals, at least, exercise also seems to stimulate the body's
production of certain molecules called growth factors, which help nerves stay
healthy and keep functioning. "We don't understand a lot about why this
happens," says Arthur Kramer, a researcher at the University of Illinois who
uses brain scans to study the effects of exercise. "But we're learning more
about that."
A
healthy cardiovascular system may even, to some extent, compensate for tiny
defects in the brain. Doctors have long known that suffering one or more
strokes, which interrupt the flow of blood to the brain, increases the
likelihood of dementia. They assumed that Alzheimer's disease was a completely
unrelated problem. In fact, a long-running study of a group of nuns who agreed
to donate their brains when they died has found that isn't necessarily the case.
About a third of the nuns whose brains at autopsy showed clear signs of the
plaques and tangles associated with Alzheimer's disease had exhibited normal
memory and cognitive function until the day they died. The difference: the blood
vessels in their brains were in great shape.
That
doesn't mean those women wouldn't eventually have developed dementia had they
lived long enough. But the study suggested to a lot of physicians that good
vascular health may make it easier for a brain with incipient Alzheimer's to
work around the plaques and tangles in its midst.
Now
that you've got your body running along smoothly, are there any mental
gymnastics you can do to keep dementia at bay? The evidence is provocative but
not terribly compelling. There's no question that you can improve your ability
to remember names or other bits of information by practicing memory tasks, just
as practice will help you learn a new instrument or another language. A number
of researchers have proposed that a lifetime of such efforts could allow you to
build up a healthy cognitive reserve to offset the declines of old age, though
the idea remains theoretical.
Several
studies have found that folks who regularly engage in mentally challenging
activities—like reading, doing crossword puzzles or playing chess—seem less
likely to develop dementia later in life. The difficulty comes in figuring out
whether their good fortune is a direct result of their leisure activities or
whether their continuing pursuit of those pleasures merely reflects good genes
for cognitive function.
A
20-year survey of 469 elderly people living in the Bronx, N.Y., tried to get to
the bottom of this chicken-or-egg question by following subjects who had no
signs of dementia in the first seven years of the study. The results, which were
published in 2003, showed that reading and playing board games or a musical
instrument was associated with a decreased risk of Alzheimer's disease or other
forms of dementia. Intriguingly, those with the strongest habits demonstrated
the greatest benefits. Participants who solved crossword puzzles four days a
week, for instance, had a 47% lower risk of dementia than those who do the
puzzles once a week.
By
the same token, several studies have suggested that older folks who are socially
active—who, for example, do volunteer work or attend religious services—have a
reduced risk of dementia. There are, of course, plenty of caveats that go along
with those observations, including the same old chicken-or-egg problem that
haunts all observational studies: In this case, is withdrawal from society a
cause or result of Alzheimer's disease?
So
where does this leave us? "I use a thermostat analogy with my patients," says
Dr. Laurel Coleman, a geriatrician who sits on the board of the Alzheimer's
Association. "Let's say you're dialed in to get Alzheimer's disease at 82. You
may be able to push that back until maybe you're 92." Depending on where their
personal thermostat is set, some people will do everything right and still
develop dementia in their 50s. Others will do everything wrong and be perfectly
lucid at 101. Most of the rest of us will fall somewhere between those two
extremes. For now, at least, preventing dementia is still a numbers game, but
one in which we're starting to grasp the variables.
Achieving
peak performance depends on controlling the mind that controls your
body
By
ALICE PARK
Elite
athletes talk a lot about being in the zone, that magical place where mind and
body work in perfect synch and movements seem to flow without conscious effort.
Major-league pitchers, NBA stars, pro golfers and Olympic hopefuls dedicate
their careers to the search for this elusive feeling, devoting hours of training
to "listening" to their body and "reading" their muscles—trying to construct a
bridge between mind and body sturdy enough to lead them straight to athletic
nirvana.
But
the truly great athletes, those with long careers and performances that fans
talk about for generations, know that maintaining a competitive edge is less
about keeping it honed to perfection at all times than realizing they can lose
the edge every once in a while and still get it back.
Few
athletes know that better than Olympic sprinter Michael Johnson, one of the
fastest men on earth. Johnson holds two individual world records in track and
five Olympic gold medals. He was the first sprinter to win both the 200-m and
400-m events in a single Olympic Games. He has also had his share of
disappointments. He contracted food poisoning a month before the 1992 Games and
didn't make it past the early heats in the event he was favored to win. And just
before the 2000 Olympics, he injured his quadriceps and failed to qualify for
the 200-m race.
Setbacks
like those would be enough to put most athletes off their game. But Johnson
found a way to push them behind him. "If you have a disappointment," he says,
"you need to ask yourself 'Why did I not perform well today?'" Was it the
preparation? A mistake in execution? "Then you need to get yourself at peace
with that situation," he says.
According
to Johnson, achieving that peace is the key to avoiding a full-fledged slump. A
slump—that downward spiral that only gets worse the harder you try—is familiar
to even amateur athletes. For golfers, it can start with the yips, an
uncontrollable twitch of the arm or an involuntary snap of the wrist at just the
wrong moment. For a pitcher, it's the strike zone over home plate that suddenly
begins to jump around. For the basketball player, it's the hoop that has
inexplicably shrunk.
Athletes
in the throes of a slump will swear that it came all of a sudden, out of
nowhere. But psychologists say the episodes are less mysterious than they seem.
They usually stem from a failure to prepare mentally for the pressure of
athletic competition. "Training is about strengthening the mind-body
connection," says Kirsten Peterson, sports psychologist for the U.S. Olympic
Committee. "Athletes need to train their mind with the same discipline that they
train their bodies."
The
mind-body connection in sports is not some New Age construct. Thoughts have
direct and powerful connections to all sorts of physiological functions. Think
hard enough about jumping out of an airplane, and your heart will start to race
and your palms to sweat. Other thought-induced changes may be more subtle, and
for athletes who rely on fine motor skills, those imperceptible adjustments can
mean the difference between a strikeout and a home run.
At
the root of most slumps is a perceived decline in performance. Athletes tend to
define themselves by their results, and any dip in their stats can make them
start to think they are not as good as they used to be or as good as they
thought they were. In some cases, they may not be slipping at all; their
opponents may just be getting better. Or the decline may be a matter of
perspective; after all, no one can perform at peak levels 100% of the time.
Over-training and bringing the muscles to the brink of fatigue can lead to a
physical plateau, after which the body just can't run any faster or swing any
harder.
What
elevates any of those scenarios from an ordinary off day to a prolonged slump is
the way the athlete interprets the dip. "It has less to do with what is
contributing to the decrease in performance and more to do with how you react
and adjust to the decline," says Jonathan Katz, a psychologist at Columbia
University's College of Physicians and Surgeons.
Much
of the action takes place without the athlete's even being aware that it's
occurring. After years of practice, hitting a baseball or shooting a basket
becomes almost second nature to a professional athlete. So it's easy to think
the skill resides in muscle memory. But even those rote actions involve a
tremendous amount of mental processing; they are just happening too fast for the
athlete to realize they are going on. "It's not the conscious kind of
processing, the kind where you're thinking about how to control your body," says
Jeff Simons, a sports psychologist at California State University, East Bay.
"Our conscious brain cannot keep up with the speed of information processing
necessary to perform a high-level skill."
Any
learned sports skill begins in the thinking part of the brain, with nerves in
the prefrontal cortex. As those neurons get excited, they activate nerve cells
connected to the limbic system just under the cerebrum of the brain, the area
associated with emotions such as fear, anxiety, elation and satisfaction. That
area is tied in turn to the motor cortex, which controls the
muscles.
If
the feedback loop is dominated by fear—fear of failure, fear of disappointing
teammates, fear of being unworthy—the circuit starts to resemble the classic
fight-or-flight response. In the perform-or-perish version, anxious thoughts
trigger the release of adrenaline, the hormone that sets the heart racing,
primes the muscles to run and puts all the senses on alert. The eyes slip into
tunnel vision—the last thing a quarter-back needs when he's relying on
peripheral perception to spot a waiting receiver.
One
way experts help athletes control the jitters is by teaching them to take
command of the interior monologue that psychologists call self-talk. This is the
endless conversation that we all have with ourselves, processing events as they
pass before our eyes. The average person speaks to himself at a rate of 300 to
1,000 words a minute. According to Trevor Moawad, director of mental
conditioning for IMG Academies, a leading sports-training facility, that means
that for a tennis player competing in a typical 2-hr. match, only about 40 min.
are spent on the court contesting points, leaving an hour and 20 min. between
points with little to do but talk to oneself. Positive chatter can help the
athlete stay focused, but if the conversation strays into fears of failing, then
the self-talk can become counterproductive.
"You
can't stop those negative thoughts from coming," says Michael Johnson,
"especially when you enter an arena or when you see your competitors walk by.
The only way to stop those thoughts is to replace them with something else." For
Johnson, the substitute images and words were all about the race ahead. "If
you're going to replace them, you might as well replace them with something
that's going to help you," he says. He liked to visualize the upcoming race,
concentrating on the start, the weakest part of his race, and thinking about
himself shooting off the blocks like a bullet.
Aynsley
Smith, director of the sports-medicine research center at the Mayo Clinic, gives
her athletes a more tangible system of thought swapping. "I tell them that
self-talk exists on three channels: positive, negative and escape. You try to be
on the positive channel as much as you can while you're training or competing,
but when the negative thoughts start coming, it's the speed of the transition
that counts. I give them a clicker pen and tell them to just click over from the
negative to the positive channel." If the anxiety doesn't go away, says Smith,
then it's time to switch to the escape channel. That's for thoughts about how
the athlete's role model would react. How would Joe DiMaggio get over the
disappointment? What would the Babe do?
Smith,
who works with ice-hockey players, finds that biofeedback techniques are
particularly effective for controlling jitters. Most athletes are skilled at
visual imagery, and when shown monitors that display their anxiety levels as a
graph or chart, they quickly learn to corral their nervousness and keep it from
interfering with the smooth flow of their practiced skills. "I tell people they
need to try to get back to doing rather than thinking," says
Simons.
Relaxation
techniques like deep breathing are also good for helping athletes quiet the
mental chatter long enough for their bodies to perform. "You have to help them
realize that 'I have to get out of my own way,'" says Simons. "Relaxing can help
them imagine competing, getting in their own groove, feeling it, tasting it,
reminding them of that feeling of flow."
For
Michael Johnson, who competed in three Olympic Games over a span of a dozen
years, avoiding a slump was mostly a matter of staying in control. "The first
thing an athlete has to realize is that you are always in control," he says.
"And you need to maintain that control." Control, that is, of both the body and
the mind.
Innovation
requires no special thought processes, says an expert. Creative people just work
harder at it
What
is creativity? Where does it come from? The workings of the creative mind have
been subjected to intense scrutiny over the past 25 years by an army of
researchers in psychology, sociology, anthropology and neuroscience. But no one
has a better overview of this mysterious mental process than Washington
University psychologist R. Keith Sawyer, author of the new book Explaining
Creativity: The Science of Human Innovation (Oxford; 336 pages). He's working on
a version for the lay reader, due out in 2007 from Basic Books. In an interview
with Francine Russo, Sawyer shares some of his findings and suggests ways in
which we can enhance our creativity not just in art, science or business but in
everyday life.
Q.
Has the new wave of research upended any of our popular notions about
creativity?
A.
Virtually all of them. Many people believe creativity comes in a sudden moment
of insight and that this "magical" burst of an idea is a different mental
process from our everyday thinking. But extensive research has shown that when
you're creative, your brain is using the same mental building blocks you use
every day—like when you figure out a way around a traffic
jam.
Q.
Then how do you explain the "aha!" moment we've all had in the shower or the
gym—or anywhere but at work?
A.
In creativity research, we refer to the three Bs—for the bathtub, the bed and
the bus—places where ideas have famously and suddenly emerged. When we take time
off from working on a problem, we change what we're doing and our context, and
that can activate different areas of our brain. If the answer wasn't in the part
of the brain we were using, it might be in another. If we're lucky, in the next
context we may hear or see something that relates—distantly—to the problem that
we had temporarily put aside.
Q.
Can you give us an example of that?
A.
In 1990 a team of NASA scientists was trying to fix the distorted lenses in the
Hubble telescope, which was already in orbit. An expert in optics suggested that
tiny inversely distorted mirrors could correct the images, but nobody could
figure out how to fit them into the hard-to-reach space inside. Then engineer
Jim Crocker, taking a shower in a German hotel, noticed the European-style
showerhead mounted on adjustable rods. He realized the Hubble's little mirrors
could be extended into the telescope by mounting them on similar folding arms.
And this flash was the key to fixing the problem.
Q.
How have researchers studied this creative flash?
A.
By using many cleverly designed experiments. Some psychologists set up video
cameras to watch creative people work, asking them to describe their thought
processes out loud or interrupting them frequently to ask how close they were to
a solution. Invariably, they were closer than they realized. In other
experiments, subjects worked on problems that, when solved, tend to result in
the sensation of sudden insight. In one experiment, they were asked to look at
words that came up one at a time on a computer screen and to think of the one
word that was associated with all of them. After each word—red, nut, bowl, loom,
cup, basket, jelly, fresh, cocktail, candy, pie, baking, salad, tree, fly,
etc.—they had to give their best guess. Although many swore they had no idea
until a sudden burst of insight at about the 12th word, their guesses got
progressively closer to the solution: fruit. Even when an idea seems sudden, our
minds have actually been working on it all along.
Q.
Has brain imaging illuminated the creative process?
A.
The first such study was done this year but was inconclusive. In the next five
to 10 years, cognitive neuroscience will be able to tell us
more.
Q.
What has been learned from historical research?
A.
Studying notebooks, manuscripts and historical records, we've dissected the
creative process of people like the Wright brothers, Charles Darwin, T.S. Eliot,
Jackson Pollock, even business innovators like Citigroup's John Reed. We find
that creativity happens not with one brilliant flash but in a chain reaction of
many tiny sparks while executing an idea.
Q.
But isn't it the original creative flash that's critical?
A.
Not at all. Take the first airplane. On Dec. 8, 1903, Samuel Pierpont Langley, a
leading government-funded scientist, launched with much fanfare his flying
machine on the Potomac. It plummeted into the river. Nine days later, Orville
and Wilbur Wright got the first plane off the ground. Why did these bicycle
mechanics succeed when a famous scientist failed? Because Langley hired other
people to execute his concept. Studying the Wrights' diaries, you see that
insight and execution are inextricably woven together. Over years, as they
solved problems like wing shape and wing warping, each adjustment involved a
small spark of insight that led to others.
Q.
Are there other generalizations you can make about creative
people?
A.
Yes. They have tons of ideas, many of them bad. The trick is to evaluate them
and mercilessly purge the bad ones. But even bad ideas can be useful. Darwin's
notebooks, for example, show us that he went down many dead ends—like his theory
of monads. These were tiny hypothetical life forms that sprang spontaneously
from inanimate matter. If they died, they took with them all the species into
which they had evolved. Darwin spent years refining this bizarre theory before
ultimately rejecting it. But it was a critical link in the chain that led to his
branching model of evolution. Sometimes you don't know which sparks are
important until later, but the more ideas you have, the
better.
Q.
So how can the average person get more ideas?
A.
Ah, here's where we come up against another of our cultural myths about
creativity—that of the lone genius. Ideas don't magically appear in a genius'
head from nowhere. They always build on what came before. And collaboration is
key. Look at what others in your field are doing. Brainstorm with people in
different fields. Research and anecdotal evidence suggest that distant analogies
lead to new ideas—like when a heart surgeon bounces things off an architect or a
graphic designer.
Q.
Can we become more creative by studying more than one
field?
A.
No one can be creative at everything. You have to work hard in your area, let's
say music, and learn everything that's already been done. But multitasking on
several music projects at once might foster unexpected connections and new
ideas.
Q.
Are great artists different from inventors and scientists?
A.
All the research shows that the creative process is basically the same:
generating ideas, evaluating them and executing them, with many creative sparks
over time. The role of collaboration may be more obvious in business than in
writing, but even apparently solitary creators like writers read constantly and
talk to one another. In the 1920s and 1930s, for example, J.R.R. Tolkien and
C.S. Lewis batted around religious and literary ideas with the Inklings, a group
of unfashionably Christian professors who met weekly at an Oxford
pub.
Q.
What advice can you give us nongeniuses to help us be more
creative?
A.
Take risks, and expect to make lots of mistakes, because creativity is a numbers
game. Work hard, and take frequent breaks, but stay with it over time. Do what
you love, because creative breakthroughs take years of hard work. Develop a
network of colleagues, and schedule time for freewheeling, unstructured
discussions. Most of all, forget those romantic myths that creativity is all
about being artsy and gifted and not about hard work. They discourage us because
we're waiting for that one full-blown moment of inspiration. And while we're
waiting, we may never start working on what we might someday
create.
Scientists
find that meditation not only reduces stress but also reshapes the
brain
By
LISA TAKEUCHI CULLEN
At
4:30, when most of Wall Street is winding down, Walter Zimmermann begins a
high-stakes, high-wire act conducted live before a paying audience. About 200
institutional investors—including airlines and oil companies—shell out up to
$3,000 a month to catch his daily webcast on the volatile energy markets, a
performance that can move hundreds of millions of dollars. "I'm not paid to be
wrong—I can tell you that," Zimmermann says. But as he clicks through dozens of
screens and graphics on three computers, he's the picture of focused calm.
Zimmermann, 54, watched most of his peers in energy futures burn out long ago.
He attributes his brain's enduring sharpness not to an intravenous espresso drip
but to 40 minutes of meditation each morning and evening. The practice, he says,
helps him maintain the clarity he needs for quick, insightful analysis—even
approaching happy hour. "Meditation," he says, "is my secret weapon."
Everyone
around the water cooler knows that meditation reduces stress. But with the aid
of advanced brainscanning technology, researchers are beginning to show that
meditation directly affects the function and structure of the brain, changing it
in ways that appear to increase attention span, sharpen focus and improve
memory.
One
recent study found evidence that the daily practice of meditation thickened the
parts of the brain's cerebral cortex responsible for decision making, attention
and memory. Sara Lazar, a research scientist at Massachusetts General Hospital,
presented preliminary results last November that showed that the gray matter of
20 men and women who meditated for just 40 minutes a day was thicker than that
of people who did not. Unlike in previous studies focusing on Buddhist monks,
the subjects were Boston-area workers practicing a Western-style of meditation
called mindfulness or insight meditation. "We showed for the first time that you
don't have to do it all day for similar results," says Lazar. What's more, her
research suggests that meditation may slow the natural thinning of that section
of the cortex that occurs with age.
The
forms of meditation Lazar and other scientists are studying involve focusing on
an image or sound or on one's breathing. Though deceptively simple, the practice
seems to exercise the parts of the brain that help us pay attention. "Attention
is the key to learning, and meditation helps you voluntarily regulate it," says
Richard Davidson, director of the Laboratory for Affective Neuroscience at the
University of Wisconsin. Since 1992, he has collaborated with the Dalai Lama to
study the brains of Tibetan monks, whom he calls "the Olympic athletes of
meditation." Using caps with electrical sensors placed on the monks' heads,
Davidson has picked up unusually powerful gamma waves that are better
synchronized in the Tibetans than they are in novice meditators. Studies have
linked this gamma-wave synchrony to increased awareness.
Many
people who meditate claim the practice restores their energy, allowing them to
perform better at tasks that require attention and concentration. If so,
wouldn't a midday nap work just as well? No, says Bruce O'Hara, associate
professor of biology at the University of Kentucky. In a study to be published
this year, he had college students either meditate, sleep or watch TV. Then he
tested them for what psychologists call psychomotor vigilance, asking them to
hit a button when a light flashed on a screen. Those who had been taught to
meditate performed 10% better—"a huge jump, statistically speaking," says
O'Hara. Those who snoozed did significantly worse. "What it means," O'Hara
theorizes, "is that meditation may restore synapses, much like sleep but without
the initial grogginess."
Not
surprisingly, given those results, a growing number of corporations—including
Deutsche Bank, Google and Hughes Aircraft—offer meditation classes to their
workers. Jeffrey Abramson, CEO of Tower Co., a Washington-based development
firm, says 75% of his staff attend free classes in transcendental meditation.
Making employees sharper is only one benefit; studies say meditation also
improves productivity, in large part by preventing stress-related illness and
reducing absenteeism.
Another
benefit for employers: meditation seems to help regulate emotions, which in turn
helps people get along. "One of the most important domains meditation acts upon
is emotional intelligence—a set of skills far more consequential for life
success than cognitive intelligence," says Davidson. So, for a New Year's
resolution that can pay big dividends at home and at the office, try this: just
breathe.
Does
it feel as if caffeine makes you more clever, upbeat and alert? Maybe that's
because it does
By
MICHAEL D. LEMONICK
I'm sitting
in Small World Coffee, the place in Princeton, N.J., where locals go when they
want to avoid the sterile trendiness of Starbucks, just around the corner. The
place is packed with students and professors. Nobel prizewinners drop in
frequently (John Nash, the mathematician hero of A Beautiful Mind, is a
regular). But I'm not here for intellectual-celebrity watching. I'm here because
my editor has ordered up a story on the question of whether caffeine makes you
smarter. And without a latte—with three shots of espresso today instead of the
regular two—I wouldn't feel equal to the task. Experience tells me that a strong
dose of caffeine inevitably makes me more alert, focused, quick-witted, clever.
As far as I'm concerned, the case is already closed.
That's
a purely subjective assessment, but placebo-controlled laboratory experiments
say exactly the same thing. Just last month Austrian scientists reported on a
study showing that the equivalent of two cups of coffee boosts short-term memory
significantly. And that's just the latest in a long line of tests proving that
caffeine can enhance mental performance.
Indeed,
there has been lots of surprisingly good news in general about caffeine and
coffee. You would naturally assume that an addictive drug like caffeine—the most
widely consumed psychoactive drug on the planet—must surely be bad for you, and
initial studies suggested it might lead to bladder cancer, high blood pressure
and other ills. More recent research has not only refuted most of those claims
but also come up with some significant benefits. Caffeine appears to have some
protective effect against liver damage, Parkinson's disease, diabetes,
Alzheimer's, gallstones, depression and maybe even some forms of cancer. The
only proven medical downside appears to be a temporary elevation in blood
pressure, which is a problem only if you already suffer from hypertension. Some
studies have also suggested a higher risk of miscarriage in pregnant women and
of benign breast cysts, but those results are highly
controversial.
While
most of the findings about the effects of caffeine remain open to further
testing, caffeine's boosting your brainpower has been proved beyond any
reasonable doubt "As a research psychologist," says Harris Lieberman, who works
in the Military Nutrition Division of the U.S. Army Research Institute of
Environmental Medicine in Natick, Mass., "I use the word intelligence as an
inherent trait, something permanently part of your makeup." Caffeine can't
change that, Lieberman says. But what it can do, he says, is heighten your
mental performance. If you're well rested, it tends to improve rudimentary brain
functions, like keeping your attention focused on boring, repetitive tasks for
long periods. "It also tends to improve mood," he says, "and makes people feel
more energetic, generally better overall." Observes Dr. Peter Martin, professor
of psychiatry and pharmacology and director of the Addiction Center at
Vanderbilt University: "Attention and mood are both elements of how we focus our
intellectual resources on a problem at hand."
Caffeine's
real power kicks in, though, when you're tired. That's of obvious interest to
the military, which counts on servicemen and -women to make life-and-death
decisions even when they have been in the field without rest for days. "When
you're sleep deprived and you take caffeine," says Lieberman, who has carried
out extensive tests on Navy SEALS, among others, "pretty much anything you
measure will improve: reaction time, vigilance, attention, logical
reasoning—most of the complex functions you associate with intelligence. And
most Americans are sleep deprived most of the time." Again, caffeine doesn't
make you inherently smarter; it just lets you call more effectively on the
intelligence you already have.
Precisely
how it all works is still being figured out by neuroscientists. What they know
is that caffeine binds to receptors that normally accept adenosine, a
neurotransmitter that signals brain cells to quiet down their activity. Blocking
adenosine staves off sleepiness. The resulting higher level of brain activity
puts the nervous system on alert, triggering the release of adrenaline—the
probable cause of caffeine's tendency to focus the mind.
Caffeine
also triggers the release of dopamine, mostly in the frontal areas of the brain
and the anterior cingular cortex, in which the so-called executive functions
like attention, task management and concentration are located. This is
consistent with what the Austrian scientists reported last month at the
Radiological Society of America's annual conference in Chicago. Dr. Florian
Koppelstaetter and his colleagues at the Medical University in Innsbruck gave 15
male volunteers 100 mg each of caffeine—about the same amount as in two cups of
coffee—and then tested their short-term memory. Not only did the caffeine
drinkers perform significantly better than those on placebos (all the subjects
were in both the caffeine and the control groups in different rounds of
testing), but when the scientists scanned their brains with functional MRIS, the
anterior cingular cortex and the frontal lobes lit up with increased
activity.
Caffeine
is just a single chemical, of course, whereas coffee contains scores of
substances. Some of them are antioxidants, which could explain part of its
protective effect against disease. Some are psychoactive. "Our research," says
Martin, "has focused on some of those other elements, such as chlorogenic acids,
which keep adenosine in circulation in the brain longer than normal. That might
augment coffee's ability to increase concentration without increasing
irritability."
And
then there's tea and chocolate, both of which also have caffeine, along with
their own mélanges of antioxidants and other chemicals. Teasing out the specific
actions of each one and separating them from caffeine's could take years. For
the patrons crowding Small World Coffee, all of that is beyond the immediate
point, which seems to be nothing more than getting a morning fix of one
caffeinated drink or another before setting off to conquer the intellectual
challenges waiting at the university just up the street. "A mathematician," the
legendary number theorist Paul Erdos used to say, "is a machine for turning
coffee into theorems." Organic chemistry, neuroscience, psychology and pretty
much universal experience suggest that he probably was on to
something.
Staying
up late to get ahead? It might be more productive to get a good night's
rest
By
SORA SONG
Americans
are not renowned for their powers of self-deprivation; doing without is not
something we do particularly well. But experts say there is one necessity of
life most of us consistently fail to get: a good night's sleep.
The
recommended daily requirements should sound familiar: eight hours of sleep a
night for adults and at least an hour more for adolescents. Yet 71% of American
adults and 85% of teens do not get the suggested amount, to the detriment of
body and mind. "Sleep is sort of like food," says Robert Stickgold, a cognitive
neuroscientist at Harvard Medical School. But, he adds, there's one important
difference: "You can be quite starved and still alive, and I think we appreciate
how horrible that must be. But many of us live on the edge of sleep starvation
and just accept it."
Part
of the problem is we are so used to being chronically sleep deprived--and have
become so adept at coping with that condition--that we no longer notice how
exhausted we really are. In 2003, sleep expert David Dinges and colleagues at
the University of Pennsylvania School of Medicine tested the effects of
restricting slumber to eight, six or four hours a night for two weeks. During
the first few days, subjects sleeping less than eight hours admitted to being
fatigued and lacking alertness. But by Day 4, most people had adapted to their
new baseline drowsiness and reported feeling fine--even as their cognitive
performance continued to plummet.
Over
time, the experiment's sleep-restricted subjects became so impaired that they
had difficulty concentrating on even the simplest tasks, like pushing a button
in response to a light. "The human brain is only capable of about 16 hours of
wakefulness [a day]," says Dinges. "When you get beyond that, it can't function
as efficiently, as accurately or as well."
In
the real world, people overcome their somnolence--at least temporarily--by
drinking coffee, taking a walk around the block or chatting with office mates.
But then they find themselves nodding off in meetings or, worse, behind the
wheel. Those short snatches of unconsciousness are what researchers call
microsleep, a sure sign of sleep deprivation. "If people are falling asleep
because 'the room was hot' or 'the meeting was boring,' that's not coping with
sleep loss. I would argue that they're eroding their productive capability,"
says Dinges.
What
most people don't realize is that the purpose of sleep may be more to rest the
mind than to rest the body. Indeed, most of the benefits of eight hours' sleep
seem to accrue to the brain: sleep helps consolidate memory, improve judgment,
promote learning and concentration, boost mood, speed reaction time and sharpen
problem solving and accuracy. According to Sonia Ancoli-Israel, a psychologist
at the University of California at San Diego who has done extensive studies in
the aging population, lack of sleep may even mimic the symptoms of dementia. In
recent preliminary findings, she was able to improve cognitive function in
patients with mild to moderate Alzheimer's simply by treating their underlying
sleep disorder. "The need for sleep does not change a lot with age," says
Ancoli-Israel, but often because of disruptive illnesses and the medications
used to treat them, "the ability to sleep does."
If
you're one of the otherwise healthy yet perpetually underrested, there's plenty
you can do to pay back your sleep debt. For starters, you can catch up on lost
time. Take your mom's advice, and get to bed early. Turn off the TV half an hour
sooner than usual. If you can't manage to snooze longer at night, try to squeeze
in a midday nap. The best time for a siesta is between noon and 3 p.m., for
about 30 to 60 minutes, according to Timothy Roehrs, director of research at the
Sleep Disorders and Research Center at Henry Ford Hospital in Detroit. He
advises against oversleeping on weekend mornings to make up for a workweek of
deprivation; late rising can disrupt your circadian rhythm, making it even
harder later to get a full night's rest.
According
to Dinges' analysis of data from the 2003 American Time Use Survey, the most
common reason we shortchange ourselves on sleep is work. (The second biggest
reason, surprisingly, is that we spend too much time driving around in our
cars.) But consider that in giving up two hours of bedtime to do more work,
you're losing a quarter of your recommended nightly dose and gaining just 12%
more time during the day. What if you could be 12% more productive instead? "You
have to realize that if you get a good night's sleep, you will actually be more
efficient and get more done the next day. The more you give up on sleep, the
harder it is to be productive," says Ancoli-Israel. "What is it going to
be?"
If
mental sharpness is your goal, the answer is clear: stop depriving yourself, and
get a good night's sleep.
Scientists
used to think intellectual power peaked at age 40. Now they know
better
By
JEFFERY KLUGER
It
took Barbara Hustedt Crook an awfully long time to get around to writing her
first musical. She started last year, shortly before her 60th birthday. Her
friend and collaborator, Robert Strozier, waited even longer; he's 65. It's not
that they didn't have the creative chops for the job. The two have spent their
careers writing and editing in New York City, and Crook has a background in
performing, singing and piano. But creating a musical always felt just out of
reach--until now.
"Somehow
I have a confidence I didn't have before," says Crook. "I find that my brain
makes leaps it didn't make so easily. I can hear my inner voice and trust
instincts and hunches in ways I didn't used to."
And,
says Strozier, they're both a lot more willing to take chances than in the past.
"At a certain age," he says, "you either get older or you get younger. If you
get younger, you venture out and take risks."
Risk-taking
seniors making daring mental leaps? That's not the stereotype. Indeed, until
quite recently most researchers believed the human brain followed a fairly
predictable developmental arc. It started out protean, gained shape and
intellectual muscle as it matured, and reached its peak of power and nimbleness
by age 40. After that, the brain began a slow decline, clouding up little by
little until, by age 60 or 70, it had lost much of its ability to retain new
information and was fumbling with what it had. But that was all right because
late-life crankiness had by then made us largely resistant to new ideas
anyway.
That,
as it turns out, is hooey. More and more, neurologists and psychologists are
coming to the conclusion that the brain at midlife--a period increasingly
defined as the years from 35 to 65 and even beyond--is a much more elastic, much
more supple thing than anyone ever realized.
Far
from slowly powering down, the brain as it ages begins bringing new cognitive
systems on line and cross-indexing existing ones in ways it never did before.
You may not pack so much raw data into memory as you could when you were
cramming for college finals, and your short-term memory may not be what it was,
but you manage information and parse meanings that were entirely beyond you when
you were younger. What's more, your temperament changes to suit those new
skills, growing more comfortable with ambiguity and less susceptible to
frustration or irritation. Although inflexibility, confusion and even later-life
dementia are very real problems, for many people the aging process not only does
not batter the brain, it actually makes it better.
"In
midlife," says UCLA neurologist George Bartzokis, "you're beginning to maximize
the ability to use the entirety of the information in your brain on an everyday,
ongoing, second-to-second basis. Biologically, that's what wisdom
is."
If
your mind does indeed grow more agile as you age, one of the things that may
help it do so is the amount of glue you carry around in your brain--glia (Greek
for glue) being what the 19th century German anatomists called it. Only about
half the mass of the brain is composed of gray matter, or nerve cells; the rest
is white matter, the connecting tissue that, in a sense, glues it all together.
Much of that white matter is made of conductive nerve strands, and covering each
fine wire is a fatty sheath of myelin that keeps nerve signals from sputtering
out or cross firing during transmission. "Myelin is what makes us human," says
Bartzokis. "We have 20% to 30% more than other primates
do."
Throughout
our lives, fresh layers of myelin sheathing are laid down in the brain. In
infants and children, who grow increasingly coordinated as they mature, the bulk
of that takes place in the motor and sensory lobes. If we acquire better
reasoning skills in middle age, Bartzokis long suspected, it would follow that
most of the myelin added in those years would appear around the
signal-transmitting axons in the higher brain regions that are the seat of
sophisticated thought. Essentially, the brain spends decades upgrading itself
from a dial-up Internet to a high-speed version, not fully completing the job
until age 45 or so.
To
test that idea, Bartzokis used magnetic resonance imaging to study the volume
and distribution of white matter in 300 healthy subjects from 18 to 75 years old
as well as in hundreds of older people suffering from such brain-related ills as
Alzheimer's and Parkinson's diseases. As he suspected, the healthy adults had
the most myelin in the frontal and temporal lobes--where big thoughts live. The
quantity of sheathing reached its peak around 45 or 50, exceeding the amount in
unhealthy older subjects and healthy younger ones.
"This
last little bit of myelination essentially puts us online," Bartzokis says. "You
may not have the same amount of information you had when you were 20, but you
can use it better in everyday life."
It's
not just the wiring that charges up the brain as we age, it's the way different
regions start pulling together to make the whole organ work better than the sum
of its parts. For all its plasticity, the brain is a specialized machine, with
specific regions handling specific operations. The greatest divergence comes
between the left and the right hemispheres, which often work almost
independently of each other. That is not such a bad thing because one hemisphere
can be busy writing a grocery list or solving an equation while the other scans
the environment and tends to other basic chores. As we age, however, the walls
between the hemispheres seem to fall, with the two halves working increasingly
in tandem. Neuroscientist Roberto Cabeza of Duke University dubs that the HAROLD
(hemispheric asymmetry reduction in older adults) model, and judging by his
work, the phenomenon is a powerful one.
Cabeza
recruited a sample group of adults 65 to 95 years old who had scored high on a
memory test, along with a group of lower-performing adults of the same age and a
group of younger, college-age adults. He then asked them all to perform a series
of tasks that called on numerous skills, including language, memory, perception
and motor functions. Throughout the tasks, he conducted functional magnetic
resonance imaging scans of their brains. Again and again, he found that the
high-functioning older adults were using either a hemisphere different from the
one the other subjects were using or both hemispheres at the same
time.
Why
that is so is still unclear, but Cabeza doesn't believe the brain is programmed
to get stronger as it ages. Rather, he acknowledges, in many ways it gets
weaker, with neurons processing information less efficiently. The
bilateralization may be a trick the brain uses to compensate for the decline,
sometimes integrating the hemispheres so efficiently that our thought and
reasoning processes are actually better than they were
before.
"It's
similar to the way you need both hands to lift a weight that you could lift with
one hand when you were younger," Cabeza says. "In the brain, there's a nice,
natural distribution of resources. You get more neural tissue to support the
task."
As
the brain's flexibility improves, so too may the temperament we bring to our
work. There's no question that personalities can calcify with age, causing us to
become less receptive to new experiences and flat-out crabby when faced with
them. But that's not the case with everyone. In fact, in many people the
opposite happens.
In
1958 psychologist Ravenna Helson, now an emeritus professor at the University of
California, Berkeley, began a long-term study of 142 women, all of them 21 years
old, at Mills College in Oakland, Calif. She interviewed the subjects and took
measures of their personalities, drives, relationship skills and the like. Then
she reinterviewed them at ages 27, 43, 52 and 61 to determine how those traits
changed over time. Just last year she and a graduate student, psychologist
Christopher Soto, collated the data from the 123 women who stuck with the study.
The results were surprising.
On
the whole, they found, the women's highest scores in inductive reasoning
occurred from their 40s to their early 60s. Similarly, their so-called affect
optimization (the ability to highlight the better aspects of one's personality
and restrain the less attractive ones) and their affect complexity (the ability
to evaluate various contradictory ideas and remain objective) did not peak until
their 50s or 60s. There was also an increased tolerance for ambiguity and an
improved ability to manage relationships.
The
Mills sample group was hardly random, consisting principally of white women of
the same age who attended the same college. Still, they were 123 different
individuals, and the results were nonetheless uniform. "People generally
describe personality change in middle age as a midlife crisis, with all its
negative connotations," says Soto. "In the Mills women, the change was
positive--a reorienting, not a crisis."
If
such a change occurs, says psychologist Robert Levenson, also at U.C. Berkeley,
it may be shaped in part by evolutionary forces, offering advantages for the
whole species. Human beings' comparatively long life spans and extended families
are very good things, but keeping big broods healthy and well behaved over the
decades takes more than the energy of young parents. It takes the cool heads and
wise counsel of the family graybeards too. "Evolution isn't just about
reproduction," Levenson says. "When you get into your 40s and 50s, you're
caretaking, looking after your children, grandchildren, even the people who work
for you. There's an advantage to having a more relativistic
mind."
It's
that talent for reflective thinking that explains the role older adults have
always played in the human culture. It's not for nothing that history's
firebrands and ideologues are typically young, while its judges and peacemakers
and great theologians tend to be older. Not everyone achieves the sharp thought
and serene mien that can come with age. But for those who do, the later years
can be the best years they have ever had.
By
WALTER KIRN
As
a child, I measured my mental development (and I was the sort of child, I
confess, who found his own mental development fascinating) by the complexity of
the jigsaw puzzles I was able to complete. As I learned to do puzzles with
smaller, more numerous pieces, graduating from simple farmyard scenes to
detailed panoramas of city skylines, I felt better and better about myself. The
adults in my life seemed to feel better about me too. But then something
unexpected happened. One afternoon when I was 10 or so, I finished a 1,000-piece
puzzle of the Milky Way and came to the realization that, puzzle-wise, I'd done
all that I could do—meaning all that a normal child should ever wish to do. I
realized that to master more difficult puzzles would be a sign not of desirable
growth but of troubling compulsion.
I
think back to that fiendishly complicated puzzle of stars and planets and
whirling gas clouds whenever I think about the promise of human-intelligence
enhancement. How much quicker and more acute do people really want to be? How
many more bits per cubic inch of gray matter do people wish they could store?
People whose minds are generally healthy, that is. People who, for their age and
condition, are already smart enough.
The
devilish problem, of course, is defining "smart enough." Enough to accomplish
what, precisely? To make a living or to make a killing? And smart enough to
satisfy whom? An employer who wants you to do your work by quitting time or one
who wishes you had finished it yesterday? Being able to do what must be done is
liberating, but being able to do whatever might be done (or whatever your driven
ego or pushy boss might conceivably demand) can be
enslaving.
And
does anyone really want to be brilliant all the time? Though heightened
intelligence would seem to be a universally desirable goal, not all tasks and
stages of life demand the amped-up cognitive speed and processing power the new
regimens and medications may make possible. Becoming a parent, for example. I
read somewhere once that many mothers and fathers suffer a rapid, appreciable
drop in IQ after their babies are born. This, if true, is a huge gift from
nature. Diapering, feeding and comforting little ones demands dumb endurance, in
my experience, not penetrating cleverness. Thinking too clearly while cleaning
up diarrhea on two hours' sleep in a house that you've just realized is one room
too small and two times too expensive can make you
suicidal.
And
yet people dream of aping their computers, which grow measurably more agile
every six months. Not wiser or saner or more truthful, those immeasurable human
qualities that are extolled by priests and poets, but just better at handling
elaborate graphics, say, or performing multimillion-variable calculations.
Assuming that we can keep up with these machines, where will it take us as a
society? When the shared ideal is to be like Mr. Spock instead of Dr. Spock, and
to emulate Dr. Jonas Salk rather than Marcus Welby, M.D., who will stroke
humanity's fevered forehead? No one, I fear, unless we use our brainpower to
develop an altruism pill.
Genius
goes only so far—at least in the current, cybernetic sense. In terms of sheer
neurological acuity, how would Jesus or the Buddha have ranked? And how would
your dear old grandfather have scored—that guy who could whittle a cottonwood
twig all day and invent new bedtime stories every night? How often, now that the
fellow isn't around, do you catch yourself wishing he'd been sharper, swifter?
Quite often, perhaps, if Grandpa suffered from Alzheimer's, but what if he was
just a wee bit... plodding?
It's
unrealistic to expect that people will forgo easy intelligence enhancement out
of some fear that it may turn them into sociopaths obsessed with the goings-on
inside their skulls and negligent about the outside world. The rat race keeps
accelerating, and the labyrinths in which it is run are growing more complicated
by the hour, it seems—as are the technological devices that are meant to help us
through their tricky passages. If many more features are added in the next year
to the average cell phone, for example, I may have to retire to a cave and
survive on campfire-roasted venison. My synapses are on overload as it
is.
Still,
it seems important to remember that intelligence—human intelligence—involves a
lot more than problem-solving skills or memory capacity. Sometimes the challenge
of being a person is to recognize that the task at hand should be performed
later, considered from a new angle or, if it's a waste of time, ignored. That's
why, at age 43, I'm not at work on a 600,000-piece jigsaw puzzle depicting
Australia's Great Barrier Reef. I was smart enough to know at 10 that it's not
what one can do that matters but what's worth doing.
One
of America's leading proponents of natural healing offers a guide to foods that
go straight to your head
By
ANDREW WEIL, M.D.
We
know that what you eat, and don't eat, can affect your health. But is it
possible, as the White Rabbit advised Alice, to "feed your head"? Is there such
a thing as brain food? I'm convinced there is. The evidence for some foods, such
as fish, is stronger than for others, like turmeric and brightly colored
vegetables. But none of those foods is bad for you, and they certainly won't
make you any less smart.
The
reason fish is so good for the brain is the so-called omega-3 fatty acids it
contains. Oily fish, like salmon, sardines, mackerel, herring, bluefish and
black cod, are the best sources of those special fats. One of the
omega-3s—DHA—is the main constituent of cell membranes in the brain, and a
deficiency of it can weaken the brain's architecture and leave it vulnerable to
disease.
Diets
associated with longevity and good health, like the Mediterranean and
traditional Japanese diets, are high in omega-3 fatty acids from fish. The North
American diet is not. I have long recommended that people in the U.S. eat more
fish—at least two servings a week—but I have been concerned lately about reports
of increasing levels of mercury, PCBS and other contaminants in certain fish
species. In my diet I stick to sardines, herring, Alaskan black cod and Alaskan
sockeye salmon. All sockeye (red) salmon are wild—fish farmers haven't yet been
able to domesticate them—and since those fish are less carnivorous than other
types of salmon, they have lower levels of the environmental contaminants that
accumulate as you work your way up the food chain. Canned sockeye, available in
most supermarkets, is a perfectly good source of omega-3s.
But
for some people it may be easier and safer to rely on fish-oil supplements. The
best are distilled and certified to be free of mercury and other toxins. Some
are flavored, and some even taste good—or at least a lot better than the
cod-liver oil I was forced to take as a kid. One product I recommend is
Antarctic krill oil, made from the tiny crustaceans that abound in southern seas
and are consumed in great quantities by whales and other marine mammals. Krill
oil is red from carotenoid pigments, which have high antioxidant activity, and
it doesn't cause those fishy burps. A good starting dose of fish oil of any kind
is 1g a day. Higher doses, up to 10g a day, have been used, with varying
results, to treat such diverse conditions as depression, attention deficit
disorder, bipolar disorder and even autism.
Vegetarian
sources of omega-3 fatty acids, such as walnuts, flax and hemp, are good
additions to the diet but not so reliable as fish. They supply a short-chain
compound (ALA) that the body must convert to long-chain DHA, and the efficiency
of that conversion can vary. Some people don't do it well, and those eating
mainstream diets top-heavy in the omega-6 fatty acids found in processed food
and prepared meals are at a disadvantage because omega-6s interfere with the
conversion of ALA to DHA. For vegetarians and vegans, there is one nonfish
source of long-chain omega-3s: supplements made from algae. (Algae is the source
of the omega-3s that fish store in their fat)
I'm
not aware of any brain foods that have as much scientific evidence behind them
as fish and fish oil. But I would keep an eye on turmeric, the yellow spice that
is a major ingredient in American mustard and Indian curries. A relative of
ginger, turmeric comes from the underground stem of a tropical plant and is
being carefully studied for its medicinal effects. It is a powerful
anti-inflammatory agent that has anticancer properties and may offer significant
protection against Alzheimer's disease. Alzheimer's begins as an inflammatory
process in the brain. Anti-inflammatory agents like ibuprofen reduce the risk of
Alzheimer's, and so do turmeric and its most studied component, curcumin. India
has the world's lowest rate of Alzheimer's, and some experts think that daily
consumption of turmeric is a contributing factor.
Finally,
in addition to all the other reasons to eat fruits and vegetables, there are
some that relate to the brain. The pigments that account for the varied colors
of vegetables and fruits have antioxidant properties that offer significant
protection against cancer and other chronic diseases, as well as protection from
a range of environmental toxins, including pesticides. Toxic injury to the brain
is almost certainly the cause of Parkinson's disease, and probably amyotrophic
lateral sclerosis (Lou Gehrig's disease). For that reason alone, it's a good
idea to eat every day from as many parts of the color spectrum as you can. It's
also a good idea to take a daily multivitamin-multimineral supplement that
provides the right doses and forms of the key antioxidants: vitamins C and E,
mixed carotenoids and selenium.
A
good diet is certainly not the only way to protect and enhance brain health.
Regularly exercising the mind and not smoking are also important. But food
choices do count. So eat your vegetables, think about your daily dose of
omega-3s, and consider flavoring more of your food with
turmeric.
Andrew
Weil is clinical professor of medicine at the University of Arizona, where he
founded the program in integrative medicine
Want
a Brainier Baby?
Loading
up on tapes, games and videos may not be a smart move. There are better ways to
nurture a young mind
By PAMELA
PAUL
Thomas
Bausman, 2, and his brother Jake, 10 months, are typical American babies. Every
day, Thomas settles down to watch two hours of television, while Jake sits in
front of the set for an hour, the national average for their respective ages.
Their favorite thing to watch, by far? Baby Einstein. Anita Bausman could not be
more pleased with her children's preference. Jake, she reports, learned colors,
numbers and his love of robots from the popular videos, which are filled with
puppets, animals and moving objects, often set to classical music. "It's not
just turning on Nickelodeon," Bausman says. "It's educational and beneficial. I
know he's happy watching, and I can pop in and point out something onscreen,
then go deal with the laundry."
Bausman's
attitude is typical of U.S. parents. In a 2004 Kaiser Family Foundation study,
more than half of the parents surveyed said that educational videos and toys are
"very important to children's intellectual development." Efforts to get kids on
the Ivy League track now begin at infancy, and in the past few years, the
so-called edutainment market for babies and toddlers has exploded. According to
Vicky Rideout, vice president of the Kaiser foundation, in 2003 there were 140
videos or DVDs for kids age 2 and younger for sale on Amazon. Today, there are
750.
Many of
those products bear enticing messages on their packages: "stimulate baby's
cognitive development" or "increase baby's brain capacity." But according to a
new study, "A Teacher in the Living Room?," by the Kaiser Family Foundation, the
companies do essentially no research to back up their claims. Nor can they cite
research by others that relates specifically to their products. "We're not
neurolinguistic scientists," admits Marcia Grimsley, a senior producer for
Brainy Baby, purveyor of such DVDs as Right Brain and Left Brain, which claim to
develop the creative and logical components of a baby's mind. "We went out and
researched other people's work—scientists, neurologists, psychologists—and
applied that knowledge to our products so they could be fun and beneficial to
parents and children."
The
unspoken assumption behind most of those products is that stimulation is good
and that more stimulation is even better. But that's not necessarily so, says
Meredith Small, an anthropologist at Cornell University and author of Our
Babies, Ourselves: How Biology and Culture Shape the Way We Parent. In fact, she
says, "there's a growing thought that maybe Americans are overstimulating their
babies, or stimulating them in the wrong ways."
There's a
basic misunderstanding that stems from studies of children and laboratory
animals that were starved of attention and stimulation, says Pat Levitt,
director of the Vanderbilt Kennedy Center for Research on Human Development.
"Everyone heard about the orphans in Romania who were deprived of stimulation as
babies, then had learning and emotional problems later," says Levitt. But just
because a normal environment is better than a deprived one, that doesn't
necessarily mean that a hyperenriched environment is better still. As Levitt
puts it: "There is no evidence that says you can drive the baby's system to ever
greater heights."
In fact,
there is evidence to the contrary. According to Dimitri Christakis, codirector
of the Child Health Institute at the University of Washington, "The more TV
babies watch, the more likely they are to have attentional problems later in
life." Christakis cites a long-term study that tracked children from age 1
through age 7. It found that for each additional hour of daily TV viewing before
age 3, a child's chances of later developing problems paying attention increased
10%.
Christakis
explains that the human mind—especially the mind of a baby—is driven by what
Ivan Pavlov (of the famous dog) called the orienting reflex. When a baby is
confronted with a novel sight or sound, he or she can't help focusing on it. By
rapidly changing colors, sounds and motions, videos for children effectively
force a baby's brain to stay at attention. If his or her gaze wanders, the
action quickly rivets it back to the screen.
"Parents
say, 'My child can't stop looking at it! She loves it!'" Christakis says. "Well,
true, she can't stop looking at it, but that doesn't mean she loves it." Not
only might Baby not be enjoying the program, Christakis says, "but based on the
research I've done, there's reason to believe these products have deleterious
effects on the developing mind." Christakis is not alone in this thinking. The
American Academy of Pediatrics recommends no TV viewing of any kind before age
2.
CDs and
DVDs designed to teach a baby Spanish or Chinese are also problematic. Patricia
Kuhl, who studies language acquisition at the University of Washington,
conducted an experiment comparing the effects of Chinese audio recordings for
children and a Chinese-speaking human. She had a native Mandarin speaker play
with a group of babies while speaking Chinese for 12 sessions of 25 minutes each
over a four-week period. Later she tested the babies and was able to demonstrate
that they recognized Mandarin sounds. But when she repeated the experiment with
three control groups—one set of babies that saw the Chinese speaker play with
babies on video, another that listened to an audio recording of the Chinese
woman playing and a third that had no exposure to the Chinese speaker—none seem
to perceive Mandarin sounds. Apparently, the presence of a living, breathing
human was essential.
There's a
lesson there for any parent who wants to encourage early learning. Most experts
agree that what matters most is not what toy the baby plays with but the ways in
which you interact with your child. "There's no question that the experiences a
child has in its first year are crucial for cognitive, emotional and physical
development," says Lise Eliot, a neuroscientist at Chicago Medical School and
author of What's Going On in There? How the Brain and Mind Develop in the First
Five Years of Life. "But the good news is none of this costs any money. Babies
prefer humans over anything inanimate."
One key
difference between human interaction and even the most sophisticated educational
toy is that interpersonal exchanges engage all the senses—sight, sound, smell,
taste and, very important, touch. "People tend to forget that children are very
tactile and their most sensitive part is their mouth," says David Perlmutter, a
neurologist and author of the forthcoming book, Raise a Smarter Child by
Kindergarten. "Babies need to mouth things and to smell, to have rich sensory
experiences."
This is
borne out by a new study of 96 babies conducted by Andrew Meltzoff and Rechele
Brooks at the University of Washington. Meltzoff and Brooks knew that long
before babies learn to talk, they form emotional connections with parents and
caregivers by looking into their eyes. But there's a big cognitive leap between
looking at someone's eyes and following that person's gaze to see what he or she
is looking at. By tracking at what age babies learn to follow an adult's gaze,
Meltzoff and Brooks have been able to establish an early indicator of language
ability. It turns out that the earlier a baby follows the gaze of an adult
(generally between 9 months and 11 months), the more advanced his or her
language skills are at age 2.
"Babies
read their mother's faces," explains Meltzoff, co-author of The Scientist in the
Crib: What Early Learning Tells Us About the Mind. "Being able to read other
people and their intentions and to know what they're thinking about is key to
language development."
Babies
can also read signs. Psychologists Linda Acredolo and Susan Goodwyn, co-founders
of the Baby Signs Institute, conducted a long-term study with 140 families
funded by the National Institutes of Health to see whether teaching sign
language to babies before they can talk helps or impedes language development.
The results were surprising. Babies taught to sign at 11 months tested 11 months
ahead of other babies in terms of vocabulary and linguistic ability by age 3. At
age 8, signing babies scored higher on IQ tests than the control group. While
many psychologists agree that teaching sign language probably does babies no
harm, others have questioned the methodology of the research that shows
signing's benefits. Moreover, the research that's been done has focused on
signing as taught by trained parents. Today there are a slew of new videos and
DVDs purporting to teach babies to sign, and no one has studied their
effectiveness.
Of
course, parents don't have to learn sign language to be active participants in
their babies' development. For the past 20 years, New York University
developmental psychologist Catherine Tamis-LeMonda has been observing babies as
they interact with parents in "naturalistic" environments—at home, running
errands, going about their everyday lives—to see how adult involvement affects
language acquisition. Through longitudinal studies, she's documented that the
more parents respond to babies' cries, expressions and articulations, the
earlier the children will talk and the more advanced their language skills will
be at age 5. Parents who respond to babies' cues—reacting to grimaces and
giggles, mimicking their sounds, extrapolating from "bababa" to "bottle,"
labeling things they touch—help their children acquire language. This
responsiveness, however, should not be forced. "If you're not enjoying yourself
while playing with that baby, it's not going to do any good," Tamis-LeMonda
cautions.
That's
because babies are remarkably attuned to emotions. The best—and easiest—gift a
parent can give his or her child is relaxed time when the parent is focused on
the baby and follows the baby's lead. If the baby grabs at waxed paper, the
adult can repeat the word paper and show him or her how it makes noise or how it
can be crumpled. "The infant brain craves novel stimulation, but that can be
found in ordinary nonstructured, nonmarketed things around the house," says Ross
Thompson, a psychologist at University of California at Davis and one of the
founders of the National Scientific Council on the Developing Child, a research
organization of scientists and experts on early-childhood
development.
Babies
need to learn how to master new situations, but they also learn through
repetition and thrive on predictability. "Having rituals, like bedtime and
mealtime routines, brings order to babies' lives, which helps them organize
their thinking," explains Tamis-LeMonda. Being able to anticipate future events
as well as remember and create memories of past patterns fosters cognitive
development. "Babies are very good at tracking statistical information in their
environment," says Laura Schulz, a professor of brain and cognitive sciences at
M.I.T. "They're incredibly sensitive to human action and to intentional acts in
the world. They watch what people are doing to learn causal connections." Babies
will grab the same object over and over, replicating experiences, testing them
out, conducting their own experiments. If I smile, will Mommy smile back?
Providing babies with consistent actions and reactions helps them make sense of
their world and the people in it.
"When a
9-month-old raises his arms to be picked up by Daddy, that demonstrates an
incredibly complex chain of learning," says Claire Lerner, director of parent
education at Zero to Three, a national nonprofit focused on early-childhood
development. "First the child has to have an emotional connection to his father.
Then he has to form an idea: I want to be picked up. Then he has to know to
raise his arms. In that tiny vignette, you can see how complicated a baby's
development is."
And how
simple it is to reinforce that learning. Just pick up the baby, and start
cuddling.
Do These
Toys Work?
BABY'S
FIRST STEPS ITALIAN Parents and caretakers, not CDs, are best for teaching
languages
BABY
EINSTEIN These programs grab attention but don't create geniuses
BIG FROG
They may be cute, but don't expect interactive stuffed animals to teach a baby
numbers, colors or shapes. A teddy bear without batteries is just as good for
cuddling and imaginative play
PICTURE
CARDS Flash cards may help students cram for the SAT, but experts agree that the
cards are inappropriate for babies younger than 2
YOUR BABY
CAN READ Cognitive scientists say that babies forced to watch a DVD daily are
memorizing responses, not reading
BRAINY
BABY Doctors recommend no TV or videos before age 2
Help!
I've Lost My Focus
E-mail
and cellphones help us multitask, but they also drive us to distraction. How to
take control and get more done
By
CLAUDIA WALLIS, SONJA STEPTOE
Jan. 16, 2006
Spend a
few hours with Hollywood producer Jennifer Klein, and you might want to pop a
Valium. Or slip her one. From the moment she rises at 7 a.m. in the Sunset
Boulevard home she shares with her husband, she's a fidgety, demanding,
chattering whirling dervish of a task juggler. Right now Klein, 41, whose
credits include Pearl Harbor and Armageddon, has 15 film and TV projects in
development--all of them requiring constant nudging and nurture. Her strategy
for managing that and several overflowing In boxes: never do just two things at
once if you can possibly do four or five.
"I'm an
obsessive and addicted multitasker and gadget user," Klein cheerily concedes. A
typical moment at her office finds Klein reviewing a screenplay by phone with
its writers and jotting notes while glancing at an incoming e-mail on her
BlackBerry, motioning signals to her assistant and firing off an instant message
to a studio exec. "Here's how bad it is," she confesses. "When I'm flying, right
before the plane lands, before the seat-belt sign goes on, I get the BlackBerry
out and put it in front of me in the seat-back compartment. That way I can turn
it on as soon as I land and see that little light flashing."
Actually,
it gets worse than that for a woman known to do her daily sit-ups during a
conference call. "While I'm driving, I've got the cell phone out. I'm drinking a
cup of coffee, checking the Palm Pilot for the number and then calling," boasts
Klein. Yup, got that all done while stuck in traffic.
Like many
other modern workers, Klein takes pride in being a master multitasker, zipping
through her daily to-do list: "I see the red lights go on or hear the beep, and
I love it." But she has noticed some drawbacks and even some side effects:
impatience, irritability and (gasp) some inefficiency. "Sometimes when e-mail
goes down, I'm actually more productive, because I can concentrate on
something," she says. She finds herself angry and snappish when callers make
poor use of her endless availability. Although she feels anxious when her In box
is empty, she feels no better when it's full: "When I wake up in the morning and
have 15 e-mails, I get a nervous stomach."
Klein's
action- and anxiety-packed work style may be extreme, but she's really only a
couple of juggling pins ahead of most of us. By now every modern
officeworker--from the mail-room clerk to the CEO--knows that the gadgets
designed to lighten our loads also ensnare us. And the dinging digital devices
that allow us to connect and communicate so readily also disrupt our work, our
thoughts and what little is left of our private lives.
What sort
of toll is all this disruption and mental channel switching taking on our
ability to think clearly, work effectively and function as healthy human beings?
Do the devices that make it possible to do so many things at once truly raise
our productivity or merely help us spin our wheels faster? Over the past five
years, psychologists, efficiency experts and information-technology researchers
have begun to explore those questions in detail. They have begun to calculate
the pluses, the minuses and the economic costs of the interrupted life--in
dollars, productivity and dysfunction. More important, they're exploring what
can be done about it--how we can work smarter, live smarter and put our beloved
gadgets back in their proper place, with us running them, not the other way
around.
AN
EPIDEMIC OF
ATTENTION
DEFICIT
DR.
EDWARD HALLOWELL, A PSYCHIATRIST in Sudbury, Mass., has seen the fallout of
multitasking mania: it walks through his door five days a week. Over the past
decade, he says, he has seen a tenfold rise in the number of patients showing up
with symptoms that closely resemble those of attention-deficit disorder (ADD),
but of a work-induced variety. "They complained that they were more irritable
than they wanted to be," he says. "Their productivity was declining. They
couldn't get organized. They were making decisions in black-and-white,
shoot-from-the-hip ways rather than giving things adequate thought, all because
they felt pressured to get things done quickly." But Hallowell, an ADD expert
and co-author of several best-selling books on the subject, including 1994's
Driven to Distraction, noticed something different about his new cases. Unlike
patients with typical ADD, which persists no matter the setting, the new
patients felt frantic only in certain situations--mainly in the workplace or,
for at-home moms, while managing the home front.
In a
Harvard Business Review article last January, Hallowell gave the condition a
name: attention-deficit trait, or ADT. He explains that ADT takes hold when we
get so overloaded with incoming messages and competing tasks that we are unable
to prioritize. The result is not only distractibility, impulsiveness and haste
but also feelings of guilt and inadequacy. "People think it's their fault that
they're falling behind," he says. "They think they have to sleep less and work
harder and stay later at the office, which only makes it worse because they're
not taking care of their brain by getting enough sleep." How common is this
phenomenon? "It's rampant," says Hallowell, who believes that corporate
downsizing and job insecurity contribute to the problem. "When I give lectures
around the country, there's always instant identification with what I'm saying.
People in the audience immediately say, 'Oh, yes, that's me,' or, 'My whole
office is like that.'"
THE HIGH
COST OF INTERRUPTIONS
IT'S NO
WONDER SO MANY OF US SUCCUMB to the panicky feeling that we can't keep pace with
workplace demands. A series of new studies that examined the modern,
multitasking worker show that the constant splintering and diversion of our
attention wastes time and money. In a study of 1,000 officeworkers from top
managers on down, Basex, an information-technology research firm in New York
City, found that interruptions now consume an average of 2.1 hours a day, or 28%
of the workday. The two hours of lost productivity included not only unimportant
interruptions and distractions but also the recovery time associated with
getting back on task, according to a Basex report titled "The Cost of Not Paying
Attention," released in September. Estimating an average salary of $21 an hour
for "knowledge workers"--those who perform tasks involving information--Basex
calculated that workplace interruptions cost the U.S. economy $588 billion a
year.
In a
revealing set of studies, a team led by Gloria Mark and Victor Gonzalez of the
University of California at Irvine tracked 36 officeworkers--in this case
information-technology workers at an investment firm--and recorded how they
spent their time, minute by minute. The researchers found that the employees
devoted an average of just 11 minutes to a project before the ping of an e-mail,
the ring of the phone or a knock on the cubicle pulled them in another
direction. Once they were interrupted, it took, on average, a stunning 25
minutes to return to the original task--if they managed to do so at all that
day. The workers in the study were juggling an average of 12 projects apiece--a
situation one subject described as "constant, multitasking craziness." The five
biggest causes of interruption in descending order, according to Mark: a
colleague stopping by, the worker being called away from the desk (or leaving
voluntarily), the arrival of new e-mail, the worker switching to another task on
the computer and a phone call.
Of
course, not all interruptions are created equal. Some are related to the job at
hand and may be helpful--if not to the individual, then maybe to the team. Some
are unrelated but nonetheless welcome: the Basex report found that 62% of
workers at all levels said being interrupted by a friend with a
nonbusiness-related question was "acceptable" (though the boss might take a
different view). Several studies, including one by Mary Czerwinski, a senior
researcher at Microsoft, show that interruptions at the beginning and the end of
a task are the most detrimental to performance. An interruption when work has
just got under way "blows away the goals you've established," says Czerwinski,
while a ping or a knock at the end of the process "breaks the train of thought
as people are reflecting and preparing for what they'll do next."
While the
researchers did not look specifically at the quality of the work, a long history
of psychological research has proved what one might expect: performance
declines--and stress rises--with the number of tasks juggled. Similarly, there's
a long-held principle in psychology that maintains that a little stimulation or
arousal improves performance but too much causes it to decline. "If you apply
that law to multitasking," says Mark, "you would expect that a certain amount of
multitasking would increase arousal, perhaps leading to greater efficiency. But
too much will produce declining performance."
Jonathan
Spira, CEO and chief analyst at Basex, suspects that so-called NetGen'ers--
those who grew up IMing, Googling and texting--are less stressed by
gadget-abetted multitasking than are older workers. "Younger people may actually
be wired a little differently," he says. But, he adds, there's no getting away
from the fact that to do your best work on difficult tasks, "sometimes you need
to shut everything else out and focus."
Some of
the world's most creative and productive individuals simply refuse to subject
their brains to excess data streams. When a New York Times reporter interviewed
several recent winners of MacArthur "genius" grants, a striking number said they
kept cell phones and iPods off or away when in transit so that they could use
the downtime for thinking. Personal-finance guru Suze Orman, despite an
exhausting array of media and entrepreneurial commitments, utterly refuses to
check messages, answer her phone or allow anything else to come between her and
whatever she's working on. "I do one thing at a time," she says. "I do it well,
and then I move on" (see box).
IS IT AN
ADDICTION?
WHAT'S
STRIKING TO RESEARCHERS IS HOW few people take even the most basic steps to
reduce workplace interruption. In the Basex study, 55% of workers surveyed said
they open e-mail immediately or shortly after it arrives, no matter how busy
they are. "Most people don't even think about turning off the dinger," says
Spira, who turned off the alert sound on his e-mail nine years ago with no
regrets. "We can't control ourselves when it comes to limiting technological
intrusions."
Indeed,
there's a compulsive quality to our relationships with digital devices.
Hallowell has noticed that when a plane lands nowadays, BlackBerrys light up the
way cigarettes once did. "A patient asked me," he says, "whether I thought it
was abnormal that her husband brings the BlackBerry to bed and lays it next to
them while they make love." Hallowell and his frequent collaborator, Harvard
psychiatrist John Ratey, believe that the neurochemistry of addiction may
underlie our compulsive use of cell phones, computers and "CrackBerrys." They
say that dopamine, a neurotransmitter involved in seeking rewards and
stimulation, is doubtless at work. "If we could measure it as we're shifting
[attention] from one thing to another," says Ratey, "we would probably find that
the brain is pumping out little shots of dopamine to give us a buzz."
Psychologists call the increasingly common addiction to Web-based activity
"online compulsive disorder." Hallowell has a more descriptive term: screen
sucking. "These screens have a magnetism we haven't quite figured
out."
TAKING
CONTROL
CAN THE
TECHNOLOGY THAT'S overloading our circuits help address the problems it has
created? Czerwinski and her bosses at Microsoft think so. She's helping design
an intelligent office-communication system that calculates whether an
interrupting e-mail or IM should be transmitted immediately or delayed on the
basis of, among other factors, the worker's appointments and projects that day,
his past preferences and habits and the organizational-chart relationship
between sender and receiver. "Something like this has got to happen sooner or
later," says Czerwinski, though she acknowledges that it raises privacy issues.
The alternative is to turn off the IMs, phones and e-mail--if management allows
it. "I've observed some people who did that, and they were highly productive,"
says Czerwinski, "but they also missed some very important e-mails. I don't
think most people will be willing to do that."
Czerwinski
has also been helping Microsoft design alternatives to current software products
to allow workers to stay on task for longer periods, even as onscreen
interruptions arrive. In next-generation systems, which Microsoft's competitors
are pursuing as well, interruptions are designed to be less intrusive--nothing
flashes, pops up or makes a noise--and the alerts appear on the periphery of a
screen that's larger than today's standards so that workers stay centered on
their main task. The key, she says, is for an incoming message to provide just
enough information for the worker to judge whether to grab it or ignore it until
later. "We found that it's more calming to give them subtle alerts that aren't
intrusive and which, should you glance at them, let you know whether you need to
worry," she says.
U.C.
Irvine's Mark also thinks improved technology will help, but she points to
low-tech solutions as well. Some companies, she notes, give employees DO NOT
INTERRUPT screens to put over their cubicles or establish quiet times when it's
not permissible to bother a colleague. In some offices, she says, "workers wear
colored hats to signify when they do and do not want to be interrupted." Another
simple trick, suggests Spira, is to leave more explicit instructions on e-mail
"away messages" and answering machines about how and when you prefer to be
interrupted.
But to
truly take control of our productivity, we also have to stop fooling ourselves
about our capacities to juggle. We have to resist the "it will only take a
second" impulse to read an e-mail, check a stock price or chat with a colleague
in the middle of a demanding assignment. At the same time, we have to stop
pretending that we are machines that can endlessly process tasks without a
break. There's a reason that research shows the No. 1 work interruption is not
an electronic signal but rather a human being stopping by. It's the same reason
a personal call feels welcome even when you are superbusy. We are social
creatures, and to do our best work, we need to set aside time in the workday to
connect with others--and also to break free from our checklist and just
think.
Psychiatrist
Hallowell offers some basic solutions to multitasking mania in a book to be
published in April, titled CrazyBusy: Overstretched, Overbooked and About to
Snap--Strategies for Coping in a World Gone ADD. Among his suggestions:
prioritize ruthlessly ("Cultivate the lilies, or the things that fulfill you,"
he says, "and cut the leeches, those that deplete you"), allot 30 minutes a day
for thinking, relaxing or meditating, and get significant doses of what he calls
vitamin C--the live connection to other people. "As much as we are connected
electronically, we have disconnected interpersonally," he says. Compulsive
screen sucking, he suggests, may actually be a symptom of vitamin-C deficiency.
To perform your best, maintain your individual creativity and avoid the pitfalls
of ADT, he insists, "you want to have some face-to-face moments of closeness."
And when you do, turn off that blinking BlackBerry.
—With
reporting by With additional reporting by Wendy Cole/Chicago
The
Surprising Power of the Aging Brain
Scientists
used to think intellectual power peaked at age 40. Now they know
better
By
JEFFERY KLUGER
Jan. 16,
2006
It took
Barbara Hustedt Crook an awfully long time to get around to writing her first
musical. She started last year, shortly before her 60th birthday. Her friend and
collaborator, Robert Strozier, waited even longer; he's 65. It's not that they
didn't have the creative chops for the job. The two have spent their careers
writing and editing in New York City, and Crook has a background in performing,
singing and piano. But creating a musical always felt just out of reach--until
now.
"Somehow
I have a confidence I didn't have before," says Crook. "I find that my brain
makes leaps it didn't make so easily. I can hear my inner voice and trust
instincts and hunches in ways I didn't used to."
And, says
Strozier, they're both a lot more willing to take chances than in the past. "At
a certain age," he says, "you either get older or you get younger. If you get
younger, you venture out and take risks."
Risk-taking
seniors making daring mental leaps? That's not the stereotype. Indeed, until
quite recently most researchers believed the human brain followed a fairly
predictable developmental arc. It started out protean, gained shape and
intellectual muscle as it matured, and reached its peak of power and nimbleness
by age 40. After that, the brain began a slow decline, clouding up little by
little until, by age 60 or 70, it had lost much of its ability to retain new
information and was fumbling with what it had. But that was all right because
late-life crankiness had by then made us largely resistant to new ideas
anyway.
That, as
it turns out, is hooey. More and more, neurologists and psychologists are coming
to the conclusion that the brain at midlife--a period increasingly defined as
the years from 35 to 65 and even beyond--is a much more elastic, much more
supple thing than anyone ever realized.
Far from
slowly powering down, the brain as it ages begins bringing new cognitive systems
on line and cross-indexing existing ones in ways it never did before. You may
not pack so much raw data into memory as you could when you were cramming for
college finals, and your short-term memory may not be what it was, but you
manage information and parse meanings that were entirely beyond you when you
were younger. What's more, your temperament changes to suit those new skills,
growing more comfortable with ambiguity and less susceptible to frustration or
irritation. Although inflexibility, confusion and even later-life dementia are
very real problems, for many people the aging process not only does not batter
the brain, it actually makes it better.
"In
midlife," says UCLA neurologist George Bartzokis, "you're beginning to maximize
the ability to use the entirety of the information in your brain on an everyday,
ongoing, second-to-second basis. Biologically, that's what wisdom
is."
If your
mind does indeed grow more agile as you age, one of the things that may help it
do so is the amount of glue you carry around in your brain--glia (Greek for
glue) being what the 19th century German anatomists called it. Only about half
the mass of the brain is composed of gray matter, or nerve cells; the rest is
white matter, the connecting tissue that, in a sense, glues it all together.
Much of that white matter is made of conductive nerve strands, and covering each
fine wire is a fatty sheath of myelin that keeps nerve signals from sputtering
out or cross firing during transmission. "Myelin is what makes us human," says
Bartzokis. "We have 20% to 30% more than other primates do."
Throughout
our lives, fresh layers of myelin sheathing are laid down in the brain. In
infants and children, who grow increasingly coordinated as they mature, the bulk
of that takes place in the motor and sensory lobes. If we acquire better
reasoning skills in middle age, Bartzokis long suspected, it would follow that
most of the myelin added in those years would appear around the
signal-transmitting axons in the higher brain regions that are the seat of
sophisticated thought. Essentially, the brain spends decades upgrading itself
from a dial-up Internet to a high-speed version, not fully completing the job
until age 45 or so.
To test
that idea, Bartzokis used magnetic resonance imaging to study the volume and
distribution of white matter in 300 healthy subjects from 18 to 75 years old as
well as in hundreds of older people suffering from such brain-related ills as
Alzheimer's and Parkinson's diseases. As he suspected, the healthy adults had
the most myelin in the frontal and temporal lobes--where big thoughts live. The
quantity of sheathing reached its peak around 45 or 50, exceeding the amount in
unhealthy older subjects and healthy younger ones.
"This
last little bit of myelination essentially puts us online," Bartzokis says. "You
may not have the same amount of information you had when you were 20, but you
can use it better in everyday life."
It's not
just the wiring that charges up the brain as we age, it's the way different
regions start pulling together to make the whole organ work better than the sum
of its parts. For all its plasticity, the brain is a specialized machine, with
specific regions handling specific operations. The greatest divergence comes
between the left and the right hemispheres, which often work almost
independently of each other. That is not such a bad thing because one hemisphere
can be busy writing a grocery list or solving an equation while the other scans
the environment and tends to other basic chores. As we age, however, the walls
between the hemispheres seem to fall, with the two halves working increasingly
in tandem. Neuroscientist Roberto Cabeza of Duke University dubs that the HAROLD
(hemispheric asymmetry reduction in older adults) model, and judging by his
work, the phenomenon is a powerful one.
Cabeza
recruited a sample group of adults 65 to 95 years old who had scored high on a
memory test, along with a group of lower-performing adults of the same age and a
group of younger, college-age adults. He then asked them all to perform a series
of tasks that called on numerous skills, including language, memory, perception
and motor functions. Throughout the tasks, he conducted functional magnetic
resonance imaging scans of their brains. Again and again, he found that the
high-functioning older adults were using either a hemisphere different from the
one the other subjects were using or both hemispheres at the same
time.
Why that
is so is still unclear, but Cabeza doesn't believe the brain is programmed to
get stronger as it ages. Rather, he acknowledges, in many ways it gets weaker,
with neurons processing information less efficiently. The bilateralization may
be a trick the brain uses to compensate for the decline, sometimes integrating
the hemispheres so efficiently that our thought and reasoning processes are
actually better than they were before.
"It's
similar to the way you need both hands to lift a weight that you could lift with
one hand when you were younger," Cabeza says. "In the brain, there's a nice,
natural distribution of resources. You get more neural tissue to support the
task."
As the
brain's flexibility improves, so too may the temperament we bring to our work.
There's no question that personalities can calcify with age, causing us to
become less receptive to new experiences and flat-out crabby when faced with
them. But that's not the case with everyone. In fact, in many people the
opposite happens.
In 1958
psychologist Ravenna Helson, now an emeritus professor at the University of
California, Berkeley, began a long-term study of 142 women, all of them 21 years
old, at Mills College in Oakland, Calif. She interviewed the subjects and took
measures of their personalities, drives, relationship skills and the like. Then
she reinterviewed them at ages 27, 43, 52 and 61 to determine how those traits
changed over time. Just last year she and a graduate student, psychologist
Christopher Soto, collated the data from the 123 women who stuck with the study.
The results were surprising.
On the
whole, they found, the women's highest scores in inductive reasoning occurred
from their 40s to their early 60s. Similarly, their so-called affect
optimization (the ability to highlight the better aspects of one's personality
and restrain the less attractive ones) and their affect complexity (the ability
to evaluate various contradictory ideas and remain objective) did not peak until
their 50s or 60s. There was also an increased tolerance for ambiguity and an
improved ability to manage relationships.
The Mills
sample group was hardly random, consisting principally of white women of the
same age who attended the same college. Still, they were 123 different
individuals, and the results were nonetheless uniform. "People generally
describe personality change in middle age as a midlife crisis, with all its
negative connotations," says Soto. "In the Mills women, the change was
positive--a reorienting, not a crisis."
If such a
change occurs, says psychologist Robert Levenson, also at U.C. Berkeley, it may
be shaped in part by evolutionary forces, offering advantages for the whole
species. Human beings' comparatively long life spans and extended families are
very good things, but keeping big broods healthy and well behaved over the
decades takes more than the energy of young parents. It takes the cool heads and
wise counsel of the family graybeards too. "Evolution isn't just about
reproduction," Levenson says. "When you get into your 40s and 50s, you're
caretaking, looking after your children, grandchildren, even the people who work
for you. There's an advantage to having a more relativistic mind."
It's that
talent for reflective thinking that explains the role older adults have always
played in the human culture. It's not for nothing that history's firebrands and
ideologues are typically young, while its judges and peacemakers and great
theologians tend to be older. Not everyone achieves the sharp thought and serene
mien that can come with age. But for those who do, the later years can be the
best years they have ever had.
For more
information about the Crook/Strozier musical go to http://www.workshoptheater.org/