Empathy by the Book: How Fiction Affects Behavior

Not all genres have the same effect, research shows

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

When I want to escape I pick up a good novel. But does this habit provide more than a quick getaway?

We’ve long known about the collateral benefits of habitual reading—a richer vocabulary, for example. But that’s only part of the picture. Mounting evidence over the past decade suggests that the mental calisthenics required to live inside a fictional character’s skin foster empathy for the people you meet day-to-day.

In 2006, a study led by University of Toronto psychologists Keith Oatley and Raymond Mar connected fiction-reading with increased sensitivity to others. To measure how much text the readers had seen in their lifetimes, they took an author-recognition test—a typical measure for this type of study. “The more fiction people read, the better they empathized,” was how Dr. Oatley summarized the findings. The effect didn’t hold for nonfiction.

Still, no one knew whether reading fiction fostered empathy or empathy fostered an interest in fiction. Other factors could have been at play too, like personality.

So, in 2009, part of the Oatley-Mar team involved in the 2006 study reproduced it with a sample of 252 adults—this time controlling for age, gender, IQ, English fluency, stress, loneliness and personality type. The researchers also assessed participants’ “tendency to be transported by a narrative”—the sense that you’re experiencing a story from within, not watching it as an outsider.

Finally, participants took an objective test of empathy, called the Reading the Mind in the Eyes Test. The aim of all of this was to see how long-term exposure to fiction influenced their ability to intuit the emotions and intentions of people in the real world.

The results? Once competing variables were statistically stripped away, fiction reading predicted higher levels of empathy. Such readers also lived large in the flesh-and-blood social sphere, with richer networks of people to provide entertainment and support than people who read less fiction. This finding put to rest the stereotype of bookworms as social misfits who use fictional characters as avatars for real friends and romantic partners.

Later studies confirmed that reading fiction does cause a spike in the ability to detect and understand other people’s emotions—at least in the short term. In a series of experiments published in 2013 in Science, social psychologist Emanuele Castano and David Comer Kidd of the New School for Social Research tried to figure out whether the type of fiction mattered.

The researchers handed subjects—in groups ranging in size from 69 to 356—different types of genre fiction, literary fiction or nonfiction, or nothing to read at all. They then assessed participants on several measures of empathy. Nonfiction—along with horror, sci-fi or romance novels—had little effect on the capacity to detect others’ feelings and thoughts. Only literary fiction, which requires readers to work at guessing characters’ motivations from subtle cues, fostered empathy.

In these studies, the reading of nonfiction not only failed to spur empathy but also predicted loneliness and social isolation, specially among men. Of course, nonfiction reading has its virtues. Other research suggests that various kinds of nonfiction can prompt empathetic feelings—as long as the narrative is moving and transformative.

In recent studies, neuroscientist Paul Zak at Claremont Graduate University and colleagues showed participants heartfelt stories, such as a video narrated by a father of a toddler with brain cancer. The video induced a spike in observers’ levels of oxytocin—a hormone that promotes trust, nurturing and empathy—and larger donations to charity. Watching a straightforward travelogue-type video of the same father and son visiting the zoo didn’t have that effect.

Apparently, what matters is not whether a story is true. Instead, as Dr. Oatley says, “If you’re enclosed in the bubble of your own life, can you imagine the lives of others?”

 

 

When We Display Our Piety, Our Social Stock Rises

People perceive signs of religious observance in others as a measure of dependability, new research shows

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

One of the many unusual aspects of this presidential campaign has been how little the candidates have discussed religion. Compare this with two previous presidential contenders, among many others who publicly affirmed their faith. When asked in 2000 to name his favorite political thinker, George W. Bush replied, “Christ, because He changed my heart,” while a 1980 ad for Jimmy Carter’s unsuccessful re-election campaign intoned, “He takes the time to pray privately and with Rosalynn each day.”

Perhaps one reason for the change is that “none” is the fastest-growing major religious affiliation in America, as a Pew Research Center survey showed last year. Given this shifting terrain, does being visibly devout still signal that you can be trusted?

Surprisingly, the answer is yes. People perceive signs of religious observance in others as a measure of dependability, new research shows. Whether one fasts on Yom Kippur, wears a cross of ash for Lent or places a red dot in the middle of one’s forehead, such religious “badges” do more than just signal that you belong to a particular group. Other people see these displays as a shorthand for reliability.

In four experiments published last year in the journal Psychology of Religion and Spirituality, the anthropologist Richard Sosis of the University of Connecticut and his colleagues altered one fifth of the images so that people appeared to be wearing a cross around their necks or a cross of ash on their foreheads. The experiments were conducted between Ash Wednesday and Easter. The researchers interspersed these images with those of people without any religious adornments.

Several hundred university students of varying backgrounds then examined the stack of photos, rating each of the faces for trustworthiness. The students also played an economic game during which they entrusted money to players whom they deemed honorable.

The researchers were surprised to discover that a person wearing Christian religious symbols prompted powerful feelings of trust, not only among fellow Christians but also among secular students and members of other religions. The presence of a cross doubled the money that non-Christians were willing to offer someone in the trust game, while the Ash Wednesday cross increased their investment by 38.5%.

Other recent studies show that the effect is the same for any religious practice that imposes a cost on the appearance, comfort or finances of believers, or that restricts their diet or sexual behavior. Whether they are MuslimsJews or Hindus, such displays of devotion burnish the reputations of the observant.

In a fascinating study of Hindu and Christian villagers in South India, published this year in the journal Evolution and Human Behavior, the anthropologist Eleanor Power found that local religious rituals greatly enhanced a participant’s standing in the community. For both Catholics and Hindus, these included onerous pilgrimages; Hindu rituals included walking across burning coals and being suspended by hooks in one’s skin. Participation in these widely accepted demonstrations of devotion also predicted which individuals had pivotal roles in local social networks.

“People are more likely to go to you for support if you undertake such religious acts,” said Dr. Power of the nonprofit Santa Fe Institute. She had access to temple records showing who paid religious fees and joined pilgrimages and processions, along with villagers’ evaluations of their peers and their status in the community. “People will rate you as having a good work ethic, giving good advice and being more generous if you worship regularly and do firewalking or other costly acts,” Dr. Power told me.

Such religious displays make us more likely to turn to these people for leadership. Today’s presidential contenders would perhaps benefit from a greater show of reverence. The harder they work to convey that they believe in something greater than themselves, the more credible they will be to voters.

 

 

Marijuana Makes for Slackers? Now There’s Evidence

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

In cities like Seattle and Vancouver, the marijuana icon has become almost as common on storefronts as the Starbucks mermaid. But there’s one big difference between the products on offer: A venti latte tastes the same everywhere and provides an identical caffeine rush, while marijuana stores offer the drug’s active ingredients in varying combinations, potencies and formats. There is no consistency in testing, standards or labeling.

This matters because marijuana’s two psychoactive ingredients, tetrohydrocannabinol (THC) and cannabidiol (CBD), have contrasting effects on the brain. “THC makes you feel high,” said Catharine Winstanley, a psychology professor at the University of British Columbia who does research on marijuana, while CBD “is responsible for its analgesic, antiseizure and purported anticancer effects.”

In street marijuana, the THC-to-CBD ratio now tends to be 10 to 1, and it is increasing, a trend occurring even at some marijuana clinics, Dr. Winstanley said. And few people know what effect that has on their brains. A new study by Dr. Winstanley’s group in the Journal of Psychiatry and Neuroscience examines how these two chemicals shape our willingness to face a challenge. Does marijuana make us lazy?

To answer this question, the Winstanley team first tested rats to determine which were hard workers and which were slackers. After being injected with THC, CBD or both, the rats had to choose between a simple task with a measly reward or a demanding one that reaped a bigger payoff.

In the easy version, the animals had a minute—an eternity for a rat—to see that a light was on in a chamber and poke their noses inside. They got a single sugar pellet. In the hard version they only had a fifth of a second to notice the light—something a rat, even a stoned one, should have no problem perceiving—and respond. Their vigilance in the hard choice would earn them two lumps instead of one. Under normal circumstances, the vast majority of rats prefer to work harder for a bigger payoff.

The results? “Whether they were workers or slackers to begin with,” Dr. Winstanley reported, “even small amounts of THC made them all slackers.”

THC didn’t impair the rats’ ability to perform, only their willingness to try. That downshift in motivation didn’t happen in rats injected with CBD only.

Later analysis of the rats’ brains showed that those with the greatest reaction to THC also had a greater density of a particular receptor in their anterior cingulate cortex, or ACC. “That area of the brain is very important for people to gear up to face a challenge and stay the course,” Dr. Winstanley said.

A small study shows something similar in humans. Published this month in the journal Psychopharmacology by a University College London team, the study of 17 adults showed that inhaling cannabis with THC alone (versus pot with CBD plus THC, or a placebo), induces people to choose an easy task more often, eschewing the harder one that offered four times the payoff. Neither the researchers nor the subjects knew who had gotten the drug, who the placebo. The effects were short term, meaning that the subjects’ apathy didn’t persist after the high wore off.

The need for policy-makers to deal with the results of tests like these is complicated by the lack of regulatory consistency. That’s because the U.S. Drug Enforcement Administration considers marijuana as illegal as heroin, while 25 states and the District of Columbia have legalized pot for various purposes. So no national standards exist.

“Thinking that it’s harmless, that you can smoke cannabis and you’ll be fine, is a false assumption,” said Michael Bloomfield, a University College London professor in psychiatry and one of the UCL study’s authors. “THC alters how willing you are to try things that are more difficult.” So next time you go to a clinic—or dealer—you might want to ask about the product’s chemical breakdown.

Medicating Children With ADHD Keeps Them Safer

New research suggests that medication can reduce risky behavior in teenagers with attention deficit hyperactivity disorder, or ADHD

 

By

SUSAN PINKER

Updated Aug. 17, 2016 10:23 a.m. ET

See the column on the Wall Street Journal site

 

If a pill could prevent teenagers from taking dangerous risks, would you consider it for your children?

I’d be tempted. My skateboard- and bicycle-riding son was hit by a car—twice—when he was a teenager. I would have welcomed anything that could have averted those dreadful phone calls from the emergency room.

While some bumps and scares are inevitable for active guys like him, serious misadventures with long-lasting repercussions are often par for the course for a subset of them—those with attention deficit hyperactivity disorder, or ADHD. But a new article suggests that early medication can significantly cut the odds of bad things happening later.

Affecting nearly 9% of all Americans between 4 and 18 years of age, ADHD is one of the most common childhood disorders and also one of the most misunderstood. Its symptoms color almost every aspect of a child’s life—from being able to focus in school to making and keeping friends, reining in fleeting impulses and assessing risk and danger.

Indeed, accidents are the most common cause of death in individuals with ADHD, withone 2015 study of over 710,000 Danish children finding that 10- to 12-year-olds with ADHD were far more likely to be injured than other children their age. Drug treatment made a big difference, however, nearly halving the number of emergency room visits by children with ADHD.

Medicating children to address problems with attention and self-control remains controversial. ADHD isn’t visible, like chickenpox, nor immediately life-threatening, like asthma. Its distortion of a child’s ability to meet adults’ expectations creates an atmosphere of frustration and blame. So it’s not often taken for what it really is: a neurodevelopmental disorder with genetic roots.

An enduring myth about ADHD is that children grow out of it in adolescence. We now know that a 5-year-old with a bona fide attentional disorder may well become a dreamy, restless and impulsive teenager and adult. Adolescents with ADHD think even less about consequences than the average teenager and are especially thrilled by novelty. They’re more likely than their friends to drink too much, drive like maniacs, abuse drugs and have unprotected sex.

It’s a sobering list. But an article published last month by Princeton researchers Anna Chorniyand Leah Kitashima in the journal Labour Economics shows that treating ADHD with medication during childhood can head off later problems. “We have 11 years of data for every child enrolled in South Carolina Medicaid who was diagnosed with ADHD,” Dr. Chorniy told me. The researchers tracked each doctor visit and every prescription, with a sample of over 58,000 children whose health progress they tracked into adulthood.

This long view let the economists compare the behaviors of teens treated with the most common ADHD medications, such as Ritalin, Concerta and Adderall, to the types of risks taken by other children with ADHD who were not treated. The researchers found fewer and less severe injuries and health problems among the treated children: a 3.6% reduction in sexually transmitted infections; 5.8% fewer children who sought screening for sexually transmitted infections (suggesting they had had an unprotected sexual tryst); and 2% fewer teen pregnancies.

That adds up to a lot fewer teenagers in trouble.

The economists did their study based on existing data, but randomized, controlled studies—experiments carefully designed to establish cause-and-effect relationships—have reached the same conclusion: that medication to control ADHD can reduce the high price in psychic pain, loss of educational opportunity and riven relationships. A child whose disorder is diagnosed and treated early by a trained clinician stands a better chance of growing into a healthy and thoughtful adult.

 

For Better Performance, Give Yourself a Pep Talk

‘Self-talk,’ stories we tell ourselves to change unwanted thoughts, can help us manage our feelings as well as boost our performance—even beyond sports

See the column on the Wall Street Journal site

 

Most of us yearn to bump up our game. Indeed, telling ourselves we’re better than the competition is so central to the American psyche that self-doubt can seem almost unpatriotic. But does egging ourselves on really help us to get better at anything?

Psychologists have long known that “self-talk” or “self-instruction”—that is, the stories we tell ourselves to change unwanted thoughts and behaviors—can also transform moods. As one feature of cognitive-behavioral therapy, self-talk—such as saying, “I am an interesting person who can make new friends” or “I can focus on one task at a time”—helps depressed people to revamp their way of thinking and thus their ability to cope.

Now a massive online study suggests that such talk can help us not only to manage our feelings but also to boost our performance—and relatively quickly, too.

The recent study takes its cue from sports psychology, which shows that self-instruction can push athletes to persist on quick tests of endurance or on highly technical bursts of effort, such as volleyball serves.

British sports psychologist Andy Lane at the University of Wolverhampton led the experiment in conjunction with the BBC Lab, a (now closed) arm of the broadcasting service, which invited volunteers to be citizen scientists. In an interesting twist on attracting research subjects, actor Ricky Gervais and Olympic sprinter Michael Johnson promoted the study on BBC TV before the 2012 London summer Olympics. Nearly 45,000 people participated, an enormous sample for a study in psychology.

The volunteers filled out questionnaires on home computers about their emotions and played a series of online number-finding games, with instructions and feedback narrated by Mr. Johnson. The yearlong study came out in March in the journal Frontiers in Psychology.

After completing practice and baseline tests, participants were randomly assigned to one of four groups: self-talk, imagery (e.g., imagining oneself reacting more quickly), “if-then planning” (e.g., planning a reaction to what might happen while competing) and a control group, which encouraged reflection on performance but didn’t give any instructions or motivational hints. The subjects had to beat their previous performance as well as an “opponent”—actually a computer algorithm matched to their skill level. The scientists wanted to know which of the interventions would help people to manage their emotions when under pressure.

The results showed that simple self-talk—like saying “I want to be the best” or “I’m going to try as hard as possible”—was the most effective technique, especially if the script was about increasing motivation. Self-talk focused on specific goals, such as “I’m going to get a score of 90,” didn’t work as well.

One caveat: There’s a world of difference between effort and skill—as anyone who has ever tried to swim faster or master the violin knows well. Roy Baumeister, a psychology professor at Florida State University, said that a number-finding challenge (like that in Dr. Lane’s study) “is based on effort; there’s not much skill involved. In that context, self-talk can help with effort. I’m not so sure about skill.”

Dr. Baumeister, who has researched how emotion shapes behavior, added that when it comes to skill and effort, “what works with one will not work with the other.” Choking under pressure decreases our ability to show what we can do—it inhibits our skills—whereas pressure usually increases effort, he explained.

Dr. Lane agrees that generalizing his results should be limited to brief tasks that require tremendous exertion—say, weight training or sprinting. “The language you tell yourself in these situations is usually negative, and you get some unpleasant emotions. But you can train your emotions to say, ‘You can endure another five or 10 seconds.’ So instead of being demoralized, you teach yourself to push just a little bit harder and a little bit longer.”

A Pair of Witnesses Can Be Better Than One

New research questions the assumption that police should interview witnesses to a crime separately

See the column on the Wall Street Journal site

 

If you witness a crime, what’s the best way to recall what happened? Minutes to months later, police might ask you the color of the perp’s eyes, the design of his tattoo or how long it took him to pull out a gun after he entered the room. Would it be better to recount your story on your own or alongside the person you were with at the time?

Anyone who has ever watched a police procedural can answer that question. Witnesses are always interviewed alone, in a dismal, windowless holding cell—that is, if the interview takes place in Hollywood.

But the isolation of witnesses is not just for dramatic effect. Psychologists have long warned police that one witness can contaminate another’s testimony. Social pressures can make someone change his tune, or errors might be introduced into the testimony.

Contagion and the power of suggestion might also create the feeling that events that never even happened actually occurred. That was the case during the 1980s epidemic of “repressed” memory syndrome, which focused on false claims of childhood sexual abuse. In 2013, researchers at the Massachusetts Institute of Technology were even able to plant false memories in lab mice, leaving neurochemical traces indistinguishable from the neural footprints left by real experiences.

Still, even if memory is highly malleable, two new studies show that there are big benefits in bringing witnesses together to collaborate on testimony. The findings could shake up decades of practice in legal circles.

One study, published in the journal Memory this past May, shows that witnesses who are interviewed together do, in fact, influence each other. But they also correct and amplify each other’s accounts of the same event, increasing accuracy in the process. The opportunity to edit each other’s memories allows pairs of witnesses to make fewer errors than witnesses who are questioned on their own.

Led by the Dutch legal psychology professors Annelies Vredeveldt and Peter van Koppen and colleagues at VU University Amsterdam, the researchers asked people who had recently seen a play to describe a violent, emotional scene. Of 53 adults who saw the same play on three separate nights, 36 came to the theater as couples and were interviewed together afterward. They ranged from spouses to one pair that had just met for the first time. The 17 others were interviewed as individuals. The researchers then compared the two groups. Who would produce a more accurate chronicle of a rape-and-murder scene acted onstage the week before?

The people who came together corrected each other’s errors, as we know. But a second, more intriguing finding surfaced, too: The nature of a couple’s communication skills influenced how much detail they remembered. “People who repeated, rephrased or elaborated on what their partner just said remembered more,” Prof. Vredeveldt told me.

It wasn’t how long they had been married or how well the couple knew each other that mattered. It was their skill in creating a joint narrative. If the husband described the crime victim as wearing “some type of dress,” the wife might add “one that opens at the front.” When the man agreed and added that the dress was white, the wife concurred, then specified that it was “a dirty white.” Because the researchers didn’t examine the features of a pair’s relationship, they plan to find out what happens if the couples are strangers or are asked to remember an event in groups.

A second study, published in Legal and Criminological Psychology in June with some of the same authors, used a larger sample and more controlled conditions—and had the same findings: People interviewed together made fewer errors.

Prof. Vredeveldt still believes that people should be interviewed individually. “But instead of sending them home after that, you might generate more leads and fewer errors if you put witnesses together.” Because when couples are bouncing ideas off each other, “the whole is greater than the sum of its parts.”

To Beat the Blues, Visits Must Be Real, Not Virtual

Loneliness keeps increasing, but new research suggests that electronic ways of keeping in touch do little compared with in-person contact

See the column on the Wall Street Journal site

 

Imagine being stranded on a desert island with a roof over your head and sufficient provisions—but no human contact other than what you can get from your smartphone. Would you get depressed? Or would your networked device provide enough connection to stave off dark thoughts?

This metaphor applies to a great many Americans. Their basic material needs are covered, and 85% have internet access. Yet at least 26% say that they feel deeply lonely. Psychologists know this from population surveys, not because people talk about it. The distress of feeling rejected or neglected by friends or family is a key predictor of depression, chronic illness and premature death. It’s also a public-health time bomb. The rate of loneliness has increased from about 14% in the 1970s to over 40% among middle-aged and older adults today, and the aging of America’s population is likely to make things worse in the years ahead.

Few public health initiatives aim at combating loneliness, despite the fact that it’s riskier to health and survival than cigarette smoking or obesity. It’s also a taboo topic. Doctors don’t often ask about it, and we might not fess up, even if they did. There’s a fine line between loneliness and exclusion, and who wants to admit to that?

Many of us expect our smartphones and tablets to be the perfect antidote to social malaise. But do virtual experiences provide that visceral sense of belonging so important to being human?

A recent study pokes a hole in that assumption. Alan Teo, an assistant professor at Oregon Health & Science University, followed 11,000 adults over age 50 who participated in a national study of aging at some point between 2004 and 2010. He and his colleagues wanted to know what type of social contact or lack of it might predict clinical depression two years later.

Major depression, the disease of dark thoughts, hits 16% of all Americans, who are twice as likely to be diagnosed with it during their lifetimes as they are to be diagnosed with cancer. Yet there’s not much talk of prevention.

The research team, which published its findings last October in the Journal of the American Geriatrics Society, controlled for demographic factors like age and sex—as well as for any medical, family or psychological history that might boost one’s depression risk. They found that only face-to-face interaction forestalled depression in older adults. Phone calls made a difference to people with a history of mood disorders but not to anyone else. Email and texts had no impact at all.

How often people got together with friends and family—or didn’t—turned out to be key. What’s more, the researchers discovered that the more in-person contact there was in the present, the less likely the specter of depression in the future.

People who had face-to-face contact with children, friends and family as infrequently as every few months had the highest rates of the disease. Those who connected with people in person, at least three times a week, had the lowest.

“That’s the beauty of it,” Dr. Teo told me. “The more often they got together in person, the better off they were.”

Winston Churchill called his own bouts of depression “my black dog,” and we know that it can be a tenacious foe. This study tells us that a cheap and easy way to foil it is in-person interaction, and that how you connect and with whom is important: People between the ages of 50 and 70 were best protected by face-to-face contact with their friends. Over the age of 70, it was in-person contact with family that mattered most.

Of course, as Dr. Teo said, phone and email are still great for making social plans. But to keep dark and dangerous thoughts at bay, you have to leave your desert island now and then and be there, in the flesh.

 

How Babies Quickly Learn to Judge Adults

Even if toddlers can’t tell us, they are making hard and fast judgments about adults

See the column on the Wall Street Journal site

 

Adults often make snap judgments about babies. First impressions lead us to assign them personalities, such as fearful, active or easy to please, and with good reason. Fifty years of evidence shows that babies begin life with traits that set the stage for how they interact with the world—and how the world reacts to them.

That might be one reason why siblings can have such wildly different takes on their own families. Once a mother has assessed her child as shy or fussy, she tends to tailor her behavior to that baby’s personality.

But what if babies make hard and fast judgments about us, too? Just because they can’t say much doesn’t mean they don’t have strong opinions. New research shows that babies are astute observers of the emotional tenor of adult interactions and censor their own behavior accordingly. Published in the March issue of Developmental Psychology, the study shows that infants who get a glimpse of a stranger involved in an angry exchange with another stranger will then act more tentatively during play.

The study’s lead authors, Betty Repacholi and Andrew Meltzoff, both of the University of Washington, explained that infants who witness an emotional outburst then expect that person to lose his cool again in a new situation. “Babies are registering how we respond emotionally,” Dr. Meltzoff said, “taking notes on how we typically react.”

The experiment included 270 15-month-old toddlers who watched two adults unfamiliar to them demonstrating how to play with an intriguing new toy. One adult, called “the emoter,” reacted either neutrally or angrily to the other adult’s attempts to play with the toy, showing her emotional cards by commenting “that’s entertaining” in a dispassionate tone or “that’s aggravating” in an angry rebuke.

The babies who witnessed the adult’s harsh reaction were then more likely to hang back before touching the intriguing toy. Even if the anger-prone adult had turned her back, and even when a different plaything was offered, the child’s hesitation was palpable. Some toddlers avoided the toy altogether.

Taking an adult’s emotional temperature happened quickly. Each baby was tested three times, but it usually took just one instance of verbal aggression for the baby to pigeonhole an adult as a hothead. The babies had “formed an impression about [the adult’s] psychological makeup, that this is an angry person,” said Dr. Repacholi.

What’s more, a secondhand brush with a riled-up adult will prompt toddlers to mollify that person. Other studies by Drs. Repacholi and Meltzoff and colleagues, using the same “eavesdropping on two strangers” design and published in February in the journal Infancy, showed that toddlers who witness an adult’s anger are more likely than other toddlers to hand over a prized toy. “Because they’ve learned that the adult is anger-prone, they try to appease her,” Dr. Repacholi said.

Well before they attribute thoughts and motivations to other people, young toddlers suss out any volatility in the adults around them, these studies show. But the findings also prompt some deeper questions. If brief shows of anger put toddlers on high alert, what might this mean for the inevitable conflicts that occur in family life?

As their studies involved babies observing the interactions between two people they had never met before, Drs. Repacholi and Meltzoff explained, their findings don’t really reproduce family life, during which parents and siblings show all kinds of feelings in various situations. “They have a history of interacting with their babies that the strangers in our study did not have,” Dr. Meltzoff wrote in an email.

Getting angry occasionally is not going to override the positive expectations that babies have built up about you over months of loving encounters, he told me. Still, “we are catching a glimpse of how babies pigeonhole us, and how they would describe our personalities, if they could only talk.”

The Perilous Aftermath of a Simple Concussion

Susan Pinker on how a concussion was both a personal struggle for her and a catalyst to study a phenomenon still only partly understood

See the column on the Wall Street Journal site

 

Eighteen months ago, a pickup truck hit me while I was walking on the sidewalk. The last thing I remember from that sunny Tuesday morning was reaching for my car keys. Then the lights went out.

I regained consciousness while being slid out of the scanner in the trauma unit. I had two fractures, two torn tendons, some wicked road rash and a concussion. The accident shook up my relationships, my memory and my personal drive. Still, everyone told me I was lucky to be alive, and I agree. Life has never seemed as tenuous—or as precious.

Not everyone with a mild brain injury is as lucky. A 20-year study published in February in the Canadian Medical Association Journal shows that people who have had a mild concussion are twice as likely to commit suicide as military personnel and more than three times as likely as members of the general population. Most of us agree with the bromide that time heals all wounds, but this study showed the opposite: After the concussion the risk of suicide rose steadily over time.

Led by Donald Redelmeier, a professor of medicine at the University of Toronto, the study included 235,110 healthy people who had seen a doctor after their accident. Most of the patients had no prior psychiatric diagnosis, hospital admission or suicide attempt.

“In the aftermath of a crash there is tremendous agony. But the broken ribs and leg will heal,” Dr. Redelmeier told me. “I’m not as sanguine about a concussion. Even when the CT scan doesn’t show major trauma, a minor injury can damage thousands and thousands of neurons. There are all sorts of problems that can last a long time, and we don’t know how to treat them.”

That clinical gap was clear. As I was leaving the emergency room, a staffer handed me a tip sheet written for teenage hockey players. (I live in Canada, after all.) There was no information for adults, nor anything on women and girls—who are known to be at greater risk of long-term problems after a concussion. When I asked the surgeon about cognitive symptoms during a follow-up visit, he exclaimed, “One concussion! The risk comes with more than one knock.” He added, “You’ll be fine.”

But I was far from fine. I spent mornings doing media interviews by phone about my new book, trying to sound upbeat. I spent afternoons sleeping, sobbing or staring at the clock, willing the time to pass so I could take another dose of oxycodone.

Meanwhile, well-meaning friends and colleagues were suggesting that the accident was some sort of warning. “This is God’s way of telling you to slow down,” said one. “Were you texting? Wearing headphones?” asked another. The refrain was that I should be thankful I’d dodged a karmic bullet and just get on with things.

But life was very different. A year after the accident I invited a friend to a concert—then blithely went on my own, forgetting all about her. I napped like a toddler and, because of fatigue and shoulder pain, couldn’t work a full day. I was never suicidal, thank goodness. But I was racked by an unanswered question: Why did this happen?

That lack of closure leads to one of the most astounding of Dr. Redelmeier’s findings. Compared to weekday accidents, weekend concussions magnify one’s suicide risk. One explanation could be psychological, he said. “If you get hurt at work, you can blame the circumstances. But if you get hurt horseback riding, that might affect how much support and sympathy you get, whether there’s companionship in fighting the good fight, or whether people feel you’re the architect of your own misfortune.”

Understanding why something happened is as important to brain health as a cast is to bone strength, it seems. It turns out that I am lucky, after all. I may never know why the driver didn’t see me. But I do know one thing: I was working that day, and it was a Tuesday.

The Peril of Ignoring Vaccines—and a Solution

Once considered eradicated in the U.S., measles is back. A look at the dangers of shunning vaccines and what can be done.

See the column on the Wall Street Journal site

In 2000 the U.S. considered measles eradicated, but the picture has changed alarmingly since then. In 2014, 667 unvaccinated people contracted measles. Last year an outbreak that began in California’s Disneyland infected more than 100.

Many Americans have been refusing to protect themselves and their children with the measles vaccine. According to a recent study in the American Journal of Public Health, as much as 5.5% of children are unvaccinated in some U.S. communities, and the parents most likely to refuse vaccines tend to be affluent, well-educated and white. Their resistance can largely be traced to a 1998 article in a British medical journal that falsely linked childhood vaccines to autism. That study was debunked, but the damage had been done.

Now doctors must figure out how to persuade these parents to change their minds. Late last year they got some help from a team of psychologists from the University of Illinois at Urbana-Champaign and the University of California, Los Angeles, who were interested in what might sway anti-vaxxers’ opinions.

Measles can be devastating. A highly contagious and virulent disease, it can lead to convulsions, hearing loss, brain damage and even death. Vaccination efforts have been so successful up to now, however, that almost half of the nation’s pediatricians have never seen a real case. The question is how to make people understand that the threat is real.

Would correcting misconceptions about the childhood vaccine-autism myth do the trick? Or would testimonials and graphic photos of sick children be more effective?

The study, led by Dr. Zachary Horne of the University of Illinois and published last August in the Proceedings of the National Academy of Science, asked 315 participants to complete questionnaires about their attitudes to vaccines and their plans to vaccinate their children. The subjects were chosen at random and not prescreened, although some dropped out or were later disqualified for not paying attention to the testing.

The researchers randomly divided subjects into three groups. They showed the “disease risk” group photos of young, infected children with florid rashes and a paragraph written by a mother of a child with measles, as well as three short warnings about the disease. The “autism correction” group read research summaries showing that childhood vaccines do not cause autism. And a control group read unrelated scientific vignettes.

Once again, all the participants completed the questionnaire about attitudes to vaccines. Which intervention was most likely to alter their views?

Surprisingly, the “autism correction” approach was no more influential in changing anti-vaxxers’ minds than the control condition. Telling people that their beliefs aren’t true just didn’t work. But showing people images of sick children with ugly rashes did, as did reading a parent’s account of how it feels to have a baby with measles who is spiking a fever of 106 degrees. “We spent three days in the hospital fearing we might lose our baby boy,” the mother wrote. “He couldn’t drink or eat, so he was on an IV, and for a while he seemed to be wasting away.”

Why would frightening people change their minds more than giving them the facts? The human brain evolved to give priority to appalling, negative events over positive ones, according to a seminal paper published in the Review of General Psychology in 2001. Lead author Roy Baumeister, a psychology professor at Florida State University, documented hundreds of ways that “bad is stronger than good,” as he and his colleagues titled the paper. It’s a position that has been confirmed by the PNAS study on vaccines and by a 2015 analysis in Psychological Bulletin of the impact of fear-based appeals on changing people’s behavior.

So, public officials, go ahead—scare parents silly.

 

Page 3 of 41234