Watching Terror and Other Traumas Can Deeply Hurt Teenagers

The focus: web exposure to the Boston Marathon bombing

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

Several weeks into the Gulf War in 1990, when I was working as a clinical psychologist in Montreal, pediatricians started referring children with mysterious behavior problems. A few 8- to 10-year-olds had started to wet their beds for the first time since they were toddlers. Others were refusing to go to school. Their first interviews revealed a common thread: Though strangers to each other, the children and their families were all visitors or immigrants from Israel, and they had been watching Scud missile attacks against Israel on the news every day.

These children were living more than 5,000 miles from the bombing, but observing the devastation from afar—and their parents’ reactions to it—had elicited an emotional response that skewed their habits. New research is now helping us to understand how second-hand exposure to disasters can change our brains and behavior, and who is most at risk.

“We used to think of it as a bull’s-eye. The closer you were to the trauma, the more likely you were to show symptoms,” said Jonathan Comer, a professor of psychology at Florida International University who investigates the impact of terrorist attacks on children and families. “But a summary of post-9/11 reactions really challenged that idea.” Those closest to the event didn’t necessarily show the most trauma, said Prof. Comer.

Prof. Comer’s own research upends our assumptions about who is most prone to feel psychological fallout after a terrorist attack. In a 2016 study of the Boston Marathon bombing published last summer in Evidence-Based Practice in Child and Adolescent Mental Health, the research team surveyed nearly 500 parents of 4-to-19-year-olds about their children’s internet exposure during the bombing and the manhunt that followed. The surveyed families all lived within 25 miles of either the bombing site itself or Watertown, Mass., where the police had ordered the local population to shelter in place during a search for suspects.

The findings? As one would expect, exposure to graphic online content rose with age. Over 23% of children saw internet images of the bombing, and 68% viewed footage of police with military-grade weapons searching their neighborhoods. The average was two to three hours of daily online activity per child. While those under age 8 were less exposed, three-quarters of children over 12 spent up to 6 hours daily viewing online news and social media coverage of the crisis.

Teenagers were the most prone of all age groups to experience psychological trauma after the bombing. The researchers found that the more internet news and social media contact they had about the bombing and manhunt, the more severe were their PTSD symptoms, which ranged from intrusive flashbacks to emotional numbing. What’s more, even if 87% of parents believed that online exposure to the crisis could be damaging, very few restricted their children’s access to it. And what parents thought was private—their own fear—turned out to be contagious, too, as shown in a 2014 study of parents’ reactions to the marathon bombing by Prof. Comer’s team.

To understand how observational distress works, a team led by Alexei Morozov at Virginia Tech Carilion Research Institute looked at what happens when mice watch a sibling experience fear. He first learned that vicarious fear leaves a neural trace that predisposes animals to experience trauma later on. A Morozov-team study published last month in Neuropsychopharmacology found that vicarious trauma changes the connections between the amygdala (roughly, our emotion and survival center) and the prefrontal cortex (the planning and decision-making area). “After the animal has the experience of the other’s pain, it allows excitation to be more robust…it reduces the ability of the prefrontal cortex to act rationally,” he told me.

That’s one reason why “watching around-the-clock breaking news is not in our best interest—either for adults or children,” said Jonathan Comer. Because it’s not how much danger we’re in that matters. It’s how much threat we perceive in others.

What You Just Forgot May Be ‘Sleeping’

Can’t remember what you were just thinking about? A new study amends our understanding of how memory works

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

If you’ve ever forgotten why you just entered a room, you know how fickle memory can be. One moment it’s obvious why you walked down the hall, and the next moment you’re standing there befuddled.

Here today, gone in a millisecond. At least that’s how we used to think about short-term, or working, memory. But a study just published in the journal Science tells a different story. A recent idea or word that you’re trying to recall has not, in fact, gone AWOL, as we previously thought. According to new brain-decoding techniques, it’s just sleeping.

“Earlier experiments show that a neural representation of a word disappeared,” said the study’s lead author, Brad Postle, a professor of psychology and psychiatry at the University of Wisconsin-Madison. But by using a trio of cutting-edge techniques, Dr. Postle and his team have revealed just where the neural trace of that word is held until it can be cued up again.

Their study amends the long-standing view of how memory works. Until now, psychologists thought that short-term memory evaporates when you stop thinking about something, while long-term memory permanently rewires neural connections. The new research reveals a neural signature for a third type of memory: behind-the-scenes thoughts that are warehoused in the brain.

In the study’s four experiments, a total of 65 students viewed a pair of images—some combination of a word, a face or a cloud of moving dots—on a screen. During a 10-second period, the students were prompted to think about one of the two images they had seen. After a brief delay, they had to confirm whether a picture they saw matched one of the first two images.

Throughout the experiment, the Postle team monitored the students’ pattern of neural activity. When they prompted the students to remember a particular image, a unique 3-D display of neural activity corresponding to that idea popped up. What really interested the experimenters, though, was what the brain was doing with the image it had effectively set aside. Where did that memory go while it was waiting in the wings?

To find out, the team used a new technique to see what happened when the participants were warned that they would be tested on the set-aside image. This novel approach created a dynamic 3-D display of electroencephalogram (brain wave) and brain-imaging data that let the researchers see beyond what part of the brain “lights up” and zoom in on the pattern of activity within a region. That’s how the team learned that the students’ brain activity had indeed shifted to the “on hold” image’s distinctive pattern—which until then had been invisible.

To confirm that the memory still existed even while a person was not thinking about it, the scientists used another recent technique, transcranial magnetic stimulation, or TMS. They positioned a wand over a participant’s scalp and delivered a harmless magnetic pulse to the brain areas that held the images. The pulse made the distinctive neural signature of those fleeting memories visible to the scientists and triggered their recall in the students.

Dr. Postle compared working memory to paper inscribed with invisible ink. Words written in lemon juice are initially imperceptible, but by passing a hot cup of coffee over the paper, “you can see the part of the message that was heated up…. Our TMS is like the coffee cup.” In this way the team activated a memory that was not only temporary but below the student’s level of consciousness.

Using Dr. Postle’s new trifecta of brain-imaging and brain-stimulation techniques to reactivate forgotten memories has enticing—though still remote—therapeutic possibilities. It is neuroscience’s most faithful reading yet of the real-time content of our thoughts—about as close as we have ever come to mind-reading.

“Our study suggests that there’s information in the penumbra of our awareness. We are not aware that it’s there, but it’s potentially accessible,” said Dr. Postle.

Why Lessons From Chimp Mothers Last a Lifetime

A grooming study suggests the powerful influence of moms (human ones, too)

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

My mother taught me how to cook, learn my times tables and read with a critical eye—and, in the social realm, how to mind my manners and reach out to others.

But maternal mentors are hardly exclusive to humans. Vervet-monkey moms show their infants their own way to clean off fruit, while chimpanzee mothers teach their toddlers just which stick is best for termite-fishing and how to use rocks to crack nuts. Some bottlenose-dolphin mothers show their young how to find sponges to protect their sensitive snouts while scouring the sea floor for treats—a safety measure that resonated with the mother in me.

New evidence shows that some mammal mothers—specifically, chimps—also transmit to offspring their unique style of socializing. (In contrast, nonhuman primate fathers rarely get involved in teaching children.) The new research focuses on grooming, a primary feature of an ape’s social life.

Chimp grooming does double duty: Picking through another animal’s fur controls parasites and also establishes a trusting social bond. Grooming can be a relaxing pastime, a come-on or a way to forge alliances. As with human social behavior, chimps do it in different ways.

In what primatologists call “high-arm grooming,” two chimps groom each other face-to-face, each with one arm raised in the air. Using their opposite hands to comb through each other’s fur, the pair might be clasping their raised hands aloft or leaning their forearms together overhead. Either way, chimps of both sexes practice this unique grooming style well into adulthood.

While male chimps stay where they are born all their lives, most females migrate to new groups at adolescence. Wherever they end up, the ones from the high-arm, hand-holding community then teach this endearing posture to their own progeny. “It’s a social custom inherited from mother to offspring and not known in any primate before,” said Richard Wrangham, a professor of biological anthropology at Harvard. He and his team observed this unusual behavior among wild chimps living in the Kanyawara community of Uganda’s Kibale National Park.

As described last month in the journal Current Biology, the researchers analyzed 932 photos of high-arm grooming among 36 wild chimps, half of them female. The team had expected that when the adolescent female chimps immigrated to a new community, they would conform to its customs. Teenagers like to fit in, after all.

But a close look at which chimps held hands showed that what mattered is how mom did it. “We’ve got individuals up to 40 years old who are following the maternal pattern,” Dr. Wrangham told me, adding that even after the mother dies, her offspring keep doing things her way.

Intriguingly, hanging out with other family members or peers made no difference in grooming style—regardless how close the relationship or how much time the animals spent together. That makes sense, given that primate mothers are crucial to the education and survival of their offspring. In a 2006 study of wild chimps in Tanzania’s Gombe National Park, the amount of time spent watching mom fish for termites correlated with how skilled 6-year-old chimps became at that task. Female children copied their mothers more faithfully and so became more efficient learners.

Human mothers also have a uniquely powerful effect on their children’s behavior. As mammals and primates, they take time to coach their young ones, who then copy what they do. I’m not discounting the importance of fathers, but it looks like we belong to a large evolutionary family that learns enduring lessons at our mothers’ feet.

Empathy by the Book: How Fiction Affects Behavior

Not all genres have the same effect, research shows

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

When I want to escape I pick up a good novel. But does this habit provide more than a quick getaway?

We’ve long known about the collateral benefits of habitual reading—a richer vocabulary, for example. But that’s only part of the picture. Mounting evidence over the past decade suggests that the mental calisthenics required to live inside a fictional character’s skin foster empathy for the people you meet day-to-day.

In 2006, a study led by University of Toronto psychologists Keith Oatley and Raymond Mar connected fiction-reading with increased sensitivity to others. To measure how much text the readers had seen in their lifetimes, they took an author-recognition test—a typical measure for this type of study. “The more fiction people read, the better they empathized,” was how Dr. Oatley summarized the findings. The effect didn’t hold for nonfiction.

Still, no one knew whether reading fiction fostered empathy or empathy fostered an interest in fiction. Other factors could have been at play too, like personality.

So, in 2009, part of the Oatley-Mar team involved in the 2006 study reproduced it with a sample of 252 adults—this time controlling for age, gender, IQ, English fluency, stress, loneliness and personality type. The researchers also assessed participants’ “tendency to be transported by a narrative”—the sense that you’re experiencing a story from within, not watching it as an outsider.

Finally, participants took an objective test of empathy, called the Reading the Mind in the Eyes Test. The aim of all of this was to see how long-term exposure to fiction influenced their ability to intuit the emotions and intentions of people in the real world.

The results? Once competing variables were statistically stripped away, fiction reading predicted higher levels of empathy. Such readers also lived large in the flesh-and-blood social sphere, with richer networks of people to provide entertainment and support than people who read less fiction. This finding put to rest the stereotype of bookworms as social misfits who use fictional characters as avatars for real friends and romantic partners.

Later studies confirmed that reading fiction does cause a spike in the ability to detect and understand other people’s emotions—at least in the short term. In a series of experiments published in 2013 in Science, social psychologist Emanuele Castano and David Comer Kidd of the New School for Social Research tried to figure out whether the type of fiction mattered.

The researchers handed subjects—in groups ranging in size from 69 to 356—different types of genre fiction, literary fiction or nonfiction, or nothing to read at all. They then assessed participants on several measures of empathy. Nonfiction—along with horror, sci-fi or romance novels—had little effect on the capacity to detect others’ feelings and thoughts. Only literary fiction, which requires readers to work at guessing characters’ motivations from subtle cues, fostered empathy.

In these studies, the reading of nonfiction not only failed to spur empathy but also predicted loneliness and social isolation, specially among men. Of course, nonfiction reading has its virtues. Other research suggests that various kinds of nonfiction can prompt empathetic feelings—as long as the narrative is moving and transformative.

In recent studies, neuroscientist Paul Zak at Claremont Graduate University and colleagues showed participants heartfelt stories, such as a video narrated by a father of a toddler with brain cancer. The video induced a spike in observers’ levels of oxytocin—a hormone that promotes trust, nurturing and empathy—and larger donations to charity. Watching a straightforward travelogue-type video of the same father and son visiting the zoo didn’t have that effect.

Apparently, what matters is not whether a story is true. Instead, as Dr. Oatley says, “If you’re enclosed in the bubble of your own life, can you imagine the lives of others?”

 

 

When We Display Our Piety, Our Social Stock Rises

People perceive signs of religious observance in others as a measure of dependability, new research shows

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

One of the many unusual aspects of this presidential campaign has been how little the candidates have discussed religion. Compare this with two previous presidential contenders, among many others who publicly affirmed their faith. When asked in 2000 to name his favorite political thinker, George W. Bush replied, “Christ, because He changed my heart,” while a 1980 ad for Jimmy Carter’s unsuccessful re-election campaign intoned, “He takes the time to pray privately and with Rosalynn each day.”

Perhaps one reason for the change is that “none” is the fastest-growing major religious affiliation in America, as a Pew Research Center survey showed last year. Given this shifting terrain, does being visibly devout still signal that you can be trusted?

Surprisingly, the answer is yes. People perceive signs of religious observance in others as a measure of dependability, new research shows. Whether one fasts on Yom Kippur, wears a cross of ash for Lent or places a red dot in the middle of one’s forehead, such religious “badges” do more than just signal that you belong to a particular group. Other people see these displays as a shorthand for reliability.

In four experiments published last year in the journal Psychology of Religion and Spirituality, the anthropologist Richard Sosis of the University of Connecticut and his colleagues altered one fifth of the images so that people appeared to be wearing a cross around their necks or a cross of ash on their foreheads. The experiments were conducted between Ash Wednesday and Easter. The researchers interspersed these images with those of people without any religious adornments.

Several hundred university students of varying backgrounds then examined the stack of photos, rating each of the faces for trustworthiness. The students also played an economic game during which they entrusted money to players whom they deemed honorable.

The researchers were surprised to discover that a person wearing Christian religious symbols prompted powerful feelings of trust, not only among fellow Christians but also among secular students and members of other religions. The presence of a cross doubled the money that non-Christians were willing to offer someone in the trust game, while the Ash Wednesday cross increased their investment by 38.5%.

Other recent studies show that the effect is the same for any religious practice that imposes a cost on the appearance, comfort or finances of believers, or that restricts their diet or sexual behavior. Whether they are MuslimsJews or Hindus, such displays of devotion burnish the reputations of the observant.

In a fascinating study of Hindu and Christian villagers in South India, published this year in the journal Evolution and Human Behavior, the anthropologist Eleanor Power found that local religious rituals greatly enhanced a participant’s standing in the community. For both Catholics and Hindus, these included onerous pilgrimages; Hindu rituals included walking across burning coals and being suspended by hooks in one’s skin. Participation in these widely accepted demonstrations of devotion also predicted which individuals had pivotal roles in local social networks.

“People are more likely to go to you for support if you undertake such religious acts,” said Dr. Power of the nonprofit Santa Fe Institute. She had access to temple records showing who paid religious fees and joined pilgrimages and processions, along with villagers’ evaluations of their peers and their status in the community. “People will rate you as having a good work ethic, giving good advice and being more generous if you worship regularly and do firewalking or other costly acts,” Dr. Power told me.

Such religious displays make us more likely to turn to these people for leadership. Today’s presidential contenders would perhaps benefit from a greater show of reverence. The harder they work to convey that they believe in something greater than themselves, the more credible they will be to voters.

 

 

Marijuana Makes for Slackers? Now There’s Evidence

By

SUSAN PINKER

See the column on the Wall Street Journal site

 

In cities like Seattle and Vancouver, the marijuana icon has become almost as common on storefronts as the Starbucks mermaid. But there’s one big difference between the products on offer: A venti latte tastes the same everywhere and provides an identical caffeine rush, while marijuana stores offer the drug’s active ingredients in varying combinations, potencies and formats. There is no consistency in testing, standards or labeling.

This matters because marijuana’s two psychoactive ingredients, tetrohydrocannabinol (THC) and cannabidiol (CBD), have contrasting effects on the brain. “THC makes you feel high,” said Catharine Winstanley, a psychology professor at the University of British Columbia who does research on marijuana, while CBD “is responsible for its analgesic, antiseizure and purported anticancer effects.”

In street marijuana, the THC-to-CBD ratio now tends to be 10 to 1, and it is increasing, a trend occurring even at some marijuana clinics, Dr. Winstanley said. And few people know what effect that has on their brains. A new study by Dr. Winstanley’s group in the Journal of Psychiatry and Neuroscience examines how these two chemicals shape our willingness to face a challenge. Does marijuana make us lazy?

To answer this question, the Winstanley team first tested rats to determine which were hard workers and which were slackers. After being injected with THC, CBD or both, the rats had to choose between a simple task with a measly reward or a demanding one that reaped a bigger payoff.

In the easy version, the animals had a minute—an eternity for a rat—to see that a light was on in a chamber and poke their noses inside. They got a single sugar pellet. In the hard version they only had a fifth of a second to notice the light—something a rat, even a stoned one, should have no problem perceiving—and respond. Their vigilance in the hard choice would earn them two lumps instead of one. Under normal circumstances, the vast majority of rats prefer to work harder for a bigger payoff.

The results? “Whether they were workers or slackers to begin with,” Dr. Winstanley reported, “even small amounts of THC made them all slackers.”

THC didn’t impair the rats’ ability to perform, only their willingness to try. That downshift in motivation didn’t happen in rats injected with CBD only.

Later analysis of the rats’ brains showed that those with the greatest reaction to THC also had a greater density of a particular receptor in their anterior cingulate cortex, or ACC. “That area of the brain is very important for people to gear up to face a challenge and stay the course,” Dr. Winstanley said.

A small study shows something similar in humans. Published this month in the journal Psychopharmacology by a University College London team, the study of 17 adults showed that inhaling cannabis with THC alone (versus pot with CBD plus THC, or a placebo), induces people to choose an easy task more often, eschewing the harder one that offered four times the payoff. Neither the researchers nor the subjects knew who had gotten the drug, who the placebo. The effects were short term, meaning that the subjects’ apathy didn’t persist after the high wore off.

The need for policy-makers to deal with the results of tests like these is complicated by the lack of regulatory consistency. That’s because the U.S. Drug Enforcement Administration considers marijuana as illegal as heroin, while 25 states and the District of Columbia have legalized pot for various purposes. So no national standards exist.

“Thinking that it’s harmless, that you can smoke cannabis and you’ll be fine, is a false assumption,” said Michael Bloomfield, a University College London professor in psychiatry and one of the UCL study’s authors. “THC alters how willing you are to try things that are more difficult.” So next time you go to a clinic—or dealer—you might want to ask about the product’s chemical breakdown.

Medicating Children With ADHD Keeps Them Safer

New research suggests that medication can reduce risky behavior in teenagers with attention deficit hyperactivity disorder, or ADHD

 

By

SUSAN PINKER

Updated Aug. 17, 2016 10:23 a.m. ET

See the column on the Wall Street Journal site

 

If a pill could prevent teenagers from taking dangerous risks, would you consider it for your children?

I’d be tempted. My skateboard- and bicycle-riding son was hit by a car—twice—when he was a teenager. I would have welcomed anything that could have averted those dreadful phone calls from the emergency room.

While some bumps and scares are inevitable for active guys like him, serious misadventures with long-lasting repercussions are often par for the course for a subset of them—those with attention deficit hyperactivity disorder, or ADHD. But a new article suggests that early medication can significantly cut the odds of bad things happening later.

Affecting nearly 9% of all Americans between 4 and 18 years of age, ADHD is one of the most common childhood disorders and also one of the most misunderstood. Its symptoms color almost every aspect of a child’s life—from being able to focus in school to making and keeping friends, reining in fleeting impulses and assessing risk and danger.

Indeed, accidents are the most common cause of death in individuals with ADHD, withone 2015 study of over 710,000 Danish children finding that 10- to 12-year-olds with ADHD were far more likely to be injured than other children their age. Drug treatment made a big difference, however, nearly halving the number of emergency room visits by children with ADHD.

Medicating children to address problems with attention and self-control remains controversial. ADHD isn’t visible, like chickenpox, nor immediately life-threatening, like asthma. Its distortion of a child’s ability to meet adults’ expectations creates an atmosphere of frustration and blame. So it’s not often taken for what it really is: a neurodevelopmental disorder with genetic roots.

An enduring myth about ADHD is that children grow out of it in adolescence. We now know that a 5-year-old with a bona fide attentional disorder may well become a dreamy, restless and impulsive teenager and adult. Adolescents with ADHD think even less about consequences than the average teenager and are especially thrilled by novelty. They’re more likely than their friends to drink too much, drive like maniacs, abuse drugs and have unprotected sex.

It’s a sobering list. But an article published last month by Princeton researchers Anna Chorniyand Leah Kitashima in the journal Labour Economics shows that treating ADHD with medication during childhood can head off later problems. “We have 11 years of data for every child enrolled in South Carolina Medicaid who was diagnosed with ADHD,” Dr. Chorniy told me. The researchers tracked each doctor visit and every prescription, with a sample of over 58,000 children whose health progress they tracked into adulthood.

This long view let the economists compare the behaviors of teens treated with the most common ADHD medications, such as Ritalin, Concerta and Adderall, to the types of risks taken by other children with ADHD who were not treated. The researchers found fewer and less severe injuries and health problems among the treated children: a 3.6% reduction in sexually transmitted infections; 5.8% fewer children who sought screening for sexually transmitted infections (suggesting they had had an unprotected sexual tryst); and 2% fewer teen pregnancies.

That adds up to a lot fewer teenagers in trouble.

The economists did their study based on existing data, but randomized, controlled studies—experiments carefully designed to establish cause-and-effect relationships—have reached the same conclusion: that medication to control ADHD can reduce the high price in psychic pain, loss of educational opportunity and riven relationships. A child whose disorder is diagnosed and treated early by a trained clinician stands a better chance of growing into a healthy and thoughtful adult.

 

To Beat the Blues, Visits Must Be Real, Not Virtual

Loneliness keeps increasing, but new research suggests that electronic ways of keeping in touch do little compared with in-person contact

See the column on the Wall Street Journal site

 

Imagine being stranded on a desert island with a roof over your head and sufficient provisions—but no human contact other than what you can get from your smartphone. Would you get depressed? Or would your networked device provide enough connection to stave off dark thoughts?

This metaphor applies to a great many Americans. Their basic material needs are covered, and 85% have internet access. Yet at least 26% say that they feel deeply lonely. Psychologists know this from population surveys, not because people talk about it. The distress of feeling rejected or neglected by friends or family is a key predictor of depression, chronic illness and premature death. It’s also a public-health time bomb. The rate of loneliness has increased from about 14% in the 1970s to over 40% among middle-aged and older adults today, and the aging of America’s population is likely to make things worse in the years ahead.

Few public health initiatives aim at combating loneliness, despite the fact that it’s riskier to health and survival than cigarette smoking or obesity. It’s also a taboo topic. Doctors don’t often ask about it, and we might not fess up, even if they did. There’s a fine line between loneliness and exclusion, and who wants to admit to that?

Many of us expect our smartphones and tablets to be the perfect antidote to social malaise. But do virtual experiences provide that visceral sense of belonging so important to being human?

A recent study pokes a hole in that assumption. Alan Teo, an assistant professor at Oregon Health & Science University, followed 11,000 adults over age 50 who participated in a national study of aging at some point between 2004 and 2010. He and his colleagues wanted to know what type of social contact or lack of it might predict clinical depression two years later.

Major depression, the disease of dark thoughts, hits 16% of all Americans, who are twice as likely to be diagnosed with it during their lifetimes as they are to be diagnosed with cancer. Yet there’s not much talk of prevention.

The research team, which published its findings last October in the Journal of the American Geriatrics Society, controlled for demographic factors like age and sex—as well as for any medical, family or psychological history that might boost one’s depression risk. They found that only face-to-face interaction forestalled depression in older adults. Phone calls made a difference to people with a history of mood disorders but not to anyone else. Email and texts had no impact at all.

How often people got together with friends and family—or didn’t—turned out to be key. What’s more, the researchers discovered that the more in-person contact there was in the present, the less likely the specter of depression in the future.

People who had face-to-face contact with children, friends and family as infrequently as every few months had the highest rates of the disease. Those who connected with people in person, at least three times a week, had the lowest.

“That’s the beauty of it,” Dr. Teo told me. “The more often they got together in person, the better off they were.”

Winston Churchill called his own bouts of depression “my black dog,” and we know that it can be a tenacious foe. This study tells us that a cheap and easy way to foil it is in-person interaction, and that how you connect and with whom is important: People between the ages of 50 and 70 were best protected by face-to-face contact with their friends. Over the age of 70, it was in-person contact with family that mattered most.

Of course, as Dr. Teo said, phone and email are still great for making social plans. But to keep dark and dangerous thoughts at bay, you have to leave your desert island now and then and be there, in the flesh.

 

How Babies Quickly Learn to Judge Adults

Even if toddlers can’t tell us, they are making hard and fast judgments about adults

See the column on the Wall Street Journal site

 

Adults often make snap judgments about babies. First impressions lead us to assign them personalities, such as fearful, active or easy to please, and with good reason. Fifty years of evidence shows that babies begin life with traits that set the stage for how they interact with the world—and how the world reacts to them.

That might be one reason why siblings can have such wildly different takes on their own families. Once a mother has assessed her child as shy or fussy, she tends to tailor her behavior to that baby’s personality.

But what if babies make hard and fast judgments about us, too? Just because they can’t say much doesn’t mean they don’t have strong opinions. New research shows that babies are astute observers of the emotional tenor of adult interactions and censor their own behavior accordingly. Published in the March issue of Developmental Psychology, the study shows that infants who get a glimpse of a stranger involved in an angry exchange with another stranger will then act more tentatively during play.

The study’s lead authors, Betty Repacholi and Andrew Meltzoff, both of the University of Washington, explained that infants who witness an emotional outburst then expect that person to lose his cool again in a new situation. “Babies are registering how we respond emotionally,” Dr. Meltzoff said, “taking notes on how we typically react.”

The experiment included 270 15-month-old toddlers who watched two adults unfamiliar to them demonstrating how to play with an intriguing new toy. One adult, called “the emoter,” reacted either neutrally or angrily to the other adult’s attempts to play with the toy, showing her emotional cards by commenting “that’s entertaining” in a dispassionate tone or “that’s aggravating” in an angry rebuke.

The babies who witnessed the adult’s harsh reaction were then more likely to hang back before touching the intriguing toy. Even if the anger-prone adult had turned her back, and even when a different plaything was offered, the child’s hesitation was palpable. Some toddlers avoided the toy altogether.

Taking an adult’s emotional temperature happened quickly. Each baby was tested three times, but it usually took just one instance of verbal aggression for the baby to pigeonhole an adult as a hothead. The babies had “formed an impression about [the adult’s] psychological makeup, that this is an angry person,” said Dr. Repacholi.

What’s more, a secondhand brush with a riled-up adult will prompt toddlers to mollify that person. Other studies by Drs. Repacholi and Meltzoff and colleagues, using the same “eavesdropping on two strangers” design and published in February in the journal Infancy, showed that toddlers who witness an adult’s anger are more likely than other toddlers to hand over a prized toy. “Because they’ve learned that the adult is anger-prone, they try to appease her,” Dr. Repacholi said.

Well before they attribute thoughts and motivations to other people, young toddlers suss out any volatility in the adults around them, these studies show. But the findings also prompt some deeper questions. If brief shows of anger put toddlers on high alert, what might this mean for the inevitable conflicts that occur in family life?

As their studies involved babies observing the interactions between two people they had never met before, Drs. Repacholi and Meltzoff explained, their findings don’t really reproduce family life, during which parents and siblings show all kinds of feelings in various situations. “They have a history of interacting with their babies that the strangers in our study did not have,” Dr. Meltzoff wrote in an email.

Getting angry occasionally is not going to override the positive expectations that babies have built up about you over months of loving encounters, he told me. Still, “we are catching a glimpse of how babies pigeonhole us, and how they would describe our personalities, if they could only talk.”

The Perilous Aftermath of a Simple Concussion

Susan Pinker on how a concussion was both a personal struggle for her and a catalyst to study a phenomenon still only partly understood

See the column on the Wall Street Journal site

 

Eighteen months ago, a pickup truck hit me while I was walking on the sidewalk. The last thing I remember from that sunny Tuesday morning was reaching for my car keys. Then the lights went out.

I regained consciousness while being slid out of the scanner in the trauma unit. I had two fractures, two torn tendons, some wicked road rash and a concussion. The accident shook up my relationships, my memory and my personal drive. Still, everyone told me I was lucky to be alive, and I agree. Life has never seemed as tenuous—or as precious.

Not everyone with a mild brain injury is as lucky. A 20-year study published in February in the Canadian Medical Association Journal shows that people who have had a mild concussion are twice as likely to commit suicide as military personnel and more than three times as likely as members of the general population. Most of us agree with the bromide that time heals all wounds, but this study showed the opposite: After the concussion the risk of suicide rose steadily over time.

Led by Donald Redelmeier, a professor of medicine at the University of Toronto, the study included 235,110 healthy people who had seen a doctor after their accident. Most of the patients had no prior psychiatric diagnosis, hospital admission or suicide attempt.

“In the aftermath of a crash there is tremendous agony. But the broken ribs and leg will heal,” Dr. Redelmeier told me. “I’m not as sanguine about a concussion. Even when the CT scan doesn’t show major trauma, a minor injury can damage thousands and thousands of neurons. There are all sorts of problems that can last a long time, and we don’t know how to treat them.”

That clinical gap was clear. As I was leaving the emergency room, a staffer handed me a tip sheet written for teenage hockey players. (I live in Canada, after all.) There was no information for adults, nor anything on women and girls—who are known to be at greater risk of long-term problems after a concussion. When I asked the surgeon about cognitive symptoms during a follow-up visit, he exclaimed, “One concussion! The risk comes with more than one knock.” He added, “You’ll be fine.”

But I was far from fine. I spent mornings doing media interviews by phone about my new book, trying to sound upbeat. I spent afternoons sleeping, sobbing or staring at the clock, willing the time to pass so I could take another dose of oxycodone.

Meanwhile, well-meaning friends and colleagues were suggesting that the accident was some sort of warning. “This is God’s way of telling you to slow down,” said one. “Were you texting? Wearing headphones?” asked another. The refrain was that I should be thankful I’d dodged a karmic bullet and just get on with things.

But life was very different. A year after the accident I invited a friend to a concert—then blithely went on my own, forgetting all about her. I napped like a toddler and, because of fatigue and shoulder pain, couldn’t work a full day. I was never suicidal, thank goodness. But I was racked by an unanswered question: Why did this happen?

That lack of closure leads to one of the most astounding of Dr. Redelmeier’s findings. Compared to weekday accidents, weekend concussions magnify one’s suicide risk. One explanation could be psychological, he said. “If you get hurt at work, you can blame the circumstances. But if you get hurt horseback riding, that might affect how much support and sympathy you get, whether there’s companionship in fighting the good fight, or whether people feel you’re the architect of your own misfortune.”

Understanding why something happened is as important to brain health as a cast is to bone strength, it seems. It turns out that I am lucky, after all. I may never know why the driver didn’t see me. But I do know one thing: I was working that day, and it was a Tuesday.