Personhood Week: Do Kids Count?

In 1891, the U.S. Supreme Court heard a case about negligence that was really about personhood.

The plaintiff, the Union Pacific Railway Company, was asking the court to force a woman, Clara Botsford, to submit to a surgical examination. Why? Botsford was suing the company for negligence related to a top bunk in one of its train’s sleeping cars. The bunk fell while she was under it, “rupturing the membranes of the brain and spinal cord” and causing “permanent and increasing injuries.” The company wanted its own doctors to examine Botsford and confirm the diagnoses, but she did not consent to the examination.

The Supreme Court ruled against the company. As Justice Horace Gray wrote in his opinion (emphasis mine):

“No right is held more sacred or is more carefully guarded by the common law than the right of every individual to the possession and control of his own person, free from all restraint or interference of others unless by clear and unquestionable authority of law.”

This is just one of many examples from U.S. case law illustrating that a big part of personhood is autonomy. In our society, people are supposed to have control over their own bodies and make independent decisions about their lives. This idea drives the modern medical concept of informed consent, in which an individual is supposed to give permission before receiving medical therapies or participating in a research study.

That autonomy principle, though, gets sticky when applied to a subset of humans that most of us would surely call persons: minors.

Read more at…

Only Human, November 2014.

When Grief Is Traumatic

As Vicki looked at her son in his hospital bed, she didn’t believe he was close to death. He was still young, at 33. It had been a bad motorcycle accident, yes, but he was still strong. To an outsider, the patient must have looked tragic — unconscious and breathing through a ventilator. But to Vicki, he was only sleeping. She was certain, in fact, that he had squeezed her hand. She worked hard to get the best Phoenix motorcycle accident lawyer on their side, out of hope.

Later that day, doctors pronounced Vicki’s son brain-dead. And for the next two years, she couldn’t stop thinking about him. She felt terribly guilty about the circumstances of his death: He and a friend had been drinking before they got in the car. She knew he was a recovering alcoholic, and that he had recently relapsed. She couldn’t shake the thought that she should have pushed him harder to go back to rehab. Every day Vicki flipped through a scrapbook of his photos and articles about his death. She turned his motorcycle helmet into a flowerpot. She let housework pile up and stopped seeing her friends. “She seemed to be intent on holding onto him,” one of her therapists wrote about her case, “at the cost of reconnecting with her own life.”

Vicki is part of the 10 percent of grievers who have prolonged grief, also known as complicated grief or traumatic grief. Grieving is an intense, painful, and yet altogether healthy experience. What’s unhealthy is when the symptoms of grief — such as yearning for the dead, feeling anger about the loss, or a sense of being stuck — last for six months or more.

Read more at…

Only Human, November 2014.

Chantix, Suicide, and the Point of Prescription Drug Warnings

Quick poll: Think back to the last time you bought a prescription medication. Did you read any of the information about the drug printed on the papers inside the box? And if you did read it, did that stop you from taking the drug?

I can’t recall a time when I read any of that fine print, despite the fact that I’m fascinated by medicine and often write about it. I got thinking about the potency (or impotence) of these warnings this week while reading about a controversy surrounding Chantix, a drug that helps people quit smoking.

Chantix (Pfizer’s branded name for varenicline) works by stimulating nicotine receptors in the brain, thus curbing cravings for cigs. The Food and Drug Administration (FDA) approved the drug in 2006. Since then, a small percentage of people who take Chantix have reported neurological side effects, and serious ones: depression, psychosis, erratic behavior, even “feeling like a zombie.” The drug has been linked to more than 500 suicides, 1,800 attempted suicides, and the bizarre death of an American musician. Here are a few anecdotal reports about the drug from a Reddit thread:

  • Chantix was the most miserable drug I have ever taken…severe gi distress, depression, paranoia, crazy and vivid dreams, etc. BUT, it got me off cigarettes after everything else I tried had failed…As I knew that it really fucked with you I prepped by temporarily getting rid of the guns and having my brother check up on me daily…What keeps me from going back to smoking is knowing that one day I’ll want to quit again, and I NEVER want to experience Chantix again!!!
  • I’m convinced Chantix played a part in my divorce. My ex gave up smoking, her Pepsi habit, as well as marriage.
  • My mother was on it (and successfully quit smoking using it) and she had some outrageous paranoia. She would accuse us of conspiring against her, making her sick, not loving her, lying to her, stealing things (that she misplaced), turning the dog against her (da fuq??), trying to poison her and sabotaging her car…she smoked for 40 years and failed at quitting hundreds of times. Chantix did the trick somehow but made her nuts.

Yikes! Reading stories like that might scare me enough to think twice about the drug. But would the information in the package insert?

Read more at…

Only Human, October 2014.

Are Video Games the Future of Brain Medicine?

On October 6th, I find myself in the gleaming office of a Boston biotech. I’ve been seated in a clear plastic chair, where I am about to try an experimental medicine for a brain disorder I don’t have.

The space is home to PureTech Ventures, the parent company of Akili Interactive Labs, which makes the new medicine. Since December, children in Florida and North Carolina have also tried the treatment as part of a formal clinical trial for attention-deficit hyperactivity disorder (ADHD). The medicine is unusual because of its delivery system: an iPad or iPhone. That’s because the medication is a video game called Project: EVO.

Until now, I haven’t touched a video game since about 1991. What if I fumble the mechanics, or worse — what if the game deems me cognitively deficient? One of Akili’s founders, 32-year-old Eddie Martucci, hands me an iPad and I see my avatar: a yellow humanoid, floating on a jet-fueled raft down a crooked, icy river. My task seems simple: I tap on blue fish that zoom overhead, but avoid the red and green fish, as well as blue birds. Of course, I’m also steering the raft to avoid frozen spikes along the riverbank.

It’s hard. It feels like I’m constantly missing my targets and smashing into the sides. Most frustrating — and addictive — of all: as I get better, the game instantly gets harder.

If Akili’s clinical studies are successful, doctors will one day prescribe EVO for ADHD as well as a variety of other disorders affecting so-called executive function — the ability to plan, inhibit actions, and quickly switch between tasks. The game has been part of a dozen clinical trials to date, involving people with ADHD, Alzheimer’s, autism, and depression.

The plan is realistic enough that Big Pharma wants in: Akili has already struck deals with two traditional drug companies, Pfizer and Shire. Within the industry, Martucci says, “there’s definitely a growing receptivity to digital technology.” Here are some gaming accessories reviews that can help people who can’t hear very well or need a better mouse or controller.

Read more at…

The Verge, October 2014.

Mental Leaps Cued by Memory’s Ripples

Over the past few decades, researchers have worked to uncover the details of how the brain organizes memories. Much remains a mystery, but scientists have identified a key event: the formation of an intense brain wave called a “sharp-wave ripple” (SWR). This process is the brain’s version of an instant replay — a sped-up version of the neural activity that occurred during a recent experience. These ripples are a strikingly synchronous neural symphony, the product of tens of thousands of cells firing over just 100 milliseconds. Any more activity than that could trigger a seizure.

Now researchers have begun to realize that SWRs may be involved in much more than memory formation. Recently, a slew of high-profile rodent studies have suggested that the brain uses SWRs to anticipate future events. A recent experiment, for example, finds that SWRs connect to activity in the prefrontal cortex, a region at the front of the brain that is involved in planning for the future.

Studies such as this one have begun to illuminate the complex relationship between memory and the decision-making process. Until a few years ago, most studies on SWRs focused only on their role in creating and consolidating memories, said Loren Frank, a neuroscientist at the University of California, San Francisco. “None of them really dealt with this issue of: How does the animal actually pull [the memory] back up again? How does it actually use this to figure out what to do?”

The new results are also prompting a broad shift in our understanding of the hippocampus, a C-shaped nub of brain tissue behind each ear. Since the late 1950s, when the region was famously tied to memory loss in the patient known as H.M., researchers have focused on its role in creating and storing memories. But newer studies have shown that the hippocampus is active when people imagine performing a task in the future. Similarly, people who have damaged hippocampi cannot imagine new experiences. The hippocampus doesn’t just allow instant replays — a kind of mental time travel into the past — it also helps us mentally leap forward.

In fact, complex planning may be the true benefit of the hippocampus. “That’s the point of having a memory, right?” Frank said. “To go back to the experiences you’ve had, extract general principles from them, and then use those principles to figure out what to do next.”

Read more at…

Quanta, October 2014 (and syndicated by Scientific American).

The Dog Mom’s Brain

When people ask me if I have kids, my standard answer is, “I have a dog.” My husband and I are the first to admit that we tend to treat our pup like a “real” child. He eats organic food. Our apartment is littered with ripped plush toys. We talk to him in stupid high-pitched voices. He spends almost all of his time with us, including sleeping and vacations. When he’s not with us he’s at a daycare center down the street — and I spend much of that time worrying about whether he’s OK. It’s probably not a full-blown separation anxiety disorder, but when we’re separate, I’m anxious.

On an intellectual level I understand that having a dog is not the same as having a human child. Still, what I feel for him has got to be something like maternal attachment. And a new brain-imaging study backs me up on this.

Researchers from the Massachusetts General Hospital scanned the brains of 14 women while they looked passively at photos of their young children, photos of their dogs, and photos of unfamiliar children and dogs.

Read more at…

Only Human, October 2014.

Why Police Lineups Will Never Be Perfect

For three decades psychology researchers have been searching for ways to make eyewitness identifications more reliable. Many studies have shown, for example, the value of “double-blind” lineups, meaning that neither the cop administering the lineup nor the witness knows which of the photos, if any, is the suspect.

But injecting science into the justice system is tricky. For one thing, most criminal investigations happen at a local level. The U.S. has roughly 16,000 law enforcement agencies and few nationally mandated standards. The other big problem is the nature of science itself: Evidence for a given idea builds gradually, as scientists try to replicate others’ work. It can take years or even decades for a clear picture to emerge, and in the meantime scientists may vigorously disagree. While they argue, cases are opened and closed, and people, sometimes the wrong people, go to prison.

Some helpful guidance came today from the National Academy of Sciences. Last year the Academy asked a panel of top scientists to review technical reports and expert testimony about eyewitness identifications and make some solid recommendations. The resulting 160 page report offers many concrete suggestions for carrying out eyewitness identifications. For example, the Academy recommends using double-blind lineups and standardized witness instructions, and training law enforcement officials on the fallibility of eyewitness memory.

On one question, though, the Academy offers no clear answer: What’s the best way to present a photo lineup to a witness? This fuzziness reflects a hot debate bubbling in the scientific literature.

Read more at…

The Atlantic, October 2014.

The Other Polygraph

You’ve no doubt heard about the polygraph and its use as a lie detector. The homely box records physiological changes — such as heart rate and electrical skin conductance — that are indirect signatures of emotion. Because these biomarkers tend to change when people tell lies, criminal investigators have long used the polygraph as a crude tool for detecting deception.

But there’s a huge problem with the polygraph: it’s all-too-frequently wrong. Truth-tellers may show a strong physiological response to being questioned if they’re nervous or fearful, which they often are — particularly if the target of a hostile interrogation.

“You end up with a lot of false positives,” says John Meixner, a law clerk to Chief Judge Gerald Rosen of the United States District Court for the Eastern District of Michigan. The traditional polygraph suffers not only from false positives, but missed positives: With a bit of training, liars can pass the test by intentionally turning down their emotions.

Because of these considerable flaws, polygraph evidence is almost never allowed in court. But it’s still used routinely by federal law enforcement agencies, not only for screening accused criminals but potential new employees.

It turns out there’s a much more accurate way to root out deception: a 55-year-old method called the ‘concealed information test’. The CIT doesn’t try to compare biological responses to truth versus lies. Instead, it shows whether a person simply recognizes information that should only the culprit (or the police) could know.

Read more at…

Only Human, September 2014.

Emotion Is Not the Enemy of Reason

This is a post about emotion, so — fair warning — I’m going to begin with an emotional story.

On April 9, 1994, in the middle of the night, 19-year-old Jennifer Collins went into labor. She was in her bedroom in an apartment shared with several roommates. She moved into her bathroom and stayed there until morning. At some point she sat down on the toilet, and at some point, she delivered. Around 9 a.m. she started screaming in pain, waking up her roommates. She asked them for a pair of scissors, which they passed her through a crack in the door. Some minutes later, Collins opened the door and collapsed. The roommates—who had no idea Collins had been pregnant, let alone what happened in that bloody bathroom—called 911. Paramedics came, and after some questioning, Collins told them about the pregnancy. They lifted the toilet lid, expecting to see the tiny remains of a miscarried fetus. Instead they saw a 7-pound baby girl, floating face down.

The State of Tennessee charged Collins with second-degree murder (which means that death was intentional but not premeditated). At trial, the defense claimed that Collins had passed out on the toilet during labor and not realized that the baby had drowned.

The prosecutors wanted to show the jury photos of the victim — bruised and bloody, with part of her umbilical cord still attached — that had been taken at the morgue. With the jury out of the courtroom, the judge heard arguments from both sides about the admissibility of the photos. At issue was number 403 of the Federal Rules of Evidence, which says that evidence may be excluded if it is unfairly prejudicial. Unfair prejudice, the rule states, means “an undue tendency to suggest decision on an improper basis, commonly, though not necessarily, an emotional one.” In other words, evidence is not supposed to turn up the jury’s emotional thermostat. The rule takes as a given that emotions interfere with rational decision-making.

This neat-and-tidy distinction between reason and emotion comes up all the time. (I even used it on this blog last week, it in my post about juries and stress.) But it’s a false dichotomy. A large body of research in neuroscience and psychology has shown that emotions are not the enemy of reason, but rather are a crucial part of it. This more nuanced understanding of reason and emotion is underscored in a riveting (no, really) legal study that was published earlier this year in the Arizona State Law Journal.

Read more at…

Only Human, September 2014.

Why Jurors and Policemen Need Stress Relief

I’ll be sitting on a jury tomorrow for the first time. The logistics are annoying. I have to take an indefinite time off work, wait in long security lines at the courthouse, and deal with a constant stream of bureaucratic nonsense. But all that is dwarfed by excitement. And, OK, yes, some pride. My judgments will affect several lives in an immediate and concrete way. There’s a heaviness to that, a responsibility, that can’t be brushed aside.

My focus on jury duty may be why a new study on social judgments caught my eye. Whether part of a jury or not, we judge other people’s behaviors every day. If you’re walking down a city sidewalk and someone slams into you, you’re probably going to make a judgment about that behavior. If you’re driving down the highway and get stuck behind a slow car, you’re probably going to make a judgment about that driver’s behavior. If somebody leaves a meandering and inappropriate comment on your blog…

Since the 1960s psychology researchers have known that people tend to make social judgments with a consistent bias: We’re more likely to attribute someone’s behavior to inherent personality traits than to the particulars of the situation. The guy who bumps into me on the sidewalk did so because he’s a dumb jerk, not because he’s rushing to the hospital to see his sick child. The driver is slow because she’s a feeble old lady, not because her engine is stalling.

Read more at…

Only Human, September 2014.