Researchers To Journalists: Stop Blaming Mothers

Being pregnant these days can be more stressful than ever, with growing lists of dos and don’ts. Don’t eat sushi, don’t drink alcohol. Exercise, sure, but not too much. Then there’s our society-wide unease about the rise of developmental disorders.

If a pregnant woman were to search for stories about prenatal risk factors for autism, she would find no shortage of things she is supposed to avoid: the fluantidepressantssugar and even being too old. And if this torrent of risk factors fueled her anxiety, she’d have to worry about that, too.

There’s nothing wrong with studies investigating links between prenatal events and autism or other developmental disorders. A lot of autism research focuses on mothers, and with good reason, given that the disorder takes root early in development. But as Sarah Richardson and her colleagues argue in a new commentary in Nature, journalists often cover this research irresponsibly.

Read more at…, August 2014.

To Infinity…and Beyond!

I don’t remember exactly when I learned the concept of infinity, but at some point in childhood I was using it as a trump card in arguments with my sister. We’d volley along the lines of:

“I have seven beads!”
“Well, I have twelve beads!”
“I have infinity beads!”
(Drops mic.)

Infinity seems like an abstract mathematical concept, and I suppose it is. But one application of infinity is integral to many aspects of everyday human cognition, including language, music, and problem solving. It’s called recursion, and children begin to grasp it around age 9, according to a fascinating study published in the October issue of Cognition. What’s more, a child’s understanding of recursion in pictures is tightly linked to her understanding of grammar.

Read more at…

Only Human, August 2014.

Active or at Rest, Brain Conducts Similar Symphonies

Timing is everything, even when it comes to brain activity. Individual neurons fire all the time, but it’s the synchronous firing of many cells in many regions that seems to drive cognitive skills, such as language, decision-making and even consciousness.

This so-called ‘functional connectivity’ reveals which brain regions are most harmonious — ramping up or down at the same time — implying that they’re part of the same functional network. It’s a hot topic in autism research, with some studies suggesting that brain regions in people with the disorder are out of sync with each other.

Researchers generally measure functional connectivity by scanning volunteers’ brains either while they’re resting passively or while they’re engaged in a specific task. According to some studies, these two approaches activate distinct networks, because the brain uses one mode of connections for active tasks and switches into another mode for quiet reflection.

A study published 2 July in Neuron questions this premise. It argues that these two networks are more similar than previously thought.

Read more at…, August 2014.

From Ancient Genomes to Ancient Epigenomes

Late last year, scientists unveiled the complete genome of a female Neanderthal whose 130,000-year-old toe bone had been found in a cave in Siberia. As it turned out, her sequence of some 3 billion DNA letters was not all that much different from mine or yours. The researchers identified only about 35,000 places in the genome where all modern humans differ from our ancient hominid cousins. And only 3,000 of those were changes that could impact how genes are turned on and off.

But if our DNA is so similar to Neanderthals, why were they so…different? They were brawnier than our ancestors, with short but muscular limbs, and big noses and eyebrows. They didn’t carry certain genetic variants that put modern humans at risk of autoimmune disease and celiac disease. And although they lived alongside our ancestors as the latter migrated into Europe, for some reason the Neanderthals didn’t survive.

Part of the answer undoubtedly lies in the way the Neanderthal genome actually worked — a complex process that depends not only on the underlying DNA code, but on the way genes get turned on and off.

Read more at…

Only Human, August 2014.

The Chatty Hippocampus

The hippocampus, a skinny log of brain tissue tucked in deep above your ear, is the star of memory research. People with damaged hippocampi — such as the famous patient Henry Molaison, known as H.M. — can’t make new memories. And studies in rodents have shown that creating new memories drives robust connections between neurons in the hippocampus.

Despite getting all the attention, the hippocampus doesn’t act alone. Our memories are as good as they are only because of the way the hippocampus talks to the rest of the brain. A study published this week in the Proceedings of the National Academy of Sciences underscores the importance of hippocampal interactions with the prefrontal cortex, the outer layers of brain located behind your forehead.

“As a field we focus so much interest on the role of the hippocampus, and rightfully so. But it means we haven’t investigated thoroughly the role of the prefrontal cortex and other areas,” says Adam Bero, a postdoctoral fellow in Li-Huei Tsai’s lab at the Massachusetts Institute of Technology, who led the study.

Read more at…

Only Human, July 2014.

Expanding Guts in Pythons and People

Regular readers of this blog might remember a post I wrote a few months ago about weight-loss surgery. A mouse study suggested that surgery works — triggering weight loss and, often, diabetes remission — not because it makes the stomach smaller, but because it drastically changes its biochemistry.

I took a close look at that study and a slew of others in a feature published in today’s issue of Nature. Rodent models have shown that after surgery, the gut goes through many dramatic changes. Bacterial compositions shift, for example, bile acids flow more freely, and the intestines swell.

That last bit maybe isn’t so surprising — after all, once the stomach shrinks to the size of an egg, suddenly a whole lot more undigested food is going to hit the intestines than before. But what is surprising, as Nicholas Stylopoulos’s group published last year in Science, is that this abrupt growth seems to trigger a host of permanent metabolic changes in the gut.

As I explain in the feature:

“The rapid growth requires a lot of energy, which comes from glucose. Glucose uptake by the changing organ increases, and the change is maintained over time, Stylopoulos says. ‘Essentially, the intestine becomes a bigger and a more hungry organ that needs more glucose than before.’

Stylopoulos believes that this tissue growth in the gut is the main driver of the surgery’s remarkable metabolic benefits — not a reduction in calorie intake.”

A couple of weeks ago, while I was doing the final fact-checks for the feature, Stylopoulos told me a fun tidbit about how the same sort of change has been reported in….wait for it….Burmese Pythons. “They have some amazing similarities,” he said.

Read more at…

Only Human, July 2014.

The Sexual Politics of Autism

Imagine you walked down the street and asked random people what autism is. What would they say? My guess: They’d talk about social skills, and the rising prevalence, and probably the vaccine nonsense. And they’d almost certainly mention that it happens to boys.

The idea that autism is a mostly male disorder is pervasive in the news, pop culture, and scientific circles. And it’s not just an academic curiosity. Last year a popular fertility clinic in Sydney, Australia, reported that about five percent of couples went through in vitro fertilization just so they could select a female embryo and thus lower the kid’s risk of developing autism.

The sex skew in autism is real: A diagnosis of autism is almost five times more common in 8-year-old boys than in 8-year-old girls, according to the latest statistics from the CDC.

But it’s not that simple. Most people don’t realize, for example, that autism’s sex bias changes dramatically depending on the severity of the disorder, with so-called high-functioning autism (a problematic term that usually means having an IQ above 70 or 80) showing a ratio more skewed towards boys. The ratio also varies wildly depending on who’s calculating it.

Read more at…

Only Human, July 2014.

Why Do Some Teens Become Binge Drinkers? Algorithms Answer.

The first time I got drunk I was 15. It was in a hotel room in Paris, on a trip with my high school French Club, drinking vodka and Orangina from a plastic bottle. I remember looking at my blurry reflection in the bathroom mirror and thinking, So this is what being drunk is. I didn’t hate it. I drank a few more times that year, and then pretty steadily for the next two. I had one blackout night in a friend’s basement. Then came college, where everything escalated. It honestly makes me queasy right now to think about what I put my body through.

But it was fun. And it didn’t lead to anything horrible. I did well academically, went to grad school, found (mostly) gainful employment. I’m 30 now and, knock on wood, don’t have any health problems.

My story is typical. “We tend not to want to say this out loud to teenagers, but most people who tried drugs don’t get addicted,” says Hugh Garavan, a cognitive neuroscientist at the University of Vermont. “Most kids have tried alcohol by age 14, and most kids don’t develop a problem. Same with cigarettes and same with cocaine. But there’s a certain subset who do, and we don’t have a clue what it is about them.”

Scientists have pinpointed lots of factors that increase the risk of alcohol misuse — a bit. Adolescents who are anxious or impulsive, for example, tend to be at higher risk. Same for those who carry certain genetic variants (dubbed ‘SNPs’) in their genome, and for kids who are abused or neglected. But most studies haven’t looked at enough factors, or at enough kids, to make predictions with much oomph. “It’s hard to look at all of it, but we have this luxury,” Garavan says.

In today’s issue of Nature, Garavan and his colleagues present a new predictive model based on an enormous amount of data—brain scans, genetic screens, personality trait tests, and family and medical histories—from 2,400 teenagers in Europe. The model isn’t by any means a crystal ball, but it can guess which 14-year-olds will become binge drinkers by age 16 with odds far better than chance.

Read more at…

Only Human, July 2014.

My Walk in the Woods

Last weekend I went for a walk in the woods. It’s not something I do often, but I was with friends who wanted to do it, so alright. We hiked for about four miles. It was not strenuous. It was warm, but not too warm, with a light breeze. The trees protected us from the sun. The canopy was intensely green, and the sky, peeking through, was cornflower blue.

While my friends walked along merrily, ostensibly enjoying their surroundings, I spent much of the hike wondering why anybody would bother with this kind of activity. I find hikes repetitious and boring. My mind goes idle (other than its constant scanning for beetles, ticks, rocks, and face-slapping branches). I look at my feet more the trees, and when I do look up, everything looks the same, a wash of brown and green. I try to think of it as good exercise, but the inefficiency! The same 90 minutes at the gym would be far more helpful.

It’s odd, when you think about it: My friends and I were all absorbing the same sights, sounds, smells and touches. We’re all about the same age and live in big cities. But they seemed to like it very much and I didn’t like it much at all.After reading an article on i realized why i didn´t like it so much.

Wilhelm Wundt, known by some as the father of psychology, pondered this very conundrum more than a century ago. In his 1896 book, Outlines of Psychology, Wundt wrote that our experiences can be broken into two elements: objective sensations and subjective affect. For instance, if you put your hand under hot water, you’ll feel a sensation of the heat. That’s the objective part. It’s hot, not cold. But then there’s the affect, the way the sensation affects you. For some people, the hot water will be excruciating, for others it will be rapturous. The sensation and the affect are inseparable, and both are crucial to perception, Wundt wrote: “The actual contents of psychical experience always consist of various combinations of sensational and affective elements.” Our experience, he added, “depends for the most part not on the nature of these elements so much as on their union.”

Wundt’s ideas have been difficult to prove empirically, but new evidence comes from a paper published Sunday in Nature NeuroscienceAdam Anderson, a cognitive psychologist at Cornell University, and his colleagues used brain scanners to show that our brains use different codes to represent these objective and subjective aspects of an experience.

Read more at…

Only Human, June 2014.

Where Do New Ideas Come From?

In 1887, after achieving great fame and money for his invention of the phonograph, a machine that recorded sound, the acoustic telegraph, which transmitted more than one message at a time, and commercially viable light bulbs, Thomas Edison built a laboratory in West Orange, New Jersey. He recruited a team of talented scientists and engineers to help him further develop his famous inventions and, of course, to come up with new ones.

Edison’s team there invented a cotton picker, a snow compactor, and a way of using magnetized iron to generate electricity. But probably the most famous device to emerge from that lab was the kinetoscope, a machine for viewing motion pictures. How, exactly, the engineers designed this machine, and how much of the credit should be given to one of Edison’s assistants, William Kennedy Laurie Dickson, are fascinating stories. But as a writer who constantly struggles to come up with something new, I’m more interested in the machine’s origin story. How did Edison generate the idea in the first place?

We tend to think of inventors as another species—geniuses—who have sudden flashes of insight. I can’t think of a single instance when a light bulb went off in my head, leading to some killer new idea. Is that because I’m an uncreative dud? Perhaps. Alternatively, it might be because Eureka moments are the stuff of legend. According to historians who specialize in the development of inventions and the thought processes of inventors, innovation is often a slow and iterative process.

Read more at…

Only Human, June 2014.