The Personality Trait That Holds Back Potential Leaders
(Bloomberg Opinion) -- So much news, so little time! Science sometimes makes headlines and even more often gives us a different way of thinking about the stories of the week. I’d like to delve into a few timely topics in a sort of lightning round:
Why Leaders Emerge, and What Happens If They Don’t
In an attempt to use science to understand leadership, a group of researchers have come upon what they say is a key trait, which they call responsibility aversion. In a study published this week in Science, they show how this trait is measurable in the lab and inversely correlated to the amount of leadership people assume in real life.
The experiments used a game offering various risky choices – gambling 25 points for the possibility of winning 50, for example – played by small groups of subjects. The subjects had the option of deferring to a group vote or taking charge. (Points in these games are redeemable for money.)
Subjects were tested under two conditions – one in which their gambles affected only them, and another in which they affected the whole group. People tended to defer more when other people’s money was at stake, and responsibility aversion was defined as the amount their deferral increased under this condition.
The researchers found a reverse correlation between responsibility aversion and the amount of leadership subjects took on in work, sports, politics or the military. But responsibility aversion was not a sign of irresponsibility.
In the game, the groups benefited the most when people deferred more and took charge less. That is, some degree of responsibility aversion helped everyone prosper.
There’s a reason that Henry IV’s utterance, “uneasy lies the head that wears a crown” became one of Shakespeare’s immortal lines. The king is expressing responsibility aversion, as the researchers would call it, but it comes from a mature, responsible recognition of the life-or-death gravity of his power.
Contrast that to modern American caricatures of leaders. We have bumbling boss Michael Scott from "The Office," a walking example of the Dunning-Kruger effect, who lacks the self-awareness to show responsibility aversion, and we have the sociopathic politician Frank Underwood from "House of Cards," who isn’t capable of responsibility aversion because he lacks any trace of empathy. What does he care if other people lose?
Lead researcher Micah Edelson of the University of Zurich, said that they’re exploring what makes some leaders more autocratic and some more democratic. The more autocratic are likely to be those who avoid deferring under any condition, even though other people have information they lack.
In a related commentary piece in Science, a pair of neuroscientists expressed the hope that future research will give us insights into what makes a good leader.
The Parasitology of Entrepreneurship
In a much weirder paper, another group of researchers found a connection between business leadership and infection with a common brain-attacking parasite. These are strange times, when it’s not unreasonable to consider such things.
The parasite is called Toxoplasma gondii, and it’s estimated to infect a big swath of the world’s population — about 2 billion of us. While most people don’t know they’re infected, scientists have found hints of various effects on behavior. A new study published in the Proceedings of the Royal Society B makes the claim that the infected among us are more likely to major in business, take an interest in entrepreneurship and start their own companies.
This probably should fall into the category of results that are “interesting if confirmed.” For one thing, it’s a little too appealing as revenge of the nerds, complete with the chance to write headlines juxtaposing business majors with cat poop (a common vector for infection). Many people acquire the parasite from contaminated water or undercooked meat.
The cat poop route is interesting, though, because scientists have pretty good evidence that this parasite gets into the cats by first infecting the brains of rodents and making these prey animals less risk-averse. The reckless rodents lack appropriate fear around cats, so they are more likely to get eaten, allowing the parasite to complete its life cycle, going from the cat to cat poop to soil and back into rodents. There’s also evidence that infected male rodents become more sexually attractive to female rodents, who can acquire the infection from sex.
And so it’s not completely nuts to consider that once the parasite gets into human brains — and it does invade the brain for life — it could change our behavior in some way. A long story in the Atlantic outlined a raft of studies suggesting all kinds of links — between recklessness, outgoingness, and risks for mental illness, suicide and car accidents. Not surprisingly, nobody until now thought to look for a correlation between infection and majoring in business.
In this latest study, out of the University of Colorado, a collaboration of evolutionary biologists and business professors took a sample of more than a thousand students, 22 percent of whom tested positive for toxoplasmosis. The researchers found the infected students were 1.4 times as likely to major in business as their uninfected peers, and of business majors, the infected were 1.7 times as likely to specialize in areas of management and entrepreneurship. A smaller sample of adults hinted at a correlation between infection and starting businesses.
This might be wrong. Other studies have turned up no behavioral effects in humans. There’s also the possibility that more risk-taking individuals are more likely to be infected. Risk is not all about being a cat person. Americans are more at risk if they enjoy travel to countries where there might be contaminated water, or where local delicacies might include undercooked meat.
Part of what makes the result seem far-fetched is the notion that entrepreneurship could be a symptom of a disease. People don’t normally associate infection with anything beneficial, but it’s good to remember that what counts as beneficial in biology changes as environments change.
In some circumstances, more cautious individuals will survive and prosper, and in others, the more daring might have an advantage. However, it may be too soon, as some jokingly suggest, to add contaminated cat poop to the water supply in U.S. universities as a way of spurring innovation.
You Can Eat Butter and Still Believe in Science
For decades, nutrition science was hampered by the fact that many studies were based on what subjects said they ate, not based on what they actually ate. Now, however, scientists can use biomarkers to tell who is really eating butter, full-fat cheeses, rich Greek yogurts and the like, and who is being “good,” following the U.S. nutrition guidelines that tell us to stick with low-fat or fat-free dairy products.
The results are embarrassing for the nutrition industry, and the U.S. government, as they bolster a growing body of evidence that our longstanding prohibitions against fats were all wrong. There was no French paradox. There was just an American goof-up.
A new study, published in the Journal of Clinical Nutrition, used three biomarkers that indicated how much dairy fat people consumed. Researchers followed more than 3,000 people, ages 65 and up, for a period of 13 years, and found that two of the biomarkers had no connection to death rates, while the other appeared to have a protective effect against stroke death.
This followed an earlier study showing that full-fat dairy was associated with lower rates of diabetes. The press called these results “counterintuitive” and a “paradox,” but that’s not quite right. There’s nothing intuitive about the notion of fat being bad. With the exception of chemically altered trans fat, it’s been an essential part of human nutrition since the dawn of our species.
There is some indirect evidence of harm from animal fats. Some studies show it’s correlated with higher measures of LDL — or “bad” — cholesterol, which is in turn correlated with heart disease risk. But the newer studies look directly at connections between death (what we really care about) and animal fat, and they show either no ill effect or a small protective one. That doesn’t mean some people with certain conditions won’t benefit from low-fat eating, but the evidence is piling up against recommending it for everyone.
This is no reason to distrust or dismiss science. It takes some fields time to get the hang of being scientific. As one of the authors of this new study pointed out in a recent historical review, nutrition is still a new science. People didn’t even know about vitamins until the 20th century. As the science developed, there were mistakes, and some of the worst perpetrators of those mistakes lacked responsibility aversion (see above) and became big leaders in the field.
Even the great Nobel-winning chemist Linus Pauling was wrong when he ventured into nutrition and advocated mega doses of certain vitamins. The behaviors of atoms and molecules and stars and planets are more predictable than human bodies or minds. (Who would have expected a connection between choice of a business major and infection with a brain parasite?) When studying humans, scientists find all sorts of strange things.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Faye Flam is a Bloomberg Opinion columnist. She has written for the Economist, the New York Times, the Washington Post, Psychology Today, Science and other publications. She has a degree in geophysics from the California Institute of Technology.
©2018 Bloomberg L.P.