The Virtue of Uncertainty ― Reading #4


 

You’re Not Going to Change Your Mind

Ben Tappin, Leslie Van Der Leer, & Ryan McKay

New York Times | May 27, 2017

 

[1] A troubling feature of political disagreement in the United States today is that many issues on which liberals and conservatives hold divergent views are questions not of value but of fact. Is human activity responsible for global warming? Do guns make society safer? Is immigration harmful to the economy? Though undoubtedly complicated, these questions turn on empirical evidence. As new information emerges, we ought to move, however fitfully, toward consensus.

 

[2] But we don’t. Unfortunately, people do not always revise their beliefs in light of new information. On the contrary, they often stubbornly maintain their views. Certain disagreements stay entrenched and polarized. Why? A common explanation is confirmation bias. This is the psychological tendency to favor information that confirms our beliefs and to disfavor information that counters them— a tendency manifested in the echo chambers and “filter bubbles” of the online world.

 

[3] If this explanation is right, then there is a relatively straightforward solution to political polarization: We need to consciously expose ourselves to evidence that challenges our beliefs to compensate for our inclination to discount it. But what if confirmation bias isn’t the only culprit? It recently struck us that confirmation bias is often conflated with “telling people what they want to hear,” which is actually a distinct phenomenon known as desirability bias, or the tendency to credit information you want to believe. Though there is a clear difference between what you believe and what you want to believe — a pessimist may expect the worst but hope for the best — when it comes to political beliefs, they are frequently aligned.

 

[4] For example, gun-control advocates who believe stricter firearms laws will reduce gun-related homicides usually also want to believe that such laws will reduce gun-related homicides. If those advocates decline to revise their beliefs in the face of evidence to the contrary, it can be hard to tell which bias is at work. So we decided to conduct an experiment that would isolate these biases. This way, we could see whether a reluctance to revise political beliefs was a result of confirmation bias or desirability bias (or both). Our experiment capitalized on the fact that one month before the 2016 presidential election there was a profusion of close polling results concerning Donald Trump and Hillary Clinton.

 

[5] We asked 900 United States residents which candidate they wanted to win the election, and which candidate they believed was most likely to win. Respondents fell into two groups. In one group were those who believed the candidate they wanted to win was also most likely to win (for example, the Clinton supporter who believed Mrs. Clinton would win). In the other group were those who believed the candidate they wanted to win was not the candidate most likely to win (for example, the Trump supporter who believed Mrs. Clinton would win). Each person in the study then read about recent polling results emphasizing either that Mrs. Clinton or Mr. Trump was more likely to win.

 

[6] Roughly half of our participants believed their preferred candidate was the one less likely to win the election. For those people, the desirability of the polling evidence was decoupled from its value in confirming their beliefs. After reading about the recent polling numbers, all the participants once again indicated which candidate they believed was most likely to win. The results, which we report in a forthcoming paper in the Journal of Experimental Psychology: General, were clear and robust. Those people who received desirable evidence — polls suggesting that their preferred candidate was going to win — took note and incorporated the information into their subsequent belief about which candidate was most likely to win the election. In contrast, those people who received undesirable evidence barely changed their belief about which candidate was most likely to win. Importantly, this bias in favor of the desirable evidence emerged irrespective of whether the polls confirmed or disconfirmed peoples’ prior belief about which candidate would win. In other words, we observed a general bias toward the desirable evidence.

 

[7] What about confirmation bias? To our surprise, those people who received confirming evidence— polls supporting their prior belief about which candidate was most likely to win— showed no bias in favor of this information. They tended to incorporate this evidence into their subsequent belief to the same extent as those people who had their prior belief disconfirmed. In other words, we observed little to no bias toward the confirming evidence.

 

[8] We also explored which supporters showed the greatest bias in favor of the desirable evidence. The results were bipartisan: Supporters of Mr. Trump and supporters of Mrs. Clinton showed a similar-size bias in favor of the desirable evidence. Our study suggests that political belief polarization may emerge because of peoples’ conflicting desires, not their conflicting beliefs per se. This is rather troubling, as it implies that even if we were to escape from our political echo chambers, it wouldn’t help much. Short of changing what people want to believe, we must find other ways to unify our perceptions of reality.


 

Power Causes Brain Damage

Jerry Useem

The Atlantic | July/August 2017

 

If power were a prescription drug, it would come with a long list of known side effects. It can intoxicate. It can corrupt. It can even make Henry Kissinger believe that he’s sexually magnetic. But can it cause brain damage?

 

[1] When various lawmakers lit into John Stumpf at a congressional hearing last fall, each seemed to find a fresh way to flay the now-former CEO of Wells Fargo for failing to stop some 5,000 employees from setting up phony accounts for customers. But it was Stumpf’s performance that stood out. Here was a man who had risen to the top of the world’s most valuable bank, yet he seemed utterly unable to read a room. Although he apologized, he didn’t appear chastened or remorseful. Nor did he seem defiant or smug or even insincere. He looked disoriented, like a jet-lagged space traveler just arrived from Planet Stumpf, where deference to him is a natural law and 5,000 a commendably small number. Even the most direct barbs— “You have got to be kidding me” (Sean Duffy of Wisconsin); “I can’t believe some of what I’m hearing here” (Gregory Meeks of New York)— failed to shake him awake. What was going through Stumpf’s head? New research suggests that the better question may be: What wasn’t going through it?

 

[2] The historian Henry Adams was being metaphorical, not medical, when he described power as “a sort of tumor that ends by killing the victim’s sympathies.” But that’s not far from where Dacher Keltner, a psychology professor at UC Berkeley, ended up after years of lab and field experiments. Subjects under the influence of power, he found in studies spanning two decades, acted as if they had suffered a traumatic brain injury— becoming more impulsive, less risk-aware, and, crucially, less adept at seeing things from other people’s point of view.

 

[3] Sukhvinder Obhi, a neuroscientist at McMaster University, in Ontario, recently described something similar. Unlike Keltner, who studies behaviors, Obhi studies brains. And when he put the heads of the powerful and the not-so-powerful under a transcranial-magnetic-stimulation machine, he found that power, in fact, impairs a specific neural process, “mirroring,” that may be a cornerstone of empathy. Which gives a neurological basis to what Keltner has termed the “power paradox”: Once we have power, we lose some of the capacities we needed to gain it in the first place. That loss in capacity has been demonstrated in various creative ways. A 2006 study asked participants to draw the letter E on their forehead for others to view— a task that requires seeing yourself from an observer’s vantage point. Those feeling powerful were three times more likely to draw the E the right way to themselves— and backwards to everyone else (which calls to mind George W. Bush, who held up the American flag backwards at the 2008 Olympics). Other experiments have shown that powerful people do worse at identifying what someone in a picture is feeling, or guessing how a colleague might interpret a remark.

 

[4] The fact that people tend to mimic the expressions and body language of their superiors can aggravate this problem: Subordinates provide few reliable cues to the powerful. But more important, Keltner says, is the fact that the powerful stop mimicking others. Laughing when others laugh or tensing when others tense does more than ingratiate. It helps trigger the same feelings those others are experiencing and provides a window into where they are coming from. Powerful people “stop simulating the experience of others,” Keltner says, which leads to what he calls an “empathy deficit.”

 

[5] Mirroring is a subtler kind of mimicry that goes on entirely within our heads, and without our awareness. When we watch someone perform an action, the part of the brain we would use to do that same thing lights up in sympathetic response. It might be best understood as vicarious experience. It’s what Obhi and his team were trying to activate when they had their subjects watch a video of someone’s hand squeezing a rubber ball.

 

[6] For non-powerful participants, mirroring worked fine: The neural pathways they would use to squeeze the ball themselves fired strongly. But the powerful group’s? Less so. Was the mirroring response broken? More like anesthetized. None of the participants possessed permanent power. They were college students who had been “primed” to feel potent by recounting an experience in which they had been in charge. The anesthetic would presumably wear off when the feeling did— their brains weren’t structurally damaged after an afternoon in the lab. But if the effect had been long-lasting— say, by dint of having Wall Street analysts whispering their greatness quarter after quarter, board members offering them extra helpings of pay— they may have what in medicine is known as “functional” changes to the brain.

 

[7] I wondered whether the powerful might simply stop trying to put themselves in others’ shoes, without losing the ability to do so. As it happened, Obhi ran a subsequent study that may help answer that question. This time, subjects were told what mirroring was and asked to make a conscious effort to increase or decrease their response. “Our results,” he and his co-author, Katherine Naish, wrote, “showed no difference.” Effort didn’t help. This is a depressing finding. Knowledge is supposed to be power. But what good is knowing that power deprives you of knowledge?

 

[8] The sunniest possible spin, it seems, is that these changes are only sometimes harmful. Power primes our brain to screen out peripheral information. In most situations, this provides a helpful efficiency boost. In social ones, it has the unfortunate side effect of making us more obtuse. Even that is not necessarily bad for the prospects of the powerful, or the groups they lead. As Susan Fiske, a psychology professor, has persuasively argued, power lessens the need for a nuanced read of people, since it gives us command of resources we once had to cajole from others. But of course, in a modern organization, the maintenance of that command relies on some level of organizational support. And the sheer number of examples of executive hubris that bristle from the headlines suggests that many leaders cross the line into counterproductive folly.

 

[9] Less able to make out people’s individuating traits, they rely more heavily on stereotype. And the less they’re able to see, other research suggests, the more they rely on a personal “vision” for navigation. John Stumpf saw a Wells Fargo where every customer had eight separate accounts. (As he’d often noted to employees, eight rhymes with great.) “Cross-selling,” he told Congress, “is shorthand for deepening relationships.”

 

[10] Is there nothing to be done? No and yes. It’s difficult to stop power’s tendency to affect your brain. What’s easier— from time to time, at least— is to stop feeling powerful. Insofar as it affects the way we think, power, Keltner reminded me, is not a post or a position but a mental state. Recount a time you did not feel powerful, his experiments suggest, and your brain can commune with reality.

 

[11] Recalling an early experience of powerlessness seems to work for some people— and experiences that were searing enough may provide a sort of permanent protection. An incredible study published in The Journal of Finance last February found that CEOs who as children had lived through a natural disaster that produced significant fatalities were much less risk-seeking than CEOs who hadn’t. (The one problem, says Raghavendra Rau, a co-author of the study and a Cambridge University professor, is that CEOs who had lived through disasters without significant fatalities were more risk-seeking.)

 

[12] But tornadoes, volcanoes, and tsunamis aren’t the only hubris-restraining forces out there. PepsiCo CEO and Chairman Indra Nooyi sometimes tells the story of the day she got the news of her appointment to the company’s board, in 2001. She arrived home percolating in her own sense of importance and vitality, when her mother asked whether, before she delivered her “great news,” she would go out and get some milk. Fuming, Nooyi went out and got it. “Leave that damn crown in the garage” was her mother’s advice when she returned.

 

[13] The point of the story, really, is that Nooyi tells it. It serves as a useful reminder about ordinary obligation and the need to stay grounded. Nooyi’s mother, in the story, serves as a “toe holder,” a term once used by the political adviser Louis Howe to describe his relationship with the four-term President Franklin D. Roosevelt, whom Howe never stopped calling Franklin.

 

[14] For Winston Churchill, the person who filled that role was his wife, Clementine, who had the courage to write, “My Darling Winston. I must confess that I have noticed a deterioration in your manner; &you are not as kind as you used to be.” Written on the day Hitler entered Paris, torn up, then sent anyway, the letter was not a complaint but an alert: Someone had confided to her, she wrote, that Churchill had been acting “so contemptuous” toward subordinates in meetings that “no ideas, good or bad, will be forthcoming”— with the attendant danger that “you won’t get the best results.”

 

[15] Lord David Owen— a British neurologist turned parliamentarian who served as the foreign secretary before becoming a baron— recounts both Howe’s story and Clementine Churchill’s in his 2008 book, In Sickness and in Power, an inquiry into the various maladies that had affected the performance of British prime ministers and American presidents since 1900. While some suffered from strokes (Woodrow Wilson), substance abuse (Anthony Eden), or possibly bipolar disorder (Lyndon B. Johnson, Theodore Roosevelt), at least four others acquired a disorder that the medical literature doesn’t recognize but, Owen argues, should.

 

[16]Hubris syndrome,” as he and a co-author, Jonathan Davidson, defined it in a 2009 article published in Brain, “is a disorder of the possession of power, particularly power which has been associated with overwhelming success, held for a period of years and with minimal constraint on the leader.” Its 14 clinical features include: manifest contempt for others, loss of contact with reality, restless or reckless actions, and displays of incompetence. In May, the Royal Society of Medicine co-hosted a conference of the Daedalus Trust—an organization that Owen founded for the study and prevention of hubris.

 

[17] I asked Owen, who admits to a healthy predisposition to hubris himself, whether anything helps keep him tethered to reality, something that other truly powerful figures might emulate. He shared a few strategies: thinking back on hubris-dispelling episodes from his past; watching documentaries about ordinary people; making a habit of reading constituents’ letters.

 

[18] But I surmised that the greatest check on Owen’s hubris today might stem from his recent research endeavors. Businesses, he complained to me, had shown next to no appetite for research on hubris. Business schools were not much better. The undercurrent of frustration in his voice attested to a certain powerlessness. Whatever the salutary effect on Owen, it suggests that a malady seen too commonly in boardrooms and executive suites is unlikely to soon find a cure.


 

The Psychological Quirk That Explains Why You Love Donald Trump

DAVID DUNNING

Politico Magazine | May 25, 2016

 

[1] Many commentators have argued that Donald Trump’s dominance in the GOP presidential race can be largely explained by ignorance; his candidacy, after all, is most popular among Republican voters without college degrees. Their expertise about current affairs is too fractured and full of holes to spot that only 9 percent of Trump’s statements are “true” or “mostly” true, according to PolitiFact, whereas 57 percent are “false” or “mostly false”—the remainder being “pants on fire” untruths. Trump himself has memorably declared: “I love the poorly educated.”

 

[2] But as a psychologist who has studied human behavior— including voter behavior—I think there is something deeper going on. The problem isn’t that voters are too uninformed. It is that they don’t know just how uninformed they are. Psychological research suggests that people, in general, suffer from what has become known as the Dunning-Kruger Effect. They have little insight about the cracks and holes in their expertise. In studies in my research lab, people with severe gaps in knowledge and expertise typically fail to recognize how little they know and how badly they perform. To sum it up, the knowledge and intelligence that are required to be good at a task are often the same qualities needed to recognize that one is not good at that task—and if one lacks such knowledge and intelligence, one remains ignorant that one is not good at that task. This includes political judgment.

 

[3] We have found this pattern in logical reasoning, grammar, emotional intelligence, financial literacy, numeracy, firearm care and safety, debate skill, and college coursework. Others have found a similar lack of insight among poor chess players, unskilled medical lab technicians, medical students unsuccessfully completing an obstetrics/gynecology rotation, and people failing a test on performing CPR. This syndrome may well be the key to the Trump voter—and perhaps even to the man himself. Trump has served up numerous illustrative examples of the effect as he continues his confident audition to be leader of the free world even as he seems to lack crucial information about the job. In a December debate he appeared ignorant of what the nuclear triad is. Elsewhere, he has mused that Japan and South Korea should develop their own nuclear weapons—casually reversing decades of U.S. foreign policy.

 

[4] Many commentators have pointed to these confident missteps as products of Trump’s alleged narcissism and egotism. My take would be that it's the other way around. Not seeing the mistakes for what they are allows any potential narcissism and egotism to expand unchecked. In voters, lack of expertise would be lamentable but perhaps not so worrisome if people had some sense of how imperfect their civic knowledge is. If they did, they could repair it. But the Dunning-Kruger Effect suggests something different. It suggests that some voters, especially those facing significant distress in their life, might like some of what they hear from Trump, but they do not know enough to hold him accountable for the serious gaffes he makes. They fail to recognize those gaffes as missteps.

 

[5] Here is more evidence. In a telling series of experiments, Paul Fernbach and colleagues asked political partisans to rate their understanding of various social policies, such as imposing sanctions on Iran, instituting a flat tax, or establishing a single-payer health system. Survey takers expressed a good deal of confidence about their expertise. Or rather, they did until researchers put that understanding to the test by asking them to describe in detail the mechanics of two of the policies under question. This challenge led survey takers to realize that their understanding was mostly an illusion. It also led them to moderate their stances about those policies and to donate less money, earned in the experiment, to like-minded political advocacy groups. Again, the key to the Dunning-Kruger Effect is not that unknowledgeable voters are uninformed; it is that they are often misinformed—their heads filled with false data, facts and theories that can lead to misguided conclusions held with tenacious confidence and extreme partisanship, perhaps some that make them nod in agreement with Trump at his rallies.

 

[6] Trump himself also exemplifies this exact pattern, showing how the Dunning-Kruger Effect can lead to what seems an indomitable sense of certainty. All it takes is not knowing the point at which the proper application of a sensible idea turns into malpractice. For example, in a CNBC interview, Trump suggested that the U.S. government debt could easily be reduced by asking federal bondholders to “take a haircut,” agreeing to receive a little less than the bond’s full face value if the U.S. economy ran into trouble. In a sense, this is a sensible idea commonly applied—at least in business, where companies commonly renegotiate the terms of their debt.

 

[7] But stretching it to governmental finance strains reason beyond acceptability. And in his suggestion, Trump illustrated not knowing the horror show of consequences his seemingly modest proposal would produce. For the U.S. government, his suggestion would produce no less than an unprecedented earthquake in world finance. It would represent the de facto default of the U.S. on its debt—and the U.S. government has paid its debt in full since the time of Alexander Hamilton. The certainty and safety imbued in U.S. Treasury bonds is the bedrock upon which much of world finance rests. Even suggesting that these bonds pay back less than 100 percent would be cause for future buyers to demand higher interest rates, thus costing the U.S. government, and taxpayer, untold millions of dollars, and risking the health of the American economy.

 

[8] This misinformation problem can live in voters, too, as shown in a 2015 survey about the proposed Common Core standards for education. A full 41 percent claimed the new standards would prompt more frequent testing within California schools. That was untrue. Only 18 percent accurately stated that the level of testing would stay the same. Further, 35 percent mistakenly asserted that the standards went beyond math and English instruction. Only 28 percent correctly reported that the standards were constrained to those two topics. And 34 percent falsely claimed that the federal government would require California to adopt the Common Core. Only 21 percent accurately understood this was not so.

 

[9] But what is more interesting—and troubling—were the responses of survey takers who claimed they knew “a lot” about the new standards. What these “informed” citizens “knew” trended toward the false rather than the true. For example, 52 percent thought the standards applied beyond math and English (versus 32 percent who got it right). And 57 percent believed the standards mandated more testing (versus 31 percent who correctly understood that it did not). These misconceptions mattered: To the extent that survey takers endorsed these misconceptions, they opposed the Common Core.

 

[10] My research colleagues and I have found similar evidence that voters who think they are informed may be carrying a good deal of misinformation in their heads. In an unpublished study, we surveyed people the day after the 2014 midterm elections, asking them whether they had voted. Our key question was who was most likely to have voted: informed, uninformed, or misinformed citizens. We found that voting was strongly tied to one thing—whether those who took the survey thought of themselves as “well-informed” citizens. But perceiving oneself as informed was not necessarily tied to, um, being well-informed.

 

[11] To be sure, well-informed voters accurately endorsed true statements about economic and social conditions in the U.S.—just as long as those statements agreed with their politics. Conservatives truthfully claimed that the U.S. poverty rate had gone up during the Obama administration; liberals rightfully asserted that the unemployment rate had dropped. But both groups also endorsed falsehoods agreeable to their politics. Thus, all told, it was the political lean of the fact that mattered much more than its truth-value in determining whether respondents believed it. And endorsing partisan facts both true and false led to perceptions that one was an informed citizen, and then to a greater likelihood of voting.

 

[12] Given all this misinformation, confidently held, it is no wonder that Trump causes no outrage or scandal among those voters who find his views congenial. But why now? If voters can be so misinformed that they don’t know that they are misinformed, why only now has a candidate like Trump arisen? My take is that the conditions for the Trump phenomenon have been in place for a long time. At least as long as quantitative survey data have been collected, citizens have shown themselves to be relatively ill-informed and incoherent on political and historical matters. As way back as 1943, a survey revealed that only 25 percent of college freshmen knew that Abraham Lincoln was president during the Civil War.

 

[13] All it took was a candidate to come along too inexperienced to avoid making policy gaffes, at least gaffes that violate received wisdom, with voters too uninformed to see the violations. Usually, those candidates make their mistakes off in some youthful election to their state legislature, or in small-town mayoral race or contest for class president. It’s not a surprise that someone trying out a brand new career at the presidential level would make gaffes that voters, in a rebellious mood, would forgive but more likely not even see.

 

[14] But the Dunning-Kruger perspective also suggests a cautionary tale that extends well beyond the Trump voter. The Trump phenomenon may provide only an extravagant and visible example in which voters fail to spot a political figure who seems to be making it up as he goes along. But the key lesson of the Dunning-Kruger framework is that it applies to all of us, sooner or later. Each of us at some point reaches the limits of our expertise and knowledge. Those limits make our misjudgments that lie beyond those boundaries undetectable to us. As such, if we find ourselves worried about the apparent gullibility of the Trump voter, which may be flamboyant and obvious, we should surely worry about our own naive political opinions that are likely to be more nuanced, subtle, and invisible—but perhaps no less consequential. We all run the risk of being too ill-informed to notice when our own favored candidates or national leaders make catastrophic misjudgments.

 

[15] To be sure, I don’t wish to leave the reader with a fatal hesitation about supporting any candidate. All I am saying is trust, but verify. Thomas Jefferson once observed that “if a nation expects to be ignorant and free in a state of civilization, it expects what never was and never will be." The Trump phenomenon makes visible something that has been true for quite some time now. As a citizenry, we can be massively ill-informed. Yet, our society remains relatively free.

 

[16] How have we managed so far to maintain what Jefferson suggested could never be? And how do we ensure this miracle of democracy continues? This is the real issue. And it will be with us far after the Trumpian political revolution or reality TV spectacle, depending on how you see it, has long flickered off the electronic screens of our cultural theater.