Sunday, July 23, 2017

Reasoning vs. rationalizing.

We live in rancorous times. Here in the United States, we're split by political views -- conservative on the right, liberal on the left -- and each half is further fractured. Republicans have majorities in both houses of Congress, but they have been unable to get much done; major pieces of legislation have been shot down by moderates who think they're too harsh, and by members of the Freedom Caucus who think they don't go far enough.

Democrats are okay with that. But they can't get themselves together, either. Neoliberals, who has been in charge of the party for the last several decades, are still trying to figure out how they lost the presidential election last year -- while progressives are frustrated that the party is not embracing their farther-left economic stances fast enough.

Each side keeps trying to convince the others of the rightness of their position, using poll results replete with charts and graphs. "Proof!" they cry. "Why won't you listen to us? We're headed for disaster! Why won't you change your minds?"

It turns out we're not hardwired that way.

A whole host of studies have been done on decision-making behavior and how preconceived notions affect it. One of the most striking was done by a Yale law professor in 2013. He set up a fairly complex math problem and had the study participants come to a conclusion from the data given them. I won't bore you with the details (you can see the questions at the link). But the upshot was that when the question was about a skin cream that caused a rash, people who were better educated at math were more likely to get the answer right.

However, if the question was about concealed-carry laws and whether they made crime better or worse, knowing more math didn't help. In fact, people did worse if the data were presented in a way that went against their stance on gun control. In other words, conservatives did well if the right answer showed that the ban didn't work, but poorly if the right answer was that the ban did work. The same was true in reverse for liberals. And the people who knew more math were worst at picking the right answer if it didn't support their stance.

This goes back to confirmation bias: humans' tendency to form an opinion first, and then seek out facts to back it up. Moreover, when confronted with facts that don't back up our opinions, we tend to reject them -- or figure out some convoluted way that they actually fit our opinion. Knowing that, we should all be searching out opposing viewpoints to challenge our opinions, but of course we don't. And the harder we're pressed to change, the more likely we are to stick our fingers in our ears and go, "LA LA LA LA LA!" until those annoying nonconforming facts go away.

So if we won't challenge ourselves, and we won't listen to the other side, how do we bring everybody together again?

In the past, major historical events have been catalysts. Pearl Harbor and 9/11 both caused Americans to rally 'round the flag. Examples of positive events are harder to come by, although the moon landing might fit the bill. In each of these cases, opinions became divided some time after the event: many people now are second-guessing our going to war against Iraq as a result of 9/11, and some folks are questioning the money we spend on space exploration. But in the first flush of excitement -- or horror -- we all pretty much reacted the same way.

We must find a way to come together again soon. We cannot continue to function as a democracy (or, to be more precise, a democratic republic) without some degree of common ground. Let's hope that this time, it's a positive event that brings us together.

***
These moments of reasonable blogginess have been brought to you, as a public service, by Lynne Cantwell.

No comments: