Let’s Be Rational

posted in: Cognition, Personal | 0

I had a friend not so long ago who espoused a libertarian point of view. I enjoyed talking with him about politics, debating  issues. His outlook was new to me. I learned a lot. Many of his ideas appealed. His insistence on freedom as the preeminent human value struck me as untenable. Yet I was never able to drive this point home.   He was unshakeable in his beliefs–more than I was in mine. So my ears pricked up when I heard him say to a group of us that, whenever someone engaged him in debate, he  “almost always” won.

Such confidence. Such hubris. I’ve never forgotten his words, though in truth I don’t know whether he said “always,” “almost always,” “usually,” or something else.

The New Yorker

The memory came back to me (again) while reading Joshua Rothman‘s recent article on rationality in The New Yorker, “Thinking It Through.” Rationality, it seems is having a moment, evidence for which Rothman cites a spate of books and a prominent web site. It makes a  certain sense, Rothman explains: “In a polemical era, rationality can be a kind of opinion hygiene–a way of washing off misjudged views.”

But Rothman reminds us that rationality has its own vulnerabilities to infection by unreason. One can be rational but also self-deceptive, “telling yourself you are rational can be itself become a source of bias.” One might be “rational” primarily as a means to impress others or in certain domains more than in others. One can irrationally insist on solving problems for himself when deferring to someone else might be more effective. The list could be extended.

My friend’s brag bugged me a for a long time. Only years later did I realize that he wasn’t so much right as smart, which is not the same thing. He had a sharp mind, able to remember facts, marshal statistics, and think logically. But there are many ways to be smart. For all his [cleverness], I sensed he was not especially metacognitively self-aware.

This is the quality on which Rothman focuses his lens most extensively, citing, among other newly published works, Stephen Fleming’s Know Thyself: The Science of Self-Awareness. Rationalists, Rothman says, must always take in new information “self-consciously, with an eye to redrawing their mental maps.” Later he asserts that “a rationalist must also be willing to be ‘metarational,’ willing to hand over the thinking keys when someone else is better informed or better  trained.”

Another recent title, The Scout Mindset: Why Some People See Things Clearly and Others Don’t by Julia Galef, likewise stresses the importance of self-awareness and flexibility. The Scout Mindset allows you “to recognize when you are wrong, to seek out your blind spots, to test your assumptions and change course.” By contrast, the Soldier Mindset predisposes you to hunker down and defend your position.

We see and hear a lot from the soldier mindset these days–as did, no doubt, our ancestors before us. It is perhaps humanity’s default position, honed over millennia through natural selection. It  would have promoted group cohesion and decisive action. The soldier fights for his country’s cause as much as for his own survival. The soldierly thinker consistently affirms his identity as the member of a group by supporting its ideology or point of view. Like many of our traits acquired on the savannah, though, this one may not serve as well in our modern complex social systems with instant global communications. We all belong to many different groups and subgroups. Many of our problems have become planetary in scope.

Furthermore, the soldier mindset requires assertions of knowledge and understanding that cannot be borne out by reason. I can think I know that Antony Blinken and Lloyd Austin got it all wrong in Afghanistan. And I may be right. (It’s hard to see that I’m wrong!) But my knowledge of the actual facts is extremely limited, even with my daily newspaper and radio broadcasts. So, I could hardly say how or why they got it wrong as well as the Secretaries of State and Defense could themselves. (It is possible, too, that they might “rationalize” their mistakes, a word that, Rothman points out, reveals our ambivalence toward rationality. And it is also true that sometimes those closest to a situation, the “experts,” miss a key insight that an outside observer might better identify.) Even on issues closer to home, it makes no sense that I could be right every time, in every way. Logically, I must be wrong some of the time, in certain ways. It’s only rational. Thus, it makes sense to be on the lookout for ways to a redraw my mental maps.

My libertarian friend did not have a soldier mindset. He was nimble rather than bullish. He was, if anything, clever to a fault. I often wish I had the chance to point out to him that “clever” is not the same thing as “right.” (I’m sure I’m right about that!) Rothman has his own hyper-rational friend he refers to throughout the article. The difference is that Rothman sees this friend as a positive exemplar. And he makes a striking realization about him near the end of the piece: “I think it’s not rationality, as such that makes him curious, truthful, honest, careful perceptive, and fair, but the reverse.”

Most of us have seen enough Star Trek to know that Spock is not our best model for human rationality. Rothman’s descriptors, on the other hand, may be a good working definition for this charged historical moment.

 

Rothman, Joshua. “Thinking It Through,” The New Yorker, August 23, 2021, 24-29.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.