How to find truth

    From BelieveTheSign
    Revision as of 14:56, 29 October 2022 by Admin (talk | contribs)
    Click on headings to expand them, or links to go to specific articles.

    In his book, The Righteous Mind, Jonathan Haidt discuss how we think about our beliefs:

    ...the mind is divided, like a rider on an elephant, and the rider’s job is to serve the elephant. The rider is our conscious reasoning—the stream of words and images of which we are fully aware. The elephant is the other 99 percent of mental processes—the ones that occur outside of awareness but that actually govern most of our behavior.
    A dog’s tail wags to communicate. You can’t make a dog happy by forcibly wagging its tail. And you can’t change people’s minds by utterly refuting their arguments.
    If you want to change people’s minds, you’ve got to talk to their elephants.
    The persuader’s goal should be to convey respect, warmth, and an openness to dialogue before stating one’s own case.
    “If there is any one secret of success it lies in the ability to get the other person’s point of view and see things from their angle as well as your own.” It’s such an obvious point, yet few of us apply it in moral and political arguments because our righteous minds so readily shift into combat mode. The rider and the elephant work together smoothly to fend off attacks and lob rhetorical grenades of our own. The performance may impress our friends and show allies that we are committed members of the team, but no matter how good our logic, it’s not going to change the minds of our opponents if they are in combat mode too.
    ...if you want to change someone’s mind about a moral or political issue, talk to the elephant first. If you ask people to believe something that violates their intuitions, they will devote their efforts to finding an escape hatch—a reason to doubt your argument or conclusion. They will almost always succeed.
    The elephant is far more powerful than the rider, but it is not an absolute dictator. When does the elephant listen to reason? The main way that we change our minds on moral issues is by interacting with other people. We are terrible at seeking evidence that challenges our own beliefs, but other people do us this favor, just as we are quite good at finding errors in other people’s beliefs. When discussions are hostile, the odds of change are slight. The elephant leans away from the opponent, and the rider works frantically to rebut the opponent’s charges. But if there is affection, admiration, or a desire to please the other person, then the elephant leans toward that person and the rider tries to find the truth in the other person’s arguments. The elephant may not often change its direction in response to objections from its own rider, but it is easily steered by the mere presence of friendly elephants (that’s the social persuasion link in the social intuitionist model) or by good arguments given to it by the riders of those friendly elephants (that’s the reasoned persuasion link).
    Necker cube - an optical illusion that can be interpreted to have either the lower-left or the upper-right square as its front side
    There are even times when we change our minds on our own, with no help from other people. Sometimes we have conflicting intuitions about something, as many people do about abortion and other controversial issues. Depending on which victim, which argument, or which friend you are thinking about at a given moment, your judgment may flip back and forth as if you were looking at a Necker cube.
    Exploratory thought is an “evenhanded consideration of alternative points of view.” Confirmatory thought is “a one-sided attempt to rationalize a particular point of view.”13 Accountability increases exploratory thought only when three conditions apply: (1) decision makers learn before forming any opinion that they will be accountable to an audience, (2) the audience’s views are unknown, and (3) they believe the audience is well informed and interested in accuracy. When all three conditions apply, people do their darnedest to figure out the truth, because that’s what the audience wants to hear. But the rest of the time—which is almost all of the time—accountability pressures simply increase confirmatory thought. People are trying harder to look right than to be right.
    The social psychologist Tom Gilovich studies the cognitive mechanisms of strange beliefs. His simple formulation is that when we want to believe something, we ask ourselves, “Can I believe it?”28 Then (as Kuhn and Perkins found), we search for supporting evidence, and if we find even a single piece of pseudo-evidence, we can stop thinking. We now have permission to believe. We have a justification, in case anyone asks. In contrast, when we don’t want to believe something, we ask ourselves, “Must I believe it?” Then we search for contrary evidence, and if we find a single reason to doubt the claim, we can dismiss it. You only need one key to unlock the handcuffs of must.
    it, “skilled arguers … are not after the truth but after arguments supporting their views.”50 This explains why the confirmation bias is so powerful, and so ineradicable. How hard could it be to teach students to look on the other side, to look for evidence against their favored view? Yet, in fact, it’s very hard, and nobody has yet found a way to do it.51 It’s hard because the confirmation bias is a built-in feature (of an argumentative mind), not a bug that can be removed (from a platonic mind).
    Reasoning can take us to almost any conclusion we want to reach, because we ask “Can I believe it?” when we want to believe something, but “Must I believe it?” when we don’t want to believe. The answer is almost always yes to the first question and no to the second.[1]


    Footnotes

    1. Haidt, Jonathan. The Righteous Mind (pp. 1-107). Knopf Doubleday Publishing Group.


    Navigation