… You can just switch off the radio, change channels, only like the Facebook pages that give you the kind of news you prefer. You can construct a pillow fort of the information that’s comfortable.
Most people aren’t totally ensconced in a cushiony cave, though. They build windows in the fort, they peek out from time to time, they go for long strolls out in the world. And so, they will occasionally encounter information that suggests something they believe is wrong. A lot of these instances are no big deal, and people change their minds if the evidence shows they should—you thought it was supposed to be nice out today, you step out the door and it’s raining, you grab an umbrella. Simple as that. But if the thing you might be wrong about is a belief that’s deeply tied to your identity or worldview—the guru you’ve dedicated your life to is accused of some terrible things, the cigarettes you’re addicted to can kill you—well, then people become logical Simone Bileses, doing all the mental gymnastics it takes to remain convinced that they’re right.
People see evidence that disagrees with them as weaker, because ultimately, they’re asking themselves fundamentally different questions when evaluating that evidence, depending on whether they want to believe what it suggests or not, according to psychologist Tom Gilovich. “For desired conclusions,” he writes, “it is as if we ask ourselves ‘Can I believe this?’, but for unpalatable conclusions we ask, ‘Must I believe this?’” People come to some information seeking permission to believe, and to other information looking for escape routes. CONT.
Julie Beck, The Atlantic