Ready to believe anything

Why are we so ready to believe what we hear or are told? Wilson and Brekke had a go at it (via Bryan Caplan):

As noted by Gilbert (1991, 1993), there is a long tradition in philosophy and psychology, originating with Descartes, that assumes that belief formation is a two-step process: First people comprehend a proposition (e.g., “Jason is dishonest”) and then freely decide whether to accept it as true (e.g., whether it fits with other information they know about Jason). Thus, there is no danger to encountering potentially false information because people can always weed out the truth from the fiction, discarding those propositions that do not hold up under scrutiny. Gilbert (1991, 1993) argued persuasively, however, that human belief formation operates much more like a system advocated by Spinoza. According to this view, people initially accept as true every proposition they comprehend and then decide whether to “unbelieve” it or not. Thus, in the example just provided, people assume that Jason is dishonest as soon as they hear this proposition, reversing this opinion if it is inconsistent with the facts.

Under many circumstances, the Cartesian and Spinozan systems end up at the same state of belief (e.g., that Jason is honest because, on reflection, people know that there is no evidence that he is dishonest). Because the second, verification stage requires mental effort, however, there are conditions under which the two systems result in very different states of belief.  If people are tired or otherwise occupied, they may never move beyond the first stage of the process. In the Cartesian system, the person would remain in a state of suspended belief (e.g., “Is Jason dishonest? I will reserve judgment until I have time to analyze the facts”). In the Spinozan system, the person remains in the initial stage of acceptance, believing the initial proposition. Gilbert has provided evidence, in several intriguing experiments, for the Spinozan view: When people’s cognitive capacity is taxed, they have difficulty rejecting false propositions (see Gilbert, 1991, 1993).

Read that last line again: when people’s cognitive capacity is taxed, they have difficulty rejecting false propositions.

This in essence is what happens when we are bombarded with information / mis-information on “news” channels and advertisements, we do not have the time to decide if what we hear is the truth or not, but we are inclined to initially accept it. Works well for politicians, ad-makers and fake-news people.

This also reminds me of Lenin’s quote: “a lie told often enough becomes the truth.”

Should I be reading Spinoza? Jeeves used to read his works! Psychology of the individual and all that.

Leave a Reply

Your email address will not be published. Required fields are marked *

Connect with Facebook

This site uses Akismet to reduce spam. Learn how your comment data is processed.