Fallacies and Cognitive Biases

*

Fallacies

*

Toupée Fallacy
“assuming that we can spot a toupée wearer, when in reality the convincing toupées go by undetected” [1]

*

Nirvana Fallacy
“rejecting a solution to a problem simply because it’s not perfect” [1]

  • [Pannenkoek] “A common mistake is to think we should strive for what’s perfect, but really we should strive for what’s best.”

*

Socratic Fallacy
“assuming that if someone can’t provide an explicit definition of a term, then they don’t know what it means, or that they don’t know what they mean when using it” [1]

  • This comes up sometimes during arguments, when person A asks person B to define a certain (usually common) term, and B can’t think up a good definition, so then A implies that B doesn’t know what they’re talking about. In reality, it’s just hard to come up with a good definition of the top of your head.
  • As another (silly) example, imagine that, out of the blue, you ask your calculus professor, “What’s the definition of ‘math’?” He’s then at a loss for what to say, and so you respond, “Wow, I guess they give Ph.D.’s out to anyone these days.”

*

Sunk-Cost Fallacy
“reasoning that further investment is warranted because the resources already invested will be lost (‘in vain’) otherwise, despite new evidence suggesting that the cost of continuing outweighs the expected benefit”i[1]

  • “For example, a person may purchase a ticket to a baseball game and find shortly after arriving that they aren’t enjoying the game. They have two options at this point. The first is accepting the waste of money on the ticket price and watching the remainder of the game, most likely without enjoying it. The second is accepting the waste of money on the ticket price and leaving to do something else. Since the second option involves suffering in only one way (wasted money), while the first involves suffering in two (wasted money plus wasted time), option two is preferable. In either case, the ticket-buyer has already paid the price of the ticket, so that part of the decision no longer affects the future. Many people, however, would feel obliged to stay for the rest of the game despite not really wanting to ― perhaps because they feel that doing otherwise would be wasting the money they spent on the ticket.” [2]

*

Map-Territory Confusion
“mistaking a representation of something for the thing itself”i[3]

  • Use-Mention Error
    “mistaking a word or concept for what it refers to” [4]

*

***

*

Cognitive Biases

*

Dunning-Kruger Effect
“When we’re incompetent at something, we don’t realize that we’re incompetent, because we lack the skill to distinguish between competence and incompetence; the unskilled suffer from illusory superiority.” [5]

*

Halo Effect
“When perceiving others, we allow their positive or negative traits to ‘spill over’ from one area of their personality into another.” [6]

*

Mere-Exposure Effect
“We express undue liking for things merely due to familiarity with them.” [6]

*

Negativity Bias
“We pay more attention and give more weight to negative experiences or information than positive.” [6]

  • For example, it’s a lot easier to remember the mean things people have said to us compared to the nice ones. And our most embarrassing moments stay with us in a way our most triumphant ones don’t.
  • The negativity bias strikes me as ultimately more detrimental to happiness than anything else mentioned in this entire section.

*

Peak-End Rule
“We judge an experience largely based on how we felt at its peak (its most intense point) and at its end, rather than based on the total sum or average of every moment of the experience.”i[6]

  • This has some overlap with “the serial position effect (also known as the recency and primacy effects), which is our tendency to recall the first and last items in a series best, and the middle items worst.” [7]

*

Hyperbolic Discounting
“We have the tendency to choose a smaller-sooner reward over a larger-later reward.” [8]

  • At its simplest, we’re willing to pay more to have the same desirable thing sooner rather than later, even though in theory there shouldn’t be a difference. [Pannenkoek] “I often feel like my consciousness today is one person and my consciousness tomorrow is a different person, and today I only care today’s consciousness.”
  • The following is a more nuanced description. “The standard experiment used to explore hyperbolic discounting involves asking participants questions such as: ‘Would you prefer a dollar today or three dollars tomorrow?’ or ‘Would you prefer a dollar in one year or three dollars in one year and one day?’ Many people will take the lesser amount today, but will gladly wait one extra day in a year in order to receive the higher amount instead. More generally, our valuations fall relatively rapidly for earlier delay periods but then fall more slowly for longer delay periods. Consequently, hyperbolic discounting often leads us to make choices today that our future selves would prefer not to have made, despite knowing the same information.” [8]

*

Planning Fallacy
“We habitually underestimate task-completion times.” [6]

  • Hofstadter’s Law
    “It always takes longer than you expect, even when you take into account Hofstadter’s Law.” [9]

*

***

*

The Pervasiveness of the Self-Serving Bias

*

Something I’ve come to think of as a ‘meta-theme’ in psychology is the pervasiveness of the self-serving bias.

Self-Serving Bias
any process that’s distorted by the need to maintain and enhance self-esteem, or the tendency to perceive oneself in an overly favorable manner” [11]

In essence, one very important, general goal we each have is to see the world accurately; and another very important, general goal we each have is to see ourselves in a favorable light. But when these two goals conflict, it’s the latter that wins out. Put another way, we constantly, unconsciously seek to have a positive self-image ― to feel good about ourselves ― even at the expense of the truth. This could be thought of as a subcategory of the more general tendency to believe that the world is as one would like it to be, even at the expense of seeing it for how it really is.

This theme of tension between objectivity and positive self-regard appears again and again, in many different forms. The following is a list of all the related phenomena I could think of. For whatever reason, I’ve never seen them all collected together before.

Arguably, everything else mentioned can be thought of as a subcategory of the self-serving bias as it’s defined above.

*

Directly Related

*

Self-Serving Bias (alternate, less fundamental definitions)
(1) “the tendency to attribute positive events to one’s own character but attribute negative events to external factors” [12]
(2) “the tendency to evaluate ambiguous information in a way that’s beneficial to one’s interests” [6]
(3) “the tendency to claim more responsibility for successes than failures” [6]

*

Egocentric Bias
(1) “the tendency to have a higher opinion of oneself than is warranted by reality” [13]
(2) “the tendency to claim more (less) responsibility for oneself for the results of a favorable (unfavorable) joint action than an outside observer would credit them with” [6]

*

Choice-Supportive Bias
(1) “the tendency to remember one’s choices as better than they actually were” [6]
(2) “the tendency to retroactively ascribe positive attributes to an option one has selected and/or to demote the forgone options” [14]

  • Post-Purchase Rationalization
    the tendency to overlook any faults or defects in order to justify a purchase one has made” [15]

*

Illusory Superiority (Above-Average Effect)
“the tendency to overestimate one’s desirable qualities and underestimate one’s undesirable qualities, relative to other people” [6]

*

False-Uniqueness Effect
“the tendency to see oneself and one’s projects as more singular or special than they actually are” [6]

*

Social Desirability Bias
“the tendency to over-report socially desirable characteristics or behaviors in oneself and under-report socially undesirable ones” [6]

  • This often comes into play when we describe ourselves to others or recount to them experiences we’ve had. Consciously or unconsciously, we leave out many of the details that make us look bad.

*

Projection
“a defense mechanism in which one attributes their own unacknowledged, unacceptable, or unwanted thoughts and emotions onto another” [16]

*

Self-Handicapping
(1) “arranging for an obvious and non-threatening obstacle to one’s own performance, such that any failure can be attributed to the obstacle and not one’s own limitations” [Gleitman, Reisberg, & Gross]
(2) “avoiding effort in the hopes of keeping potential failure from hurting self-esteem” [17]

*

Indirectly or Sometimes Related

*

Actor-Observer Bias (Fundamental Attribution Error)
“When explaining other individuals’ behaviors, we overemphasize the influence of their personality and underemphasize the influence of their situation. On the other hand, when explaining our own behaviors, we overemphasize the influence of our situation and underemphasize the influence of our own personality.” [6]

  • [Pannenkoek] “So the actor-observer bias can be summarized as: (others..character) and (oneself..situation). And that one definition of the self-serving bias can be summarized as: regarding oneself, (good..character) and (bad..situation). So they’re slightly at odds with one another, but I understand and agree with the general sentiment of both.”

*

Rationalization
“a defense mechanism in which one makes excuses or convinces oneself that no wrong has been done through faulty reasoning” [16]

*

Distortion
“a defense mechanism in which one grossly reshapes reality to meet internal needs” [16]

*

Denial
“a defense mechanism in which one refuses to accept reality, because it’s too threatening” [16]

*

Confirmation Bias
“the tendency to search for, interpret, focus on, and remember information in a way that confirms one’s preconceptions” [18]

  • “If we’re offered a fact which goes against our beliefs or instincts, we’ll scrutinize it closely, and unless the evidence is overwhelming, we’ll refuse to believe it. On the other hand, if we’re offered a fact which is in accordance with our beliefs or instincts, we’ll accept it even on the slightest evidence.” [Bertrand Russell]
  • Experimenter’s Bias
    “the tendency for experimenters to believe and publish data that agree with their expectations, and to disbelieve and discard data that conflict with those expectations” [6]

*

Selective Memory (Selective Forgetting)
“the tendency to remember only what one wants to remember” [19]

*

Hindsight Bias
“the tendency to see past events as being predictable at the time those events happened” [6]

  • “All truths are easy to understand once they’re discovered; the point is to discover them.” [Galileo]
  • “Like so many ideas of major importance, it appears obvious once stated, but had never been obvious before.” [20]

*

In-Group Favoritism
“the tendency to favor members of one’s in-group (people with whom you share an identity) over out-group members (people with whom you don’t share an identity)” [21]

  • “It’s theorized that one of the key determinants of group biases is the need to improve self-esteem ― the desire to view oneself positively is transferred onto the group, creating a tendency to view one’s own group in a positive light, and by comparison, outside groups in a negative light.” [21]

*

Bias Blind Spot
“the tendency to see oneself as less biased than other people, or to be able to identify more cognitive biases in others than in oneself” [6]

  • This is kind of like the meta one.

*

Cognitive Dissonance

*

Intertwined with the self-serving bias is cognitive dissonance.

Cognitive dissonance is the mental discomfort experienced by a person who notices that they exhibit contradictory beliefs and/or behaviors. A person experiencing cognitive dissonance will try to find a way to resolve the contradiction to reduce their discomfort.

In essence, cognitive dissonance occurs when you realize that you’re being a hypocrite.

In practice, people usually reduce cognitive dissonance one of the following ways. (For this example, imagine that the conflict is between your belief that you’re on a strict diet and your behavior that you’re currently eating cake.)

  • change the behavior
    • ‘I won’t eat any more of this cake.’
  • change the belief
    • ‘It’s okay if I cheat on my diet every once in a while.’
  • add a new belief and/or behavior
    • ‘I’ll spend an extra hour at the gym tomorrow to balance it out.’
  • ignore or deny information that conflicts with the existing belief
    • ‘This cake isn’t even that unhealthy.’”i[22]

[23]

*

“Cognitive dissonance is bothersome in any circumstance, but it’s especially painful when an important element of one’s self-image is threatened.

Self-justification describes how, when a person encounters a situation in which their behavior is inconsistent with their beliefs, they’ll tend to justify the behavior and deny any negative feedback associated with it. Self-justification often comes into play when discussing why individuals make ‘immoral’ decisions ― they may use rationalization to keep viewing themselves in a positive light.

There are two self-justification strategies:

  • External self-justification refers to the use of external excuses to justify one’s actions ― a displacement of personal responsibility, lack of self-control, or social pressures. External self-justification aims to diminish one’s responsibility for a behavior.
    • I was really tired.
    • ‘This is your fault!’
    • ‘I’m just having an off day.’
    • ‘They pressured me to do it!’
    • I wasn’t sober when I did that.
  • Internal self-justification refers to a change in the way one perceives their actions ― an attitude change, trivialization, or denial of the negative consequences. Internal self-justification helps make the negative outcomes more tolerable.
    • ‘They deserved it.’
    • ‘It’s a victimless crime.
    • ‘The ends justify the means.’
    • ‘It’s not a big deal! Everybody does it!’
    • ‘What they don’t know won’t hurt them.’

Generally, when people can’t find external justification for their behavior, they attempt to find internal justification.” [24]

*

Sour Grapes:  Sometimes people pretend (or even successfully convince themselves) not to care about something they can’t have by disparaging it.” [Daniel Dennett]

*

“In one study, participants were asked to perform several extremely boring tasks. When they were finished, the participants were instructed to tell the next participant that the tasks were really interesting. Half the participants were paid $20 for lying, while the other half were paid just $1. Later on, all the participants were asked how enjoyable they really found the tasks, and the well-paid participants said that the tasks were in fact boring. In contrast, the poorly-paid participants claimed that the tasks were fairly interesting. What produces this odd pattern?

According to one theory, the well-paid liars knew why they’d mouthed sentiments they didn’t endorse. The poorly-paid liars, however, had experienced cognitive dissonance, thanks to the fact that they’d misled other people without good reason for this misdeed. In other words, they received insufficient justification for their action, making it underjustifed behavior. Taken at face value, this made them look like casual and unprincipled liars, a view that conflicted with how they wanted to see themselves. How therefore could they reconcile their behavior with their self-image? One plausible solution was to reevaluate the boring task and convince themselves it was fun.

Closely related findings emerge in studies of people who make considerable sacrifices to achieve a goal. These people typically end up placing a very high value on the achievement and cognitive dissonance tells us why ― most of us would find it difficult to tolerate the idea that we worked hard for many years to achieve something trivial. To make our efforts seem sensible, therefore, our only choice is to value what we attained. Thus, goals will be valued more if they were harder won. This result helps to explain why many organizations have difficult or aversive entrance requirements. The hazing rituals prevalent in military units, sports teams, and fraternities/sororities often include demanding and/or humiliating tasks which lead the new member to increase the subjective value of the group.” [Gleitman, Reisberg, & Gross]

*

Retroactive Justification

*

[Pannenkoek] “Imagine you ask someone to help you do a chore, but they say no with a very weak excuse. For example, maybe you asked them to help you move furniture at your house, but they say that they can’t because their arm was feeling a little funny earlier. In this case, you’d be rightly skeptical to think that this isn’t the real reason they’re declining the offer.

What the person probably did was start with their final decision already made, i.e. ‘I’m not going to move furniture.’ Then they searched their mind for all possible excuses they have, none of which were very good, and then gave you the best one they found. Obviously, this isn’t the correct way to make a decision. Instead, the correct way would be to first decide on a threshold for how good of an excuse they would need to have in order to justify not moving furniture. Then they should search their mind for all possible excuses they have, and determine whether any of them exceeds that threshold. Only then could they decline the offer in good faith.”

*

“Moral judgment is like aesthetic judgment. When you see a painting, you usually know instantly and automatically whether you like it. If someone asks you to explain your judgment, you confabulate. You don’t really know why you think something is beautiful, but your interpreter module (your rational left-brain) is skilled at making up reasons. You search for a plausible reason for liking the painting, and you latch on to the first reason that makes sense ― maybe something vague about color or light.

Moral arguments are much the same: two people feel strongly about an issue, their feelings come first, and their reasons are invented on the fly, to throw at each other. When you refute a person’s argument, do they generally change her mind and agree with you? Of course not, because the argument you defeated wasn’t the cause of their position ― it was made up after the judgment was already made.” [10]

*

***

*

“Our psychological mechanisms have to deal simultaneously with the internal need for self-esteem and the constant flow of evidence from the outside affecting our self-image. The result is that information flows in a complex swirl between different levels of the personality in an attempt to reconcile what is, with what we wish were. The upshot is that the total picture of ‘who I am’ contains in each one of us a large number of unresolved, possibly unresolvable, inconsistencies. These undoubtedly provide much of the dynamic tension which is so much a part of being human.” [Hofstadter]

“Nothing is so difficult as not deceiving oneself.” [Wittgenstein]

The hardest thing to do is to see yourself objectively.

*

*

*

Citations

  1. Wikipedia’s List of Fallacies article
  2. Wikipedia’s Sunk Cost article
  3. Wikipedia’s Map-Territory Relation article
  4. Wikipedia’s Use-Mention Distinction article
  5. David Dunning, Justin Kruger, & Wikipedia’s Dunning-Kruger Effect article
  6. Wikipedia’s List of Cognitive Biases article
  7. Wikipedia’s Serial-Position Effect article
  8. Wikipedia’s Hyperbolic Discounting article
  9. Douglas Hofstadter & Wikipedia’s List of Eponymous Laws article
  10. Jonathan Haidt
  11. Wikipedia’s Self-Serving Bias article
  12. Alice Boyes & psychologytoday.com
  13. Wikipedia’s Egocentric Bias article
  14. Wikipedia’s Choice-Supportive Bias article
  15. (various sources)
  16. Wikipedia’s Defense Mechanism article
  17. Wikipedia’s Self-Handicapping article
  18. Wikipedia’s Confirmation Bias article
  19. (various sources)
  20. (source lost)
  21. Wikipedia’s In-Group Favoritism article
  22. Wikipedia’s Cognitive Dissonance article
  23. Elizabeth Turnbull
  24. Wikipedia’s Self-Justification article

%d bloggers like this: