Results 1 to 11 of 11

Thread: The 12 cognitive biases that prevent you from being rational

  1. #1
    yeves's Avatar
    Join Date
    May 2014
    TIM
    Si 6 spsx
    Posts
    1,359
    Mentioned
    40 Post(s)
    Tagged
    0 Thread(s)

    Default The 12 cognitive biases that prevent you from being rational

    Which of these biases are you most subject to?
    Which do you think aren't in your nature? Which have you been able to overcome?
    Which biases do you take note of from your discussions with others?

    http://io9.com/5974468/the-most-comm...being-rational

    The human brain is capable of 1016 processes per second, which makes it far more powerful than any computer currently in existence. But that doesn't mean our brains don't have major limitations. The lowly calculator can do math thousands of times better than we can, and our memories are often less than useless — plus, we're subject to cognitive biases, those annoying glitches in our thinking that cause us to make questionable decisions and reach erroneous conclusions. Here are a dozen of the most common and pernicious cognitive biases that you need to know about.

    Before we start, it's important to distinguish between cognitive biases and logical fallacies. A logical fallacy is an error in logical argumentation (e.g. ad hominem attacks, slippery slopes, circular arguments, appeal to force, etc.). A cognitive bias, on the other hand, is a genuine deficiency or limitation in our thinking — a flaw in judgment that arises from errors of memory, social attribution, and miscalculations (such as statistical errors or a false sense of probability).

    Some social psychologists believe our cognitive biases help us process information more efficiently, especially in dangerous situations. Still, they lead us to make grave mistakes. We may be prone to such errors in judgment, but at least we can be aware of them. Here are some important ones to keep in mind.

    Confirmation Bias

    We love to agree with people who agree with us. It's why we only visit websites that express our political opinions, and why we mostly hang around people who hold similar views and tastes. We tend to be put off by individuals, groups, and news sources that make us feel uncomfortable or insecure about our views — what the behavioral psychologist B. F. Skinner called cognitive dissonance. It's this preferential mode of behavior that leads to the confirmation bias — the often unconscious act of referencing only those perspectives that fuel our pre-existing views, while at the same time ignoring or dismissing opinions — no matter how valid — that threaten our world view. And paradoxically, the internet has only made this tendency even worse.

    In-group Bias

    Somewhat similar to the confirmation bias is the in-group bias, a manifestation of our innate tribalistic tendencies. And strangely, much of this effect may have to do with oxytocin — the so-called "love molecule." This neurotransmitter, while helping us to forge tighter bonds with people in our in-group, performs the exact opposite function for those on the outside — it makes us suspicious, fearful, and even disdainful of others. Ultimately, the in-group bias causes us to overestimate the abilities and value of our immediate group at the expense of people we don't really know.

    Gambler's Fallacy

    It's called a fallacy, but it's more a glitch in our thinking. We tend to put a tremendous amount of weight on previous events, believing that they'll somehow influence future outcomes. The classic example is coin-tossing. After flipping heads, say, five consecutive times, our inclination is to predict an increase in likelihood that the next coin toss will be tails — that the odds must certainly be in the favor of heads. But in reality, the odds are still 50/50. As statisticians say, the outcomes in different tosses are statistically independent and the probability of any outcome is still 50%.

    Relatedly, there's also the positive expectation bias — which often fuels gambling addictions. It's the sense that our luck has to eventually change and that good fortune is on the way. It also contribues to the "hot hand" misconception. Similarly, it's the same feeling we get when we start a new relationship that leads us to believe it will be better than the last one.

    Post-Purchase Rationalization

    Remember that time you bought something totally unnecessary, faulty, or overly expense, and then you rationalized the purchase to such an extent that you convinced yourself it was a great idea all along? Yeah, that's post-purchase rationalization in action — a kind of built-in mechanism that makes us feel better after we make crappy decisions, especially at the cash register. Also known as Buyer's Stockholm Syndrome, it's a way of subconsciously justifying our purchases — especially expensive ones. Social psychologists say it stems from the principle of commitment, our psychological desire to stay consistent and avoid a state of cognitive dissonance.

    Neglecting Probability

    Very few of us have a problem getting into a car and going for a drive, but many of us experience great trepidation about stepping inside an airplane and flying at 35,000 feet. Flying, quite obviously, is a wholly unnatural and seemingly hazardous activity. Yet virtually all of us know and acknowledge the fact that the probability of dying in an auto accident is significantly greater than getting killed in a plane crash — but our brains won't release us from this crystal clear logic (statistically, we have a 1 in 84 chance of dying in a vehicular accident, as compared to a 1 in 5,000 chance of dying in an plane crash [other sources indicate odds as high as 1 in 20,000]). It's the same phenomenon that makes us worry about getting killed in an act of terrorism as opposed to something far more probable, like falling down the stairs or accidental poisoning.

    This is what the social psychologist Cass Sunstein calls probability neglect — our inability to properly grasp a proper sense of peril and risk — which often leads us to overstate the risks of relatively harmless activities, while forcing us to overrate more dangerous ones.

    Observational Selection Bias

    This is that effect of suddenly noticing things we didn't notice that much before — but we wrongly assume that the frequency has increased. A perfect example is what happens after we buy a new car and we inexplicably start to see the same car virtually everywhere. A similar effect happens to pregnant women who suddenly notice a lot of other pregnant women around them. Or it could be a unique number or song. It's not that these things are appearing more frequently, it's that we've (for whatever reason) selected the item in our mind, and in turn, are noticing it more often. Trouble is, most people don't recognize this as a selectional bias, and actually believe these items or events are happening with increased frequency — which can be a very disconcerting feeling. It's also a cognitive bias that contributes to the feeling that the appearance of certain things or events couldn't possibly be a coincidence (even though it is).

    Status-Quo Bias

    We humans tend to be apprehensive of change, which often leads us to make choices that guarantee that things remain the same, or change as little as possible. Needless to say, this has ramifications in everything from politics to economics. We like to stick to our routines, political parties, and our favorite meals at restaurants. Part of the perniciousness of this bias is the unwarranted assumption that another choice will be inferior or make things worse. The status-quo bias can be summed with the saying, "If it ain't broke, don't fix it" — an adage that fuels our conservative tendencies. And in fact, some commentators say this is why the U.S. hasn't been able to enact universal health care, despite the fact that most individuals support the idea of reform.

    Negativity Bias

    People tend to pay more attention to bad news — and it's not just because we're morbid. Social scientists theorize that it's on account of our selective attention and that, given the choice, we perceive negative news as being more important or profound. We also tend to give more credibility to bad news, perhaps because we're suspicious (or bored) of proclamations to the contrary. More evolutionarily, heeding bad news may be more adaptive than ignoring good news (e.g. "saber tooth tigers suck" vs. "this berry tastes good"). Today, we run the risk of dwelling on negativity at the expense of genuinely good news. Steven Pinker, in his book The Better Angels of Our Nature: Why Violence Has Declined, argues that crime, violence, war, and other injustices are steadily declining, yet most people would argue that things are getting worse — what is a perfect example of the negativity bias at work.

    Bandwagon Effect

    Though we're often unconscious of it, we love to go with the flow of the crowd. When the masses start to pick a winner or a favorite, that's when our individualized brains start to shut down and enter into a kind of "groupthink" or hivemind mentality. But it doesn't have to be a large crowd or the whims of an entire nation; it can include small groups, like a family or even a small group of office co-workers. The bandwagon effect is what often causes behaviors, social norms, and memes to propagate among groups of individuals — regardless of the evidence or motives in support. This is why opinion polls are often maligned, as they can steer the perspectives of individuals accordingly. Much of this bias has to do with our built-in desire to fit in and conform, as famously demonstrated by the Asch Conformity Experiments.

    Projection Bias

    As individuals trapped inside our own minds 24/7, it's often difficult for us to project outside the bounds of our own consciousness and preferences. We tend to assume that most people think just like us — though there may be no justification for it. This cognitive shortcoming often leads to a related effect known as the false consensus bias where we tend to believe that people not only think like us, but that they also agree with us. It's a bias where we overestimate how typical and normal we are, and assume that a consensus exists on matters when there may be none. Moreover, it can also create the effect where the members of a radical or fringe group assume that more people on the outside agree with them than is the case. Or the exaggerated confidence one has when predicting the winner of an election or sports match.

    The Current Moment Bias

    We humans have a really hard time imagining ourselves in the future and altering our behaviors and expectations accordingly. Most of us would rather experience pleasure in the current moment, while leaving the pain for later. This is a bias that is of particular concern to economists (i.e. our unwillingness to not overspend and save money) and health practitioners. Indeed, a 1998 study showed that, when making food choices for the coming week, 74% of participants chose fruit. But when the food choice was for the current day, 70% chose chocolate.

    Anchoring Effect

    Also known as the relativity trap, this is the tendency we have to compare and contrast only a limited set of items. It's called the anchoring effect because we tend to fixate on a value or number that in turn gets compared to everything else. The classic example is an item at the store that's on sale; we tend to see (and value) the difference in price, but not the overall price itself. This is why some restaurant menus feature very expensive entrees, while also including more (apparently) reasonably priced ones. It's also why, when given a choice, we tend to pick the middle option — not too expensive, and not too cheap.

  2. #2
    Haikus
    Join Date
    Jan 2014
    Location
    Berlin
    TIM
    LSI 5w6 sx/so
    Posts
    5,402
    Mentioned
    144 Post(s)
    Tagged
    1 Thread(s)

    Default

    Succinct and superficial answer: personally I'm most prone to Post-purchase Rationalization, Current Moment Bias, and Observational Selection. I'm usually immune to groupthink/bandwagon stuff, uninterested in status-quo and quite alert when it comes to projection risk.

    I appreciate your list ...but you may be asking too much of people .
    Last edited by Amber; 12-01-2014 at 11:50 PM.

  3. #3
    ■■■■■■ Radio's Avatar
    Join Date
    Aug 2011
    Posts
    2,571
    Mentioned
    154 Post(s)
    Tagged
    0 Thread(s)

    Default

    you would be lying if you said you don't fall victim to all of these pitfalls at least some of the times.

  4. #4
    Feeling fucking fantastic golden's Avatar
    Join Date
    Sep 2010
    Location
    Second story
    TIM
    EIE
    Posts
    3,724
    Mentioned
    250 Post(s)
    Tagged
    0 Thread(s)

    Default

    Here's the ultimate, most amusing cognitive bias, which I call to mind whenever I consider this subject:

    The bias blind spot is the cognitive bias of recognizing the impact of biases on the judgement of others, while failing to see the impact of biases on one's own judgement. The term was created by Emily Pronin, a social psychologist from Princeton University's Department of Psychology, with colleagues Daniel Lin and Lee Ross. The bias blind spot is named after the visual blind spot.

    CAUSES OF BIAS BLINDNESS
    The cognitive utilization of bias blind spots may be caused by a variety of other biases and self-deceptions.

    Self-enhancement biases may play a role, in that people are motivated to view themselves in a positive light. Biases are generally seen as undesirable, so people tend to think of their own perceptions and judgments as being rational, accurate, and free of bias. The self-enhancement bias also applies when analyzing our own decisions, in that people are likely to think of themselves as better decision makers than others.

    People also tend to believe they are aware of ‘how’ and ‘why’ they make their decisions, and therefore conclude that bias did not play a role. Many of our decisions are formed from biases and cognitive shortcuts, which are unconscious processes. By definition, people are unaware of unconscious processes, and therefore cannot see their influence in the decision making process.

    When made aware of various biases acting on our perception, decisions, or judgments, research has shown that we are still unable to control them. This contributes to the bias blind spot in that even if one is told that they are biased, they are unable to alter their biased perception.

    ROLE OF INTROSPECTION
    Emily Pronin and Matthew Kugler have argued that this phenomenon is due to the introspection illusion. In their experiments, subjects had to make judgments about themselves and about other subjects. They displayed standard biases, for example rating themselves above the others on desirable qualities (demonstrating illusory superiority). The experimenters explained cognitive bias, and asked the subjects how it might have affected their judgment. The subjects rated themselves as less susceptible to bias than others in the experiment (confirming the bias blind spot). When they had to explain their judgments, they used different strategies for assessing their own and others' bias.

    Pronin and Kugler's interpretation is that, when people decide whether someone else is biased, they use overt behaviour. On the other hand, when assessing whether or not they themselves are biased, people look inward, searching their own thoughts and feelings for biased motives. Since biases operate unconsciously, these introspections are not informative, but people wrongly treat them as reliable indication that they themselves, unlike other people, are immune to bias.

    Pronin and Kugler tried to give their subjects access to others' introspections. To do this, they made audio recordings of subjects who had been told to say whatever came into their heads as they decided whether their answer to a previous question might have been affected by bias. Although subjects persuaded themselves they were unlikely to be biased, their introspective reports did not sway the assessments of observers.

    DIFFERENCES OF PERCEPTIONS
    People tend to attribute bias in an uneven way. When people reach different perceptions from each other, they each tend to label the other person as biased, and themselves as being accurate and un-biased. Pronin hypothesizes that this bias misattribution may be a source of conflict and misunderstanding between people. For example, in labeling another person as biased, one may also label their intentions cynically. But when examining one’s own cognitions, people judge themselves based on their good intentions. It is likely that in this case, one may attribute another’s bias to “intentional malice” rather than an unconscious process.

    Pronin also hypothesizes ways to use awareness of the bias blind spot to reduce conflict, and to think in a more “scientifically informed” way. Although we are unable to control bias on our own cognitions, one may keep in mind that biases are acting on everyone. Pronin suggests that people might use this knowledge to separate other’s intentions from their actions.

  5. #5
    Glorious Member mu4's Avatar
    Join Date
    Aug 2007
    Location
    Mind
    Posts
    8,174
    Mentioned
    760 Post(s)
    Tagged
    3 Thread(s)

    Default

    Most of these biases comes from a sense of agency or sense of control over the result. Negative/positive/whatever.

    Rationality is often undesired, it's only when rationality serves some desire or agency that people choose it, but some people have a desire for rationality. A lot of what supposedly rational people do is make a host of rational choices so they can make select irrational ones.

    Reason is a tool, often to be used in service of some passion, which we act upon with our agency, that's about as close to freedom as I see exist in the world. Pure reason and pure submission to all passions and desire are both forms of bondage, it is only when reason is directed truly to some more noble passion and desire and acting upon these with our capabilities and might that there can be a thing that one might call freedom. So the search for the highest good, the most noble knowledge and perfect understanding and many such passions are reasonable, free and when acted upon with our agency even commendable to those that observe such curious irrationality.

  6. #6
    Pookie's Avatar
    Join Date
    May 2010
    TIM
    IEI-Ni 6w5-9-2 So/Sx
    Posts
    2,372
    Mentioned
    112 Post(s)
    Tagged
    0 Thread(s)

    Default

    "It's called a fallacy, but it's more a glitch in our thinking. We tend to put a tremendous amount of weight on previous events, believing that they'll somehow influence future outcomes. The classic example is coin-tossing. After flipping heads, say, five consecutive times, our inclination is to predict an increase in likelihood that the next coin toss will be tails — that the odds must certainly be in the favor of heads. But in reality, the odds are still 50/50. As statisticians say, the outcomes in different tosses are statistically independent and the probability of any outcome is still 50%."

    Well, over the course of a thousand flips, it is more likely to have tails, as 50/50 chances always average out over a big enough field. So, yeah, Odds wise i disagree with this conclusion. You'll streak tails to a degree that it evens out eventually.
    Projection is ordinary. Person A projects at person B, hoping tovalidate something about person A by the response of person B. However, person B, not wanting to be an obejct of someone elses ego and guarding against existential terror constructs a personality which protects his ego and maintain a certain sense of a robust and real self that is different and separate from person A. Sadly, this robust and real self, cut off by defenses of character from the rest of the world, is quite vulnerable and fragile given that it is imaginary and propped up through external feed back. Person B is dimly aware of this and defends against it all the more, even desperately projecting his anxieties back onto person A, with the hope of shoring up his ego with salubrious validation. All of this happens without A or B acknowledging it, of course. Because to face up to it consciously is shocking, in that this is all anybody is doing or can do and it seems absurd when you realize how pathetic it is.

  7. #7
    Feeling fucking fantastic golden's Avatar
    Join Date
    Sep 2010
    Location
    Second story
    TIM
    EIE
    Posts
    3,724
    Mentioned
    250 Post(s)
    Tagged
    0 Thread(s)

    Default

    Quote Originally Posted by Pookie View Post
    "It's called a fallacy, but it's more a glitch in our thinking. We tend to put a tremendous amount of weight on previous events, believing that they'll somehow influence future outcomes. The classic example is coin-tossing. After flipping heads, say, five consecutive times, our inclination is to predict an increase in likelihood that the next coin toss will be tails — that the odds must certainly be in the favor of heads. But in reality, the odds are still 50/50. As statisticians say, the outcomes in different tosses are statistically independent and the probability of any outcome is still 50%."

    Well, over the course of a thousand flips, it is more likely to have tails, as 50/50 chances always average out over a big enough field. So, yeah, Odds wise i disagree with this conclusion. You'll streak tails to a degree that it evens out eventually.
    Did you ever watch / read "Rosencrantz and Guildenstern Are Dead"?

    (From about 45 seconds in):


  8. #8
    escaping anndelise's Avatar
    Join Date
    Jan 2006
    Location
    WA
    TIM
    IEE 649 sx/sp cp
    Posts
    6,359
    Mentioned
    215 Post(s)
    Tagged
    0 Thread(s)

    Default


    From a thread I started in 2007 titled
    Top 12 Cognitive Biases regularly seen on this Forum

    (it was actually kind of difficult narrowing it down to the top 12)


    1. not compensating for one's own cognitive biases

    2. seeing patterns where actually none exist

    3. viewing one's self as relatively variable in terms of personality, behavior and mood
    while viewing others as much more predictable

    4. describing "aspects of a type" which are in fact vague and general enough to apply to other types

    5. giving high accuracy ratings to descriptions of one's personality that supposedly are tailored specifically for them,
    but are in fact vague and general enough to apply to a wide range of people

    6. over-emphasizing personality influences on behavior
    while under-emphasizing situational influences on the same behavior

    7. relying too heavily, or "anchoring," on one trait or piece of information when typing/describing

    8. failing to incorporate prior known probabilities which are pertinent to the decision at hand

    9. searching for or interpreting information in a way that confirms one's preconceptions and/or expectations

    10. extending critical scrutiny to information which contradicts one's prior beliefs
    and accepting uncritically information that is congruent with one's prior beliefs

    11. overestimating one's ability to know others
    while over/under-estimating other's ability to know the one

    12. weighing recent events more than earlier events

    Note: as you can see I took out the names as I wanted the focus to be on the actions rather than the terms.
    If you wish to look up the terms, or to seek out what other cognitive biases are available: http://en.wikipedia.org/wiki/List_of_cognitive_biases

    Note2: the majority of these are not in my own words but come from the linked site.
    Some slight alterations may have been made for context and readability
    iow, i claim nothing for myself other than it seems to me that these are the most obvious and significant 12.
    IEE 649 sx/sp cp

  9. #9
    escaping anndelise's Avatar
    Join Date
    Jan 2006
    Location
    WA
    TIM
    IEE 649 sx/sp cp
    Posts
    6,359
    Mentioned
    215 Post(s)
    Tagged
    0 Thread(s)

    Default

    Quote Originally Posted by GOLDEN View Post
    Did you ever watch / read "Rosencrantz and Guildenstern Are Dead"?
    I watched that clip. I kept expecting them to be facing death at some point where the flip of the coin would spell their doom. The answer being whatever the opposite of what they chose.

    This also reminds me of that game show probability thing where some people argued that it was theorized and maybe "proven" that a contestant had a higher chance of winning if they kept switching their choices. Example, choice a, b, c, they choose a. C is opened up to not be the choice. So then contestant now had to choose between a and b, and supposedly the contestant had a higher chance (66.6%?) of winning if they switched their choice to b. Other people (incl me) argued that regardless of the previous choices, the last choice was a 50/50 chance of being right.

    Edited to add: found a link to the Monty Hall problem
    http://en.m.wikipedia.org/wiki/Monty_Hall_problem

    And the discussion on this forum about it: http://www.the16types.info/vbulletin...y-Hall-Problem
    Obviously i suck at accepting the theory.
    Last edited by anndelise; 12-02-2014 at 09:07 PM.
    IEE 649 sx/sp cp

  10. #10
    Ti centric krieger's Avatar
    Join Date
    Sep 2006
    Posts
    5,937
    Mentioned
    80 Post(s)
    Tagged
    0 Thread(s)

    Default

    i gained immunity from these at an early age. it isn't that hard for people above IQ 120. i read these as guides to how stupid people think.

  11. #11
    Banned
    Join Date
    Aug 2010
    TIM
    SLE/LSE sx/sp
    Posts
    2,470
    Mentioned
    76 Post(s)
    Tagged
    1 Thread(s)

    Default

    All of them

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •