Social Psych | Cognitive Bias | List of Cognitive Biases | Cognitive Bias Sources

Self Perception Biases

Self Perception biases are the tendency to allow one’s dispositions to affect one's way of interpreting information. Self perception biases are distortions of one’s own view of self.
photo credit:jcoterhals@flickr

  1. Bias Blind Spot - the affectation or tendency to be ignorant of one’s own biases. This is a case of the blind not knowing or ignoring that they are blind. (Pronin and Kugler, 2007)
  2. Illusion of Control - the belief of being in at least some control over events and outcomes that you actually have no effect on. The devoted fan who gets out his lucky hat that "always brings the game back whenever the Giants are down" is a good example of this bias. (Kahneman and Tversky, 1972)
  3. Restraint Bias - having overconfidence in one’s own ability to deny temptations. This is a common bias because people like to believe they can handle whatever faces them and do not want to see themselves as having weak willpower. A Yorkie might fully believe they can become a vegetarian and even spend four days without eating any meat, but when they attend a Carnivores’ Club meeting and smell the mouthwatering aroma of bacon, they give into the temptation that they were so confident they would overcome.
  4. Self-Serving Bias - the tendency to be less prone to claim a failure than to claim a success. This is mostly due to people thinking their successes were due to their own brilliance, but their errors were caused by mistakes outside of their control ||Cognitive Dissonance||. In Mr. Fink’s titration lab, a student is less likely to claim personal responsibility for the error that ends up skewing some of the results than for his quick thinking that enabled his group to salvage some meaningful data from the experiment.
  5. Overconfidence Effect - inappropriately high confidence in one’s own answers, opinions or beliefs. These overestimations could be driven by a strong desire to succeed or could just be a consequence of the general optimism produced by cognitive bias. Examples of overconfidence bias include a famous 1983 study in which 93% of drivers reported that they believed they were among the upper 50% of driving skill. (Pohl, 2006)
  6. Egocentric Bias - the tendency of people to claim more responsibility in a group project than actuality. Egocentric bias could be observed if, for instance, any one person claimed to run Fall Fair when in reality, anyone who has taken part in Fall Fair knows it is an enormous team effort. (Kruger, Dunning, 1999)

Perception Biases

Perception biases are inaccurate views or conclusions drawn in various ways. They explain certain behavioral vicissitudes as well as how collective debates can result in so many various opinions.
  1. Attentional Bias - the tendency for one’s emotions to determine or affect one's focus. Emotional propaganda plays on this; for instance, certain charity commercials will show pictures of starving kids in Africa to draw attention away from the fact that only a fraction of the money donated actually goes to charitable causes.
  2. Availability Heuristic - basing judgements or estimations on what most easily comes to memory. Because we remember cases or events that stand out as unusual or unexpected, this usually results in false assumptions or estimations. (Tversky and Kahneman, 1972) The availability heuristic is hypothesized to be to blame for the misconception that couples are more likely to conceive after they have adopted a child. People tend to remember all of the people who conceive after adoption and tend to forget about all of the cases in which the couples did not conceive after adopting. A more York oriented example is the common belief students seem to have that if their teacher doesn’t show up to class within the first 15 minutes, then they have a free. This fits the availability heuristic because they most easily remember hearing of cases where other students did get away with this and enjoyed an unexpected free, rather then the more plentiful instances where the teacher showed up just in the nick of time and was angry at their attempt to desert class.
  3. Hindsight Bias - “the I-knew-it-all-along bias”, it is the tendency to believe you knew something when you truly
    (photo credit: suburbanbloke@flickr)
    did not. This also includes viewing completed events as more predictable than they actually were. (Pohl, 2006) Hindsight Bias can easily be observed outside the science building as Yorkies walking out of a math test will ask one another what they got on the Option A and frustratedly proclaim they knew that was what they were supposed to do, but for some reason didn't apply it at the time.
  4. Observer Expectancy Effect / Selective Perception - known as the “observer effect”, this is a fallacy that can very easily skew results in qualitative scientific experimentation. It is the tendency to manipulate or misinterpret data so that it will support (or disprove) a hypothesis. Essentially, it is the tendency to see what you want or expect to see.
  5. Framing Effect - the tendency to interpret information differently based on changes in context or description. A Yorkie might exhibit this in the stress they put on studying for a chemistry quiz in comparison to a chemistry test. Even though Ms. Trachsel will explain that test and quiz scores are valued equally, and this quiz will be the same length as an average test, you might still hear one Yorkie telling another that “It’s just a quiz,” implying that being a quiz makes it somehow less imperative or important, regardless of how many points it’s worth.
  6. Choice Supportive Bias - the propensity to believe your choices were better or more righteously chosen than they actually were. This tends to happen when an individual remembers only the positive aspects of the chosen option of a decision, and only the negative aspects of the rejected options. For example, a second semester senior who hasn’t taken any AP classes might justify his choice by concentrating on how much stress he would have now had he taken any AP classes, while not thinking about the benefits of passing the AP test and potentially getting college credit.

Logic and Decision Biases

Cognitive biases in logic and decisions are shown mostly through how people go about solving problems in different ways, make various choices, and judge different situations.
  1. Base Rate Fallacy - Base Rate Fallacy is the inclination for someone to base his judgements on specifics rather than the big picture. An example of this could be a York Senior who chooses a college for having a strong chemistry program and ignores other aspects such as its location in the middle of a desert.
  2. Zero-Risk Bias - the tendency for someone to try to eliminate a small risk rather than lower the likeliness of a great risk. An example of this could be a Yorkie that decides against joining the cross country team because they run on trails adjacent to areas that could contain unexploded ordinance. Rather than always choosing public transportation over driving a car to greatly reduce the risk of death in a transportation accident, the Yorkie reduces a small chance of getting blown to bits. This bias stems from a desire to reduce risk based on proportion rather than by chance. In other words, this Yorkie values a 100% risk decrease from .1% to 0% rather than a 66% risk decrease from, say, 3% to 1%
  3. Anchoring - the inclination for someone to allow one piece of information to outweigh others when making a decision. An example might be a couple considering the fact that the girl they hired to babysit their children goes to Stanford to be more important than the side facts that that girl skips half her classes, rides a motorcycle and brings her boyfriend with her to babysitting jobs.
  4. Belief Bias - the tendency for someone to ignore logical error in an argument based on how believable a conclusion may be. For instance, people often buy into weight loss commercials that promise you could lose 20 pounds despite the illogical claim that you don’t have to diet and only have to use their method for 10 minutes everyday for two weeks.
  5. Semmelweis Reflex - the reflex-like tendency to ignore or reject any contradictory information against what one already believes. An example might be some one who does not believe that high fructose corn syrup is alright for their children after being told it was unhealthy, despite solid research and facts disputing that misconception.

Probability Biases

A probability bias arises when someone misinterprets precedents or past information and acts on this inaccuracy.
  1. Normalcy Bias - the bias best represented in the freshmen class as Yorkies who are used to flying by in classes believe that since they have never received a B before, it simply cannot or will not happen. This is a logical error based on previous experience that most usually will throw the freshmen into shock. (Hsee and Zhang, 2004)
    (Photo credit: sampsyo@flickr)
  2. Gambler’s Fallacy - the propensity to believe that happenings of the past determine what will happen in the future. Just as its name predicts, this is most commonly exemplified by gamblers whom mistakenly tend to think along the lines that since they lost their game the last 6 times, they have a much greater chance of winning this time, or the next time, or the time after that. (Hsee and Zhang, 2004)

Predictive Biases

Predictive biases are most usually related to someone holding the inaccurate belief that they prematurely know information about events or people based on large or general ideas rather than specifics.
(Photo credit: skotan@clipart)

  1. Optimism Bias - the higher tendency to expect positive outcomes of planned actions, rather than negative. People known as optimists tend to be the reassuring, confidence boosting, Mrs.Sherry-type people who always encourage you to hope for the best.
  2. Pessimism Bias - opposite of the Optimism Bias, this is the habit of anticipating negative outcomes rather than positive. Pessimists sometimes suffer from depression, and typically have less hope for success of planned
    (Photo credit: sunking@clipart)
  3. Planning Fallacy - possibly due to deficiencies in the Prefrontal Cortex (||Cerebral Cortex||), this is the tendency to inaccurately predict the time necessary to complete a task. This can be observed in some York seniors taking AP Psych who underestimate how much time will be needed to complete their textbook-wiki assignment and therefore are up until 2am the night before an installment is due.
  4. Stereotyping - a bias in judgement, stereotyping is setting expectations for or drawing conclusions about an individual, based on the group they are tied to. Racial, religious and political stereotyping are most common as one will assume that because someone looks a certain way, believes a certain way or votes a certain way, she is like the majority of all others who affiliate with them.

Conformity Biases

Conformity biases are the most socially based cognitive biases that are exemplified by people young and old in instances varying from politics to surfing. For more on conformity, follow this link: Conformity
  1. Availability Cascade - the idea that if you believe something enough, it becomes the truth. This idea is subjective to each individual as, for instance, religious upbringing results in different people having concrete belief in opposing concepts.
  2. Ingroup Bias - the tendency for someone to be more comfortable or friendly with people whom he perceive as like himself, or as in the same group as himself. This most basically explains the “cliques” of typical high school as people with common interests gravitate to each other. (Garcia, Song and Tesser, 2010)
  3. Out-group Homogeneity Bias - also called homogeneity blindness, this is the tendency for people within a like group to see their group members as more varied and individualistic than the members of other groups.
    (photo credit: ericfischer@flickr)
    (Garcia, Song and Tesser, 2010)
  4. System Justification - the “go with the flow” tendency for people to more frequently adhere to precedents, rather than establish something new or different. As exemplified with political parties vs York clubs, people tend to mold to existing political parties with a general fit to their beliefs/interest rather than establish new, more self-specific parties. Yorkies are less subject to System Justification than most people, as in anyone having a unique interest, such as in surfing, but finding there is no pre-existing group to facilitate that interest, will easily start a surf club. (Edwards, 1968)