Public and semi-public catalogues of cognitive biases frequently claim there are hundreds of them. It is, of course, not known to everyone that these catalogues often contain a cocktail of 1) cognitive phenomena with well-studied effects, 2) broad labels that different researchers measure in different ways, and 3) vague or overlapping terms that can mean different things and are hard to check properly. Whether this is a problem or not depends on what you want to achieve with a catalogue. Here is some intel on how good cognitive bias lists are built and how to select one that’s good for you.
Let’s Take a Look at Two Cognitive Bias Catalogues
One way to understand how cognitive bias lists are built is by looking at what those who build one tell you about their methodology. For instance, Soprano et al. (2024) conducted a review of relevant literature from which they manually derived a list of 221 cognitive biases that might affect humans when analyzing the accuracy of information. More specifically, what they did was to systematically look in the literature for cognitive biases, using sources such as Wikipedia, a research firm called The Decision Lab, and prior academic work.
They then searched for supporting studies on each bias using six academic databases (Google Scholar, Scopus, PubMed, Wiley, ACL Anthology, and DBLP), using each bias name as their search query. From there, two researchers independently went through all 221 biases and judged which ones could realistically show up during fact-checking, backed by examples; they then compared notes, resolved disagreements, and a third researcher reviewed the final agreed-upon list. The result was a narrowed-down set of 39 biases most relevant to fact-checking, which were then categorized and paired with 11 practical countermeasures to help reduce their influence.
So, in this case, the final list was 39 because what they were looking for were biases that are likely to come up during fact-checking. Is the final list they came up with a good one? Well…only sort of. The problem is that deciding which of the 221 biases apply to fact-checking was subjective because it relied on the researchers’ imagining a plausible scenario. For 16 of the 39 final biases, they couldn’t find any direct fact-checking literature to back them up, meaning those inclusions were essentially educated guesses. The study was also entirely theoretical, as no experiments were run, no fact-checkers were observed, and no data was collected, so there’s no empirical proof that these biases actually manifest in practice.
If 221 sounds like too much, perhaps you’ll be more comfortable with the Cognitive Bias Codex/Buster Benson dataset, which contains only 188 bias names. The story of this catalogue goes as follows: Buster Benson spent years referencing Wikipedia’s list of cognitive biases and found it a useful but tangled reference. His goal was essentially to reorganize and make sense of what Wikipedia had already compiled, not to conduct original research. The codex, visualized with John Manoogian III, arranges biases in a radial circle divided into four quadrants, grouping them by the type of cognitive problem they represent.
Is the Cognitive Bias Codex useful? Yes, it often is. It can be used as a navigational and educational tool. Its four-category framework (too much information, not enough meaning, need to act fast, what to remember) gives you an intuitive mental model for why biases exist at all. It’s also the case that many of these biases are confirmed by reproducible research.
Is the Cognitive Bias Codex a fully reliable list of biases? No. This catalogue was created based on a Wikipedia list, which is not an academic source, though its level of reliability is relatively high. Moreover, some entries (e.g., confirmation bias and anchoring) are backed by decades of research, while others are ‘validated’ by a single study or a limited cultural context. Visually, all of the items in the Codex appear equal in terms of authority.
Wondering what happens if we combine the two lists?
List of 251 Cognitive Biases and Their Definitions
Here’s a table listing the cognitive biases (and their definitions) included in one or both of the lists (i.e., the one from Soprano et al. and the one from the Cognitive Bias Codex). Note that differences in how the two lists named/defined specific biases may have affected the extent to which the comparison between the two catalogues is accurate.
| Item Number | Bias Name | Definition | Appears in Both Catalogues? |
| #1 | Absent-mindedness | Forgetfulness that happens because your attention was elsewhere at the moment you needed the information. | No |
| #2 | Action Bias | A preference for doing something (any action) over doing nothing, even when inaction would be wiser. | No |
| #3 | Actor-observer Bias | A tendency to explain other people’s actions as personality-driven while explaining our own actions as situation-driven. | Yes |
| #4 | Additive Bias | A tendency to solve problems by adding features, steps, or resources instead of considering what to remove. | No |
| #5 | Affect Heuristic | Using your immediate feelings (like/dislike, fear/comfort) as a shortcut for judging risks and benefits. | No |
| #6 | Agent Detection Bias | A tendency to see intentional agents or purposeful causes behind events that may be random or natural. | No |
| #7 | Ambiguity Effect | A tendency to avoid choices when the odds are unclear, even if the expected payoff could be good. | Yes |
| #8 | Anchoring Effect | Letting the first number or piece of information you hear overly shape your later estimates or decisions. | Yes |
| #9 | Anecdotal fallacy | Treating a vivid personal story as decisive evidence, while downplaying broader data or statistics. | No |
| #10 | Anthropocentric Thinking | A habit of interpreting the world as centered on humans, overgeneralizing human needs or traits as the default. | No |
| #11 | Anthropomorphism | Attributing human emotions, intentions, or personalities to animals, objects, or systems that don’t have them. | Yes |
| #12 | Apophenia | Seeing meaningful patterns or connections in random or unrelated information. | No |
| #13 | Appeal to novelty | A fallacy that assumes something is better or true simply because it is new. | No |
| #14 | Appeal to probability fallacy | A fallacy that treats something as true just because it might be true or seems likely to be true. | No |
| #15 | Argument from fallacy | A fallacy that concludes a claim must be false (or true) just because a particular argument for (or against) it is flawed. | No |
| #16 | Association Fallacy | Judging a person, idea, or claim based on what it is associated with rather than on its own merits. | No |
| #17 | Assumed Similarity Bias | Assuming other people think, feel, or behave more like you than they actually do. | No |
| #18 | Attentional Bias | A tendency for attention to get pulled toward certain cues (e.g., threat, novelty), shaping what you notice and remember. | Yes |
| #19 | Attribute Substitution | Answering a hard question by unconsciously swapping in an easier, related question. | No |
| #20 | Attribution Bias | Recurring mistakes in how we assign causes to events or behavior (e.g., over-crediting traits or under-crediting situations). | No |
| #21 | Authority Bias | Giving undue weight to an authority figure’s opinion, regardless of the evidence. | Yes |
| #22 | Automation Bias | Over-trusting automated recommendations and overlooking signs that the system may be wrong. | Yes |
| #23 | Availability Bias | Overestimating how common or likely something is because examples come to mind easily. | No |
| #24 | Availability Cascade | A belief seems truer as it becomes more talked about and repeated in public conversation. | No |
| #25 | Availability Heuristic | Estimating frequency or probability by how easily you can recall examples, rather than by the real base rate. | Yes |
| #26 | Backfire Effect | Corrections or counterevidence can sometimes make a mistaken belief feel even more certain. | Yes |
| #27 | Bandwagon Effect | Adopting beliefs or behaviors mainly because many other people appear to hold them. | Yes |
| #28 | Barnum Effect (or Forer Effect) | Accepting a vague, general personality description as uniquely accurate for you. | Yes |
| #29 | Base Rate Fallacy | Ignoring how common something is in the population (the base rate) when judging how likely it is in a specific case. | Yes |
| #30 | Belief Bias | Judging an argument by whether you like its conclusion rather than by whether the reasoning is valid. | Yes |
| #31 | Ben Franklin Effect | Tending to like someone more after you’ve done them a favor, because you infer they must be worth helping. | No |
| #32 | Berkson’s Paradox | A selection effect where choosing participants based on certain traits creates a misleading relationship between those traits. | No |
| #33 | Bias blind Spot | Spotting biases in other people more readily than in yourself. | Yes |
| #34 | Bizarreness Effect | Unusual or bizarre details are more memorable than ordinary ones. | Yes |
| #35 | Boundary Extension | Remembering a scene as extending beyond what you actually saw, as if your mind ‘zoomed out’ in memory. | No |
| #36 | Cheerleader Effect | People can seem more attractive when seen as part of a group than when seen alone. | Yes |
| #37 | Childhood Amnesia | The common inability to recall events from the earliest years of childhood. | No |
| #38 | Choice-supportive Bias | Remembering your past choices as smarter than they were, and recalling rejected options as worse than they were. | Yes |
| #39 | Clustering illusion | Seeing ‘streaks’ or clusters in random data and assuming they must be meaningful. | No |
| #40 | Cognitive Dissonance | The discomfort of holding conflicting beliefs or acting against your values, often followed by rationalizing to reduce tension. | No |
| #41 | Commission Bias | A preference for errors of action over errors of inaction, often driven by fear of regret for not acting. | No |
| #42 | Compassion Fade | Feeling less empathy as the number of people in need grows, even when the total suffering is larger. | No |
| #43 | Confabulation | Filling gaps in memory with a plausible story, without realizing the details are invented. | No |
| #44 | Confirmation Bias | Favoring information that supports what you already believe and discounting what challenges it. | Yes |
| #45 | Conformity | Adjusting your views or behavior to match a group’s norms or expectations. | No |
| #46 | Congruence Bias | Testing ideas mainly by looking for confirming evidence instead of trying to disprove them. | Yes |
| #47 | Conjunction Fallacy (or Linda Problem) | Mistakenly believing a specific combined scenario is more likely than a broader, more general scenario. | Yes |
| #48 | Conservatism Bias (or Regressive Bias) | Updating beliefs too slowly when given new evidence, so initial beliefs remain overly influential. | Yes |
| #49 | Consistency Bias | Misremembering your past attitudes and choices as more stable and consistent than they really were. | Yes |
| #50 | Context Effect | Your preference changes depending on the surrounding options, comparisons, or how a choice set is constructed. | Yes |
| #51 | Continued Influence Effect | Even after a claim is corrected, it can keep influencing judgments because the original misinformation sticks. | Yes |
| #52 | Contrast Effect | Judging something based on how it compares to what you just experienced, rather than on its absolute value. | Yes |
| #53 | Courtesy Bias | Giving answers you think are polite or expected, rather than what you truly think or experienced. | No |
| #54 | Cross-race Effect | Being better at recognizing and distinguishing faces from your own racial group than from other groups. | Yes |
| #55 | Cryptomnesia | A memory error where you remember an idea but forget where it came from, so it feels like your own original thought. | Yes |
| #56 | Cue-dependent forgetting | Recalling information better when the cues at retrieval match the cues present when you learned it. | No |
| #57 | Curse of Knowledge | Struggling to imagine what it’s like not to know something you know, which leads to explanations that are too advanced. | Yes |
| #58 | Declinism | A tendency to believe society or the world is in decline compared with an idealized past. | Yes |
| #59 | Decoy Effect | Adding a clearly inferior ‘decoy’ option can shift preference between two main options. | Yes |
| #60 | Default Effect | A tendency to stick with the pre-selected option, even when an alternative might be better. | No |
| #61 | Defensive Attribution Hypothesis | Assigning blame in a way that protects you emotionally—often blaming more when you could picture yourself as the victim. | Yes |
| #62 | Delmore effect | Spending more time and detail on low-priority plans than high-priority ones, even when the stakes are reversed. | No |
| #63 | Denomination Effect | Spending money more freely when it is broken into smaller denominations than when it is one larger bill. | Yes |
| #64 | Disposition Effect | In investing, selling assets that went up too early and holding assets that went down too long. | Yes |
| #65 | Distinction Bias | Perceiving options as more different when you compare them side-by-side than when you evaluate them one at a time. | Yes |
| #66 | Dread Aversion | A tendency to avoid or overreact to options that evoke dread, because anticipated pain looms larger than anticipated pleasure. | No |
| #67 | Dunning-Kruger Effect | People with low skill often overrate their ability, while people with high skill may underrate theirs. | Yes |
| #68 | Duration Neglect | Judging an experience by its most intense and final moments while largely ignoring how long it lasted. | Yes |
| #69 | Effort Justification | Valuing an outcome more because you worked hard for it, which helps justify the effort you spent. | Yes |
| #70 | Egocentric Bias | Over-focusing on yourself—such as overestimating your role in outcomes or how much others notice you. | Yes |
| #71 | End-of-history Illusion | Believing you will change very little in the future, even though your past self has changed a lot over time. | No |
| #72 | Endowment Effect | Valuing something more simply because you own it. | Yes |
| #73 | Escalation of Commitment (or Irrational Escalation, or Sunk Cost Fallacy) | Continuing to invest time, money, or effort in a failing course of action because you’ve already invested so much. | Yes |
| #74 | Essentialism | Assuming categories (like ‘genius’ or ‘criminal’) have an underlying essence that makes members what they are. | No |
| #75 | Euphoric Recall | Remembering a past experience as more positive than it actually was at the time. | No |
| #76 | Exaggerated Expectation | Expecting outcomes to be more extreme than they usually are, so reality feels milder than your prediction. | No |
| #77 | Experimenter’s Bias (or Expectation Bias) | Research results can be skewed because the experimenter’s expectations subtly influence methods, participants, or interpretation. | Yes |
| #78 | Extension Neglect | Failing to adequately consider the size or extent of something (how many, how much) when evaluating its importance or value. | No |
| #79 | Extrinsic Incentives Bias | Over- or misjudging how external rewards and incentives affect motivation, especially when interpreting other people’s behavior. | Yes |
| #80 | Fading Affect Bias | In memory, negative feelings fade faster than positive ones, making the past seem nicer over time. | Yes |
| #81 | Fallacy of Composition | A fallacy that assumes what is true of the parts must be true of the whole. | No |
| #82 | Fallacy of Division | A fallacy that assumes what is true of the whole must be true of each part. | No |
| #83 | False Consensus Effect | Overestimating how many other people share your opinions, habits, or preferences. | Yes |
| #84 | False Memory | Remembering events that didn’t happen, or recalling real events in a distorted way. | Yes |
| #85 | False Uniqueness Bias | Underestimating how common your positive traits or good behaviors are, so you feel unusually unique. | No |
| #86 | Focusing effect | Overweighting one standout factor when predicting outcomes like happiness, satisfaction, or success. | No |
| #87 | Form Function Attribution Bias | Assuming an object’s form implies its intended function or meaning beyond the evidence you actually have. | No |
| #88 | Framing Effect (or Frequency Illusion, or Baader-Meinhof Phenomenon) | After you notice something once, you start seeing it everywhere and mistakenly think it has suddenly become much more common. | Yes |
| #89 | Functional fixedness | Difficulty seeing alternative uses for an object beyond its typical or ‘intended’ purpose. | No |
| #90 | Fundamental Attribution Error | Explaining others’ behavior as personality-driven while underestimating situational forces affecting them. | Yes |
| #91 | Gambler’s Fallacy | Believing a random process should ‘even out’ soon, so a streak makes the opposite outcome feel due. | Yes |
| #92 | Gender Bias | Letting gender stereotypes shape expectations, judgments, or decisions about people’s abilities or roles. | No |
| #93 | Generation Effect (or Self-generation Effect) | Remembering information better when you generate or produce it yourself rather than just reading it. | Yes |
| #94 | Google Effect | Forgetting information more readily when you expect you can easily look it up later (e.g., online). | Yes |
| #95 | Group Attribution Error | Attributing a person’s behavior to their group (or a group’s behavior to its members) in an overly simplistic way. | Yes |
| #96 | Groupshift | Groups tend to make decisions that are more extreme than the average of members’ initial positions. | No |
| #97 | Groupthink | A group’s desire for harmony can suppress dissent and critical thinking, leading to poor decisions. | No |
| #98 | Halo Effect | Letting an overall impression of someone (good or bad) color judgments about their specific traits. | Yes |
| #99 | Hard-easy Effect | Being underconfident on easy tasks and overconfident on hard tasks. | Yes |
| #100 | Hindsight Bias | After you know the outcome, it feels like you ‘knew it all along’ and the result seems more predictable than it was. | Yes |
| #101 | Hostile Attribution Bias | Interpreting ambiguous actions by others as intentionally hostile. | No |
| #102 | Hot-cold Empathy Gap | Underestimating how much ‘hot’ states (hunger, pain, anger, arousal) change choices compared with ‘cool’ states. | Yes |
| #103 | Hot-hand Fallacy | Believing a streak of successes means a person has a ‘hot hand’ and is more likely to keep succeeding, beyond chance. | Yes |
| #104 | Humor Effect | Humorous material is more memorable than similar non-humorous material. | Yes |
| #105 | Hyperbolic Discounting | Strongly preferring smaller-sooner rewards over larger-later rewards, more than a consistent discount rate would predict. | Yes |
| #106 | Identifiable victim effect | Feeling more compelled to help a single named or pictured person than an anonymous group with the same need. | No |
| #107 | IKEA Effect | Valuing things more because you helped create or assemble them. | Yes |
| #108 | Illicit Transference | A logic error where you treat properties of a word or label as if they were properties of the thing it refers to. | No |
| #109 | Illusion of Asymmetric Insight | Believing you understand other people better than they understand you. | Yes |
| #110 | Illusion of Control | Overestimating how much control you have over outcomes that are largely influenced by chance or external factors. | Yes |
| #111 | Illusion of Explanatory Depth | Thinking you understand how something works in detail until you try to explain it and discover the gaps. | No |
| #112 | Illusion of external agency | Attributing your own preferences or choices to an outside force or agent rather than to your own mind. | No |
| #113 | Illusion of Transparency | Overestimating how much your feelings or intentions ‘leak out’ and are obvious to other people. | Yes |
| #114 | Illusion of Validity | Feeling overly confident in a judgment because the story seems coherent, even if the evidence is weak. | Yes |
| #115 | Illusory Correlation | Seeing a relationship between two things when the pattern is coincidental or driven by a third factor. | Yes |
| #116 | Illusory Superiority | Rating yourself as better than average on positive traits or abilities. | Yes |
| #117 | Illusory Truth Effect | Repeated statements feel truer, even when the content is false. | Yes |
| #118 | Impact Bias | Overestimating how strongly and how long future events will affect your emotions. | Yes |
| #119 | Implicit associations | Unconscious mental links between concepts that shape perception and judgment without deliberate intent. | No |
| #120 | Implicit Bias | Automatic, unconscious attitudes or stereotypes that can influence behavior and decisions. | No |
| #121 | Implicit stereotypes | Unconscious stereotypes that shape what you expect from people in different groups. | No |
| #122 | Information Bias | Seeking additional information even when it won’t change the decision you can make. | Yes |
| #123 | Ingroup Bias | Favoring people seen as part of your own group over outsiders. | Yes |
| #124 | Insensitivity To Sample Size | Drawing strong conclusions from small samples, as if they were as informative as large samples. | Yes |
| #125 | Intentionality Bias | Assuming outcomes or behaviors are intentional when they may be accidental or random. | No |
| #126 | Interoceptive Bias (or Hungry Judge Effect) | Your judgments are swayed by internal bodily states like hunger, fatigue, or stress. | No |
| #127 | Just-world Hypothesis | Believing the world is fair, so good things happen to good people and bad outcomes must be deserved. | Yes |
| #128 | Lag Effect | Memory improves when repetitions are spaced out over time rather than packed together without gaps. | No |
| #129 | Less-is-better Effect | Preferring a smaller quantity when judged alone, even if you’d prefer the larger quantity when compared directly. | Yes |
| #130 | Leveling And Sharpening | Memory tends to drop some details (‘leveling’) while exaggerating or highlighting others (‘sharpening’). | Yes |
| #131 | Levels-of-processing Effect | Information processed more deeply (by meaning) is remembered better than information processed superficially (by appearance). | Yes |
| #132 | List-length Effect | As lists get longer, people tend to recall a smaller proportion and struggle more to retrieve specific items. | Yes |
| #133 | Logical Fallacy | A general term for a common error in reasoning that makes an argument unreliable or invalid. | No |
| #134 | Loss Aversion | Losses typically feel more painful than equally sized gains feel pleasurable. | Yes |
| #135 | Magic number 7+-2 | A rough limit on how many items we can hold in working memory at once (often around 7, give or take). | No |
| #136 | Masked man fallacy | A mistake about identity and belief: knowing ‘X’ doesn’t imply knowing ‘X’ under a different description, even if they’re the same thing. | No |
| #137 | Memory Inhibition | Difficulty recalling certain memories because other thoughts or retrieval attempts actively block them. | Yes |
| #138 | Mental accounting | Mentally separating money into ‘buckets’ so identical dollars are treated differently depending on their label or source. | No |
| #139 | Mere Exposure Effect (or Familiarity Principle) | Repeated exposure to something tends to increase familiarity and liking for it. | Yes |
| #140 | Misattribution | Assigning a memory, feeling, or idea to the wrong source—such as thinking you learned it somewhere else. | Yes |
| #141 | Misinformation effect | Later misleading information can alter your memory of what actually happened. | No |
| #142 | Modality Effect | Items presented by sound are often recalled better than visually presented items, especially near the end of a list. | Yes |
| #143 | Money Illusion | Focusing on nominal amounts of money and neglecting purchasing power changes like inflation. | Yes |
| #144 | Mood-congruent Memory Bias | Being more likely to remember information that matches your current mood. | Yes |
| #145 | Moral Credential Effect | Past good deeds can make people feel licensed to behave less ethically afterward. | Yes |
| #146 | Moral Luck | Judging the morality of someone’s choice by how it turned out, even when the outcome depended on luck. | Yes |
| #147 | Murphy’s Law | A tendency to expect that things will go wrong, especially when you’re primed to look for problems. | No |
| #148 | Naïve Cynicism | Assuming other people are more selfish or self-interested than they really are. | Yes |
| #149 | Naïve Realism | Believing you see the world objectively, and if others disagree they must be uninformed or biased. | Yes |
| #150 | Negativity Bias | Negative experiences and information tend to have a stronger impact than positive ones. | Yes |
| #151 | Neglect of Probability | Ignoring probability and focusing on the possibility or vividness of outcomes when judging risk. | Yes |
| #152 | Next-in-line Effect | Forgetting your turn when it’s time to speak because you were focused on preparing what you would say. | Yes |
| #153 | Non-adaptive Choice Switching | Switching choices after a negative outcome even when the underlying odds haven’t changed, which can hurt performance. | No |
| #154 | Normalcy Bias | Assuming things will continue as normal, so you underestimate the likelihood or impact of a disaster. | Yes |
| #155 | Not Invented Here Syndrome | Rejecting ideas or products primarily because they come from outside your group or organization. | Yes |
| #156 | Objectivity Illusion | Believing your judgments are purely objective while others’ judgments are biased by perspective. | No |
| #157 | Observer effect | Changes in behavior that occur because people know they’re being watched, measured, or studied. | No |
| #158 | Observer-expectancy Effect | An experimenter’s expectations influence what participants do or what observers record, shaping the results. | Yes |
| #159 | Occam’s razor | A rule of thumb that, all else equal, simpler explanations are preferable to more complicated ones. | No |
| #160 | Omission Bias | Judging harmful actions as worse than equally harmful inaction, often because omissions feel less blameworthy. | Yes |
| #161 | Optimism Bias | Believing good outcomes are more likely for you than for other people in similar situations. | Yes |
| #162 | Ostrich Effect (or Ostrich Problem) | A tendency to avoid or ignore negative information so you don’t have to face it. | Yes |
| #163 | Outcome Bias | Judging a decision mainly by how it ended up, not by how reasonable it was given what was known at the time. | Yes |
| #164 | Outgroup Homogeneity Bias | Perceiving members of an outgroup as all basically similar, while seeing more diversity within your own group. | Yes |
| #165 | Overconfidence Effect | Being more confident in your answers or predictions than your accuracy justifies. | Yes |
| #166 | Pareidolia | Seeing faces or meaningful images in random patterns (like clouds or noise). | No |
| #167 | Parkinson’s Law of Triviality | Spending disproportionate time and attention on trivial issues while neglecting bigger, more important ones. | Yes |
| #168 | Part-list Cueing Effect | Providing some items from a list as ‘help’ can actually make it harder to recall the remaining unseen items. | Yes |
| #169 | Peak-end Rule | Remembering an experience mostly by its peak intensity and how it ended, rather than by the overall average. | Yes |
| #170 | Perky Effect | Vivid mental imagery can spill into perception so that imagined details feel like they were actually seen. | No |
| #171 | Pessimism Bias | Expecting negative outcomes to happen more often than they do, and underestimating positive possibilities. | Yes |
| #172 | Picture Superiority Effect | Pictures tend to be remembered more easily than the same information presented as words. | Yes |
| #173 | Placebo effect | Experiencing real changes (or perceived improvement) because you believe a treatment will help. | No |
| #174 | Placement Bias | Preferences can shift simply because an option appears earlier, later, higher, or more prominently in a list or layout. | No |
| #175 | Plan Continuation Bias | Sticking with a plan even when new information suggests it’s no longer appropriate or safe. | No |
| #176 | Planning Fallacy | Underestimating how long tasks will take and overestimating how smoothly they will go. | Yes |
| #177 | Plant Blindness | Overlooking plants in your environment and underestimating their importance compared with animals or objects. | No |
| #178 | Positivity Effect (or Socioemotional Electivity Theory) | As people age, they often remember and attend to positive information more than negative information. | Yes |
| #179 | Prejudice | Holding negative judgments about groups or individuals in advance, often based on stereotypes rather than evidence. | No |
| #180 | Present Bias | Giving much more weight to immediate rewards or costs than to future ones, even when the future matters more. | No |
| #181 | Prevention Bias | Preferring to prevent harms rather than provide help after harms occur, even when outcomes would be equivalent. | No |
| #182 | Primacy Effect | Better recall for items that appear at the beginning of a sequence. | Yes |
| #183 | Pro-innovation Bias | Assuming innovations are inherently beneficial and discounting risks, trade-offs, or unintended consequences. | Yes |
| #184 | Probability Matching | Choosing options in proportion to their success rates instead of consistently choosing the best-probability option. | No |
| #185 | Processing Difficulty Effect | When something is harder to read or process, it can prompt deeper thinking (or sometimes stronger memory) than fluent information. | Yes |
| #186 | Projection Bias | Assuming your future preferences and feelings will match your current ones, even when circumstances will differ. | Yes |
| #187 | Proportionality Bias | Believing big events must have big, intentional causes, rather than accepting small or chance causes. | No |
| #188 | Prospect Theory | A framework describing how people evaluate gains and losses relative to a reference point, with losses typically weighing more than gains. | No |
| #189 | Pseudocertainty Effect | Preferring ‘sure’ outcomes in a sub-stage of a decision, even when the overall probabilities don’t justify the preference. | Yes |
| #190 | Puritanical Bias | A bias toward valuing self-control and judging indulgence harshly, which can skew moral and practical judgments. | No |
| #191 | Pygmalion Effect | Higher expectations from others can lead to better performance, partly because those expectations change treatment and effort. | No |
| #192 | Reactance Theory | A tendency to resist perceived threats to freedom by doing the opposite of what is being pressured or demanded. | Yes |
| #193 | Reactive Devaluation | Discounting a proposal or concession because it comes from an opponent, even if it would otherwise seem reasonable. | Yes |
| #194 | Recency Effect | Better recall for the most recent items in a sequence. | Yes |
| #195 | Recency Illusion | After learning something, you mistakenly think it’s new or that people are suddenly talking about it more than before. | Yes |
| #196 | Reminiscence Bump | Adults tend to have a cluster of vivid memories from adolescence and early adulthood compared with other life periods. | No |
| #197 | Repetition Blindness | Failing to notice repetitions when the same word or object appears twice close together in time. | No |
| #198 | Restraint Bias | Overestimating your ability to resist temptation, so you take on situations that require more self-control than you have. | Yes |
| #199 | Reverse psychology | Trying to influence someone by advocating the opposite of what you actually want them to do. | No |
| #200 | Rhyme As Reason Effect | Rhyming phrases often feel more convincing or ‘true’ than equivalent non-rhyming phrases. | Yes |
| #201 | Risk Compensation (or Peltzman Effect) | Taking bigger risks when safety measures are present because you feel more protected. | Yes |
| #202 | Rosy Retrospection | Remembering the past as better than it was, especially compared with the present. | Yes |
| #203 | Salience Bias | Paying disproportionate attention to what is vivid, prominent, or emotionally striking, while neglecting less noticeable information. | No |
| #204 | Saying Is Believing Effect | After you state or explain something, you become more likely to believe it and remember it in a way consistent with what you said. | No |
| #205 | Scope Neglect | Failing to adjust your judgments for the scale of a problem, so large and small harms can feel oddly similar in value. | No |
| #206 | Selection Bias | Drawing conclusions from a non-representative sample because the way you selected observations was skewed. | No |
| #207 | Selective perception | Noticing and interpreting information that fits your expectations while filtering out what doesn’t. | No |
| #208 | Self-relevance Effect | Information linked to your self-concept is remembered better than similar information about others. | Yes |
| #209 | Self-serving Bias | Taking credit for successes and blaming failures on circumstances, luck, or other people. | Yes |
| #210 | Semmelweis Reflex | Rejecting new evidence or ideas mainly because they conflict with an established belief or practice. | Yes |
| #211 | Serial Position Effect | Memory is best for items at the beginning and end of a list, with the middle remembered worst. | Yes |
| #212 | Serial recall effect | The tendency for recall to follow the original order of information, especially in short-term memory tasks. | No |
| #213 | Sexual Overperception Bias | Overinterpreting friendliness or ambiguity as sexual interest, often more commonly reported by men judging women. | No |
| #214 | Shared Information Bias | In group discussions, people tend to focus on information everyone already knows and ignore unique information held by individuals. | No |
| #215 | Social Comparison Bias | Favoring options or candidates who don’t threaten your own strengths, status, or self-image. | Yes |
| #216 | Social Cryptomnesia | Misremembering who originally contributed an idea in a group, so the true source gets blurred or reassigned. | No |
| #217 | Social Desirability Bias | Answering in a way that looks socially acceptable, leading to overreporting ‘good’ behavior and underreporting ‘bad’ behavior. | Yes |
| #218 | Source Confusion | Mixing up where you learned something, so you attribute a memory to the wrong source. | Yes |
| #219 | Spacing Effect | Spacing study or practice over time improves long-term learning compared with cramming. | Yes |
| #220 | Spotlight Effect | Overestimating how much other people notice your appearance, actions, or mistakes. | Yes |
| #221 | Status Quo Bias | Preferring things to stay the same and resisting change, even when change could be beneficial. | Yes |
| #222 | Stereotypical Bias (or Stereotype Bias) | Letting stereotypes drive judgments about individuals, often by overgeneralizing group traits. | Yes |
| #223 | Stereotyping Subadditivity Effect | Assigning probabilities to subcategories in a way that makes the parts add up to more than the whole should allow. | No |
| #224 | Subadditivity effect | Assigning higher combined probability to detailed parts than to the overall category, violating basic probability logic. | No |
| #225 | Subjective Validation | Accepting a claim because it feels personally fitting or emotionally resonant, not because it is well supported. | Yes |
| #226 | Suffix Effect | Recall of the last item worsens when another irrelevant item immediately follows it. | Yes |
| #227 | Suggestibility | Having memories and beliefs shaped by leading questions, hints, or social pressure. | No |
| #228 | Surrogation | Mistaking a representation or model of something for the real thing, and drawing conclusions as if the model were reality. | No |
| #229 | Survivorship Bias | Focusing on visible successes and missing the invisible failures that dropped out along the way. | Yes |
| #230 | System Justification | Defending and rationalizing existing social or political arrangements, even when they disadvantage you or your group. | Yes |
| #231 | Systematic Bias | A consistent, directional error in judgment or measurement that reliably pushes results away from the truth. | No |
| #232 | Tachypsychia | A warped sense of time (often feeling slowed down or sped up) during trauma, drugs, or extreme stress. | No |
| #233 | Telescoping Effect | Misplacing events in time—remembering recent events as longer ago and distant events as more recent than they were. | Yes |
| #234 | Testing Effect | Retrieving information through testing strengthens memory more than simply reviewing the same material. | Yes |
| #235 | Third-person Effect | Believing media or messages influence other people more than they influence you. | Yes |
| #236 | Time-saving Bias | Miscalculating time saved by changing speed—overvaluing speed changes at high speeds and undervaluing them at low speeds. | Yes |
| #237 | Tip-of-the-Tongue Phenomenon | Knowing a word but being unable to retrieve it in the moment, often while feeling it’s ‘right there’. | Yes |
| #238 | Trait Ascription Bias | Seeing yourself as flexible and changeable but viewing other people as having more fixed traits. | Yes |
| #239 | Travis Syndrome | Overestimating the significance or urgency of the present moment compared with the past or future. | No |
| #240 | Truth Bias | A tendency to default to believing what you’re told, which makes lies and errors harder to detect. | No |
| #241 | Ultimate Attribution Error | Explaining outgroup behavior in a biased way—crediting their good actions to luck or situation while blaming their bad actions on character (and doing the opposite for your ingroup). | Yes |
| #242 | Unconscious Bias (or Implicit Bias) | Automatic, unintentional bias that influences judgment and behavior without conscious awareness. | No |
| #243 | Unit Bias | Treating one ‘unit’ (one serving, one item, one package) as the proper amount and consuming it regardless of actual need. | Yes |
| #244 | Verbatim Effect | Remembering the gist of what was said better than the exact words, which can change details over time. | No |
| #245 | Von Restorff Effect | A distinct or standout item is remembered better than similar items around it. | Yes |
| #246 | Weber-Fechner Law | A principle that perceived intensity grows with relative change, so equal physical changes don’t feel equally large. | Yes |
| #247 | Well Traveled Road Effect | Preferring familiar options and routes simply because they are familiar, even without evidence they are better. | Yes |
| #248 | Women Are Wonderful Effect | A stereotype that attributes more positive traits to women than to men in general. | No |
| #249 | Worse-than-average Effect | Believing you are worse than average at simple tasks because you assume others find them easier than you do. | No |
| #250 | Zero-risk Bias | Preferring to eliminate a small risk completely rather than reduce a larger risk by a bigger amount that still leaves some risk. | Yes |
| #251 | Zero-sum Bias | Assuming situations are zero-sum, so one person’s gain must come at another person’s loss even when resources aren’t fixed. | Yes |
Why Don’t We Have A Single, Universally Accepted List of Cognitive Biases?
There are several reasons that can explain why different bias catalogues tell us different things. For instance, bias names may occasionally work as umbrella terms as opposed to well-delimited constructs, making it unclear what counts as replication of the same bias. In other words, researchers might study slightly different things but call them by the same name, or study the same pattern but give it a different label, making it hard to know what truly counts as a replication. For example, “confirmation bias” can refer to selectively searching for supportive evidence, interpreting neutral information as supportive, or remembering confirming facts better—all related, but not identical, behaviors. So when one study claims to replicate “confirmation bias,” it may not be testing exactly the same thing another study did, which creates confusion in bias catalogues.
In some cases, catalogues may contain several entries that represent the same thing under different names. Sometimes two labels are true synonyms, and other times they’re near-synonyms with only tiny, subtle differences. For example, one catalogue might split “overconfidence” into several named sub-biases (overestimation, overprecision, better-than-average), while another treats them as one family.
Even when everyone agrees on the general concept, studies may measure it in different ways, and that can change the size of the effect or when it shows up. In framing research, for instance, “the same bias” can look stronger or weaker depending on how the choices are written, what the stakes are, or whether it’s a between-person vs within-person design. For example, when studying something known as the illusion of control bias, one study might measure it as “how much control do you feel you have?” while another measures “do you bet more when you think you have control?” The results obtained from these two approaches may be different.
Bias literatures can also get distorted by publication bias and researcher degrees of freedom; studies with exciting results get published more often, and small choices in analysis (which outcomes to report, when to stop collecting data, which variables to include, etc.) can accidentally make weak effects look real.
Finally, replication and generalizability are real constraints: when large replication projects rerun classic effects, the replicated effects are often smaller, and sometimes they don’t hit the usual “statistically significant” threshold, especially if the original finding was overestimated. And even if an effect replicates, it might be very context-sensitive: a bias seen in US undergraduates in a lab might look different in professionals making real decisions under time pressure.
Bottom Line: A Good List Contains the Cognitive Biases You Care About
We don’t currently have a single catalogue of cognitive biases we can rely on, regardless of the circumstance. Existing lists are, nevertheless, quite useful when our goal is to find biases that might affect us or that we want to further investigate.
To be more specific, you may consider doing things as follows: If you suspect biases might be significantly affecting your personal life, your organization, or your research study, perhaps it’s time to take a good look at some bias constructs to get a better sense of which biases might be responsible for your problems. Then, as a next step, do a bit of research to assess to what extent the biases in question are understood in the scientific literature (e.g., what cognitions and behaviors are thought to underlie the biases in question, and what kind of consequences correlating with the cognitions and behaviors in question are to be expected). Then, if you are quite sure the biases in question are to be taken seriously in your situation, find out more about ways to mitigate their unwanted effects, both generally speaking and bias-specific speaking.
Sources
Benson, B. (2016). Cognitive bias cheat sheet dataset [Data set]. GitHub. Link.
Soprano, M., Roitero, K., La Barbera, D., Ceolin, D., Spina, D., Demartini, G., & Mizzaro, S. (2024). Cognitive biases in fact-checking and their countermeasures: A review. Information Processing & Management, 61(3), 103672. Link.


