Skip to content
Navigation
🏠Overview
Knowledge
🔬Scientific Foundation
🧠Critical Thinking
🤖AI and Technology
Debunking
🔮Esotericism and Occultism
🛐Religions
🧪Pseudoscience
💊Pseudomedicine
🕵️Conspiracy Theories
Tools
🧠Cognitive Biases
✅Fact Checks
❓Test Yourself
📄Articles
📚Hubs
Account
📈Statistics
🏆Achievements
⚙️Profile
Deymond Laplasa
  • Home
  • Articles
  • Hubs
  • About
  • Search
  • Profile

Knowledge

  • Scientific Base
  • Critical Thinking
  • AI & Technology

Debunking

  • Esoterica
  • Religions
  • Pseudoscience
  • Pseudomedicine
  • Conspiracy Theories

Tools

  • Fact-Checks
  • Test Yourself
  • Cognitive Biases
  • Articles
  • Hubs

About

  • About Us
  • Fact-Checking Methodology
  • Privacy Policy
  • Terms of Service

Account

  • Profile
  • Achievements
  • Settings

© 2026 Deymond Laplasa. All rights reserved.

Cognitive immunology. Critical thinking. Defense against disinformation.

  1. Home
  2. /Critical Thinking
  3. /Mental Errors
  4. /Cognitive Biases
  5. /58 Logical Fallacies and Cognitive Biase...
📁 Cognitive Biases
🔬Scientific Consensus

58 Logical Fallacies and Cognitive Biases: How Dr. Spin Turns Your Mind Into a Battlefield for Others' Interests

Human thinking is far from perfect — Kahneman and Tversky's research revealed that our minds are riddled with systematic errors that are easily exploited. From ignoring base rates to framing effects, these cognitive traps turn rational people into predictable puppets. We break down the mechanisms behind 58 documented biases, show how "Dr. Spin" weaponizes them for manipulation, and provide a self-check protocol that works in 30 seconds.

🔄
UPD: February 23, 2026
📅
Published: February 21, 2026
⏱️
Reading time: 13 min

Neural Analysis

Neural Analysis
  • Topic: Systematic thinking errors (logical fallacies) and cognitive biases as tools for mind manipulation
  • Epistemic status: High confidence — phenomena experimentally confirmed in the works of Kahneman, Tversky, and hundreds of subsequent studies
  • Evidence level: Meta-analyses, systematic reviews, reproducible experimental data (base-rate neglect, availability bias, conjunction fallacy, decoy effect, framing effect, Allais paradox)
  • Verdict: Cognitive biases are not random glitches but predictable patterns that can be systematized and exploited for manipulation. Understanding the mechanisms of these errors is a necessary condition for cognitive hygiene in the age of information warfare.
  • Key anomaly: Rationalists from Descartes to Spinoza considered human thinking a flawless instrument for seeking truth, but experimental data shows the opposite — our minds systematically err in predictable directions
  • 30-second check: When offered a solution, ask: "What alternatives am I not considering?" — this exposes framing and anchoring effects
Level1
XP0

�� Your mind is not a fortress. It's a battlefield where an invisible war unfolds daily for your decisions, beliefs, and actions. The work of Daniel Kahneman and Amos Tversky shattered the myth of the rational human, revealing that our thinking is riddled with systematic errors that turn us into predictable puppets (S003). "Doctor Spin" is the archetypal manipulator who knows these 58 documented cognitive traps and wields them with surgical precision. Today we'll dissect the mechanisms of these biases and give you a self-defense protocol that works in 30 seconds.

�� What are cognitive biases and logical fallacies — and why 17th-century rationalists were catastrophically wrong

17th-century rationalists — Descartes, Spinoza — believed human thinking was flawless if given the right tools of logic (S003). Several centuries later, Kahneman and Tversky demolished this illusion: thinking is filled with systematic errors, prejudices, and cognitive traps (S003).

Cognitive biases are systematic deviations from rationality, hardwired into the brain's architecture. Logical fallacies are violations of formal logic rules in argumentation. Both categories make us predictably vulnerable to manipulation. More details in the Thinking Tools section.

The rationalists weren't wrong about logic — they were wrong about thinking that logic is the only mechanism of thought. The brain operates on heuristics: fast, dirty, economical rules that often make mistakes.

The boundary between bias and fallacy

A cognitive bias is a mental bug in perception. The availability heuristic makes us overestimate the probability of events that are easy to recall: plane crashes seem more dangerous than car accidents, even though statistics say otherwise (S003).

A logical fallacy is a defect in the chain of reasoning. "After this, therefore because of this" (post hoc ergo propter hoc) confuses correlation with causation (S008).

Key distinction
Cognitive biases generate logical fallacies. Base-rate neglect is a bias that leads to errors in probabilistic judgments. People ignore statistics and focus on vivid details (S003). This connection makes us especially vulnerable: we don't err randomly, but according to predictable patterns.

Scale of the problem: documented traps

Research has identified dozens of cognitive biases that reproduce in experiments with high reliability (S001). Among the most studied:

  • Base-rate neglect
  • Availability bias
  • Conjunction fallacy
  • Decoy effect
  • Framing effect
  • Allais paradox

Each of these biases has been confirmed repeatedly under controlled conditions. They're not random — they're part of our cognitive architecture, built into how the brain processes information under time pressure and uncertainty.

Taxonomy of 58 cognitive biases and logical fallacies as an interactive map
Visualization of major cognitive bias categories: from perception biases to errors in probabilistic judgments. Each category represents a separate attack vector on rationality.

�� Five Most Powerful Arguments for the Existence of Systematic Cognitive Traps

Cognitive biases are not a theoretical abstraction, but a real phenomenon with a solid evidence base. Here are the strongest arguments that make this concept irrefutable. Learn more in the Critical Thinking section.

�� Argument 1: Reproducibility in Controlled Experiments

Cognitive biases demonstrate high reproducibility in laboratory settings. Kahneman and Tversky's experiments on the framing effect showed that the same information, presented in terms of losses or gains, triggers opposite decisions in the same people (S003). This is not random fluctuation—it's a systematic pattern that repeats across different cultures and time periods.

Characteristic Random Error Systematic Bias
Reproducibility Unpredictable Repeats in 80–95% of cases
Direction Random Always in one direction
Magnitude Varies Stable across groups

�� Argument 2: Predictive Power of Models

Models based on cognitive biases successfully predict human behavior in real situations. The decoy effect is used in pricing: adding a deliberately unfavorable third option makes the target option more attractive (S003). Companies apply this knowledge to increase sales of premium products—and it works with mathematical precision.

If biases were random noise, prediction would be impossible. Instead, we see reproducible effects ranging from 15–40% of the baseline metric.

�� Argument 3: Neurobiological Correlates

Modern neuroimaging methods show that cognitive biases are linked to activity in specific brain regions. Research in computational cognition demonstrates how brain architecture predisposes us to certain types of errors (S007). This is not just a psychological phenomenon—it's physiology.

When the brain processes information under time pressure or uncertainty, it activates fast decision-making systems (limbic system, basal ganglia) rather than slow analytical processes (prefrontal cortex). This is an architectural constraint, not a failure of willpower.

⚙️Argument 4: Evolutionary Justification

Many cognitive biases have evolutionary explanations. The availability heuristic was useful in environments where easily recalled threats were indeed more frequent and dangerous (S003). Our brains are optimized for survival on the savanna, not for statistical analysis in the information age.

These "bugs" are side effects of adaptations that once saved lives. They haven't disappeared because evolution hasn't had time to edit them out over the last 10,000 years.

�� Argument 5: Cross-Cultural Universality

Core cognitive biases are found across different cultures, indicating their universal nature. Base rate neglect manifests in subjects regardless of education, cultural context, or language (S003). This is not an artifact of Western psychology—it's a property of human cognition itself.

  1. Framing effect: reproduced in the USA, Israel, Japan, India
  2. Anchoring: works identically in cultures with different languages and number systems
  3. Confirmation bias: universal regardless of literacy level
  4. Decoy effect: operates across different economic systems and markets

�� Evidence Base: Six Experimentally Confirmed Cognitive Traps and Their Exploitation Mechanisms

Let's move to specific biases that have been repeatedly confirmed in research and actively exploited by manipulators. Each one is an open door into your consciousness. More details in the Psychology of Belief section.

⚠️Base-Rate Neglect

This bias causes people to ignore the statistical prevalence of a phenomenon and focus on specific details (S003). If a test for a rare disease (occurring in 1% of the population) has 95% accuracy, a positive result is still more likely to be a false positive. People systematically overestimate the probability of disease, forgetting the base rate.

"Doctor Spin" exploits this by presenting a vivid, emotionally charged case (a rare vaccine side effect) and making you forget the base rate (millions of safe vaccinations). Your brain latches onto the dramatic story and ignores the statistics.

One tragic story outweighs a million favorable outcomes—not because it's logical, but because that's how our memory works.

�� Availability Bias

We overestimate the probability of events that are easy to recall—usually because they're recent, vivid, or emotionally charged (S003). After a series of news reports about plane crashes, people start considering flights more dangerous, even though the statistics haven't changed.

This bias is exploited through control of the information agenda: if media constantly talk about rare but frightening events, you begin to consider them typical. More on the mechanism in availability heuristic and risk perception.

�� Conjunction Fallacy

People systematically rate the probability of a conjunction of two events (A and B) as higher than the probability of one of them (A), which is mathematically impossible (S003). The famous Linda problem: most people choose "bank teller and feminist activist," even though this is logically less probable than simply "bank teller."

Manipulators exploit this by adding plausible details to a statement, making it more "vivid" and convincing. Each additional detail reduces the probability of the entire conjunction but increases its psychological persuasiveness.

Visualization of the decoy effect in decision-making
Graphical representation of the decoy effect: adding an asymmetrically dominated option shifts preferences toward the target product. Arrows show the change in choice probability.

⚙️Decoy Effect

Adding a third, deliberately disadvantageous option (decoy) changes the relative attractiveness of the other two options (S003). Example: magazine subscription for $59 (online only) and $125 (print + online). Most will choose the cheaper option.

But if you add a third option—$125 (print only), suddenly the "print + online" combo for the same $125 seems like an incredible deal. Sales of this option skyrocket, even though objectively nothing has changed.

Scenario Majority Choice Mechanism
Two options: $59 (online) vs $125 (print + online) $59 (online) Price is the main criterion
Three options: $59 (online) vs $125 (print + online) vs $125 (print only) $125 (print + online) Decoy makes combo the "best deal"

�� Framing Effect

The same information presented in different formulations triggers different decisions (S003). Classic experiment: "Program will save 200 out of 600 lives" is perceived positively, while "Program will lead to the death of 400 out of 600 people" is perceived negatively, even though it's the same thing.

Politicians, marketers, and propagandists masterfully use framing to direct your perception in the desired direction. The same fact can be presented as a triumph or catastrophe depending on the chosen angle.

�� Allais Paradox

People systematically violate the axioms of expected utility theory, making choices that contradict their own preferences in a different context (S003). This demonstrates that our decisions don't follow a rational model of utility maximization.

We're sensitive to context and problem formulation in ways that make us predictably irrational. This predictability is the manipulator's main tool.

Why these traps are universal
They're built into the architecture of human thinking, not a lack of education or intelligence. Even experts and scientists fall into them when working outside their specialty.
How to recognize them
First sign—when you're asked to make a decision based on one vivid story, without statistics. Second—when information is presented in a form that triggers emotion before you've had time to analyze the facts.
Defense
Not complete immunization (it doesn't exist), but awareness of the mechanism. When you see framing, you can reframe the information yourself. When you notice a decoy, you can ignore it. When you hear one story—you can demand the base rate.

�� Mechanisms of Cognitive Biases: Why Our Brain Systematically Errs and How This Is Used Against Us

Understanding the mechanisms is key to protection. Cognitive biases aren't random; they arise from fundamental features of how our brain works. Learn more in the Cognitive Biases section.

�� System 1 vs System 2: The Architecture of Vulnerability

Kahneman described two modes of thinking: System 1 (fast, automatic, intuitive) and System 2 (slow, analytical, effortful). Most cognitive biases occur when System 1 provides a quick but inaccurate answer, and System 2 fails to activate for verification (S003).

Manipulators exploit this by creating conditions where System 2 doesn't have time to engage: time pressure, cognitive load, emotional arousal.

�� Heuristics: Useful Tools Turned Into Weapons

Heuristics are mental "shortcuts" that allow us to make quick decisions under uncertainty. The availability heuristic, representativeness heuristic, affect heuristic—all were adaptive in our evolutionary past (S003).

In today's information landscape, where information is carefully filtered and presented, these heuristics become vulnerabilities. "Dr. Spin" knows which triggers activate which heuristics and uses this knowledge with surgical precision.

Availability Heuristic
Events that are easier to recall seem more probable. The manipulator repeats rare but vivid examples until they become "available" in memory.
Representativeness Heuristic
We judge probability by similarity to a typical example. One vivid case outweighs statistics.
Affect Heuristic
The emotional coloring of an event determines our assessment of its risk and benefit. Fear distorts calculations.

�� Emotions as Bias Amplifiers

Research demonstrates a tight connection between cognitive processes and emotions (S005). Emotional arousal amplifies cognitive biases: fear strengthens the availability effect (you overestimate threats), anger reduces critical thinking, euphoria blinds you to risks.

Manipulators deliberately use emotional triggers to shut down your System 2 and make you maximally vulnerable. This isn't accidental—it's engineering.

Emotion Effect on Thinking How It's Used
Fear Overestimation of threats, seeking protection Catastrophic scenarios, appeals to safety
Anger Reduced criticism, seeking an enemy Blame, polarization, black-and-white thinking
Euphoria Ignoring risks, overestimating benefits Promises of miracle solutions, hiding side effects
Guilt Willingness to compensate, submission Moral reproaches, demands for "atonement"

⚙️Causality vs Correlation: The Fundamental Trap

Causal errors are a class of logical fallacies related to incorrectly establishing cause-and-effect relationships (S008). Our brain is evolutionarily wired to seek causes—this was critical for survival.

But this tendency leads to systematic errors: we see causality where there's only correlation, ignore confounders (third variables), confuse the direction of causality. "Dr. Spin" exploits this by presenting correlations as proof of causation and hiding alternative explanations.

  1. Two events occur together → brain seeks connection
  2. Connection found (often false) → confidence grows
  3. Alternative explanations ignored → causality "proven"
  4. Decision made based on false cause → outcome unpredictable

��️ Conflicts and Uncertainties: Where Sources Diverge and What This Means for Understanding Cognitive Biases

Scientific integrity requires acknowledging that not all aspects of cognitive biases are unambiguous. There are areas where researchers disagree. For more details, see the Climate and Geology section.

�� Debates About Rationality

Some researchers argue that many "cognitive biases" are actually rational adaptations to real-world decision-making conditions.

Ignoring base rates may be justified if you have specific information that makes the base rate irrelevant—but in controlled experiments, people systematically deviate from normative standards of rationality (S003).

These debates are important, but they don't negate the fundamental fact: when we face uncertainty without additional context, our brains choose predictable errors. For more on how this works, see the article on base rate neglect.

�� The Problem of Ecological Validity

Critics point out that many experiments are conducted in artificial laboratory conditions and may not reflect real-world behavior.

  1. Marketing uses this knowledge with predictable results
  2. Political campaigns build strategies based on cognitive traps
  3. Native advertising exploits the same mechanisms (S001)

If cognitive biases were merely laboratory artifacts, they wouldn't work in the real world with such predictability. Practical effectiveness is the best test of validity.

The gap between theory and practice disappears here: what works in experiments works on the street. This isn't coincidence—it's a sign that we've identified a real mechanism.

⚠️Cognitive Anatomy of Manipulation: Which Biases "Doctor Spin" Exploits and How to Recognize an Attack

Manipulators rarely use a single bias. They combine several, amplifying the effect and making defense more difficult. Here's how it works in practice. More details in the Witchcraft section.

�� Technique 1: Emotional Framing + Availability Heuristic

The manipulator presents information in an emotionally charged frame while simultaneously making certain examples easily accessible for recall. Result: you overestimate the probability of an event and make decisions based on emotions rather than facts.

A series of news stories about rare but frightening crimes by immigrants creates an impression of a massive threat, even though statistics show the opposite. The availability heuristic makes you remember vivid examples rather than numbers.

�� Technique 2: Base Rate Neglect + Conjunction Fallacy

A vivid, detailed story is presented that makes you forget about the statistical prevalence of the phenomenon. You assess the probability of a complex scenario as high because it "sounds plausible."

This is the foundation of conspiracy theories: the more details, the more "convincing" the story, even though each detail mathematically reduces its probability. Ignoring base rates allows the manipulator to replace statistics with narrative.

⚙️ Technique 3: Decoy Effect in Political Choice

In politics, the decoy effect is used to manipulate voter preferences (S003). Introducing a third candidate who "draws away" votes from one of the main competitors can change the election outcome.

Scenario Mechanism Result
Two candidates are equal Voter chooses based on principles Predictable outcome
Third added (similar to one) Decoy effect distorts comparison Votes redistribute illogically

�� Technique 4: Causal Errors in Native Advertising

Native advertising actively exploits cognitive biases by presenting promotional content as editorial material. Causal errors create the illusion of a cause-and-effect relationship between the product and the desired outcome.

"Successful people use this product"
The manipulator shows correlation but hides the fact that successful people use many products. Your brain automatically searches for causal connections and fills in the gaps.
"If I use this product, I'll become successful"
This is a classic error, but it works because we overestimate the role of a single factor and underestimate the complexity of reality. Mental errors of this type are built into our cognitive architecture.

You can recognize an attack by stopping and asking three questions: what emotions am I experiencing right now, which examples do I remember best, and what statistical data am I ignoring (S001).

��️ 30-Second Verification Protocol: Seven Questions That Will Destroy Any Cognitive Bias-Based Manipulation

Theory is useless without a practical tool. Here's a protocol you can apply to any claim, decision, or call to action.

  1. What's the base rate? Before reacting to a vivid example, ask: how statistically common is this phenomenon? If you're shown a dramatic case but given no frequency data, that's a red flag. (S003) shows that ignoring base rates is one of the most exploited biases.
  2. Why is this information easily accessible? If something is "common knowledge" or "constantly discussed," ask: who is deliberately making it visible? The availability heuristic only works if information is easily recalled—manipulators know this.
  3. Do details add probability or just plausibility? Each additional detail reduces the mathematical probability of the entire conjunction (S003). Plausibility ≠ probability. This protects against the conjunction fallacy.
  4. Is there a "decoy" here? If you're offered a choice, check: has a third option been added specifically to make one of the main options more attractive? The decoy effect works invisibly, but it can be detected if you know what to look for.
  5. How would my decision change if reframed in terms of losses/gains? The framing effect means your decision depends on wording. Reframe the statement in opposite terms: if your decision changes, you've fallen victim to framing.
  6. Correlation or causation? If a causal relationship is claimed, ask: is this proven causality or just correlation? Are there alternative explanations? Causal errors (S008) are the foundation of most pseudoscientific claims.
  7. What emotion does this trigger? If a message evokes anger, fear, or urgency, stop. Emotional arousal shuts down the prefrontal cortex. Ask: why this specific emotion? Who benefits from my reaction?
Manipulation doesn't work because you're stupid. It works because your brain uses heuristics to conserve energy. The verification protocol is simply switching to slow thinking mode.

These seven questions don't require expertise. They require a pause. Manipulators count on the speed of your reaction—on you responding emotionally rather than analytically.

Each question targets a specific mechanism: base rate blocks base rate neglect, availability exposes the availability heuristic, details protect against the conjunction fallacy, decoy reveals choice manipulation, framing shows dependence on wording, causality destroys pseudoscientific claims, emotion reveals the manipulator's intent.

Question Protects Against Danger Signal
Base rate Ignoring statistics Vivid example without numbers
Information availability Availability effect "Everyone's talking about it"
Details vs probability Conjunction fallacy Excess details in story
Decoy Choice manipulation Third option that's "worse"
Framing Wording dependence Decision changes when reframed
Causality Causal errors Correlation presented as cause
Emotion Emotional hijacking Urgency, anger, fear

Apply this protocol not as dogma, but as a calibration tool. If you can't answer three of the seven questions—the information is insufficient for a decision. If answers point to manipulation—this doesn't mean the claim is false, but it does mean it requires independent verification.

Cognitive immunology isn't paranoia. It's thinking hygiene. Just as you brush your teeth to avoid cavities, you verify information to avoid cognitive contamination.

⚔️

Counter-Position Analysis

Critical Review

⚖️ Critical Counterpoint

The article relies on classical research in cognitive psychology, but this foundation has methodological limitations. Below is an honest analysis of where conclusions may be overestimated or contextually narrow.

Overestimation of Bias Universality

Most experiments on cognitive biases were conducted on WEIRD populations (Western, Educated, Industrialized, Rich, Democratic). There is evidence that some biases—for example, fundamental attribution error—manifest more weakly or differently in non-Western cultures. The article may underestimate the cultural variability of cognitive processes.

Adaptiveness of "Errors"

What we call cognitive biases were often adaptive heuristics in an evolutionary context. Availability bias made sense in an environment where available information correlated with the actual frequency of threats. The article may create the impression that these mechanisms are purely deficient, ignoring their historical functionality.

Ecological Validity Problem

Many classical experiments (Linda problem, Asian disease problem) use artificial, abstract scenarios. In real-world conditions, where people have context, motivation, and time, some biases manifest more weakly. The article may overestimate the influence of biases in natural decision-making conditions.

Replication Crisis

Some classical effects in cognitive psychology do not replicate in modern studies with large samples and preregistration. While basic effects (framing, anchoring) are robust, effect sizes may be smaller than in original works. The article relies on classical sources that may overstate the reliability of certain phenomena.

Limitations of Debiasing Techniques

Meta-analyses show that most debiasing interventions have small effect sizes and transfer poorly from laboratory to real life. Knowledge about bias does not guarantee protection from it, especially under conditions of stress, time pressure, or emotional arousal. The article may create inflated expectations about cognitive hygiene.

Knowledge Access Protocol

FAQ

Frequently Asked Questions

These are systematic, predictable deviations from rational thinking. Cognitive biases are patterns of errors in information processing that arise from how the brain works. Logical fallacies are violations of logical inference rules in argumentation. The work of Kahneman and Tversky showed that these errors aren't random—they reproduce in experiments with high reliability and can be catalogued (S003). Unlike rationalist views from thinkers like Descartes, who considered human reasoning a flawless instrument for finding truth, modern evidence demonstrates that our minds are full of systematic glitches that can be manipulated.
The exact number depends on classification, but researchers have documented dozens of main types. Scientific literature describes over 180 different cognitive biases, though many overlap or are variations of basic patterns. The study 'Bringing Order to the Cognitive Fallacy Zoo' (S003) proposes systematization through the concept of information reduction (IR), which allows reducing multiple biases to fundamental mechanisms. The most experimentally confirmed include: base-rate neglect, availability bias, conjunction fallacy, decoy effect, framing effect, and Allais paradox (S003).
This is a metaphor for a manipulator who uses knowledge of cognitive biases to control perception. 'Doctor Spin' (from spin doctor—a specialist in manipulating public opinion) is any actor (politician, marketer, propagandist) who exploits predictable thinking errors to achieve their goals. They don't deceive directly—they construct a context (frame) in which your brain arrives at the conclusion they want on its own. For example, using the framing effect, the same statistics can be presented as '90% survival rate' or '10% mortality rate'—and people will make different decisions (S003).
This is ignoring the base frequency of an event when assessing probability. Base-rate neglect is a cognitive bias where people ignore the statistical prevalence of a phenomenon and focus on specific information about a particular case (S003). Classic example: a test for a rare disease with 95% accuracy gives a positive result. Most people think the probability of having the disease is 95%, but if the disease occurs in 1 out of 1,000 people, the actual probability after a positive test is about 2%. This is dangerous because it allows manipulation of fears: by showing vivid isolated cases (terrorist attacks, plane crashes), 'Doctor Spin' makes people overestimate risks while ignoring actual statistics.
We overestimate the probability of events that are easy to recall. Availability bias is a heuristic where the assessment of frequency or probability of an event depends on how easily examples come to mind (S003). If media constantly shows plane crashes, people start thinking flights are more dangerous than they actually are, even though statistically cars kill thousands of times more people. The mechanism: vivid, emotional, recent events are encoded in memory more strongly and retrieved faster. 'Doctor Spin' uses this by creating information noise around events they need and suppressing inconvenient statistics.
This is an error where we consider the combination of two events more probable than one of them separately. Conjunction fallacy is a violation of a basic rule of probability theory: P(A and B) cannot be greater than P(A) (S003). Famous experiment by Tversky and Kahneman: 'Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and participated in anti-nuclear demonstrations.' Which is more probable: (A) Linda is a bank teller, or (B) Linda is a bank teller and active in the feminist movement? Most choose B, though it's logically impossible. Manipulators use this by adding plausible details to false statements, making them 'more convincing.'
The same information, presented differently, triggers opposite decisions. Framing effect is a cognitive bias where the way information is presented influences choice more strongly than the information itself (S003). Experiment: doctors are offered a choice between two treatment methods. Option A: '90% survival rate.' Option B: '10% mortality rate.' These are identical, but in the first case doctors more often choose treatment, in the second they refuse it. 'Doctor Spin' masterfully wields framing: 'tax breaks for business' vs 'subsidies for the wealthy,' 'fighting terrorism' vs 'military intervention'—words change the perception of facts.
This is manipulation of choice through adding a deliberately unfavorable option. Decoy effect is a cognitive bias where adding a third, asymmetrically dominated option changes preferences between two original choices (S003). Classic example: a movie theater sells popcorn. Small—$3, large—$7. Sales are 50/50. They add medium for $6.50. Now large seems like a bargain (only 50 cents more than medium!), and its sales jump to 80%. Nobody buys medium—it exists only to make large attractive. This works in pricing, political elections (spoiler candidates), and product design.
Completely—no, but you can significantly reduce their influence through self-checking protocols. Cognitive biases are built into the architecture of thinking—they're not bugs but features of evolutionarily developed heuristics that in most situations work quickly and well enough. However, awareness of bias mechanisms and use of structured verification protocols (checklists, statistical thinking, searching for alternative explanations, checking base rates) demonstrably reduces error frequency (S002, S003). Key principle: slow down and switch from automatic System 1 (fast intuitive thinking) to analytical System 2 (slow reflective thinking) in critical situations.
They built philosophy on introspection, not experiments. Descartes and Spinoza considered human thinking an ideal instrument for knowing truth, capable of arriving at reliable conclusions through pure logic (S003). The problem: they analyzed their own thinking from within, lacking tools to detect systematic blind spots. Experimental psychology of the 20th century, especially the work of Kahneman and Tversky, showed: when we place people in controlled conditions and measure their decisions, we discover stable error patterns that aren't consciously perceived from within. Rationalists were right that logic exists, but wrong in thinking the human mind naturally follows it.
It's a theoretical framework for systematizing cognitive errors through the lens of information simplification. Information Reduction (IR) is a concept proposed in the study 'Bringing Order to the Cognitive Fallacy Zoo' (S003), which explains many cognitive biases as resulting from the need to compress complex information for rapid processing. Similar to the concept of reduction in computational complexity theory, IR shows how the brain sacrifices accuracy for speed. For example, base-rate neglect is a reduction of a complex Bayesian problem to simple matching of a description with a prototype. Availability bias is a reduction of frequency estimation to the accessibility of examples in memory. This framework allows us not just to catalog biases, but to understand their common nature.
Native advertising exploits biases by disguising itself as organic content. Research shows that native advertising systematically uses cognitive biases to bypass critical thinking (S010). Key mechanisms: (1) framing effect—advertising messages are presented as editorial material, news, or expert opinion; (2) availability bias—creating an illusion of popularity through repetition and presence in trusted sources; (3) authority bias—using the format of authoritative media to transfer trust to the advertised product. The key danger: blurring the boundary between information and manipulation, which reduces cognitive vigilance.
Yes, emotions often amplify cognitive biases and vice versa. Research shows that cognitive processes and emotional reactions are not separate—they interact and modulate each other (S005). For example, fear amplifies availability bias (we overestimate the probability of threats that evoke strong emotions), while anger reduces critical thinking and amplifies confirmation bias (the tendency to seek confirmation of one's beliefs). Doctor Spin exploits this by creating emotionally charged narratives: fear (terrorism, disease), anger (injustice, betrayal), hope (salvation, breakthrough). Emotional arousal switches the brain into fast heuristics mode (System 1), where cognitive biases manifest more strongly.
The most dangerous are biases that are invisible and affect critical decisions. In high-stakes decision contexts (medicine, investments, politics, hiring), particularly dangerous are: (1) confirmation bias—seeking only confirming information; (2) anchoring bias—excessive reliance on the first piece of information received; (3) sunk cost fallacy—continuing ineffective actions due to already invested resources; (4) overconfidence bias—overestimating the accuracy of one's judgments (S006). These biases lead to systematic errors in situations where the cost of error is maximal, and they are difficult to detect from within—a person is confident they are acting rationally.
It helps, but not as much as one might hope—knowing about bias doesn't make you immune. Research shows that even experts in logic and statistics are susceptible to cognitive biases in real situations (S002, S003). The problem: biases operate at the level of automatic processes (System 1), while training affects conscious thinking (System 2). However, training provides two advantages: (1) the ability to recognize high-risk situations for biases and consciously activate verification protocols; (2) knowledge of specific debiasing techniques (e.g., considering alternative hypotheses, checking base rates, pre-committing to decision criteria). The key is not just knowing about biases, but having checklists and procedures built into the workflow.
Deymond Laplasa
Deymond Laplasa
Cognitive Security Researcher

Author of the Cognitive Immunology Hub project. Researches mechanisms of disinformation, pseudoscience, and cognitive biases. All materials are based on peer-reviewed sources.

★★★★★
Author Profile
Deymond Laplasa
Deymond Laplasa
Cognitive Security Researcher

Author of the Cognitive Immunology Hub project. Researches mechanisms of disinformation, pseudoscience, and cognitive biases. All materials are based on peer-reviewed sources.

★★★★★
Author Profile
// SOURCES
[01] Cognitive biases, heuristics, and logical fallacies in clinical practice: A brief field guide for practicing clinicians and supervisors.[02] Decision-making during gambling: an integration of cognitive and psychobiological approaches[03] Politics of Nostalgia, Logical Fallacies, and Cognitive Biases: the Importance of Epistemology in the Age of Cognitive Historiography[04] A multidisciplinary approach to insanity assessment as a way to reduce cognitive biases[05] Differences in Cognitive Distortions Between Pathological and Non-Pathological Gamblers with Preferences for Chance or Skill Games[06] The Relationship Between Cognitive Bias and Logical Fallacies in Egyptian Society[07] The psychological drivers of misinformation belief and its resistance to correction[08] The Psychology of Change: Self-Affirmation and Social Psychological Intervention

💬Comments(0)

💭

No comments yet