Skip to content
Navigation
🏠Overview
Knowledge
🔬Scientific Foundation
🧠Critical Thinking
🤖AI and Technology
Debunking
🔮Esotericism and Occultism
🛐Religions
🧪Pseudoscience
💊Pseudomedicine
🕵️Conspiracy Theories
Tools
🧠Cognitive Biases
✅Fact Checks
❓Test Yourself
📄Articles
📚Hubs
Account
📈Statistics
🏆Achievements
⚙️Profile
Deymond Laplasa
  • Home
  • Articles
  • Hubs
  • About
  • Search
  • Profile

Knowledge

  • Scientific Base
  • Critical Thinking
  • AI & Technology

Debunking

  • Esoterica
  • Religions
  • Pseudoscience
  • Pseudomedicine
  • Conspiracy Theories

Tools

  • Fact-Checks
  • Test Yourself
  • Cognitive Biases
  • Articles
  • Hubs

About

  • About Us
  • Fact-Checking Methodology
  • Privacy Policy
  • Terms of Service

Account

  • Profile
  • Achievements
  • Settings

© 2026 Deymond Laplasa. All rights reserved.

Cognitive immunology. Critical thinking. Defense against disinformation.

  1. Home
  2. /Critical Thinking
  3. /Mental Errors
  4. /Cognitive Biases
  5. /Cognitive Biases: Why Your Brain Lies to...
📁 Cognitive Biases
🔬Scientific Consensus

Cognitive Biases: Why Your Brain Lies to You Every Day — and How It's Used Against You

Cognitive distortions are systematic thinking errors that cause us to perceive reality inaccurately. They are universal, unconscious, and influence all decisions—from choosing a partner to making investments. High intelligence offers no protection: smart people simply rationalize their biases more effectively. This article reveals the mechanism behind cognitive traps, dismantles myths about "rationality," and provides a self-audit protocol for daily use.

🔄
UPD: February 5, 2026
📅
Published: February 4, 2026
⏱️
Reading time: 11 min

Neural Analysis

Neural Analysis
  • Topic: Cognitive biases as systematic information processing errors, their mechanisms, clinical significance, and management methods
  • Epistemic status: High confidence — phenomenon confirmed by multiple studies in cognitive psychology, neuroscience, and clinical practice
  • Level of evidence: Meta-analyses and systematic reviews (cognitive-behavioral therapy), experimental studies (behavioral economics), clinical observations (link to depression and anxiety)
  • Verdict: Cognitive biases are a real, measurable phenomenon with a neurobiological basis. Complete elimination is impossible, but awareness and structured protocols significantly reduce their impact on critical decisions. Intelligence offers no protection — active metacognitive work is required.
  • Key anomaly: Knowing about cognitive biases doesn't automatically prevent them — the "bias blind spot" causes people to see biases in others but not in themselves
  • Test in 30 sec: Recall your last important decision. Did you write down arguments AGAINST your position BEFORE making the decision? If not — confirmation bias was at work
Level1
XP0

Cognitive distortions are systematic errors in thinking that cause us to perceive reality in a distorted way. They are universal, unconscious, and influence every decision—from choosing a partner to making investments. High intelligence offers no protection: smart people simply rationalize their biases more effectively. This article reveals the mechanism behind cognitive traps, dismantles myths about "rationality," and provides a self-audit protocol for daily use.

★★★★★

🖤 You consider yourself a rational person. You weigh arguments, analyze data, make informed decisions. But every day your brain systematically deceives you—and you don't notice it. What's more: the smarter you think you are, the more sophisticated your intellect becomes at disguising these deceptions as logic. Cognitive distortions aren't a bug in your consciousness—they're its basic operating system, and it's working against you right now, as you read these lines.

📌What Cognitive Biases Actually Are — And Why the Textbook Definition Won't Save You

Cognitive distortions (cognitive biases) are systematic patterns of deviation from rational thinking and objective information assessment. They're not random errors, not the result of lack of education, and not a sign of low intelligence. Learn more in the Critical Thinking section.

They're mechanisms built into the architecture of human thinking that force us to perceive, remember, and interpret information in predictably distorted ways.

🧩 Three Critical Properties That Make Cognitive Biases Dangerous

Automaticity
Cognitive biases occur without conscious intention, at the level of automatic thought processes (S005). You don't decide to distort reality — your brain does it before the information reaches the level of conscious analysis. By the time you start "thinking" about a problem, the data has already passed through several layers of distortion.
Systematicity
Biases follow predictable patterns that reproduce across different people in similar situations (S002). These aren't chaotic errors — they're structured failures that can be catalogued and exploited. Marketers, political strategists, and manipulators know these patterns and use them professionally.
Universality
Cognitive biases affect all people regardless of intelligence level, education, or cultural context. Nobel laureates are subject to the same basic biases as people without college degrees. The only difference is that high intelligence allows for creating more sophisticated rationalizations to justify biased conclusions.

⚠️ Why Your Brain Evolved to Lie to You

Cognitive biases aren't an evolutionary defect — they're a feature. Under conditions of limited computational resources and the need to make quick decisions in a dangerous environment, our ancestors survived not through accuracy, but through speed (S001).

Better to mistake a rustling in the bushes for a predator ten times than to miss a real threat once. Heuristics — mental shortcuts — allowed conserving cognitive energy and reacting instantly.

The problem is that the modern environment radically differs from the Pleistocene savanna. Decisions about loans, investments, choosing a partner, medical treatment, or political preferences require accuracy, not speed. But the brain continues using ancient algorithms optimized for survival in a world that no longer exists.

🔎 Boundaries of the Concept: What Is NOT a Cognitive Bias

Phenomenon Why It's Not a Bias Where the Trap Is
Conscious Lying Deliberate distortion of information, conscious choice Easy to confuse with rationalization that follows the bias
Lack of Information Gap in knowledge, not distortion of perception The brain fills gaps with assumptions — that's already a bias
Emotional Reactions Normal feelings (fear, anger, joy) Emotion becomes a bias when it systematically deforms interpretation of reality

For example, anxiety is an emotion, but catastrophizing (automatic assumption of the worst outcome) is a cognitive bias. Fear of flying is a normal reaction; believing that planes crash more often than cars is a distortion of risk perception.

Map of cognitive biases in brain structure with neural pathways
Cognitive biases are built into the basic architecture of the brain's information processing — they trigger before data reaches the level of conscious analysis

🧱The Steel Version of the Argument: Seven Reasons Why Cognitive Biases Are Inevitable and Even Useful

Before examining the problems of cognitive biases, it's necessary to present the strongest version of the opposing argument. This is called the "steel man" approach—as opposed to a "strawman," where you represent your opponent in the weakest possible light. For more details, see the section on Psychology of Belief.

Here are seven serious arguments in defense of cognitive biases:

🧠 Argument One: Computational Efficiency Under Resource Constraints

The human brain consumes about 20% of the body's energy while representing only 2% of body mass. Fully rational processing of every bit of information would require astronomical energy expenditure. Cognitive biases represent a tradeoff between accuracy and efficiency.

The availability heuristic (estimating probability by ease of recalling examples) works quickly and produces acceptable results in most cases. Yes, it sometimes errs, but the alternative is paralyzing slowness with every decision.

⚡ Argument Two: Speed of Response in Critical Situations

In situations of real danger, cognitive biases save lives. If you see a snake-like object on a path, it's better to jump back first and analyze later—even if 99% of the time it turns out to be a stick.

Type I error (false alarm) is less critical than Type II error (missing a real threat). Evolution optimized us for survival, not academic accuracy.

🎯 Argument Three: Social Cohesion and Group Survival

Many cognitive biases promote social cohesion. In-group favoritism (preferring members of one's own group) creates trust and cooperation within communities (S008).

Conformity (tendency to agree with the majority) reduces conflict and accelerates collective decision-making. Yes, these mechanisms can lead to discrimination and groupthink, but they also make stable social structures possible.

🔮 Argument Four: Adaptive Value of Optimism

Optimistic biases (overestimating the probability of positive outcomes) correlate with better mental health, greater persistence, and higher achievement (S007).

People with "depressive realism"—more accurate assessment of their capabilities and risks—are often less successful because realistic evaluation of odds can be demotivating. The illusion of control motivates people to take actions that sometimes genuinely improve situations.

💡 Argument Five: Creativity Through Illogical Associations

Apophenia (tendency to see patterns in random data) can lead to false conclusions, but it also underlies scientific discoveries and artistic insights.

Many breakthrough ideas began with intuitive hunches that were formally cognitive biases but proved productive.

🛡️ Argument Six: Protective Function of Self-Esteem

Self-serving biases protect the psyche from the destructive impact of constant self-criticism (S007). Attributing successes to oneself and failures to external circumstances maintains self-esteem at a level necessary to continue efforts.

Completely "objective" self-perception may be psychologically unbearable.

📊 Argument Seven: Statistical Adequacy in Natural Environments

Many cognitive biases that appear irrational in laboratory conditions are statistically justified in natural environments. The representativeness heuristic (estimating probability by similarity to a typical example) works well when base rates in the population match our intuitive expectations.

Problems arise in artificial situations with counterintuitive probability distributions.

  1. Computational efficiency: fast decisions with limited brain energy
  2. Response speed: survival in critical situations matters more than accuracy
  3. Social cohesion: trust and cooperation within groups
  4. Psychological resilience: optimism and self-esteem as resources for action
  5. Creative potential: intuitive insights often precede logic
  6. Psychic protection: self-perception adapted for survival, not objectivity
  7. Ecological validity: heuristics work in natural environments, break down in laboratories

These arguments are serious and deserve respect. Cognitive biases aren't simply "errors" that need to be "fixed." They represent a complex set of adaptations that made evolutionary sense.

The problem is that the modern environment creates contexts where these adaptations systematically malfunction—and these malfunctions have serious consequences.

🔬Evidence Base: What Science Actually Knows About Cognitive Biases — With Numbers and Without Illusions

Cognitive biases have been studied within cognitive psychology, behavioral economics, and neuroscience for over half a century. There exists an extensive empirical foundation demonstrating their existence, mechanisms, and consequences. For more details, see the Scientific Method section.

🧪 Memory Biases: Why Your Memories Are Fanfiction Written by Your Brain

Memory doesn't work like video recording — it's a reconstructive process subject to systematic distortions (S009). Recency effect overvalues the importance of the latest information. Primacy effect gives disproportionate weight to first impressions. Selective memory retains information consistent with current beliefs and "forgets" contradictory evidence.

Hindsight bias is a particularly insidious distortion: after an event, people systematically overestimate how predictable it was beforehand (S009). This creates the illusion of "I knew it all along" and prevents learning from mistakes. Even experts are susceptible: physicians evaluating medical cases after knowing the outcome "remember" considering that outcome more likely than they actually did.

Hindsight bias turns failures into inevitabilities and successes into predictabilities — both variants block learning.

👥 Social Biases: How the Brain Turns People Into Stereotypes

Fundamental attribution error is one of the most persistent social biases: we explain others' behavior through personal characteristics while ignoring situation, but explain our own behavior through circumstances (S005). A colleague is late — they're irresponsible; you're late — there was traffic. This underlies interpersonal conflicts and unfair evaluations.

Halo effect colors the perception of all a person's qualities based on one positive characteristic (S009). Physically attractive people are systematically rated as more intelligent and competent — without objective connection. This effect even influences judicial decisions: attractive defendants receive lighter sentences.

In-group favoritism and out-group homogenization work in tandem: we prefer members of "our" group and perceive the "other" as homogeneous (S005). These biases activate even with arbitrary division of people in laboratory conditions — a random t-shirt color is sufficient.

  1. Fundamental attribution error: internal causes for others, external for ourselves
  2. Halo effect: one trait colors entire perception
  3. In-group favoritism: preference for "our own" and homogenization of "others"

💰 Decision-Making Biases: Why You Systematically Choose Suboptimally

Anchoring effect — the first number disproportionately influences decisions, even when chosen arbitrarily (S009). Real estate appraisers, knowing about this effect, cannot fully protect themselves from it. Initial price in negotiations, first offer in bargaining, starting bid at auction — all are anchors distorting valuations.

Sunk cost fallacy compels continued resource investment in failing projects simply because much has already been invested (S010). Rational decisions should be based only on future costs and benefits, but it's psychologically difficult to "write off" past investments. This destroys businesses, marriages, and careers.

Availability heuristic assesses event probability by ease of recall (S009). After a plane crash, people overestimate aviation risk and underestimate automobile risk, though statistically cars are orders of magnitude more dangerous. Media coverage creates availability, availability creates illusion of frequency, illusion of frequency distorts risk assessment. See availability heuristic and risk perception for more details.

🪞 Self-Perception Biases: Why You Don't Know Yourself as Well as You Think

Dunning-Kruger effect — people with low competence overestimate their abilities, while highly competent individuals underestimate their uniqueness (S010). This isn't simply "stupid people don't know they're stupid" — lack of competence prevents assessing one's own incompetence, because evaluating quality requires the same skills as the work itself.

Self-serving bias attributes successes to internal factors and failures to external circumstances (S007). This protects self-esteem but prevents learning. In depression, this bias weakens — "depressive realism" means more accurate but psychologically painful attribution.

Illusion of control overestimates influence over events, especially in situations involving chance (S009). People throw dice harder for high numbers, softer for low ones. Investors believe they "feel the market." Gamblers develop "systems" for roulette. The illusion can motivate but leads to unjustified risks.

📰 Information Processing Biases: How Media and Algorithms Exploit Brain Vulnerabilities

Confirmation bias — possibly the most dangerous: the tendency to seek, interpret, and remember information to confirm existing beliefs (S009). This isn't merely preference for agreeable information — it's active distortion of contradictory data. People reading a single article with opposing views find confirmation of their own positions in it.

Modern algorithmic social media feeds turn confirmation bias into a weapon of mass destruction (S004). Algorithms are optimized for engagement, and engagement is maximal when content confirms beliefs and triggers emotion. The result — "filter bubbles" where people see only reinforcing information and never encounter alternatives. The mechanism also operates in groupthink.

Algorithms didn't create confirmation bias — they simply scaled it to a level where it becomes an instrument of social fragmentation.

Framing effect demonstrates that the manner of presenting information radically changes decisions, even when content is identical (S004). "90% survival rate" sounds better than "10% mortality rate," though they're the same. Media systematically use framing to manipulate perception — choice of headline, order of facts, emotionally charged words create a distorted picture.

Availability cascade — a self-reinforcing cycle where media coverage of an event increases its perceived importance, leading to even more coverage (S004). This explains moral panics and media hysteria. Terrorist attacks receive disproportionate coverage compared to car accidents, though the latter kill orders of magnitude more people — because terrorism is more dramatic and generates more clicks.

Bias Mechanism Consequence
Anchoring effect First number disproportionately influences valuation Negotiations, bargaining, auctions distorted by initial price
Sunk cost fallacy Past investments influence future decisions Continuation of failing projects, marriages, careers
Availability heuristic Ease of recall determines probability Overestimation of rare but media-covered risks
Confirmation bias Seeking information confirming beliefs Filter bubbles, social fragmentation
Framing effect Presentation method changes decision Perception manipulation through language and structure
Diagram of information flow through cascade of cognitive biases during decision-making
Every decision passes through a cascade of cognitive filters — information is distorted at stages of perception, memory, interpretation, and option evaluation

🧬Mechanisms and Causality: What Happens in Your Brain When It Lies to You — And Why It's Not Your Fault (But It Is Your Responsibility)

Cognitive biases are not moral failings or signs of weak character. They are the result of how evolution assembled the human brain from available components to solve survival problems in an environment radically different from today's. More details in the Statistics and Probability Theory section.

Understanding the neurobiological mechanisms is critically important for separating causality from correlation.

🧠 Dual-System Architecture: Why You Have Two Brains, and They Don't Get Along

Daniel Kahneman popularized the model of two thinking systems. System 1 is fast, automatic, intuitive, emotional. It operates effortlessly and cannot be controlled by willpower.

System 2 is slow, analytical, rational, and requires effort. It can check and correct System 1's conclusions, but this requires energy and motivation.

Parameter System 1 System 2
Speed Instant Slow
Effort Minimal Maximum
Source of Errors Heuristics, patterns, emotions Lack of information, fatigue
Control Automatic Volitional

Most cognitive biases are products of System 1. It makes quick judgments based on heuristics, patterns, and emotional reactions (S001). System 2 can correct them, but only if you have the time, motivation, and cognitive resources.

The Predictive Brain: Why Error Is Better Than Uncertainty

The brain is not a recorder of reality, but a generator of predictions (S001). It constantly builds models of what will happen next and compares them with incoming information.

When data is insufficient, the brain fills in the gaps. Error in this case is not a bug, but a feature: an incorrect prediction is better than paralysis from uncertainty. In survival environments, speed is often more important than accuracy.

The brain prefers a fast error to a slow truth. This strategy saved our ancestors, but in today's world of information overload, it becomes a trap.

Emotions are not obstacles to rationality, but its foundation (S004). They encode past experience and direct attention to relevant signals. Without them, System 2 would be paralyzed by choice.

Energy Budget: Why Your Brain Is Lazy

The brain consumes 20% of the body's energy while comprising only 2% of its mass. Cognitive biases are a way to conserve this energy.

Representativeness Heuristic
The brain judges probability by similarity to a prototype, not by statistics. It saves computation but ignores base rates.
Anchoring
The first number you hear becomes the reference point for all subsequent estimates. The brain doesn't recalculate from scratch — it adjusts the anchor.
Confirmation
The brain seeks information that confirms an already-formed hypothesis. This reduces cognitive load but blinds you to contradictions.

These mechanisms are not design flaws — they are optimal for a world with limited information and high cost of error. The problem is that the modern world works differently.

From Causality to Responsibility

Knowing that biases are the result of neurobiology, not moral choice, frees you from shame. But it doesn't free you from responsibility.

You're not to blame for how your brain works. You are to blame if you know about it and do nothing. Responsibility begins with understanding the mechanism — and with choosing to slow down when the stakes are high.

Causality explains why you make mistakes. Responsibility requires that you make mistakes more slowly and more consciously.
⚔️

Counter-Position Analysis

Critical Review

⚖️ Critical Counterpoint

The article offers tools for combating cognitive biases, but fails to account for the limitations of these methods in real-world conditions, the adaptive value of certain biases, and cultural differences in their manifestation. Here's where the article's logic shows cracks.

Overestimating the Controllability of Biases

The article claims that systematic practice and self-checking protocols reduce the influence of cognitive biases. However, research shows that even trained professionals (doctors, judges, analysts) make the same mistakes in real-world conditions when stress, time constraints, and emotional involvement are high. The effectiveness of protocols may be overestimated, working only in laboratory or low-stress conditions.

Underestimating the Adaptive Value of Biases

The article focuses on negative consequences but insufficiently covers the adaptive function of biases. For example, optimistic bias correlates with better mental health and motivation. Complete "cognitive hygiene" can lead to depressive realism—a more accurate but psychologically destructive perception of reality, so some biases are worth preserving.

Cultural Bias in Sources

Most cognitive bias research is conducted on Western, Educated, Industrialized, Rich, and Democratic (WEIRD) populations. The universality of many biases may be an artifact of the cultural homogeneity of samples. The article is insufficiently critical of generalizations to all humanity.

The Problem of Measurement in Real-World Conditions

The article references experimental data, but most studies are conducted in artificial conditions with hypothetical scenarios. The ecological validity of these experiments is questionable—it's unclear how well the results transfer to real decisions with real stakes.

Risk of Metacognitive Paranoia

Excessive focus on biases can lead to decision paralysis and constant doubt in one's own judgments. This is dysfunctional in situations requiring quick decisions or intuitive expert judgment. The article doesn't discuss when to trust intuition versus when to apply analytical protocols.

Knowledge Access Protocol

FAQ

Frequently Asked Questions

Cognitive biases are systematic thinking errors that cause the brain to perceive reality inaccurately. They're automatic "shortcuts" for processing information that work quickly but often incorrectly. For example, confirmation bias makes you notice only information that confirms your beliefs while ignoring contradictory facts. These biases are universal—everyone has them regardless of intelligence or education (S005, S009).
No, completely eliminating cognitive biases is impossible. They're built into the brain's information processing architecture and have evolutionary roots—many were adaptive mechanisms for quick decision-making under limited information. The goal isn't elimination but awareness and management: learning to recognize biases in critical situations and applying structured verification protocols. Cognitive-behavioral therapy (CBT) shows that you can significantly reduce biases' influence on behavior, but not remove them entirely (S005, S012).
No, high intelligence doesn't protect against cognitive biases. Moreover, smart people may be more susceptible to certain biases because they're better at constructing complex rationalizations for their prejudices. This phenomenon is called "motivated reasoning"—the ability of intelligence to work on defending an already-held position rather than seeking truth. Research shows that educated people are just as susceptible to confirmation bias and other distortions as everyone else (S009, S010).
The most frequent: confirmation bias (seeking confirmation of our beliefs), availability heuristic (overestimating the probability of what's easily recalled), anchoring effect (first information excessively influences judgment), fundamental attribution error (explaining others' mistakes by character, our own by circumstances), Dunning-Kruger effect (incompetent people overestimate their abilities). These biases affect partner selection, financial decisions, risk assessment, interpersonal conflicts, and professional judgments (S009, S010, S011).
Cognitive biases play a central role in depression and anxiety disorders. In depression, typical patterns include: catastrophizing (exaggerating negative consequences), overgeneralization (one failed experience = "it'll always be this way"), mental filtering (focusing only on negatives), personalization (everything bad is my fault). Research on middle-aged people with depression showed persistent patterns of these distortions (S007). CBT works precisely through identifying and restructuring these automatic thoughts—it's first-line therapy for depression and anxiety with proven effectiveness (S012).
Because cognitive biases operate automatically, at a level below conscious control. Knowledge about them is declarative memory (facts), while biases are procedural memory (automatic processes). Moreover, there's a "bias blind spot"—we easily see biases in others but don't notice them in ourselves. Real change requires not just information but active practice: structured decision-checking protocols, external feedback, slowing automatic reactions. This takes effort and time, like any skill (S009, S010).
Manipulators exploit: scarcity effect (scarcity increases value), social proof (we do what others do), authority bias (trusting experts without verification), sunk cost fallacy (continuing to invest because we already have), anchoring (first price sets perception of all subsequent ones), framing effect (same information, different presentation changes decision). Marketing is built on these mechanisms: "only 2 spots left", "9 out of 10 doctors recommend", "special price today only". Understanding these techniques is the first step toward protection (S009, S010, S011).
Use a pre-check protocol: (1) Write down your decision and arguments "for". (2) Actively seek arguments "against"—at least 3 strong objections. (3) Formulate alternative options—at least 3 different solutions. (4) Check for anchoring: are you too dependent on the first information received? (5) Delay the decision 24 hours if possible. (6) Ask someone with an opposing position and try to "steel-man" their arguments (present them in strongest form). (7) Ask yourself: "If I were advising a friend, what would I say?"—this reduces emotional involvement (S010, S012).
Yes, cultural context influences the expression and form of cognitive biases. For example, fundamental attribution error (explaining behavior by character rather than situation) is more pronounced in individualistic cultures (USA, Western Europe) than in collectivist ones (East Asia), where more attention is paid to context. In-group bias (preferring one's own group) is universal, but criteria for "us" differ: family, clan, nation, religion. However, basic mechanisms—confirmation bias, availability heuristic—work across all cultures; what changes is the content, not the structure of the bias (S009).
Social media creates an ideal environment for amplifying cognitive biases. Algorithms show content you already approve of (confirmation bias), creating "filter bubbles". Availability heuristic makes you overestimate the frequency of heavily reported events (terrorism, plane crashes). Emotional content spreads faster (negativity bias), distorting worldview toward catastrophism. Headlines use framing effect, changing perception of the same facts. Research shows that information distortions in media interact with cognitive prejudices, creating an amplification effect (S004, S009, S011).
Yes, but it requires systematic practice, not one-time efforts. Effective methods: (1) Regular "decision journal" keeping with error analysis. (2) Practicing "premortem" analysis — before making a decision, imagine it failed and describe the reasons. (3) Learning basic statistics and probability theory — reduces errors in risk assessment. (4) Exposure to opposing views — actively seeking quality counterarguments. (5) Mindfulness meditation — improves metacognitive control. (6) Structured checklists for recurring decisions. Research shows these practices work, but the effect is cumulative and requires months of regular work (S010, S012).
Consult a psychotherapist if cognitive biases cause significant distress or impair functioning. Red flags: (1) Negative automatic thoughts are constant and uncontrollable. (2) Biases lead to avoidant behavior (social isolation, refusing opportunities). (3) Symptoms of depression or anxiety are present (sleep disturbances, appetite changes, concentration problems). (4) Biases are linked to addictive behavior (alcohol, gambling, overeating). (5) Suicidal thoughts. Cognitive Behavioral Therapy (CBT) is a proven method for working with cognitive biases, especially effective for depression and anxiety disorders (S006, S007, S012).
Deymond Laplasa
Deymond Laplasa
Cognitive Security Researcher

Author of the Cognitive Immunology Hub project. Researches mechanisms of disinformation, pseudoscience, and cognitive biases. All materials are based on peer-reviewed sources.

★★★★★
Author Profile
Deymond Laplasa
Deymond Laplasa
Cognitive Security Researcher

Author of the Cognitive Immunology Hub project. Researches mechanisms of disinformation, pseudoscience, and cognitive biases. All materials are based on peer-reviewed sources.

★★★★★
Author Profile
// SOURCES
[01] Whatever next? Predictive brains, situated agents, and the future of cognitive science[02] The myth of language universals: Language diversity and its importance for cognitive science[03] Parkinson disease-associated cognitive impairment[04] The brain basis of emotion: A meta-analytic review[05] Using social and behavioural science to support COVID-19 pandemic response[06] “Economic man” in cross-cultural perspective: Behavioral experiments in 15 small-scale societies[07] Homo Heuristicus: Why Biased Minds Make Better Inferences[08] Social cognition and the brain: A meta‐analysis

💬Comments(0)

💭

No comments yet