Skip to content
Navigation
🏠Overview
Knowledge
🔬Scientific Foundation
🧠Critical Thinking
🤖AI and Technology
Debunking
🔮Esotericism and Occultism
🛐Religions
🧪Pseudoscience
💊Pseudomedicine
🕵️Conspiracy Theories
Tools
🧠Cognitive Biases
✅Fact Checks
❓Test Yourself
📄Articles
📚Hubs
Account
📈Statistics
🏆Achievements
⚙️Profile
Deymond Laplasa
  • Home
  • Articles
  • Hubs
  • About
  • Search
  • Profile

Knowledge

  • Scientific Base
  • Critical Thinking
  • AI & Technology

Debunking

  • Esoterica
  • Religions
  • Pseudoscience
  • Pseudomedicine
  • Conspiracy Theories

Tools

  • Fact-Checks
  • Test Yourself
  • Cognitive Biases
  • Articles
  • Hubs

About

  • About Us
  • Fact-Checking Methodology
  • Privacy Policy
  • Terms of Service

Account

  • Profile
  • Achievements
  • Settings

© 2026 Deymond Laplasa. All rights reserved.

Cognitive immunology. Critical thinking. Defense against disinformation.

  1. Home
  2. /Critical Thinking
  3. /Reality Check
  4. /Media Literacy
  5. /🖤 When Experts Unite Against the Knowle...
📁 Media Literacy
⚠️Ambiguous / Hypothesis

🖤 When Experts Unite Against the Knowledge Crisis: Anatomy of Medical Misinformation and Why Universities Are Failing

Medical misinformation has evolved from a local problem into a global knowledge crisis, demanding a systematic response from the scientific community. Despite attempts by experts to unite against the spread of false health information, cognitive bias mechanisms and structural problems in academic communication continue to undermine trust in evidence-based medicine. This article examines why traditional approaches by universities and research centers prove ineffective against the viral nature of medical myths, and proposes a cognitive self-defense protocol.

🔄
UPD: February 21, 2026
📅
Published: February 20, 2026
⏱️
Reading time: 11 min

Neural Analysis

Neural Analysis
  • Topic: The medical misinformation crisis as an epistemological threat and attempts by the expert community to counter the spread of false health knowledge
  • Epistemic status: Moderate confidence — the problem is documented in academic literature, but the effectiveness of countermeasures remains contested
  • Evidence level: Conceptual analysis based on philosophical works and observational studies of communication strategies during the COVID-19 pandemic
  • Verdict: Medical misinformation poses a real threat to public health, but expert attempts to "unite" often fail due to misunderstanding of the cognitive mechanisms behind myth propagation. Universities and research institutions use outdated communication models that are ineffective against the viral nature of false information.
  • Key anomaly: The expertise paradox — the more scientists try to "debunk" myths using traditional methods, the stronger the boomerang effect among audiences with preexisting beliefs
  • Check in 30 sec: Find the source of a medical claim and verify whether the author has conflicts of interest or financial motivation
Level1
XP0
🖤
Medical misinformation is no longer a marginal problem confined to isolated forums and social media groups—it has evolved into a systemic knowledge crisis that undermines the fundamental mechanisms of public health. When experts attempt to unite against this threat, they face a paradox: traditional academic communication tools prove powerless against the viral nature of medical myths. Universities, research centers, and professional medical associations are losing the information war not because they lack evidence, but because they fundamentally misunderstand the anatomy of the modern trust crisis.

📌What exactly we mean by the knowledge crisis in medicine — and why it's not just "people believe nonsense"

The knowledge crisis in medicine is not a deficit of information. It's an excess of competing narratives in the absence of reliable credibility filters, where false information gains structural advantages over verified facts (S007).

The systemic disruption affects three levels simultaneously: data production, validation, and dissemination. Each level operates by different rules, and it's precisely this asymmetry that creates the crisis. More details in the Logic and Probability section.

🧩 Three dimensions of the crisis

Production
Academic science generates data through multi-stage peer review and replication processes. Misinformation producers work in real-time, adapting narratives to current audience anxieties.
Validation
The scientific community uses complex methodological criteria incomprehensible to the general public. Pseudoscientific claims rely on intuitively appealing but false criteria: "it's natural," "they're hiding this," "simple explanation."
Dissemination
Social platform algorithms are optimized for engagement, not accuracy. This creates a structural advantage for emotionally charged but false content.

⚠️ Asymmetry of effort: why debunking loses to myth

Debunking requires an order of magnitude more cognitive resources than creating a myth. The claim "vaccines contain microchips" takes three seconds to say.

Its scientific refutation requires explaining vaccine manufacturing technology, principles of radio-frequency identification, biological incompatibility of silicon structures with immune response, and demonstrating the absence of corresponding components in the formulation.

This asymmetry creates an insurmountable advantage for misinformation in conditions of limited attention. This isn't a question of audience education — it's a question of information environment architecture.

🔎 Boundaries of the problem: what's included in the crisis

Included in the crisis Remains outside the scope
Systematic dissemination of verifiably false claims about disease mechanisms, treatment effectiveness, and intervention safety Legitimate scientific discussions about uncertain questions
Exploitation of cognitive vulnerabilities to promote ineffective or dangerous practices Criticism of specific healthcare system shortcomings
Undermining trust in public health institutions through conspiracy narratives Cultural differences in health and illness perception that don't contradict basic biological facts

The distinction is critical: without it, any criticism of science becomes "misinformation," and any doubt becomes "ignorance." The knowledge crisis is not an absence of critical thinking. It's the structural advantage of falsehood over truth in an information environment where algorithms, cognitive traps, and economic incentives work against accuracy.

Three-dimensional diagram of the medical knowledge crisis with axes of information production, validation, and dissemination
Visualization of three critical dimensions of the medical knowledge crisis: content production speed, credibility validation mechanisms, and information dissemination channels create a systemic advantage for misinformation over scientific data

🧱Five Most Compelling Arguments That Experts Can Actually Unite Against Misinformation

Before analyzing the failures of academic communication, it's necessary to honestly examine the strongest arguments for the effectiveness of expert collaboration. Steelmanning — presenting the opposing position in its most convincing form — is critically important for avoiding straw man fallacies and understanding the real mechanisms of the problem. More details in the Thinking Tools section.

🔬 First Argument: The Cumulative Power of Expert Consensus

When multiple independent experts from different institutions reach the same conclusions, it creates a powerful epistemological signal. Historical examples show that scientific consensus ultimately prevails: germ theory displaced miasma theory, evolutionary biology became the foundation of modern medicine, the link between smoking and lung cancer was eventually acknowledged even by tobacco companies.

Expert collaboration amplifies this signal, making it discernible even through the noise of misinformation. Joint statements from professional associations, inter-university research consortia, and international scientific collaborations should theoretically create insurmountable authority.

📊 Second Argument: Resource Advantage of Academic Institutions

Universities and research centers possess enormous resources: access to data, computational power, expertise in research design, networks for rapid hypothesis testing. Expert attempts to unite against misinformation, as in initiatives combating COVID-19 myths, demonstrate the ability to quickly mobilize these resources (S001).

Coordinated efforts enable the creation of debunking databases, training of fact-checkers, development of educational materials, and execution of large-scale information campaigns. Theoretically, this resource advantage should be decisive against scattered misinformation producers.

🧪 Third Argument: Reproducibility as an Insurmountable Advantage

Scientific claims can be independently verified in different laboratories, by different researchers, using various methodologies. Pseudoscientific claims cannot withstand such scrutiny.

When experts unite for systematic verification of popular medical myths, they can demonstrate the non-reproducibility of false claims. Numerous independent studies have failed to reproduce the claimed effects of homeopathy, leading to changes in reimbursement policies in several countries. The cumulative force of negative replication results should demolish myths.

🛡️ Fourth Argument: Institutional Legitimacy and Regulatory Power

Academic experts don't just produce knowledge — they occupy positions in regulatory bodies, advise governments, and determine clinical protocols. When experts unite, their recommendations become official healthcare policy.

  1. Physicians follow clinical guidelines based on expert consensus
  2. Regulators approve drugs based on expert review
  3. Insurance companies reimburse procedures recognized as effective by the expert community
  4. Institutional power translates into control over medical practice

🧬 Fifth Argument: Long-Term Victory Through Next-Generation Education

Universities control the education of future physicians, researchers, journalists, and policymakers. Even if the current generation is susceptible to misinformation, systematic training in critical thinking, scientific methodology, and media literacy should create a cohort resistant to medical myths.

Expert collaboration in educational initiatives, as demonstrated by attempts to integrate digital literacy into medical curricula (S003), can transform the basic cognitive skills of the population. This is a long-term strategy, but potentially unbeatable. For more on information verification methods, see lateral reading and fundamentals of epistemology.

🔬What the Data Shows: Systematic Analysis of Expert Coalition Effectiveness Against Medical Misinformation

Moving from theoretical arguments to empirical data, it's necessary to analyze what actually happens when experts attempt to unite against the medical knowledge crisis. More details in the section Statistics and Probability Theory.

📊 The COVID-19 Case: The Largest Natural Experiment in Expert Mobilization

The COVID-19 pandemic became an unprecedented test of the expert community's ability to counter misinformation. Research on New Zealand's discursive approach shows that even with strong political leadership and attempts to create a unified expert front ("Unite against COVID-19"), the communication strategy faced serious challenges (S001).

The key problem wasn't the absence of expert consensus on basic facts, but rather the inability of this consensus to compete with emotionally resonant alternative narratives in the digital space.

Analysis of pandemic policy revealed that the rapid shift to digital technologies created new vectors for misinformation spread that traditional academic institutions weren't prepared to address (S003). Universities possessed expertise in virology and epidemiology but lacked effective mechanisms for translating this expertise into the digital public sphere where mass beliefs were being formed.

🧪 Data from Other Areas: Dementia, Mental Health, Chronic Diseases

Expert attempts to unite against misinformation aren't limited to infectious diseases. The initiative to unite experts to strengthen dementia research demonstrates a different pattern: successful coordination in knowledge production doesn't automatically translate into effective public communication (S004).

  1. Experts collaborate effectively in research consortia
  2. Their findings don't reach broad audiences susceptible to myths about "natural" methods
  3. Conspiracy theories about the pharmaceutical industry spread in parallel

Research on the pandemic and global mental health shows that even when experts formulate clear recommendations, their implementation encounters barriers at the level of public perception and trust (S006). The problem isn't the quality of expertise, but the gap between expert knowledge and public understanding.

🧾 Meta-Analysis of the Knowledge Crisis: Structural Factors of Ineffectiveness

Comprehensive research on anthropogenic challenges provides a broader framework for understanding the problem (S007). The medical information crisis is part of a more general crisis of epistemological institutions in the digital age.

Traditional Truth-Establishment Mechanisms Digital "Credibility" Establishment Mechanisms
Peer review, institutional certification Social validation, emotional resonance
Academic reputation, methodological rigor Tribal identity, viral spread
Acknowledgment of uncertainty, cautious formulations Narrative simplicity, confirmation of existing beliefs

Expert coalitions, even when successful in producing quality knowledge, prove ineffective in the public sphere. Experts optimize communication for academic credibility criteria, while misinformation is optimized for viral spread criteria.

This isn't competition on equal terms—it's competition by different rules in different games. Misinformation operates in the ecosystem of social networks and emotional resonance; expert knowledge operates in the ecosystem of methodological rigor and institutional validation.

The gap between expert knowledge and public understanding, which misinformation exploits more effectively than scientific communication bridges it, requires rethinking the very mechanisms of knowledge translation. This isn't a problem of insufficient experts or their inability to unite—it's a problem of structural mismatch between how the modern information ecosystem works and how traditional institutions of knowledge production and dissemination operate.

Comparative visualization of expert coalition effectiveness in knowledge production versus public communication
Graphical representation of the expert coalition paradox: high effectiveness in research coordination and production of consensus recommendations contrasts with low effectiveness in changing public beliefs and countering misinformation

🧠Mechanisms of Failure: Why Academic Communication Structurally Loses to Disinformation

Understanding why experts cannot effectively unite against medical disinformation requires analyzing deep mechanisms, not surface symptoms. The problem isn't a lack of effort or competence—it's a fundamental mismatch between the nature of scientific knowledge and the demands of the modern information ecosystem. More details in the Mental Errors section.

🧬 Asymmetry of Cognitive Load: Why Lies Are Easier Than Truth

Scientific truth about medical questions is almost always more complex and less intuitive than attractive lies. Explaining why vaccines are safe requires understanding immunology, statistics of rare events, the difference between correlation and causation, the concept of herd immunity. The claim "vaccines are dangerous because they contain toxins" appeals to simple intuition: "poison in the body—bad."

This asymmetry of cognitive load creates an insurmountable advantage for disinformation in conditions of limited attention and cognitive resources of the audience (S001). When experts try to unite to debunk myths, they inevitably increase cognitive load: they add nuances, caveats, acknowledge areas of uncertainty.

Each caveat is perceived not as a sign of intellectual honesty, but as a sign of uncertainty or hidden motives. Disinformation has no such constraints—it can be absolutely categorical because it's not bound by the obligation to correspond to reality.

🔁 The Problem of Time Scales: Slow Science Versus Fast Memes

Scientific knowledge is produced slowly. Research goes through design, data collection, analysis, peer review, publication, replication—a process taking months or years. A medical myth can be created, spread, and embedded in mass consciousness within hours (S002).

When experts unite to respond to a new wave of disinformation, their response comes too late—the myth has already become part of certain groups' identity, and debunking is perceived as an attack on group belonging, not as providing information.

  1. Scientific understanding evolves as data accumulates (epistemologically correct)
  2. Publicly this looks like "experts constantly change their minds" (undermines trust)
  3. Disinformation remains consistent, not updating (paradoxically increases perceived reliability)

The COVID-19 pandemic dramatically demonstrated this problem (S005). Scientific understanding of the virus evolved correctly, but publicly it looked like inconsistency. Disinformation, not bound by the need to update according to new data, could remain consistent.

⚙️ Institutional Inertia: Why Universities Cannot Adapt Fast Enough

Academic institutions are optimized for producing reliable knowledge, not for rapid public communication. Incentive systems in universities reward publications in peer-reviewed journals, not viral social media posts. Career advancement depends on citations in academic literature, not on reach in the public sphere.

When experts try to unite to fight disinformation, they do so in their spare time, without institutional support, often risking their reputation in the eyes of colleagues who may perceive public communication as low-status "popularization."

Academic Culture Digital Public Sphere Requirements Result
Caution, nuance, acknowledging limitations Categoricalness, simplicity, confidence Expert loses in public communication
"Data suggest with moderate confidence" "I know for sure" The latter sounds more convincing
Long verification processes Instant dissemination Truth arrives too late

Universities cannot quickly adapt to the demands of the digital public sphere without losing their epistemological identity. This isn't a question of will or competence—it's a structural contradiction between the logic of knowledge production and the logic of its dissemination under conditions of information overload (S008).

Related materials: how social media turns attention into addiction, disinformation category, epistemology.

⚠️Cognitive Anatomy of Successful Medical Misinformation: Which Vulnerabilities It Exploits

To understand why uniting experts is insufficient, it's necessary to analyze which specific cognitive mechanisms make medical misinformation so effective. This is not accidental success—it's systematic exploitation of predictable features of human cognition. More details in the Epistemology section.

🧩 Availability Heuristic: Why Vivid Stories Beat Statistics

The human brain assesses the probability of events based on how easily examples can be recalled. One vivid story about a child who allegedly developed autism after vaccination psychologically weighs more than statistical data on millions of safe vaccinations.

Medical misinformation systematically exploits this heuristic by creating emotionally charged narratives with specific "victims" (S001). When experts respond with statistics and abstract data, they lose at the level of basic cognitive processing.

🕳️ Motivated Reasoning: Why People Defend Beliefs Rather Than Seek Truth

Cognitive science shows that people are not neutral information processors. When information threatens existing beliefs or group identity, defensive mechanisms activate.

Refuting a medical myth is perceived not as providing useful information, but as an attack on identity. If someone has publicly stated that vaccines are harmful, admitting error means a social cost—loss of status within their peer group.

Motivated reasoning drives people to find ways to discredit experts ("they're paid off by pharmaceutical companies") instead of revising beliefs (S002).

🧠 Backfire Effect: When Refutation Strengthens the Myth

Paradoxically, attempts to refute medical misinformation can strengthen it. Research shows that repeating a myth even in the context of refutation increases its familiarity, and familiarity is perceived as credibility.

  1. Large-scale refutation campaigns inadvertently increase the visibility of the myths themselves
  2. Aggressive refutation activates reactance—the psychological tendency to defend freedom of belief when it's perceived as being threatened
  3. The effect is amplified in information bubbles where people see only confirming content

🔁 Availability Cascades and Algorithmic Filtering

In the digital environment, information spreads through social networks, creating availability cascades—processes by which a belief becomes increasingly widespread simply because people observe others adopting it (S005).

Mechanism How It Works Why Experts Lose
Algorithmic Curation Content is selected based on user's previous behavior Refutations aren't shown to those who believe the myths—algorithms optimize for engagement, not truth
Information Bubbles People see predominantly information that confirms their beliefs Expert messages don't reach the target audience that needs them most
Social Proof Visibility of a belief in the network is perceived as its validity Misinformation spreads faster due to emotional charge, not accuracy

When experts attempt to unite to disseminate accurate information, they confront the architecture of the platform itself, which works against them. This is not a question of insufficient coordination—it's a question of incompatibility between the logic of expert knowledge and the logic of digital information distribution systems.

🛡️Cognitive Self-Defense Protocol: Seven Questions That Dismantle Medical Misinformation in 90 Seconds

Recognizing the structural limitations of expert consensus, we can develop an individual verification tool. Requires no advanced degrees — only consistency. More details in the Esoterica and Occultism section.

Misinformation operates on three levels: emotional capture, social proof, and cognitive overload (S001). The protocol below breaks each one.

  1. Who benefits? Identify the beneficiary — financial, reputational, political. If the gain is vague or hidden, that's a red flag.
  2. Where's the evidence? Demand the primary source, not a retelling. If the link leads to another retelling — the chain is broken.
  3. Why now? Check whether publication coincides with a crisis, fear, or information vacuum. Misinformation is a parasite on uncertainty.
  4. Who's saying this? An expert in this field or a popularizer? Is there a conflict of interest? Source verification methods are a standard tool.
  5. What's omitted? Which counterarguments, limitations, alternative explanations aren't mentioned? Completeness is a marker of honesty.
  6. How can this be tested? Can the claim be falsified? If not — it's not science, it's dogma.
  7. What am I feeling? Anger, fear, urgency? Emotion is a signal that cognitive defenses are down. A 24-hour pause is minimum.
Misinformation doesn't require perfection — it only requires that doubt outlive truth. The protocol inverts this: truth is verified in 90 seconds, doubt is resolved by fact.

Lateral reading is a professional method that accelerates this process. Open three tabs: the original text, a search for the author, a search for the opposing view. Parallel comparison reveals manipulation in minutes.

Information overload is not a system bug, but its mechanism (S008). The protocol works precisely because it reduces cognitive load to seven questions. Everything else is noise.

⚔️

Counter-Position Analysis

Critical Review

⚖️ Critical Counterpoint

The article's position relies on the assumption that disinformation is primarily a problem of communication and cognitive errors. However, this logic has blind spots that are worth examining honestly.

Overestimating Psychology, Ignoring Structure

The article may create the impression that the problem of disinformation is primarily a psychological problem of individuals, ignoring structural factors: underfunding of science, commercialization of medicine, real failures of medical institutions (the opioid crisis, pharmaceutical company scandals). These facts create rational grounds for distrust, not just cognitive biases.

Defending the Status Quo Under the Guise of Expertise

Criticism of "alternative experts" may be perceived as defending the status quo and ignoring the fact that official medicine does indeed sometimes make mistakes, is slow to acknowledge errors, and has conflicts of interest (ties to the pharmaceutical industry). Some "alternative" approaches—for example, criticism of excessive medicalization—have a rational core.

The Illusion of Certainty in Conditions of Uncertainty

The article may create the impression that "proper" scientific communication will solve the problem, but in reality many medical questions are objectively complex, data is contradictory, and expert consensus changes (masks, COVID-19 origins, effectiveness of interventions). Honest communication should reflect this uncertainty, not hide it.

Technocratic Optimism Without Accounting for Social Roots

The proposed solutions (fact-checking, media literacy, communication reform) may be insufficient against the deep social causes of distrust—inequality, political polarization, crisis of institutions. Without addressing these factors, informational interventions will remain superficial.

The Risk of Censorship Under the Mask of Protection

Strengthening control over medical information may lead to suppression of legitimate criticism and discussion, especially if decisions about what constitutes "disinformation" are made opaquely or under the influence of commercial and political interests. History shows that such mechanisms are often used against inconvenient questions.

Knowledge Access Protocol

FAQ

Frequently Asked Questions

It's a systemic inability of society to distinguish verified medical data from misinformation. The knowledge crisis occurs when the volume of false health information grows faster than institutions' ability to refute it, leading to erosion of trust in evidence-based medicine and a rise in dangerous self-treatment practices (S007). The problem is compounded by the fact that traditional academic communication channels are too slow and formal to counter the viral spread of myths on social media.
Because they use outdated communication models that don't account for audience cognitive biases. Research shows that direct myth "debunking" often triggers a backfire effect—people with preexisting beliefs start believing the myth even more strongly (S007). Universities and scientific institutions rely on publishing articles and press releases that don't reach target audiences and lose to emotionally charged content on social media. Additionally, academic culture avoids simplification, making expert explanations inaccessible to the general public.
The primary mechanisms are confirmation bias, availability heuristic, and emotional contagion. People tend to seek and remember information that confirms their existing beliefs, especially if it triggers strong emotions—fear, hope, anger (S007). Medical myths often exploit evolutionarily ancient survival triggers: "natural = safe," "elite conspiracy," "miraculous healing." Academic information, by contrast, is emotionally neutral and requires cognitive effort to understand, making it less "sticky" in memory.
It transformed misinformation into an infodemic—a parallel epidemic of false information spreading faster than the virus itself. During the pandemic, the volume of medical content grew exponentially, but much of it was unverified, contradictory, or outright false (S003, S006). Research shows that even government communication strategies, such as Jacinda Ardern's "Unite against COVID-19" approach, faced problems with polarization and distrust of official sources (S001). The pandemic exposed the structural unpreparedness of the scientific community for rapid response to information threats.
It's the paradoxical strengthening of belief in a myth after an attempt to refute it. When experts try to "debunk" a medical myth, they often repeat its formulation, which reinforces its memorability (S007). Moreover, if the refutation is perceived as an attack on a person's identity or values, psychological defense mechanisms activate, and the person begins seeking counterarguments that reinforce their original position. The effect is especially strong in polarized communities, where accepting the "official" version means betraying one's group.
Because they're optimized for knowledge production, not for its defense and popularization. The academic system rewards publications in peer-reviewed journals, not effective communication with the general public (S007). Universities respond slowly to information threats due to bureaucracy and a culture of caution. Additionally, many scientists lack science communication skills and avoid public debates, fearing reputational risks. As a result, the information space fills with voices of pseudo-experts who speak more simply, emotionally, and persuasively.
The most dangerous are sources masquerading as scientific—pseudo-journals, "alternative experts" with real academic degrees, and commercial platforms using scientific rhetoric. They exploit trust in formal attributes of expertise (titles, publications, lab coats) while distorting data or presenting correlations as causal relationships (S007). Particularly dangerous are sources with conflicts of interest—supplement manufacturers, "wellness gurus" with commercial programs, media personalities monetizing audience fears. They create an illusion of scientific validity, making their misinformation more convincing than outright conspiracy theories.
Complete eradication—no, but cognitive immunity can be built at individual and systemic levels. Misinformation isn't a virus that can be eliminated with a vaccine, but a constant ecological threat requiring adaptive defense (S007). At the individual level, this means developing critical thinking skills, source verification, and recognizing cognitive biases. At the systemic level—reforming scientific communication, creating rapid fact-checking mechanisms, integrating media literacy into education, and changing social media algorithms that currently amplify emotional content regardless of its accuracy.
Use the three-question protocol: 1) Who is the source and do they have a conflict of interest? 2) Is the claim supported by independent studies in peer-reviewed journals? 3) Does the source use emotional triggers (fear, miracle, conspiracy) instead of data? If the source is selling a product related to the claim, that's a red flag (S007). If the claim contradicts the consensus of major medical organizations (WHO, CDC, national academies of science), extraordinarily strong evidence is required. If the text appeals to "hidden truth" or "what doctors don't want you to know," that's a classic marker of conspiratorial misinformation.
They act as accelerators and amplifiers of misinformation through algorithms optimized for engagement, not accuracy. Social media algorithms promote content that triggers strong emotions and prolonged interaction, automatically giving advantage to shocking, provocative, and simplified claims over nuanced scientific explanations (S003, S007). Additionally, social media create echo chambers—closed information bubbles where people see only content confirming their beliefs. This transforms medical misinformation from scattered myths into coherent alternative knowledge systems with their own "experts," "research," and communities.
An infodemic is an excessive amount of information (both accurate and false) that makes it impossible to find reliable sources and navigate a topic. The term was introduced by the WHO during the COVID-19 pandemic to describe situations where the volume of contradictory data about the virus, treatment, and prevention exceeded people's ability to process it (S006). Unlike classic disinformation (deliberate lies), an infodemic also includes misinformation (unintentional errors), outdated data, premature conclusions from preprints, and contradictions between experts. The result is decision-making paralysis and growing distrust of all sources, including reliable ones.
Because alternative experts offer simple answers, emotional support, and a sense of control in situations of uncertainty. Official medicine often says 'we don't know,' 'it's complicated,' 'more research is needed,' which triggers anxiety and frustration (S007). Alternative experts, by contrast, make categorical statements, promise quick solutions, and create an illusion of understanding. Additionally, they use a 'little guy versus the system' narrative that resonates with distrust of institutions. For many people, emotional credibility (the source seems sincere, caring, 'like me') matters more than epistemic credibility (the source relies on verified data).
Deymond Laplasa
Deymond Laplasa
Cognitive Security Researcher

Author of the Cognitive Immunology Hub project. Researches mechanisms of disinformation, pseudoscience, and cognitive biases. All materials are based on peer-reviewed sources.

★★★★★
Author Profile
Deymond Laplasa
Deymond Laplasa
Cognitive Security Researcher

Author of the Cognitive Immunology Hub project. Researches mechanisms of disinformation, pseudoscience, and cognitive biases. All materials are based on peer-reviewed sources.

★★★★★
Author Profile
// SOURCES
[01] Using social and behavioural science to support COVID-19 pandemic response[02] Information Pollution as Social Harm: Investigating the Digital Drift of Medical Misinformation in a Time of Crisis[03] Psychosocial and Socio-Economic Crisis in Bangladesh Due to COVID-19 Pandemic: A Perception-Based Assessment[04] Opioid Epidemic in the United States[05] Misinformation of COVID-19 on the Internet: Infodemiology Study[06] Student Attitudes Towards Online Education during the COVID-19 Viral Outbreak of 2020: Distance Learning in a Time of Social Distance[07] Psychological health during the coronavirus disease 2019 pandemic outbreak[08] Information overload and fake news sharing: A transactional stress perspective exploring the mitigating role of consumers’ resilience during COVID-19

💬Comments(0)

💭

No comments yet