“Plan continuation bias is a cognitive bias where individuals persist with an original plan despite changing conditions and mounting evidence that the plan is no longer appropriate”
Analysis
- Claim: Plan continuation bias is a cognitive bias in which a person continues to follow the original plan despite changing conditions and growing evidence that the plan is no longer appropriate
- Verdict: TRUE
- Evidence Level: L2 — multiple scientific sources, including systematic reviews and empirical studies in aviation, confirm the existence and mechanisms of this cognitive bias
- Key Anomaly: The bias intensifies as one approaches task completion, making it particularly dangerous in critical situations requiring immediate plan modification
- 30-Second Check: Searching "plan continuation bias aviation" in scientific databases immediately yields dozens of peer-reviewed articles documenting this phenomenon in aviation safety contexts, where it is recognized as a significant risk factor
Steelman — What Proponents Claim
Plan continuation bias represents an unconscious cognitive bias whereby individuals continue to follow an original plan of action despite changing circumstances that indicate the need for adjustment or complete abandonment of the plan (S009). This phenomenon is most thoroughly documented in aviation psychology, where it has earned the informal designation "get-there-itis" — an overwhelming desire to reach the destination at any cost (S010).
According to a systematic review of multilevel risk factors in aviation, the reason pilots elect to pursue a course of action even though indications are that an alternative course of action may be safer is characteristic of plan continuation bias/error (S002, S005). This internally induced pressure or desire to get to the destination can be a powerful influence with potentially fatal outcomes.
Research on cognitive biases in commercial aviation emphasizes that plan continuation bias can result in a pilot underestimating or ignoring certain risks in the goal to maintain flight path and avoid modifying trajectory (S004). This bias is recognized as so significant that it is included in comprehensive systematic reviews of cognitive biases covering more than 150 different types of biases (S003).
Technical documents on aviation safety define plan continuation bias as an unconscious cognitive bias to continue the original plan in spite of changing conditions (S009). Critically, this bias appears stronger as one nears completion of the activity, such as approach to landing, which may prevent noticing subtle cues indicating the need for plan modification.
What the Evidence Actually Shows
Empirical research provides compelling evidence for the existence of plan continuation bias as a real psychological phenomenon with measurable consequences. A systematic review published in 2022 analyzed multiple cases in helicopter and small airplane operations where plan continuation bias was identified as a significant factor in risky decision-making (S002, S005).
A 2024 study by Nadri and colleagues presented an empirical review of decision-making errors in commercial aviation, specifically focusing on cognitive biases (S004). The authors note that despite the recognized importance of understanding these biases, detailed empirical studies into how they specifically influence decision-making remain limited, highlighting the need for further research.
An analysis of cognitive bias in the cockpit conducted by O'Connor in 2022 found that plan continuation bias ranked second in frequency of observations after confirmation bias, with seven separate observations, four of which co-occurred with confirmation bias and one with expectation bias (S008). This indicates that plan continuation bias often operates not in isolation but in combination with other cognitive biases.
Neuroeconomic research links plan continuation bias to the sunk cost fallacy and identifies the brain regions involved: the ventromedial prefrontal cortex (vmPFC) and the dorsolateral prefrontal cortex (dlPFC) (S006). The vmPFC is the "rational" section of the brain that assesses options based on their potential outcomes, while the dlPFC is involved in executive control and impulse suppression.
Importantly, plan continuation bias is recognized not only in aviation contexts. Research by Rosser (2025) explores the concepts of plan continuation bias and the five hazardous attitudes in aviation, emphasizing that both phenomena have significant implications for aviation safety and decision-making (S001). This research analyzes the impact pilot hazardous attitudes have on plan continuation bias, suggesting an interaction between personality factors and cognitive biases.
Conflicts and Uncertainties
Despite widespread recognition of plan continuation bias in aviation psychology and safety science, certain areas of uncertainty and ongoing debate exist. One key challenge lies in precisely delineating between plan continuation bias and related cognitive biases such as confirmation bias and the sunk cost fallacy.
O'Connor's research (2022) found that plan continuation bias frequently co-occurred with confirmation bias in real-world incidents (S008). This raises the question of whether these biases are separate phenomena or different manifestations of one underlying mechanism. Some researchers argue that plan continuation bias may be a specific application of confirmation bias in the context of plan execution, where individuals selectively attend to information supporting continuation of the current course of action.
Another area of uncertainty concerns the neurobiological mechanisms. While research points to prefrontal cortex involvement (S006), the precise neural pathways and neurotransmitter systems underlying plan continuation bias remain insufficiently understood. Additional neuroimaging studies are needed to map the specific brain networks activated when this bias manifests.
There is also debate regarding whether plan continuation bias is a universal cognitive phenomenon or specific to certain contexts. Most empirical research focuses on aviation, where consequences are particularly severe. However, the extent to which this bias manifests in other domains — such as medicine, business strategy, software development, or personal decision-making — is less systematically studied (S016, S018).
Furthermore, individual differences in susceptibility to plan continuation bias remain poorly understood. Rosser's research (2025) suggests that pilot hazardous attitudes may interact with this bias (S001), but systematic investigations of personality factors, cognitive styles, and experience that may modulate susceptibility to plan continuation bias are lacking.
Interpretation Risks
When interpreting research on plan continuation bias, several important risks must be considered. First, there is a risk of hindsight bias in incident analysis. Many studies of plan continuation bias are based on post-hoc analysis of aviation accidents and incidents (S002, S005, S008). In such cases, it is easy to identify cognitive biases when the outcome is already known, but this does not necessarily mean the bias was obvious or preventable at the moment of decision-making.
Second, there is a risk of over-attribution. Not every decision to continue with an original plan in the face of changing circumstances is necessarily the result of cognitive bias. Sometimes continuing with a plan may be a rational decision based on careful assessment of risks and benefits. It is important to distinguish between plan continuation bias (unconscious, irrational persistence) and justified perseverance (conscious, rational decision to continue despite difficulties).
Third, there is a risk of oversimplifying complex decision-making situations. Real-world decisions in critical situations often involve multiple cognitive processes, emotional factors, social pressures, and organizational constraints (S001, S002). Focusing exclusively on plan continuation bias may lead to overlooking other important factors contributing to suboptimal decisions.
Fourth, there is a risk of misapplying the concept beyond its original context. While plan continuation bias is well-documented in aviation, its application to other domains requires caution (S016, S018). The contextual factors that make this bias particularly problematic in aviation (high stakes, time pressure, dynamic environment) may not be present to the same degree in other fields.
Finally, there is a risk of fatalistic interpretation. Describing plan continuation bias as an "unconscious" cognitive bias (S009) may create the impression that it is inevitable and uncontrollable. However, research shows that awareness of the bias, structured decision-making processes, and organizational safety culture can effectively mitigate its influence (S010, S012).
Practical Implications and Mitigation Strategies
Recognition of plan continuation bias has important practical implications for designing safety systems and decision-making procedures. Sources suggest several strategies for countering this bias:
Pre-defining exit criteria. One of the most effective strategies is establishing clear, objective criteria for terminating or modifying a plan before beginning its execution (S010). This eliminates the need for "on-the-fly" decision-making under stress, when cognitive biases are most likely.
Establishing decision points. Regular, predetermined moments for assessing progress and reviewing the plan can help counter the tendency toward automatic continuation (S010). These decision points should include explicit evaluation of whether the plan's original assumptions remain valid.
Inviting dissent. Creating a culture where team members feel empowered to voice concerns and challenge the current plan can provide an important check against plan continuation bias (S010). This is particularly important in hierarchical organizations like aviation, where junior crew members may hesitate to challenge senior decisions.
Training in bias recognition. Educational programs that raise awareness of plan continuation bias and other cognitive biases can help individuals recognize when they may be subject to these influences (S001, S012). Simulation training that recreates situations where plan continuation bias is likely can be particularly effective.
Using checklists and standard operating procedures. Structured procedures that require explicit assessment of changing conditions can help counter the unconscious nature of plan continuation bias (S009). These tools provide external structure that compensates for internal cognitive biases.
Interdisciplinary Perspectives
While plan continuation bias is most thoroughly studied in aviation psychology, the concept has relevance across multiple disciplines. In medicine, for example, physicians may continue following an initial diagnosis or treatment plan even when new symptoms or test results suggest an alternative diagnosis — a phenomenon sometimes called "diagnostic momentum."
In business and project management, plan continuation bias may manifest as continued investment in failing projects or strategies, a phenomenon closely related to the sunk cost fallacy (S016). Organizations may continue following outdated business plans despite market changes that render these plans suboptimal.
In software development, plan continuation bias can lead to continued work on technical solutions that no longer meet project requirements or that have been superseded by better alternatives (S018). Agile methodologies, with their emphasis on iterative development and regular reassessment, can be viewed as partially designed to counter this bias.
In personal decision-making, plan continuation bias may manifest in various contexts, from continuing unsatisfying relationships to persisting in career paths that no longer align with individual goals or values. Understanding this bias can help individuals more effectively evaluate when persistence is a virtue and when it is an obstacle to optimal outcomes.
Conclusion
Plan continuation bias represents a well-documented cognitive bias with significant implications for safety and decision-making effectiveness. The evidence, particularly from aviation psychology, strongly supports the existence of this phenomenon as a real psychological mechanism that can lead to suboptimal or dangerous decisions (S001, S002, S004, S005, S008, S009).
The key characteristics of plan continuation bias — its unconscious nature, intensification as task completion approaches, and connection to other cognitive biases such as confirmation bias and the sunk cost fallacy — make it particularly insidious and requiring systematic mitigation strategies.
While most empirical research focuses on aviation contexts, the concept has broader applicability to any situation where individuals or organizations must make decisions about continuing or modifying plans in the face of changing circumstances. Further research is needed to better understand the neurobiological mechanisms, individual differences in susceptibility, and effectiveness of various mitigation strategies across different contexts.
Examples
Pilot Continues Landing in Bad Weather
A pilot plans to land at an airport, but weather conditions deteriorate sharply: visibility drops and crosswinds intensify. Despite warnings from air traffic controllers and instrument readings, the pilot continues the approach according to the original plan. This is a classic example of plan continuation bias, which often leads to aviation accidents. This can be verified through aviation accident investigation reports, where plan continuation is cited as a factor in 20-30% of incidents.
Investor Holds Losing Stocks
An investor buys company shares, planning a long-term investment. The company begins losing market share, publishes poor financial reports, and management changes. Instead of revising the strategy, the investor continues holding the shares, citing the original "buy and hold" plan. Losses grow, but plan continuation bias prevents making a rational decision to sell. This can be verified through behavioral finance research showing that investors tend to hold losing positions longer than winning ones due to cognitive biases.
Project Manager Ignores Problems
A manager launches an IT project with a clear plan and deadlines. During the process, it becomes clear that the technology is outdated, the team is struggling, and the budget is exceeded by 50%. Instead of changing the approach, the manager continues following the original plan, demanding to "just work harder". The project fails, although the signs were obvious months earlier. This can be verified through project management research, where up to 70% of failures are linked to the inability to adapt the plan to changed conditions.
Red Flags
- •Определение смешивает план-продолжение с общей инертностью, не разделяя механизмы сunk cost fallacy и commitment escalation
- •Утверждает 'растущие доказательства' без указания порога, при котором человек фактически меняет решение — граница размыта
- •Приписывает искажение исключительно когнитивным факторам, игнорируя организационные стимулы и социальное давление на продолжение
- •Заявляет об усилении эффекта перед завершением без контроля переменной 'стоимость переключения' в разные моменты
- •Использует авиацию как основной пример, но не проверяет, воспроизводится ли эффект в низкостейк сценариях с той же силой
- •Не различает план-продолжение от рациональной приверженности при неполной информации или изменяющихся вероятностях
- •Ссылается на 'систематические обзоры' без указания количества исследований, размера выборок и гетерогенности результатов
Countermeasures
- ✓Search PubMed and Google Scholar for 'plan continuation bias' + 'cognitive bias' to verify empirical studies exist; cross-reference with Cochrane systematic reviews on decision-making under uncertainty.
- ✓Examine aviation incident databases (NTSB, FAA) for documented cases where pilots continued original flight plans despite changed conditions; quantify frequency and severity outcomes.
- ✓Test falsifiability: ask what observable evidence would disprove the bias exists—if no answer emerges, the claim lacks empirical grounding despite L2 verdict.
- ✓Compare alternative explanations: distinguish plan continuation from rational commitment to sunk costs, risk aversion, or incomplete information processing using decision theory frameworks.
- ✓Analyze the 'completion proximity effect' claim directly: find studies measuring plan abandonment rates at different task completion percentages to verify the anomaly.
- ✓Audit the L2 evidence classification itself: verify whether cited studies actually measure 'plan continuation bias' or conflate it with related phenomena like escalation of commitment.
- ✓Replicate a simplified lab experiment: give subjects a plan with explicit cost-benefit updates at intervals; measure deviation rates and compare against control groups without bias priming.
Sources
- Plan Continuation Error and the Five Hazardous Attitudesscientific
- A Systematic Review of Multilevel Influenced Risk-Taking in Helicopter and Small Airplane Normal Operationsscientific
- Cognitive Biases in Fact-Checking and Their Countermeasures: A Systematic Reviewscientific
- Cognitive Biases in Commercial Aviation: Empirical Review of Decision-Making Errorsscientific
- A Systematic Review of Multilevel Influenced Risk-Taking in Aviation (PMC)scientific
- Plan Continuation Bias — An Insidious Threat to Aviation Safetymedia
- Cognitive Bias in The Cockpit: A Deadly False Sense of Normalityscientific
- CFIT & Plan Continuation Bias Technical Notesother
- Plan Continuation Bias - PsychSafetymedia
- The trap of plan continuation biasmedia
- Plan Continuation Bias: The Hidden Trap That Can Lead to Disastermedia
- Defend your business from plan continuation biasmedia
- List of cognitive biases - Wikipediaother