HOME

TheInfoList



OR:

The preparedness paradox is the proposition that if a society or individual acts effectively to mitigate a potential disaster such as a
pandemic A pandemic () is an epidemic of an infectious disease that has spread across a large region, for instance multiple continents or worldwide, affecting a substantial number of individuals. A widespread endemic (epidemiology), endemic disease wi ...
,
natural disaster A natural disaster is "the negative impact following an actual occurrence of natural hazard in the event that it significantly harms a community". A natural disaster can cause loss of life or damage property, and typically leaves some econ ...
or other catastrophe so that it causes less harm, the avoided danger will be perceived as having been much less serious because of the limited damage actually caused. The paradox is the incorrect perception that there had been no need for careful preparation as there was little harm, although in reality the limitation of the harm was due to preparation. Several
cognitive bias A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, m ...
es can consequently hamper proper preparation for future risks.


Background

The term "preparedness paradox" has been used occasionally since at least 1949 in different contexts, usually in the military and financial system. The term regained traction in reference to the
Covid-19 pandemic The COVID-19 pandemic, also known as the coronavirus pandemic, is an ongoing global pandemic of coronavirus disease 2019 (COVID-19) caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). The novel virus was first identif ...
and to the overall government response worldwide. Another notable citation of the term was in 2017 by
Roland Berger Roland Berger (born 22 November 1937) is a German entrepreneur, consultant and philanthropist. Life Roland Berger was born in Berlin in 1937 as Robert Altmann; his family name changed later, after his father, Georg L. Berger, married his mot ...
regarding executives in the aerospace and defense industry: almost two thirds of those surveyed reported that they were well-prepared for geopolitical changes, about which they could do nothing, while feeling unprepared in areas such as changes in technology and innovation, to which they should be much more able to respond. In contrast, other surveys found that boards and financial professionals were increasingly concerned about geopolitical risk. Berger concluded that there was an urgent need for more and better business strategies throughout industry to close this gap in preparedness.


Cognitive biases

Organisms with faster life histories and shorter lives are disproportionately affected by chaotic or hostile environments. These types of organisms innately have a greater fear of environmental disasters or emergencies. However, organisms with slower life histories, such as humans, may have less urgency in dealing with these types of events. Instead, they have more time and ability to prepare for such emergencies.
Cognitive bias A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, m ...
es play a large role in the lack of urgency in preparation, hampering efforts to prevent disasters. These include over-optimism, in which the degree of disaster is underestimated, and the fact that many disasters do not reach their breaking point until it is too late to take action. In over-optimism and
normalcy bias Normalcy bias, or normality bias, is a cognitive bias which leads people to disbelieve or minimize threat warnings. Consequently, individuals underestimate the likelihood of a disaster, when it might affect them, and its potential adverse effects. ...
, people believe that disasters will happen elsewhere, and even if they do happen locally only their neighbors will be affected. Another obstacle to preparedness is the interval between disasters. When there is a long time between disasters, there is less urgency to prepare. This is due to fewer people remembering the last disaster, which reduces its emotional impact on the group. This effect is heightened when some measure of action is taken to prevent the disaster, which further reduces the memory of the original danger and consequences. Financial concerns can also contribute to the preparedness paradox. There is a tendency to over-value known short-term costs, as well as to under-value unknown long-term rewards. The fact that preparing for disasters is expensive in the short term and its value in the long term cannot be determined could lead to catastrophic consequences if the choice is made to not prepare.


Examples

Preparing for a pandemic is a particularly evident example of the preparedness paradox. Because adequate preparation means that no mass deaths or visible consequences will occur, there is no evidence that the preparation for the pandemic was necessary. Historical perspective can also contribute to the preparedness paradox. From the point of view of historians after the Year 2000 problem, the preventative action taken has been described as an "overreaction", instead of a successful effort to prepare for an upcoming problem.


See also

*
Cascade effect A cascade effect is an inevitable and sometimes unforeseen chain of events due to an act affecting a system. If there is a possibility that the cascade effect will have a negative impact on the system, it is possible to analyze the effects with a ...
*
Prevention paradox The prevention paradox describes the seemingly contradictory situation where the majority of cases of a disease come from a population at low or moderate risk of that disease, and only a minority of cases come from the high risk population (of t ...
*
Survivorship bias Survivorship bias or survival bias is the logical error of concentrating on entities that passed a selection process while overlooking those that did not. This can lead to incorrect conclusions because of incomplete data. Survivorship bias is ...
*
False positive paradox The base rate fallacy, also called base rate neglect or base rate bias, is a type of fallacy in which people tend to ignore the base rate (i.e., general prevalence) in favor of the individuating information (i.e., information pertaining only to a ...
* Tragedy of the commons


References


External links


The Ostrich Paradox: Why We Underprepare for Disasters
{{Authority control Cognitive biases Emergency management