Anavailability cascade is a self-reinforcing cycle that explains the development of certain kinds of collective beliefs. A novel idea or insight, usually one that seems to explain a complex process in a simple or straightforward manner, gains rapid currency in the popular discourse by its very simplicity and by its apparent insightfulness. Its rising popularity triggers a chain reaction within the social network: individuals adopt the new insight because other people within the network have adopted it, and on its face it seems plausible. The reason for this increased use and popularity of the new idea involves both the availability of the previously obscure term or idea, and the need of individuals using the term or idea to appear to be current with the stated beliefs and ideas of others, regardless of whether they in fact fully believe in the idea that they are expressing. Their need for social acceptance, and the apparent sophistication of the new insight, overwhelm their critical thinking.
The idea of the availability cascade was first developed byTimur Kuran andCass Sunstein as a variation ofinformation cascades mediated by theavailability heuristic, with the addition ofreputational cascades.[1] The availability cascade concept has been highly influential in finance theory and regulatory research, particular with respect to assessing and regulatingrisk.
Availability cascades occur in a society via public discourse (e.g. thepublic sphere and thenews media) or oversocial networks—sets of linked actors in one or more of several roles. These actors process incoming information to form their private beliefs according to various rules, both rational and semi-rational. The semi-rational rules include the heuristics, in particular the availability heuristic. The actors then behave and express their public beliefs according to self-interest, which might cause their publicly expressed beliefs to deviate from their privately held beliefs.
Kuran and Sunstein emphasize the role of availability entrepreneurs, agents willing to invest resources into promoting a belief in order to derive some personal benefit. Other availability entrepreneurs with opposing interests may wage availability counter-campaigns. Other key roles includejournalists andpoliticians, both of which are subject to economic and reputational pressures, the former in competition in the media, the latter for political status. As resources (e.g. attention and money) are limited, beliefs compete with one another in the "availability market". A given incident and subsequent availability campaign may succeed in raising the availability of one issue at the expense of other issues.[1]
Dual process theory posits that human reasoning is divided into two systems, often called System 1 and System 2. System 1 is automatic and unconscious; other terms used for it include the implicit system, the experiential system, the associative system, and the heuristic system. System 2 is evolutionarily recent and specific to humans, performing the more slow and sequential thinking. It is also known as the explicit system, the rule-based system, the rational system, or the analytic system. InThe Happiness Hypothesis,Jonathan Haidt refers to System 1 and System 2 as the elephant and the rider: while human beings incorporatereason into their beliefs, whether via direct use offacts andlogic or their application as a test to hypotheses formed by other means, it is the elephant that is really in charge.
Heuristics are simple, efficient rules which people often use to form judgments and make decisions. They are mental shortcuts that replace a complex problem with a simpler one. These rules work well under most circumstances, but they can lead to systematic deviations from logic,probability orrational choice theory. The resulting errors are called "cognitive biases" and many different types have been documented. These have been shown to affect people's choices in situations like valuing a house or deciding the outcome of a legal case. Heuristics usually govern automatic, intuitive judgments but can also be used as deliberate mental strategies when working from limited information. While seemingly irrational, the cognitive biases may be interpreted as the result ofbounded rationality, with human beings making decisions while economizing time and effort.
Kuran and Sunstein describe the availability heuristic as more fundamental than the other heuristics: besides being important in its own right, it enables and amplifies the others, includingframing,representativeness,anchoring, and reference points.[1]
Even educated human beings are notoriously poor at thinking statistically.[2] Theavailability heuristic, first identified byDaniel Kahneman andAmos Tversky, is a mental shortcut that occurs when people judge the probability of events by how easy it is to think of examples. The availability heuristic operates on the notion that, "if you can think of it, it must be important." Availability can be influenced by the emotional power of examples and by their perceived frequency; while personal, first-hand incidents are more available than those that happened to others, availability can be skewed by the media. In his bookThinking, Fast and Slow, Kahneman cites the examples of celebrity divorces and airplane crashes; both are more often reported by the media, and thus tend to be exaggerated in perceived frequency.[3]
An important class of judgments is those concerningrisk: the expectation of harm to result from a given threat, a function of the threat's likelihood and impact. Changes in perceived risk result inrisk compensation—correspondingly more or less mitigation, including precautionary measures and support for regulation. Kuran and Sunstein offer three examples of availability cascades—Love Canal, theAlar scare, andTWA Flight 800—in which a spreading public panic led to growing calls for increasingly expensive government action to deal with risks that turned out later to be grossly exaggerated.[1] Others have used the term "culture of fear" to refer to the habitual achieving of goals via suchfear appeals, notably in the case of the threat ofterrorism.
In the early years of theHIV/AIDS epidemic, many believed that the disease received less attention than warranted, in part due to the stigma attached to its sufferers. Since that time advocates— availability entrepreneurs that include LGBT activists and conservativeSurgeon General of the United StatesC. Everett Koop—have succeeded in raising awareness to achieve significant funding. Similarly, awareness and funding for breast cancer and prostate cancer are high, thanks in part to the availability of these diseases. Other prevalent diseases competing for funding but lacking the availability of HIV/AIDS or cancer include lupus, sickle-cell anemia, and tuberculosis.[4]
TheMMR vaccine controversy was an example of an unwarrantedhealth scare. It was triggered by the publication in 1998 of a paper in the medical journalThe Lancet which presented apparent evidence thatautism spectrum disorders could be caused by theMMR vaccine, animmunization againstmeasles,mumps andrubella.[5]In 2004, investigations bySunday Times journalistBrian Deer revealed that the lead author of the article,Andrew Wakefield, had multiple undeclaredconflicts of interest,[6] had manipulated evidence,[7] and had broken other ethical codes. TheLancet paper was partially retracted in 2004 and fully retracted in 2010, and Wakefield was found guilty of professional misconduct. Thescientific consensus is that noevidence links the vaccine to the development of autism, and that the vaccine's benefits greatly outweigh its risks.The claims in Wakefield's 1998The Lancet article were widely reported;[8] vaccination rates in the UK and Ireland dropped sharply,[9] which was followed by significantly increased incidence of measles and mumps, resulting in deaths and severe and permanent injuries.[10] Reaction tovaccine controversies has contributed to a significant increase in preventable diseases including measles[11] andpertussis (whooping cough), which in 2011 experienced its worst outbreak in 70 years as a result of reduced vaccination rates.[12] Concerns about immunization safety often follow a pattern: some investigators suggest that a medical condition is an adverse effect of vaccination; a premature announcement is made of the alleged adverse effect; the initial study is not reproduced by other groups; and finally, it takes several years to regain public confidence in the vaccine.[13]
Extreme weather events provide opportunities to raise the availability ofglobal warming. In the United States, the mass media devoted little coverage to global warming until thedrought of 1988, and the testimony ofJames E. Hansen to the United States Senate, which explicitly attributed "the abnormally hot weather plaguing our nation" to global warming.[14] Theglobal warming controversy has attracted availability entrepreneurs on both sides, e.g. the bookMerchants of Doubt claiming that scientific consensus had long ago been reached, and climatologistPatrick Michaels providing the denialist viewpoint.
The media inclination tosensationalism results in a tendency to devote disproportionate coverage to sympathetic victims (e.g.missing white woman syndrome), terrifying assailants (e.g.Media coverage of the Virginia Tech massacre), and incidents with multiple victims. Although half the victims ofgun violence in the United States are black, generally young urban black males,[15] media coverage and public awareness spike after suburbanschool shootings, as do calls for strictergun control laws.
International adoption scandals receive disproportionate attention in the countries of adoptees' origins. As the incidents involve abuse of children, they easily spark media attention, and availability entrepreneurs (e.g. populist politicians) fan the flames of xenophobia, without making statistical comparisons of adoptee abuse in the source and target nations, or of the likelihood of abuse vs. other risks.[16]
Poisoned candy myths are urban legends that malevolent individuals could hide poison or drugs, or sharp objects such as razor blades, needles, or broken glass in candy and distribute the candy in order to harm random children, especially during Halloween trick-or-treating. Several events fostered the candy tampering myth. The first took place in 1964, when an annoyedLong Island, New York housewife started giving out packages of inedible objects to children who she believed were too old to be trick-or-treating. The packages contained items such assteel wool,dog biscuits, andant buttons (which were clearly labeled with the word "poison"). Although nobody was injured, she was prosecuted and pleaded guilty to endangering children. The same year saw reports oflye-filled bubble gum being handed out inDetroit andrat poison being given inPhiladelphia.[17]
The second milestone in the spread of the candy-tampering myths was an article published inThe New York Times in 1970. It claimed that "Those Halloween goodies that children collect this weekend on their rounds of ‘trick or treating’ may bring them more horror than happiness", and provided specific examples of potential tampering.[18]
In 2008, candy was found withmetal shavings and metal blades embedded in it. The candy wasPokémonValentine's Daylollipops purchased from aDollar General store inPolk County, Florida. The candy was determined to have been manufactured inChina and not tampered with within theUnited States. The lollipops were pulled from the shelves after a mother reported a blade in her child's lollipop and after several more lollipops with metal shavings in them were confiscated from a localelementary school.[19] Also in 2008, somecoldmedicine was discovered in cases ofSmarties that were handed out to children inOntario.[20]
Over the years, various experts have tried to debunk the various candy tampering stories. Among this group isJoel Best, aUniversity of Delawaresociologist who specializes in investigating candy tampering legends. In his studies, and the bookThreatened Children: Rhetoric and Concern about Child-Victims, he researchednewspapers from 1958 on in search of candy tampering.[21] Of these stories, fewer than 90 instances might have qualified as actual candy tampering. Best has found five child deaths that were initially thought by local authorities to be caused by homicidal strangers, but none of those were sustained by investigation.[22]
Despite the falsity of these claims, thenews media promoted the story continuously throughout the 1980s, with local news stations featuring frequent coverage. During this time, cases of poisoning were repeatedly reported based on unsubstantiated claims or before a full investigation could be completed and often never followed up on. This one-sided coverage contributed to the overall panic and caused rival media outlets to issue reports of candy tampering as well. By 1985, the media had driven the hysteria about candy poisonings to such a point that an ABC News/The Washington Post poll that found 60% of parents feared that their children would be injured or killed because of Halloween candy sabotage.
The phenomenon ofmedia feeding frenzies is driven by a combination of the psychology described by the availability cascade model and the financial imperatives of media organizations to retain their funding.
There are two schools of thought on how to cope with risks raised by availability cascades:technocratic anddemocratic. The technocratic approach, championed by Kuran and Sunstein, emphasizes assessing, prioritizing, and mitigating risks according to objective risk measures (e.g. expected costs, expecteddisability-adjusted life years (DALY)). The technocratic approach considers availability cascades to be phenomena of mass irrationality that can distort or hijack public policy, misallocating resources or imposing regulatory burdens whose costs exceed the expected costs of the risks they mitigate.
The democratic approach, championed byPaul Slovic, respects risk preferences as revealed by the availability market. For example, thoughlightning strikes kill far more people each year thanshark attacks, if people genuinely consider death by shark worse than death by lightning, a disproportionate share of resources should be devoted to averting shark attacks.
Kuran and Sunstein recommend that availability cascades be recognized, and institutional safeguards be implemented in all branches of government. They recommend expanded product defamation laws, analogous to personal libel laws, to discourage availability entrepreneurs from knowingly spreading false and damaging reports about a product. They recommend that the legislative branch create a Risk Regulation Committee to assess risks in a broader context and perform cost-benefit analyses of risks and regulations, avoiding hasty responses pandering to public opinion. They recommend that the executive branch use peer review to open agency proposals to scrutiny by informed outsiders. They also recommend the creation of a Risk Information Center with a Risk Information Web Site to provide the public with objective risk measures.[1] In the United States, theCenters for Disease Control and Prevention[23] and theFederal Bureau of Investigation[24] maintain web sites that provide objective statistics on the causes of death and violent crime.