In public policy, outrage factor is public opposition to a policy that is not based on the knowledge of the technical details. The term "outrage factor" originates from Peter Sandman's 1993 book, Responding to Community Outrage: Strategies for Effective Risk Communication.[1][2][3]

Causes

edit

"Outrage factors" are the emotional factors that influence perception of risk. The risks that are considered involuntary, industrial and unfair are often given more weight than factors that are thought of as voluntary, natural and fair.

Sandman gives the formula:[4]

Risk = Hazard + Outrage

The following are listed in Covello and Sandman's 2001 article, Risk Communication: Evolution and Revolution

Factor Risks considered to… Are less acceptable than…
Voluntariness[5] Be involuntary or imposed Risks from voluntary activities
Controllability[5] Be under the control of others Risks under individual control
Familiarity[5] Be unfamiliar Risks associated with familiar activities
Fairness[5] Be unfair or involve unfair processes Risks from fair activities
Benefits[5] Have unclear, questionable, or diffused personal or economic benefits Risks from activities with clear benefits
Catastrophic potential[5] Have the potential to cause a significant number of deaths and injuries at once Risks from activities that cause deaths and injuries at random or over a long period of time
Understanding[5] Be poorly understood Well understood or self-explanatory risks
Uncertainty[5] Be relatively unknown or are highly uncertain Risks from activities that appear to be relatively well known to science
Delayed effects[5] Have delayed effects Risks from activities that have immediate effects
Effects on children[5] Put children specifically at risk Risks that appear to primarily affect adults
Effects on future generations[5] Pose a threat to future generations Risks from activities that do not
Victim Identity[5] Produce identifiable victims Risks that produce statistical victims
Dread[5] Evoke fear, terror, or anxiety Risks from activities that don’t arouse such feelings and emotions
Trust[5] Be associated with individuals, institutions, or organizations lacking in trust and credibility Risks from activities associated with those that are trustworthy and credible
Media attention[5] Receive considerable media coverage Risks from activities that receive little coverage
Accident history[5] Have a history of major accidents or frequent minor accidents Risks from activities with little to no such history
Reversibility[5] Have potentially irreversible adverse effects Risks from activities considered to have reversible adverse effects
Personal stake[5] Place people or their families personally and directly at risk Risks from activities that pose no direct or personal threat
Ethical/moral nature[5] Be ethically objectionable or morally wrong Risks from ethically neutral activities
Human vs. natural origin[5] Generated by human action, failure, or incompetence Risks believed to be caused by nature or “Acts of God”

Risk communications

edit

While policy analysis by institutional stakeholders typically focuses on risk-benefit analysis and cost-benefit analysis, popular risk perception is not informed by the same concerns. The successful implementation of a policy relying on public support and cooperation must address the outrage factor when informing the public about the policy.[6]

In an interview with New York Times journalist and Freakonomics author Stephen J. Dubner, Sandman emphasized "the most important truth in risk communication is the exceedingly low correlation between whether a risk is dangerous, and whether it's upsetting".[4]

The relevance of public outrage has been acknowledged in discussions of various policy debates, including

See also

edit

References

edit
  1. ^ a b Nebel, Bernard J.; Richard T. Wright (1993). Environmental science: the way the world works (4th ed.). Prentice Hall PTR. pp. 392–3. ISBN 0-13-285446-5.
  2. ^ a b Hird, John A. (1994). Superfund: the political economy of environmental risk. JHU Press. p. 70. ISBN 0-8018-4807-5.
  3. ^ You, Myoungsoon; Ju, Youngkee (2015-08-10). "The Influence of Outrage Factors on Journalists' Gatekeeping of Health Risks". Journalism & Mass Communication Quarterly. 92 (4): 959–969. doi:10.1177/1077699015596339. S2CID 147352231.
  4. ^ a b Stephen J. Dubner (2011-11-29). "Risk = Hazard + Outrage: A Conversation with Risk Consultant Peter Sandman".
  5. ^ a b c d e f g h i j k l m n o p q r s t Corvello, Vincent; Sandman, Peter (2001). "Risk communication: Evolution and Revolution". Solutions to an Environment in Peril.
  6. ^ Sandman, Peter M. (2016-11-07). "Risk Communication: Facing Public Outrage". Management Communication Quarterly. doi:10.1177/0893318988002002006. S2CID 144400652.
  7. ^ Williams, David R. (1998). What is safe?: the risks of living in a nuclear age. Royal Society of Chemistry. p. 39. ISBN 0-85404-569-4.
  8. ^ Kayyem, Juliette; Robyn L. Pangi (2003). First to arrive: state and local responses to terrorism. BCSIA studies in international security. MIT Press. p. 68. ISBN 0-262-61195-3.
  9. ^ Milloy, Steven J. (1995). Science without sense: the risky business of public health research. Cato Institute. p. 8. ISBN 1-882577-34-5.
  10. ^ David, Pencheon; David Melzer; Charles Guest; Muir Gray (2006). Oxford handbook of public health practice. Oxford handbooks (2nd ed.). Oxford University Press. p. 221. ISBN 0-19-856655-7.

Bibliography

edit