Ted Schettler, MD, MPH, Katherine Barrett, PhD, Carolyn Raffensperger, MA, JD
Science and Environmental Health Network
Adapted from an essay by Schettler et al. in: McCally 2002.
The precautionary principle is a guide to public policy decision making (Raffensperger and Tickner 1999, Schettler et al. 2002). It responds to the realization that humans often cause serious and widespread harm to people, wildlife, and the general environment. According to the precautionary principle, precautionary action should be undertaken when there are credible threats of harm, despite residual scientific uncertainty about cause and effect relationships.
History of the Precautionary Principle:
The term "precautionary principle" comes from the German "Vorsorgeprinzip"-- literally, "forecaring principle." Its origins can be traced to German clean air environmental policies of the 1970's that called for Vorsorge, or prior care, foresight, and forward planning to prevent harmful effects of pollution (Boehmer-Christiansen S. 1994). The precautionary principle has since been invoked in numerous international declarations, treaties, and conventions, and has been incorporated into the national environmental policies of several countries. It has been applied to specific decisions on food safety, protection of freshwater systems, land development proposals, fisheries management, and the release of genetically modified organisms, among others.
Formulations of the precautionary principle
One formulation of the precautionary principle is found in the 1992 Rio Declaration of nations participating in the United Nations Environment Program treaty negotiations:
Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation (Rio Declaration 1992) (Shabecoff 1996).
In 1998 a group of scientists, environmentalists, government researchers, and labor representatives from the United States, Canada, and Europe convened at the Wingspread Conference in Wisconsin to discuss ways to formalize and implement the precautionary principle. They formulated the precautionary principle as:
"When an activity raises threats of harm to human health or the environment, precautionary measures should be taken even if some cause and effect relationships are not fully established scientifically." (Wingspread Statement, 1998)
The precautionary principle says we should attempt to anticipate and avoid damages before they occur or detect them early. However formulated, each version of the precautionary principle is based on underlying values and three core elements:
- potential harm—predicting and avoiding harm, or identifying it early, should be a primary concern when contemplating an action;
- scientific uncertainty—the kind and degree of scientific uncertainty surrounding a proposed activity should be explicitly addressed; and
- precautionary action
Why the Precautionary Principle?
Humans have transformed land, sea, and air, dominating the earth’s ecosystems in unprecedented ways (McCally 2002). Although many of these impacts were or could have been predicted, often they were surprises. Degradation of life support services, loss of biodiversity, and direct impacts on human health are a result (Lubchenco 1998, Johnson et al. 2001). Patterns of human disease are changing throughout the world. To remain focused on life expectancy and decreases in childhood mortality is to miss these changing patterns.
Newly emerging infectious diseases and new geographical distribution of older infectious diseases illustrate the capacity of microorganisms to evolve and adapt to changing circumstances. Antibiotic resistance is increasingly common. Chronic diseases like hypertension, heart disease, diabetes, and asthma are increasing throughout much of the world. Depression and other mental health disorders are becoming new public health threats in many parts of the world with profound consequences for individuals, families, and communities. Developmental disabilities, including learning disorders, attention deficit hyperactivity disorder, and autism are increasingly common (Schettler et al. 2000). The age-adjusted incidence of a number of different kinds of cancer in the US has increased over the past 25 years (SEER). The incidence of some birth defects is increasing (Paulozzi 1999, Pew 2001). Sperm density is declining in some parts of the world (Swan et al. 1997). Asthma prevalence and severity is sharply increasing throughout the world and is often of epidemic proportions (Pew 2001).
Recognizing the limits of science, the precautionary principle is intended to enable and encourage precautionary actions that serve underlying values, based on what we know as well as what we do not know. It encourages close scrutiny of all aspects of science, from the research agenda to the funding, design, interpretation, and limits of studies, for potential impacts on the earth and its inhabitants.
Elements of the precautionary principle:
The precautionary principle contains a specific directive to take precautionary action and, as with all guiding principles, carries its own values. The principle is based on recognizing that some activities may cause serious, irreparable, or widespread harm and that people have a responsibility to prevent harm and to preserve the natural foundations of life, now and into the future. The needs of future generations of people and other species and the integrity of ecosystems are worthy of Vorsorge, of forecaring, and of respect.
A precautionary approach asks how much harm can be avoided rather than asking how much is acceptable. The precautionary principle acknowledges that the world is comprised of complex, interrelated systems, vulnerable to harm from human activities, and resistant to full understanding. Precaution gives priority to protection of these vulnerable systems.
The potential for harm:
Precautionary action is appropriate when there is credible evidence that a particular technology or activity might be harmful, even if the nature of that harm is not fully understood. This means that decision makers must consider potential hazards that have been identified or that are plausible, based on experience, what is known, and/or predicted. Threats of serious, irreversible, cumulative, or widespread harm are of more concern than trivial threats and demand precautionary action commensurate with their nature.
Harm can occur at the level of the cell, organism, population, or ecosystem. Impacts may be biological, ecological, social, economic, or cultural, and they may be distributed equally or disproportionately among individuals, populations, or geographically, now or in the future. Because systems are complex and outcomes are not always predictable, it becomes extremely important for decision makers to specifically identify the parameters that are used to assess the potential impacts of a proposed activity. Moreover, the standard against which an impact is measured must also be defined. Asking whether a proposed agricultural pesticide is safer for use than another, for example, is very different from asking whether either is necessary at all.
Recognition of scientific uncertainty is central to the precautionary principle. We are often unable to predict or even identify in advance the consequences of a proposed action in complex biological or social systems (Perrow 1984). When inputs are modified, the behavior of complex systems is often surprising. By the time impacts are documented, considerable harm may have occurred. Despite early warnings, the use of lead in gasoline and paint, for example, damaged the brain function of generations of children (Markowitz and Rosner 2000). Sometimes, a system crosses a threshold and operates at a new state of relative equilibrium from which there is no turning back. For example, exotic species may be introduced into ecosystems where they did not previously exist, allowing them to become established and cause irreversible harm.
Understanding cause and effect relationships in complex systems is limited by different kinds of uncertainties. Uncertainty sometimes results from more than a simple lack of data or inadequate models and is not easily reduced because of the nature of the problem being studied. In those circumstances, a requirement of absolute "proof" of harm before action can be taken is either ideologically motivated or deprived of a fundamental understanding of the limits of science.
Most complex problems have a mixture of three general kinds of uncertainty—statistical, model, and fundamental—each of which should be explicitly considered before deciding how to act.
Statistical uncertainty is the easiest to reduce or to quantify with some precision. It results from not knowing the value of a particular variable at a point in time or space but knowing, or being able to determine, the probability distribution of the variable. An example is IQ distribution in a population of individuals. In this case, valid decisions can be based on knowing the likelihood of a variable having a particular value.
Typically, however, real-world decisions are made in the context of multiple, interactive variables. For example, the incidence of cancer attributable to exposure to a carcinogen in genetically and geographically diverse people is inherently more difficult to determine than the incidence of cancer in a group of genetically similar rodents exposed to the same carcinogen living in controlled laboratory conditions. When more than one variable is involved, a model is typically constructed with certain assumptions and simplifications, introducing a new kind of uncertainty.
Model uncertainty is inherent in systems with multiple variables interacting in complex ways. Even if the statistical uncertainty surrounding the value of a single variable can be defined or reduced, the nature of relationships among system variables may remain difficult to understand. This is particularly problematic for any model of complex systems. We may decide that there will be a tendency for the system to behave in a certain way, but the likelihood of that behavior is difficult to estimate.
Moreover, complex models can include only a finite number of variables and interactions. The real world, however, is a confluence of biological, geochemical, ecological, social, cultural, economic, and political systems. No experimental model can fully account for each of these and their interrelationships. Ongoing research, monitoring, and model refinement may help to reduce uncertainties, but imprecision is inevitable. Indeterminacy, which increases when moving from statistical to model uncertainty, is, at some point, more correctly called ignorance.
Fundamental uncertainty encompasses this extension of indeterminacy into ignorance. Ignorance that results from the complexity or uniqueness of a system is of particular concern. This kind of uncertainty is inherent in novel or complex systems where existing models do not apply. Fundamental uncertainty can result from having no valid knowledge of the likelihood of a particular outcome. Fundamental uncertainty can also result from no knowledge of what some of the outcomes may be. Here we don't even know what we don't know. Chemical regulators, for example, were unaware of the existence and functions of a stratospheric ozone layer that would be damaged by chlorofluorocarbons when allowing them to be marketed as safe for commercial use. Fundamental uncertainty is extremely difficult to reduce or otherwise manage and demands respect and humility.
Scientific uncertainty and scientific proof:
It is imperative to keep these kinds of uncertainty in mind when considering the notion of scientific proof. Proof is a value-laden concept that integrates statistics, empirical observation, inference, research design, and the research agenda into a political and social context. Strict criteria may be useful for establishing "facts", but by the time a fact or causal relationship has been established by rigorous standards of proof, considerable avoidable harm may already have occurred. The impacts of lead exposure on children’s brain development or asbestos on lung cancer risk are examples. Guided by the precautionary principle, therefore, we are as concerned with the weight of available evidence as we are with the establishment of fact by rigorous standards of proof.
By convention, a considerable amount of consistent evidence is necessary to establish factual "proof" of a cause and effect relationship. Traditionally, in a study of the relationship between two variables, a correlation is said to be statistically significant only if the results show the two to be linked, independent of other factors, with greater than 95% likelihood that the results of the study truly depict the real world. But correlation does not establish causation. In epidemiology, a series of additional criteria, for example, those of Hill, are usually added before causation can be claimed (Hill 1965). Hill criteria include not only establishment of a statistically significant correlation between two variables, but also require that the causal variable precede the effect, a dose-response relationship, elimination of sources of bias and confounding, coherence with other studies, and understanding of a plausible biological mechanism. Tobacco smoking, for example, was known to be associated with lung cancer for more than fifty years before a plausible biological mechanism was finally described. At that point, it became impossible to deny that tobacco "causes" cancer.
When exposure to environmental hazards causes immediate and obvious harm, scientific uncertainty about cause and effect relationships is minimal. However, under other circumstances, scientific uncertainty increases dramatically and is often difficult to resolve.
Conditions with long latency periods between a hazardous exposure and the appearance of an adverse health outcome are difficult to study. Study design is necessarily complex and implementation is expensive. Intervening variables that must be considered in a comprehensive study complicate the analysis. Subjects may also be lost to follow up during a prolonged study.
Investigative challenges are also increased when the health outcome is subtle and detectable only by detailed, complex testing. For example, subtle changes in immune system or brain function may be of significant practical importance but difficult to document easily.
Finally, adverse health outcomes are often non-specific and multifactorial in origin. Many diseases, for example, asthma or developmental disorders like learning disabilities are caused by complex interactions of genetic, environmental, and social factors and are not easily linked to single variables. As a result, determining causation with precision is difficult, if not impossible, and some residual uncertainty will always remain. It then becomes the task of policy makers or health care providers to decide how to act in the face of uncertainty. Under these circumstances, according to the precautionary principle, preventive or anticipatory measures are appropriate.
Meanwhile, lack of "proof" of harm is often used to justify ongoing or proposed activities when the weight of credible evidence suggests that harm is plausible and perhaps even likely. Given the limits of scientific inquiry in the world of complex systems, establishing a high bar of proof as a pre-requisite for taking action is certain, in some instances, to result in unnecessary and often irreversible harm (Beauchamp and Steinbock 1999, Kriebel et al. 2001).
Under the precautionary principle, shifting the burden of proof from one party to another, depending on weight of evidence, lack of evidence, scientific uncertainty, and the nature of the harm of concern is one way to address these complexities. Depending on circumstances, differing standards of evidence for demonstrating harm (or safety) may also help protect public health and the environment. For example, concluding that something is more likely than not to cause harm is a very different standard from concluding that it will cause harm beyond a reasonable doubt.
The precautionary principle reminds us that, when dealing with complex systems, evidence is rarely sufficient to quantify or predict the consequences of human activity beyond doubt. Yet, failure to take action, because of a lack of quantifiable proof of harm, is, in itself, a form of action.
Finally, in order to serve the values that underlie the precautionary principle, action should be anticipatory, in order to prevent harm to public health and the environment, despite underlying scientific uncertainty. The precautionary principle does not specify which actions are appropriate under particular circumstances. It is a guiding principle, not a set of binding rules. The choice(s) among potential anticipatory actions, however, should be informed by:
- full consideration of the weight of evidence for potential harm
- the kind and degree of scientific uncertainty associated with that evidence
- participation of potentially affected parties, and
- an assessment of potential alternative actions.
Implementing the precautionary principle
The precautionary principle requires a systematic look at the potential for various kinds of harm, associated scientific uncertainty, underlying fundamental values, and then counsels precautionary action.
1) Goal setting
Goal setting is particularly important for establishing environmental and health policies. Goal setting requires us to ask, "Where do we want to be at some future time? What are we trying to accomplish? Starting with agreed upon goals and then looking at where we are now can help in developing a strategy for getting from here to there. Of course, not all goals are generally agreed upon or represent shared visions. But as goals are made explicit, the values and assumptions underlying decision-making processes will also become more transparent and may result in processes for reconciling differences.
2) Assessing alternatives
A truly precautionary approach includes examination of a range of options for meeting policy goals. Currently, in most settings there are few requirements for comprehensively assessing a range of alternatives to proposed activities. For example, current regulatory policies emphasize a risk assessment/risk management framework. This approach attempts to estimate the probability of harm (risk) from a proposed activity and then asks whether that harm is acceptable. Risk management techniques are intended to minimize the risks of the proposed activity, but not to question if the activity is necessary for achieving broader goals.
Alternatives assessment instead asks whether the harm is necessary and if there might be other ways to achieve agreed upon goals that would avoid harm altogether. When alternatives assessment is applied earlier rather than later in policy decision making, innovative approaches that reflect societal goals, ecological principles, and the values that underlie the precautionary principle are more likely to emerge. Assessing alternatives can also lead to actions that truly respect the level of uncertainty in given circumstances.
3) Adopting transparent, inclusive, and open processes
A precautionary approach requires open, inclusive, and transparent processes that are initiated early in decision making, beginning with goal setting, where the health and well-being of the public and environment are at stake. A participatory approach is justified by a belief in the fundamental fairness of democratic decision making and by the thought that a broad range of experience leads to better science and decision-making. Transparency also helps to ensure accountability among decision-makers.
4) Analyzing uncertainty
A precautionary approach requires explicit recognition of the scientific uncertainty inherent in understanding the potential for harm from an ongoing or proposed activity.
Statistical uncertainty may be reduced with more data collection. Model and fundamental uncertainty, however, are more difficult to reduce. When model or fundamental uncertainty predominate, a requirement to resolve uncertainty as a pre-requisite for decision making shows a fundamental lack of understanding of the limits of science or alternatively, may be nothing more than a tactic to maintain the status quo. In the US, for example, President Bush used scientific uncertainty about the impacts of greenhouse gases on global climate to justify US rejection of the Kyoto treaty on global climate change and to promote an energy policy weighted heavily in favor of increasing fossil fuel extraction and consumption.
5) Burden of proof and responsibility
Under the precautionary principle, the burden of proof regarding the safety of an activity may shift with the nature of potential harm and scientific uncertainty. Requirements for evaluating the safety of a proposed activity also vary with the political context. The precautionary approach suggests that the burden of proof is better thought of as the burden of persuasion and responsibility. This avoids the fruitless assertion that absolute safety can never be "proven." Rather, it acknowledges that, as the potential for serious, irreversible harm and scientific uncertainty increase, the proponent of an activity has an increasing obligation to account for the safety of the activity and take responsibility for adverse impacts that may result from it. Then, more comprehensive testing, monitoring, and assumption of liability shift the onus onto the proponent.
6) Learning and adaptation
Under the precautionary approach, appropriate research and monitoring are essential. Decisions must be periodically re-examined, based on new information. The research agenda of private and public institutions may be designed to reflect broad social goals that extend well beyond developing marketable products. In this way the precautionary approach is designed with feedback loops that search for and take into account new information and unintended consequences of provisional decisions.
7) Options for precautionary action
Choices among potential precautionary actions are made only after full analysis of potential harms and scientific uncertainty. Precautionary action can take a number of directions. At the level of regulation, when research and development of a product or technology are complete, and only regulatory approval is needed for production and marketing, the options are ordinarily limited to yes, no, with limits, with monitoring, with labeling, or with posting of a performance bond.
At a pre-regulatory level, however, precautionary action might include a closer look at problems that proposed technologies are intended to solve. How was the problem defined and by whom? Was the problem framed in the only or best way? Are there alternatives to the proposed technology?
Evaluating a full range of possible precautionary measures again requires a multidisciplinary, participatory approach in order to elicit relevant knowledge and set priorities. Responses to scientific uncertainty, as well as various kinds of harm, legitimately vary among individuals, societies, and cultures. It is obviously easier to consider alternatives, multiple sources of information, and priorities earlier in the process than when a developed product or technology is presented for regulatory approval.
The Relationship of the Precautionary Principle to Risk Assessment:
A risk assessment approach to public policy decision making dominates in the US and many parts of the world. With few exceptions, risk assessments attempt to estimate the potential risks of proposed products or activities on a case-by-case basis without consideration of the complete context in which the activity will be carried out and rarely with any consideration of alternatives to the proposal (O'Brien 2000).
The relationship between the precautionary principle and risk assessment reflects differing views of a number of factors, including how much we know, how much we can know, how broadly questions should be framed, which questions should be asked, who should frame the questions, the value of non-human life, our responsibility to future generations, and how we plan for the future.
Quantitative risk assessments usually respond to narrowly-framed questions and are often flawed by simplifying assumptions. Risk assessments almost always fail to consider a full range of biological, ecological, social, cultural, and economic impacts and how they are distributed. Advocates of a regulatory system dominated by quantitative risk assessments argue that they are inherently precautionary through the use of conservative assumptions and safety factors. Risk assessors, however, often fail to distinguish among various kinds of uncertainty and tend to misclassify some model and fundamental uncertainty as statistical, to which they apply "uncertainty" factors. When model and fundamental uncertainty predominate in a system, this approach may lead to large underestimates of risk, failure to predict adverse impacts removed in time and space, and of course, will completely fail to predict surprises or novel impacts.
Risk assessors often claim that the precautionary principle is "anti-science" or a tool to keep certain technologies from the marketplace. In fact, a precautionary approach encourages more science rather than less, acknowledging the need for precautionary action while addressing scientific uncertainty that may be intractable using available tools.
Decision making in the face of uncertainty is, of course, necessary, frequently difficult, and requires assessment of relative risks. Guided by an overarching precautionary principle, however, these assessments are not the exclusive domain of risk analysts, can be fully participatory and include a full consideration of a range of alternatives.
A precautionary approach is based on the ethical notions of taking care and preventing harm. It arises from recognition of the extent to which scientific uncertainty and inadequate evaluation of the full impacts of human activities have contributed to ecological degradation and harm to human health. It can be used to help address these circumstances, bringing together ethics and science, illuminating their strengths, weaknesses, values, or biases. The precautionary principle encourages research, innovation, and cross-disciplinary problem solving. It serves as a guide for considering the impacts of human activities and provides a framework for protecting children, adults, other species, and life-sustaining ecological systems now and for future generations.
For example: Montreal Protocol on Substances that Deplete the Ozone Layer (1987); Ministerial Declaration of the Second World Climate Conference (1990); Bergen Ministerial Declaration on Sustainable Development (1990); Bamako Convention on Hazardous Wastes within Africa (1991); Framework Convention on Climate Change (1992); United Nations Conference on Environment and Development (1992); Helsinki Convention on the Protection and Use of Transboundary Watercourses and International Lakes (1992); Maastrict Treaty on the European Union (1994); US President’s Council on Sustainable Development (1996); Cartagena Protocol on Biosafety (2000); Stockholm Convention on Persistent Organic Pollutants (2001).
Beauchamp DE and B Steinbock (eds.). 1999. New Ethics for Public's Health. New York:Oxford University Press.
Boehmer-Christiansen S. 1994. The precautionary principle in Germany—enabling government. In: Interpreting the precautionary principle. Ed: O'Riordan T, Cameron J. Earthscan Publications, Ltd.; London.
Hill AB. 1965. The environment and disease: association or causation? Proc R Soc Med 58:295.
Johnson N, C Revenga and J Echeverria. 2001. Managing water for people and nature. Science 292:1071-1072.
Kriebel D, J Tickner, P Epstein, J Lemons et al. 2001. The Precautionary Principle in Environmental Science. Environ Health Perspect 109(9):871-876.
McCally M. (ed.) 2002. Life Support: The Environment and Human Health. Cambridge, MA:MIT Press.
Lubchenco J. 1998. Entering the century of the environment: a new social contract for science. Science 279:491-497.
Markowitz G and D Rosner. 2000. "Cater to the children": the role of the lead industry in a public health tragedy, 1900-1955. Amer J Pub Health 90(1):36-46.
O'Brien M. 2000. Making Better Environmental Decisions: An alternative to risk assessment. Cambridge, MA, MIT Press.
Paulozzi L. 1999. International trends in rates of hypospadias and cryptorchidism. Environ Health Perspect 107(4):297-302
Perrow C. 1984. Normal Accidents: Living with High Risk Technologies. New York: Basic Books.
Pew Environmental Health Commission.
Raffensperger C, and J Tickner. (eds). 1999. Protecting Public Health and the Environment: Implementing the Precautionary Principle. Ed: Island Press, Washington DC.
Rio Declaration on Environment and Development. 1992.
SEER Cancer Statistics Review, 1973-1996. Bethesda MD: National Cancer Institute.
Schettler T, K Barrett and C Raffensperger. 2002. The Precautionary Principle. In: Life Support: The Environment and Human Health. Ed: McCally M. Cambridge, MA: MIT Press.
Schettler T, Stein J, Reich F, Valenti M. 2000. In Harm's Way: Toxic Threats to Child Development. Greater Boston Physicians for Social Responsibility.
Shabecoff P. 1996. A New Name for Peace: International Environmentalism, Sustainable Development, and Democracy. Hanover and London: University of New England Press pp 86, 156, 172.
Swan S, E Elkin and L Fenster. 1997. Have sperm densities declined? A reanalysis of global trend data. Environ Health Perspect 105:1228-1232.