Author: Robert Hoffman, Ottawa.
Many issues of public policy involve the establishment of a balance between long term societal interests and short-term individual or private interests. This is clearly the case in those issues of public policy concerned with the environment, natural resource management, and public health. Many of these problems involve externalities – situations where the activities undertaken by one individual or group in the pursuit of its objectives have adverse unintended consequences for other individuals, groups or the society at large. Some of them are characterized as problems of the commons. The issues of climate change and fisheries management are cases in point.
The problem of the ‘management of the commons’ is well known: unfettered use of the commons to serve individual interests may result in overuse and potentially the destruction of the commons. The resolution of these problems has two elements: the establishment of an agreed-upon level of use of the commons and the allocation of that level of use among the participants in such a way that public interest is served and the participants perceive that the outcome is just and fair.
Science constitutes a key component not only in the resolution of environmental and resource management issues but as well in the identification of problems for which a public policy response is required. In problems of the commons, science is the source of information concerning the relationships between of the level of use of the commons and the evolution of the state of the commons. Quantification of the long-term carrying capacity of the commons or the ‘sustainable’ yield is based on scientific understanding. As well, science provides the basis for technologies that might reduce externalities, or increase the productivity of the processes that constitute the commons, thereby increasing the sustainable yield.
However, scientific knowledge is seldom available in a form that is ready for use in policy analysis, nor does it explicitly address the problems confronting the decision-makers. Often it originates in more than one discipline; sometimes it is contradictory; it may be incomplete and/or subject to caveats. Its findings are couched in a technical language that may be inaccessible to the layman. When policy analysis is unable to assimilate science based information, the consequence is either problem-denial or policy decisions based on criteria usually not unrelated to the relative power of the various stakeholders in the problem domain.
This gap between science and public policy analysis is well recognized and is problematic particularly in the fields noted above. [Brewer 1986], [Clark 1986], [Ravetz 1986], [Skodvin 1999], [Wojciechowski 1998] It arises because the culture, institutions, and modes of behaviour of science have little in common with those of public policy analysis. [Skodvin 1999]
Public policy analysis and decision making implies choice: the future is not pre-determined, but can be influenced by what we decide to do; there are alternatives from among which we must choose and the choice to do nothing is a willful one. For science, the concept of choice has been problematic. Science seeks to generate ‘objective’ knowledge of the laws that govern the universe. These laws in conjunction with a starting condition determine the future of the universe and the concept of choice is precluded. Max Born, the Danish physicist and Nobel Laureate observed that:
“Only two possibilities exist: Either one must believe in determinism and regard free will as a subjective illusion, or one must become a mystic and regard the discovery of natural laws as a meaningless intellectual game.” [Bulletin of Atomic Scientists, 1957]
Another Nobel Laureate, Belgian chemist Ilya Prigogine, described the contradiction in the following terms:
“. . . we owe to the ancient Greeks two ideals that have since shaped human history. The first is the intelligibility of nature, or in Whitehead’s words, “the attempt to frame a coherent, logical, necessary system of general ideas in terms of which every element of our experience can be interpreted.” The second is the idea of democracy based on the assumption of human freedom, creativity and responsibility. As long as science led to the description of nature as an automation, these two ideals were contradictory.
And went on to conclude that:
“ This (contradiction) requires a new formulation of the laws of nature that is no longer based on certitudes, but rather possibilities. In accepting that the future is not determined, we come to the end of certainty” [Prigogine 1997]
The culture and institutions of science enshrine the so-called scientific method, by which one arrives at objective scientific knowledge of real-world phenomena that is accepted provisionally as truth until it is contradicted by further application of the method. One pillar of the scientific method is the use of controlled and repeatable experiments to test hypotheses. Here the emphasis is on analysis – reduction to the point where controlled experimentation is possible. Another pillar of the scientific method is prediction: a hypothesis may be conditionally accepted if phenomena predicted from the hypothesis are observed. The scientific method has proven to be most effective in areas such as chemistry, physics, and biology where knowledge is context independent – that is where the process under study has weak interactions with the processes that constitute its context. The scientific method has been less effective in dealing with complex context dependent systems where controlled experimentation is not possible and where emergence is an important phenomenon. [Holland 1998] It has had least success in systems in which human beings are an integral component. In these systems, not only is controlled experimentation not possible, but as well prediction is not possible to the extent that the future is influenced by choice. Perhaps the relative lack of progress in the social sciences noted by Edward O. Wilson may be attributed to reliance on scientific methods borrowed from the physical sciences that are inappropriate to the problem domain of the social sciences. [Wilson, 1998]
If science may be characterized as knowledge-seeking, objective, reductionist, analytic, deterministic, and technical, public policy analysis and decision making is at the opposite end of each of these poles. It is useful to consider public policy analysis and decision making as two distinct processes. Public policy analysis involves developing an understanding of the problem domain such that the full consequences of various resolution interventions can be assessed and the alternative outcomes from which choices must be made can be identified. Decision-making is the process of selection from amongst the alternatives. It is the job of policy analysis to synthesize all of the knowledge pertinent to the problem domain and to communicate the resulting understanding to those delegated responsibility for making the decision. Decision processes in matters of public policy almost always involve communication of understanding to a much broader community of stakeholders – often the ‘public’ at large. Unlike science, policy analysis and decision making do not have the luxury of waiting until understanding is complete; decisions must be made in the face of the uncertainty of incomplete knowledge and risks must be subjectively evaluated. The expression of choice is subjective; outcomes may be valued differently by different individuals or groups. Those who benefit from a particular outcome may have to compensate those who lose; choice involves understanding the real tradeoffs among interests and negotiating until a choice can be made that is acceptable to all parties. Issues of public policy are almost always historically bound and context dependent and involve humans as an integral part of a complex system.
It is clear that methods for synthesizing knowledge originating in the sciences, both physical and social, are critical for public policy analysis and subsequent decision making. However, unlike science where methods of analysis are mature, methods of synthesis are not well developed, particularly for systems involving humans, with the consequence that consensus on the validity of synthetic understanding is difficult to achieve. The body of theory from which synthetic methods may be drawn is general system theory and cybernetics. [Weiner 1948], [Ashby 1956], [Bertalanffy 1968], [Maturana etal 1980], Simon 1982] But, this body of theory has largely been ignored by institutions of higher learning as it does not fit into the disciplinary framework with its emphasis on analysis.
Synthesis of scientific knowledge in support of public policy analysis has employed several methods: blue ribbon panels, formal models, and policy exercises.
The first and longest established is the ‘blue ribbon’ panel, consisting of experts from the various fields, with a mandate to seek consensus on the problem domain and to make policy recommendations. Blue ribbon panels are convened as subcommittees of national academies of science, presidential commissions, and international working groups; the authority of the panel rests on the credentials of the members. Panels use the techniques of persuasion to support the policy recommendations that emerge from their deliberations. While persuasion may trigger action, it seldom conveys understanding, since it relies on rhetorical technique and selective arguments. Argument, according to Northrop Frye, relies on the arrangement of data. Arrangement means selecting for emphasis, and selecting for emphasis can never be definitively right or wrong [Frye, 1990]. Panels are effective when the scientific knowledge is complete and consensus can be easily reached, but are not effective when knowledge is incomplete and there are conflicting interests. Interest groups advocating different policies can convene their own experts and thereby confuse the decision-making process. Panels communicate to the policy process by means of a written report directed to the sponsoring agency; they seldom engage directly in the decision process. There is a tendency for panels to make policy recommendations judged to be ‘politically acceptable’ even though it may be known that the recommended policy may not be sufficient to deal with the problem.
The second approach is the use of computer based modeling. Models clearly have the potential for handling large amounts of technical information in a systematic and reproducible way. They have seldom met this potential when used for policy analysis, not just because formal modeling projects are costly and time consuming. Brewer makes the point that “the work is one sided; it presents but one perspective on a future rich in potentialities”. Put another way, modeling in the sciences has retained the determinism of the scientific method and thereby preempts the role of the decision-maker. At worst, large-scale policy models are black boxes closed to adaptive possibilities and learning. They produce predictions with little room for policy intervention, or, when they incorporate ‘objective’ measurements of value, they make prescriptions that purport to be optimal. Large-scale models often lack transparency and fail to communicate an understanding of how the system works; they rely for their authority on the mystique of computer technology, arcane mathematics, and the scientific credentials of the modelers. [Brewer 1986]
The third approach is the use of policy exercises or workshops that engage scientists, decision-makers, interest groups and, in some cases, modelers in the decision making process. These policy exercises come in a variety of flavours and go by a number of names. Their origins may be traced to game theory [von Neumann et al 1944] and to the use of war gaming by the military. [Shubik 1982] Scenario analysis [Wack 1985], [Schwartz 1991], Adaptive Environmental Assessment and Management [Holling 1978], [Sonntag 1986] and Integrated Assessment [Bland 1999] are all examples of policy exercises. In these methods the emphasis is on the process for reaching a decision as much as the outcome itself. The policy exercise puts the policy issue in as broad a context as possible and seeks to build a common understanding of the elements of the underlying system. It is recognized that stakeholders may have different values and interests and that the probability of a consensus-based decision is increased if the real trade-offs among the interested parties are understood by all. This process almost always involves the development and analysis of a number of scenarios; sometimes models are constructed during the course of the exercise to support the scenario analysis. Choice and indeterminacy are implicitly recognized. While in an early stage of development, these methods show promise particularly in circumstances where the understanding generated in the process need not be communicated beyond the participants in the process.
An informed public is essential for effective public policy. There must be a broad-based consensus on the diagnosis of the problem before remedial policies can be implemented, particularly when those policies may have adverse impacts upon particular interest groups. Further, there must be broad-based consensus that the policy actions are socially just and fair – that private interests aren’t served to the exclusion of the public interest. In this context, robust methods for the synthesis and communication of understanding of complex systems are critical. This paper suggests that ‘systems simulators’ may be an effective means for communicating the understanding of complex systems needed to inform public debate on issues of public policy.
Simulators are descriptions of complex systems representing the interrelationships among the processes that constitute the system; they combine observations of past states of the system with scientific understanding of processes. As such, simulators are explicit and communicable representations of the mental models that guide our perceptions and actions. Unlike verbal or mathematical descriptions of systems, simulators are active and can be experienced. Learning how the system works arises from the experience of using the simulator. The user will come to appreciate the complex system-as-a-whole behaviour as it emerges out of dynamic interactions among relatively well understood processes.
Unlike the deterministic models of classical science, the simulator approach is open to adaptation or learning. The simulators are designed in such a way that the system of feedback loops necessary to assure consistency among the constituent processes of the system is incomplete: those feedbacks embodying the behavioral responses that are subject to adaptation are excluded from the simulator because they are not knowable.
Consequently, the possibility of inconsistency or disequilibrium arises. Disequilibium is indicated by tensions that must be resolved by the user of the simulator. In this way the user becomes an integral part of the system as the source of novelty for adaptation, not an observer of a closed system.
These concepts have their origins in modern science. The work of Ilya Prigogine shows the indeterminacy of systems far from equilibrium and the possibilities of adaptation through the emergence of higher levels of order.[Prigogine 1984]. Indeterminacy is a property of evolutionary systems. The evolutionary principle is stated by Erwin Laszlo in the following terms:
The evolutionary paradigm challenges concepts of equilibrium and determinacy in scientific theories; and it modifies the classical deterministic conception of scientific laws. The laws conceptualized in the evolutionary context are not deterministic and prescriptive: they do not uniquely determine the course of evolution. Rather, they state ensembles of possibilities within which evolutionary processes can unfold. [Laszlo, 1987]
Thus, simulators are primarily learning devices that extend our powers of perception; they cannot predict what will happen nor can they prescribe what should happen. Just as flight simulators support learning how the aircraft responds to the controls, global systems simulators may be used for exploring the responsiveness of global systems to potential societal actions involving, for example, population growth, life-style and technology.
Ashby, W. Ross. Introduction to Cybernetics. Wiley, New York, 1956.
Bertalanffy, Ludwig von. General System Theory. George Braziller, 1968.
Bland, Bill. Toward integrated Assessment in Agriculture. University of Wisconsin – Madison (http://bob.soils.wisc.edu/asig/sivs/iam.html)
Brewer, G. D. “Methods for Synthesis: Policy Exercises” In Sustainable Development of the Biosphere, eds. William C. Clark and R.E. Munn. International Institute for Applied Systems Analysis. Cambridge University Press, Cambridge, 1986.
Burns, Tom R., Thomas Baumgartner, and Phillipe DeVille. Man, Decisions, Society: The Theory of Actor-System Dynamics for Social Scientists. Gordon and Breach Science Publishers, New York, 1985.
Casti, John L. Would-Be Worlds: How Simulation is Changing the Frontiers of Science. John Wiley & Sons Inc. New York, 1997.
Clark, William C. “Sustainable Development of the Biosphere: Themes for Research Program” In Sustainable Development of the Biosphere, eds. William C. Clark and R.E. Munn. International Institute for Applied Systems Analysis. Cambridge University Press, Cambridge, 1986.
Colborn, Theo, Dianne Dumandski, and John Peterson Myers. Our Stolen Future: Are We Threatening out fertility, Intelligence and survival? – A Scientific Detective Story. Dutton, New York, !996.
Frye, Northrop. Words with Power: Being a Second Study of “The Bible and Literature”. Harcourt Brace Janovitch, Publishers, New York, 1990.
Holland, John H., Emergence: from Chaos to Order. Helix Books, Addison-Wesley, Reading Mass. 1998
Holling, C. S. (Ed.) Adaptive Environmental Assessment and Management. Wiley, New York, 1978.
Maturana, Humberto R. and Francisco J. Varella. Autopoiesis and Cognition. D Reidel Publishing Company, Dordrecht, Holland, 1980.
Prigogine, Ilya. The End of Certainty: Time, Chaos, and the New Laws of Nature. The Free Press, New York, 1997.
Ravetz, J. R. “Usable Knowledge, Usable Ignorance: Incomplete Science with Policy Implications” In Sustainable Development of the Biosphere, eds. William C. Clark and R.E. Munn. International Institute for Applied Systems Analysis. Cambridge University Press, Cambridge, 1986.
Schwartz, Peter. The Art of the Long View. Doubleday, New York, 1991.
Shubik, M. Game Theory in the Social Sciences. MIT Press, Cambridge MA, 1982.
Simon, Herbert. The Sciences of the Artificial, Second Edition, The MIT Press, Cambridge, Massachusetts, 1982.
Skodvin, Tora. Science-Policy Interaction in the global Greenhouse: Institutional Design and Institutional Performance in the Intergovernmental Panel on Climate Change. CICERO Working Paper 1999:3. Center for International Climate and Environmental Research. <www.cicero.uio.no> Oslo, Norway.
Sonntag, N.C. Commentary on Brewer, G. D. “Methods for Synthesis: Policy Exercises” In Sustainable Development of the Biosphere, eds. William C. Clark and R.E. Munn. International Institute for Applied Systems Analysis. Cambridge University Press, Cambridge, 1986.
von Neumann, John and Oskar Morgenstern. Theory of Games and Economic Behavior. Princeton University Press, Princeton, NJ, 1944.
Wack, Pierre. “The Gentle art of Reperceiving” (two-part article) Harvard Business Review. “Scenarios: Uncharted Water Ahead” September-October, 1985. ” Scenarios: Shooting the Rapids” November-December, 1985.
Wilson, Edward O. Consilience: The Unity of Knowledge. Alfred A. Knopf, New York, 1998.
Wojciechowski, Jerzy A. The Age of Reckoning. Presented at Pacem in Mirabus XXVI, Halifax, November 1998 and published in Proceedings of the Canadian Association for the Club of Rome, <www.cacor.ca >, Series 1, Number 28, Winter 1998-99.
Weiner, N. Cybernetics. MIT Press, Cambridge Mass., 1948 and J. Wiley, New York, enlarged edition, 1961.
Bio: Robert Hoffman is one of the seven Canadian Members of the Club of Rome. He is President of whatIf? Technologies Inc., a company that has modelled energy systems for Canada