Introduction to the Thematic Focus

Schwerpunkt: Systemic Risk as a Perspective for Interdisciplinary Risk Research

Systemic Risk as a Perspective for Interdisciplinary Risk Research

Introduction to the Thematic Focus

by Christian Büscher, ITAS

1     Observing Systems

Our lifeworld is replete with systems which are familiar to us, which we (have to) trust, and with connections in and among systems which we often merely have an inkling of, but by no means comprehend. We commonly use a system, an automobile, so that we can transport ourselves “individually”. The vast majority of us has only a vague idea how it works. We pursue the superficial information displays and operate the instruments. Then we move simultaneously with many others in the orientation on a system of rules with all sorts of normative expectations. The road users, together with their technical apparatus, the available infrastructure, and the system of rules relate to each other and constitute a system of its own: The transportation system, which is permanently expanding, in that elements are added to it, and linkages that are further developed (Urry 2004).

Even the simplest correlations are no longer perceived by the road users: more and more parents drive their children by car to the Kindergarten, because the way on foot or by bicycle has become too dangerous. Traffic has increased too much, i.e., there are too many automobiles on the streets, they argue. Or: drivers notice, while in a traffic jam, that traffic is flowing more speedily in the next lane, and change lanes, only to cause a jam there. Such self-reinforcing effects of collective action cause, on the large scale as well, enormous problems for urban- and traffic planners. New and more spacious streets, for the most part, attract even more traffic, and worsen a city’s situation.[2] It would be simple to point out more relationships which elude our life world experience. The eye soon loses its way as soon as one looks into the interconnections of the transportation system with the energy system, with its heterogeneous technical and social components, with the economic system and its globally operating industries, with the political system and its global competition for ressources, with science and its innovation systems.

It’s slowly dawning on us: everything is connected with everything else in a network of technical, social, and ecological systems which we barely understand, although we suspect “systematicity” behind it, which can no longer be planned, directed, and controlled, even though it was modern society which set it in motion through rational organization (synchronization) and through communications media which are based on trust (Giddens 1990, p. 20ff.). Further, we are increasingly coming to realize that everything which is done by human hands will have consequences – sooner or later, here or there, for some of for many. These insights are at first glance trivial, but they indicate changes in the perception of the current ways of looking at problems.

(1) We are accustomed to describing and analyzing material, temporal, and social relationships as systemic or as systems, and so to take them into account: biological systems, financial systems, software systems, infrastructural systems, and much more. Historically, one can follow how the term “system”, from the 16th century on, has experienced a boom, if in different meanings – at first, as a “political entity”, e.g., by Thomas Hobbes (2007 [1651], Chapter XXII): “By systems, I understand any number of men joined in one interest or one business”, or as a scientific possibility for orientation, when the “Systematicity of the World” gains validity as a criterion for the correctness or falseness of systems of thought (Strub 1998, p. 833). Early definitions by J.H. Lambert, from the 18th century on, refer to conditions which have to be met when one speaks of systems: identifiable components which are recognizably connected to one another by a purpose. This connection has to be temporally stable, as long as the purpose requires it (Strub 1998, p. 835f.).

These deliberations were later carried on by biologists, with the exception that the aspect of purpose receded into the background, and the reproduction of the system came to the fore. Systems were described as complexes of interacting elements which distinguish themselves from an environment which itself consists of other complexes (Bertalanffy 1950, p. 143). Systems implement an emergent quality (an effect, a result, or a function) which cannot be derived from the qualities of the individual elements alone, but much rather from the relationship of the elements to one another. Further, mechanisms were discerned which keep a system in a “dynamic equilibrium” – self-preservation through constant change: “constancy is maintained in a continuous change” (Bertalanffy 1950, p. 157). For it, a sort of information processing is respectively necessary which can bring about a self-regulation, as it is described in cybernetics. Examples can be found especially in nature (Mitchell 2008, p. 51).

These pioneering deliberations lead to the fact that we accept the paradoxical constitution of systems which lets them be, on the one hand, ultra-stable, but in a few respects, on the other hand, fragile: constancy through change, openness through closure, unity through difference, autonomy through selective dependency (without self-sufficiency or autarky, therefore). With regard to social systems, these properties have enormous consequences for modern society: increasing the productivity of the economy, of politics, science, or law through functional differentiation – the simultaneous evolution of autonomous systems qua exclusive fulfillment of functions, the hypostatization of singular functions, and the loss of a control center.

(2) At the same time, means of scientific observation were developed and installed, which can analyze and describe possible consequences of systemic differentiation: whether these are consequences through changes in the societal formation, the introduction of new technologies or methods of organization, as well as consequences for, e. g., the individual, for areas of societal functionality, and, above all, for the natural environment. Basically, this is always some kind of analysis of systems with which we are confronted, which, once they are installed, seem to make themselves autonomous, and of the assessment of these developments as a reflexion on some form of “re-affection” oder “self-infliction”.

It turns out, however, that the possibilities for attributing causes and effects become increasingly more difficult, and that dispite (or because of) science’s ability constantly to increase its observational resolution: scaling down what is large (e.g., in models of the universe), enlarging what is small (e.g., in models of matter) and simplifying what is complex (e.g., in models of planetary climate). “Learning more sometimes means discovering hidden complexities which compel us to acknowledge that the confidence in our ability to master the situation was illusory” (Dupuy 2005, p. 91; our Translation: CB/RA). The problems crop up on a small scale, for example, when it is a matter of determining the dose-response relationships of substances on living organisms (NRC 2009, p. 97), and they also appear on the large scale, when it is a question of determining the tipping points of various systems, upon which life on the planet depends (Lenton, Held et al. 2008).

Dispite the hidden complexietes, which we will refer to later on, there can be no question of a different, parallel evolutionary developments of humankind, together with its inventions of social and technical systems, on the one hand, and of nature, with its immense diversity of living and inanimate manifestations, on the other. Every day, one discovers anew that everything is connected with everything else, and the fact that impacts and counter-impacts are generated through feedback loops, as well as through positive and negative self-reinforcement let the notion of “harmony with nature” seem somewhat naive. On the contrary, one proclaims a new geological epoch: “Humanity, they [an influential group of scientists] contend, can be considered a geophysical force on par with supervolcanoes, asteroid impacts, or the kinds of tectonic shift that led to the massive glaciation of the Ordovician” (Vince 2011, p. 33). While the concept of the ecological equilibrium which had to be protected or brought about, was long upheld, newer approaches in environmental research reckon with systems in permanent change, which are maintained by nonlinear dynamics. Consequently, reduction of the impacts of interventions, the restoration of damage, and precautionary action are analyzed with regard to anticipated damage (Groß, Heinrichs 2010, pp. 3–6).

(3) All in all, systems convey a sense of order and stability. Organisms, machines, or social systems realize their own respective problem solutions, functions, and performance, and exist as long as they can assert themselves in an environment. On the other hand, we know that much or everything is at stake if they break down: sickness and death, technical failures, or economic crises testify to it. If the manner of organization of systems is interrupted, the result is their dissolution or their malfunction. Organisms can’t live just a little, their autopoiesis either functions, or it doesn’t, and this “or” designates the change of condition from life to death without transition (Maturana 2001, p. 62). Technical systems are destroyed if the causal closure as intended in their design can’t be maintained (as one of many examples Vaughan 1996). Economic crises anihilate capital and give rise to poverty for many.

The concept of “systemic risks”, which is receiving some attention at present, is supposed to make this potential explicit. It is supposed to point up the material, temporal, and social unboundedness of chains of events and of potentials for damage which affect entire systems and not merely individual components or isolated occurrences. With the above deliberations, we come nearer to the formulation of problems which concern the core of Technology Assessment (TA) as the study of consequences (risk, danger, and chance) and Systems Analysis (demonstrating the “entirety” of the relationships; Paschen et al. 1978, p. 23). The fact that TA also (and in many cases, above all) claims advice to be its responsibility (Grunwald 2007, p. 8) should be left aside here for the time being. Just so much: TA and related fields of research have developed a pronounced bias in the sense of education (Büscher 2010). The impression arises that Systems Analysis has been pushed into the background to the advantage of education, and that, in the research process, there is very early certainty (and maybe much too soon), what society has to be educated about, mostly about wrong behavior.

2     Terminological and Conceptual Inexactitudes

The term systemic risks is inexact in two respects, namely, with regard to its respective components: system and risk.

 

2.1   System and the Environment

First, questions ensue when arguments from the general systems theory are taken into consideration. When systemic risks are analyzed, then reference has to be made to systems in some sense or other, and, with it, to differences of system and environment. No system exists without an environment, and without systems, there is no environment. For that reason, the question poses itself quite generally whether, in view of systemic risks, there is danger for the system or for the environment (or for both). We must further assume that the re-production of a system depends on the transfer of some (not all) of the causes into the system’s range of control (Luhmann 1995, p. 19f.). It therefore has to be clarified, whether it is possible to distinguish between productive and unproductive causes in the system. This lets us think of mechanisms which work in and through the system, such as the metabolism of an organism, which internally converts substances from the environment into energy, and then of the possibility, whether the same mechanisms which set the systems going and keep them going can, under certain circumstances, cause their demise in the sense of a self-endangerment (Bunge 2010, p. 375).

Not least, system or environment differences can arise by drawing borderlines, and thereby through a form of cutting off causalities. Occurrences in the environment don’t affect the system in a one-to-one correspondence. The system reacts on the basis of its own conditions, and possibilities for operation to external shocks (which, in reality, are internal shocks) and stressors. Accordingly, it has to be explained, to which extent occurrences in the environment have effects in a system, resp., how they are passed on within the system.

2.2   Risk and Decision

Second, questions pose themselves when arguments from risk research are taken into account. It is known that “risk” designates an expectation that damage which is initiated by present events could possibly be incurred in the future. A risk implies a decision with regard to a calculation which includes the contingent course of future events. In this sense, risk is a form of time-binding which permits making future presents relevant for action as present futures (Luhmann 2005, p. 71; Esposito 2007, p. 94). With it, an important difference from the concept of danger is already indicated. In the material respect, the ascertainment of hazards is based on experience gained from other observations of more or less linear causal chains between a source of disturbance and possible deviations from normal conditions. Estimations of risk, on the other hand, fall back on model-based constructions of event- and error- chains in order to include possible causalities, about which no empirical knowledge is available (Ladeur 1993, p. 209f). This applies not only for the – in the meantime well-known – cases of the development and use of risky high technology, in which trial-and-error learning is ruled out, and assessing the consequences is tremendously difficult. In the temporal respect, the problem lies deeper. The uncertainties about future events don’t result out of ignorance of the relationships alone, but out of the circumstance that risky decisions themselves bring about a reality that didn’t exist beforehand. In that the openness of the future is used to be able to grasp opportunities, decision-making processes principally generate a lack of transparency. In the attempt to dispose over the future, decisions as present commitments shape future presents. Luhmann (1993, p. 281) called this “open” (before) and “closed” (afterwards) “contingency”. The consequences of decisions will manifest themselves more or less probably only after a commitment which is observable as such has been carried out. Before that, any assessment of the consequences remains a more or less uncertain prognosis.

2.3   Deviations from Normal Operation?

If one summarizes these deliberations, then the intuitively plausible term of systemic risks calls for an explanation, because systems don’t decide, and don’t take any risks. They either operate or not. Expectations are addressed to the manner of operation of technical and social systems, with regard to functions and results. Risks and hazards then mean possible deviations from “normal operation”, which can affect everyone who comes into contact with the respective systems. If one doesn’t mean exclusively exogenous causes for the deviation (typical: natural hazards), then the processes of decision-making come as endogenous causes in question. But since, in the case of systemic risks, it should precisely not be a matter of individual decisions, aggregate effects have to be explained. This problem is solved in that one distinguishes different levels of social reality. On the one hand, one makes reference to the realization of individual operations, which, on the other hand, realize systemic relationships in a nontransparent network of an infinite multitude of other operations. In the case of systemic risks, this means that a multitude of decisions can endanger a system if they initiate emergent effects which affect the system’s reproduction.

If this is correct, then we have to inquire into the respective rationalities of decision-making which encourage taking risks in individual cases, because opportunities are openened by it. One possibly finds that, on the level of individual operations, promising orientations on the level of a system relationship can have collective effects (or so maintains Deutschmann 2008, p. 515 in connection with the present financial market crisis). This addresses self-endangerment potentials which are only inadequately defined with the term of systemic risks, because systems don’t put themselves at risk, but aggregate effects of individual operations give rise to hazards. In the economy, these phenomena, like a bank-run, are discussed as the summation of individual decisions (Kambhu et al. 2007, p. 5).

3     The Differentiation of Systemic Risks

We first have to ask to which extent systemic risks differ from non-systemic risks. The first obvious mark of a distinction comes about through reference to systemic qualities, namely, complexity, concatenation, densification, or connectivity. Occasional, isolated situations characterized by uncertainty are precisely not meant. Systemic and non-systemic risk situations are determined by expectations about the courses of future events. Whoever believes himself to be secure does not expect disappointment, although it may occur, and whoever exposes himself to danger takes the possibility of disappointment into account, as risk.[3] Every small investor has to calculate the risks of the possible loss of investment, as opposed to the possible profits when he chooses an investment form. At the same time, however, his expectations are directed at a continuation of a “normal operation” of the economy, which expresses itself in economic growth, currency stability, and the liquidity of the banks. This expectation can also be disappointed, as we have recently again experienced. The loss can therefore be caused by individual actions or by systemic events. In addition, individual actions can lead to systemic effects. But nevertheless, a difference still has to be made between isolated and systemic events, between individual and collective effects, and between limited and unbounded extents of damage.

3.1   Unboundedness

Current discussions on systemic risks or systemic events refer, above all, to the spatial, temporal, and social unboundedness of the damaging occurrences. In the case of systemic events, regional, national, or global concatenations of damage are assumed, which can also occur with a delay in time, and can have an enormous potential for damage (IRGC 2010, p. 9; Renn, Keil 2008, p. 350). The concatenations refer either to correlations among the elements of a system or to dependencies among different systems. The former points to endangering the functional reliability of a system itself: “Systemic risk refers to the risk or probability of breakdowns in an entire system, as opposed to breakdowns in individual parts or components, and is evidenced by co-movements (correlation) among most or all parts” (Kaufman, Scott 2003, p. 371). The latter points to possible external effects as a result of losses of functionality in a system, which causes completely separate problems in other areas. The finance industry, for example, which grants no loans because it has cash problems, resp., reciprocally assumes a lack of liquidity, endangers the functional reliability of the entire economy, which, in its turn, has consequences for governmental action (Kambhu et al. 2007, p. 5). This emphasizes society’s dependence on systems. The OECD study on “Emerging Risks” highlights processes of densification and networking, caused by the increase of population in conurbations, and by the economic concentration in certain regions.[4] In this case, one has to distinguish between exogenous shocks in densely-settled regions, which can cause immense damage, due to the enormous exposition of human beings and material goods (Wenzel et al. 2007; Berz 2004) and the interdependences among critical infrastructures which arise out of densification and concentration (Perrow 2007).

3.2   Transitions from Non-systemic to Systemic

Is it now possible to characterize the transitions from non-systemic occurrences to systemic occurrences? Or can it only be ascertained ex-post that a systemic risk existed, which then manifested itself as the catastrophe which had already happened? The unbounded extent of damage is distinguished from a limited extent of damage by a quantitative and qualitative leap. But how can this leap be characterized? Can it be determined quantitatively in monetary losses, resp., the loss of human lives, or qualitatively by modelling regime-shifts? The former could certainly require work on definitions for every event, and the latter could only be determined ex-post. The same holds true for other criteria which one can sort according to material, temporal, or social viewpoints:

With these exercises in definition, however, explanatory content would scarcely be inherent in the concept of systemic risks. It would remain a term for dramaturgic purposes.

4     Contributions to this Thematic Focus

In this special issue, possibilities for the “assessment” of the phenomenon are to be ascertained for Technology Assessment. To this end, authors from various disciplines were requested to present extracts of their research.

4.1   Complex Systems in General

Belinda Cleeland presents a selection of the central arguments from two studies by the International Risk Governance Council (IRGC 2009; IRGC 2010). While the first report was only marginally concerned with complex systems, the focus of the analyses in the following report lay on the characteristics of systems and of complex processes, as well as on the danger which results from these characteristics. Belinda Cleeland’s contribution has been placed at the beginning of this special issue, because it points out epistemological difficulties. How can we acquire knowledge about systems and their properties which elude simple causal schemata of cause and effect? Emergence, non-linear dynamics, delayed effects, catastrophic thresholds, path dependencies are some key words. It will not come as a surprise that the IRGC experts come to the conclusion that the endogenous effects of complex systems bring about new dangers. Only the question poses itself, to which extent the systems we scrutinize endanger themselves or other systems (or both). In this respect, the following contribution by Helmut Willke makes some suggestions.

4.2     Financial System

Helmut Willke has long occupied himself with systemic risks in the financial system. In this special issue, as well as in his study on “Governance of Global Finance” (Willke 2007), he describes the consequences of the autonomy of a social system. Through the present financial and economic crisis, his analyses have received some empirical underpinnings.

Willke describes, following Niklas Luhmann’s works (1994, 1999), the reflexive medium, money as a basic structural characteristic of the finance industry. A symbolic communication medium, which (1) acquires its value by proving itself and by trust, in other words, solely through the generalized attribution that one can buy something with money, and (2) which itself becomes buyable, as a loan, and by paying interest, which dramatically increases the accessibility of capital and, with it, the possibilities for investment. Capital makes investments possible before returns can be expected from the investments. It is needed to tide over momentary shortages. This form of self-reference (or of self-validation) and of reflexivity generates, especially in the financial industry, intrinsic values, behind which a real asset often scarcely seems tangible (Willke 2007, p. 140).

The elementary operations of society’s economy are payments. The elementary operations of the financial industry are payments inclusive of temporally and monetarily determined conditions of repayment. Payments and repayments are temporally separated, and this difference generates risks and opportunities (Willke 2007, p. 141). Risk acceptance is an inherent necessity for investments (financial capital as “securities, monetary and derivative assets”). Willke’s argument is that, through these prerequisites, limited and unlimited networks can be distinguished in the economy. Limited networks constitute themselves in the social dimension between producers and consumers, sellers and buyers, in their respective economic roles through payments for goods. These networks are limited by the number of actors participating. Principally unlimited networks constitute themselves through the primary recourse to the temporal dimension, when the open future permits an endless architecture of options. “As the field of options within the financial system is extended into the depth of structured derivative instruments and into the labyrinths of prolonged chains of conditioned events, the chances and risks of aggregate or even systemic effects of mutual reinforcement, snowballing, leverage and positive feedback loops beyond single firms loom large” (Willke 2007, p. 156).

4.3   Software Systems

Which leaps in complexity might technological and organizational systems have made since Charles Perrow has been investigating the inherent structural susceptibility of their design since the 1990’s. One is tempted to say that his arguments carry, not only in the Age of Mechanics, but also in the Age of Electronics. The outstanding argument in “Normal Accidents” of 1984 was that uncontrollability originates in the structure of large-scale technologies. Therefore, he concentrates his analysis not on errors by operators, on design, or on the equipment, or on disregarding safety regulations. He was more interested in analyzing the nature of high technology, and uses two independent dimensions for classifying: linear or complex interactions, and tight and loose coupling of technical elements. By combining both dimensions, Perrow generated a crosstab as a heuristic to highlight many dangerous technologies: nuclear energy systems, genetics, and the chemical industry, for example (Perrow 1984, p. 97).

Further into the 1990’s, he upheld his arguments in discussions with other theoreticians. Some of them can also be found in the different contributions in this current special issue. The combination of complex interaction and tight coupling, which characterizes the form of technology discussed above, will produce unexpected interactions of error that will defeat safety devices. That does not mean that prevention, anticipation of errors, analytical decision-making is useless, but all this is no guarantee for safety. Referring to other social theories, Perrow claims that organizations are bounded in their rationality to make decisions and that their mode of decision-making can be described with the “Garbage Can”-model. That means decision-making under a high degree of uncertainty is characterized by unstable and unclear goals, misunderstanding and mislearning, happenstance and confusion. That leads to a pessimistic view regarding efficiency and safety goals. Furthermore, Perrow insists on a competition of different goals of organization; safety is not the only one, and is not at all considered to be the most important one. Also relevant to a high degree is production pressure, profit pressure, growth, prestige and departmental power struggles (Perrow 1994).

Perrow transfers these insights onto the present situation, in which many critical technologies are being driven by standardized, non-modular software systems. This produces a high degree of security problems through management decisions. This is a situation which could have far-reaching consequences, as the author shows in his contribution.

4.4   Critical Infrastructure

Carsten Orwat concerns himself with the hypothesis that systemic risks arise when technical systems and governance structures don’t develop adequately to one another. He distinguishes three planes of analysis and their interactions: technological developments (1), societal organization, resp., industrial structure as the branch’s internal governance (2), and the regulative level as external governance (3). An example which he delves into is the so-called “Smart Grid”: a socio-technical development in the energy sector with the objective of a better integration of renewable energies with volatile power generation.

New hazardous situations possibly result from the transformation to be expected in the energy sector, if a tight connectivity of formerly loosely coupled components is pushed. Realization of an “Energy Internet” with connections of all Smart-Grid components by IP-communication leads to a new connectivity, which entails a great number of points of attack, a low level of security for numerous “terminals”, or weaknesses of the IP-standards. Flaws in subsystems can possibly lead to the dysfunction of a large part of a system, or of the entire system.

Orwat gathers arguments from diverse analyses of the energy sector, in which the assumptions are expressed that faulty or inadequate governance structures systematically cause among rationally-behaving actors activity which brings about hazards. Through the economic pressure for reform in the electricity industry (privatization, competition), it has come to the reduction of redundancies, with power failures as a result. Economic considerations lead to less investment in IT-security. For reason of costs, SCADA-systems are connected to the Internet. Through the pressure of costs, the use of standard software (COTS) in energy systems is to be expected, through which weaknesses in the system are multiplied – an assumption which is supported by Perrow’s analyses.

4.5   Natural Hazards and Socio-Technical Systems

Bijan Khazai, James E. Daniell, und Friedemann Wenzel provide us with data and interpretations from the viewpoint of Hazard Research on a current catastrophe, the March 2011 Japan Earthquake at the Tohoku coast (better known under the label Fukushima-Disaster). Above all, they attempt to analyze the different events in their effects as a concatenation, in that they distinguish causes, direct impacts, indirect impacts, und exacerbating factors. In essence, they come to the conclusion: “The Tohoku earthquake was typical of disasters with cascade-like spreading, and dynamic risk assessment procedures should follow from recent disaster experience that incorporates dynamic interactions between natural hazards, socio-economic factors, and technological vulnerabilities” (Khazai et al. in this issue) . They describe how exogenous processes impose stress on technical and social systems, and set off a chain of damage which to the present hasn’t been completely registered – primarily because the Fukushima-Ruin will long continue to be a danger for the health of the Japanese population.

Notes

[1]  I would like to thank my colleagues at ITAS for their help and discussions concerning the contents and composition of this thematic focus. Many thanks belong to Robert Avila for his corrections and translations, furthermore for his patience with the non-native-speaking fraction in this issue.

[2]  Megacities suffer under these self-reinforcing effects: “If we improve the city’s appearance, furnish it with good streets, train connections, and apartments, – if we make life more pleasant – ever more people are drawn to the environs” (A statement of Rahul Mehrotra, Urban Planner, cited in the German edition of “Maximum City: Bombay Lost and Found”; Mehta 2006, p. 185).

[3]  The beginnings of systematic risk calculation with statistics are assumed to lie in the 17th century. [Insurances] developed to the same extent that modern statistics (since the middle of the 17th century) were well-founded and fully developed, i.e., risk became calculable (Conze 2004, p. 848).

[4]  The much-quoted passage is: “A systemic risk [...] is one that affects the systems on which society depends: health, transport, environment, telecommunications, etc” (OECD 2003, p. 30).

References

Bertalanffy, L. von, 1950: An Outline of General System Theory. In: The British Journal for the Philosophy of Science 1/2 (1950), pp. 134–165

Berz, G. et al. (ed.), 2004: Megacities – Megarisks. Trends and Challenges for Insurance and Risk Management. Munich

Bunge, M., 2010: Soziale Mechanismen und mechanismische Erklärungen. In: Berliner Journal für Soziologie 20 (2010), pp. 371–381

Büscher, Chr., 2010: Formen ökologischer Aufklärung. In: Büscher, Chr.; Japp, K.P. (eds.): Ökologische Aufklärung. 25 Jahre „Ökologische Kommunikation“. Wiesbaden, pp. 19–49

Conze, W., 2004: Sicherheit, Schutz. In: Brunner, O.; Conze, W.; Koselleck, R. (eds.): Geschichtliche Grundbegriffe: historisches Lexikon zur politisch-sozialen Sprache in Deutschland – Band 5. Stuttgart, pp. 831–862

Deutschmann, Chr., 2008: Die Finanzmärkte und die Mittelschichten: der kollektive Buddenbrooks-Effekt. In: Leviathan 4 (2008), pp. 501–517

Dupuy, J.-P., 2005: Aufgeklärte Unheilsprophezeiungen. In: Gamm, G.; Hetzel, A. (eds.): Unbestimmtheitssignaturen der Technik: Eine neue Deutung der technisierten Welt. Bielefeld, pp. 81–102

Esposito, E., 2007: Die Fiktion der wahrscheinlichen Realität. Frankfurt a. M.

Giddens, A., 1990: The Consequences of Modernity. Cambridge

Groß, M.; Heinrichs, H., 2010: New Trends and Interdisciplinary Challenges in Environmental Sociology. In: Groß, M.; Heinrichs, H. (eds.): Environmental Sociology – European Perspectives and Interdisciplinary Challenges. Dordrecht, pp. 1–16

Grunwald, A., 2007: Auf dem Weg zu einer Theorie der Technikfolgenabschätzung: der Einstieg. In: Technikfolgenabschätzung – Theorie und Praxis 16/1 (2007), pp. 4–17

Hobbes, Th., 2007 (1651): Leviathan; http://ebooks.adelaide.edu.au/h/hobbes/thomas/h68l/index.html (download 29.11.11)

IRGC – International Risk Governance Council, 2009: Risk Governance Deficits. Geneva

IRGC – International Risk Governance Council, 2010: The Emergence of Risks. Geneva

Kambhu, J.; Weidman, S.; Krishnan, N., 2007: New Directions for Understanding Systemic Risk. Washington, DC

Kaufman, G.G.; Scott, K.E., 2003: What is Systemic Risk, and Do Bank Regulators Retard or Contribute to It? In: Independent Review 7/3 (2003), pp. 371

Ladeur, K.-H., 1993: Risiko und Recht. Von der Rezeption der Erfahrung zum Prozeß der Modellierung. In: Bechmann, G. (eds.): Risiko und Gesellschaft: Grundlagen und Ergebnisse interdisziplinärer Risikoforschung. Opladen, pp. 209–233

Lenton, T.M.; Held, H.; Kriegler, E. et al., 2008: Tipping Elements in the Earth’s Climate System. In: Proceedings of the National Academy of Sciences 105/6, pp. 1786–1793

Luhmann, N., 1993: Die Paradoxie des Entscheidens. In: Verwaltungsarchiv 84/3 (1993), pp. 287–310

Luhmann, N., 1994: Kapitalismus und Utopie. In: Deutsche Zeitschrift für europäisches Denken 48/3 (1994), pp. 189–198

Luhmann, N., 1995: Social Systems. Stanford, CA

Luhmann, N., 1999: Die Wirtschaft der Gesellschaft. Frankfurt a. M.

Luhmann, N., 2005: Risk: A Sociological Theory. New Brunswick, NJ

Maturana, H.R., 2001: Was ist erkennen? Munich

Mehta, S., 2006: Bombay: Maximum City. Frankfurt a. M.

Mitchell, S., 2008: Komplexitäten. Frankfurt a. M.

NRC – National Research Council, 2009: Science and Decisions. Advancing Risk Assessment. Washington, DC

OECD – Organisation for Economic Co-operation and Development, 2003: Emerging Risks in the 21st Century. An Agenda for Action. Paris

Paschen, H.; Gresser, K.; Conrad, F., 1978: Technology Assessment: Technologiefolgenabschätz. Frankfurt a. M.

Perrow, C., 1984: Normal Accidents. Living with High-Risk Technologies. New York

Perrow, C., 1994: The Limits of Safety: The Enhancement of a Theory of Accidents. In: Journal of Contingencies and Crisis Management 2/4 (1994), pp. 212–220

Perrow, C., 2007: The Next Catastrophe. Princeton, NJ

Renn, O.; Keil, F., 2008: Systemische Risiken: Versuch einer Charakterisierung. In: GAIA 17/4, pp. 349–354

Strub, Chr., 1998: System; Systematik; systematisch. In: Ritter, J.; Gründer, K.; Gabriel, G. (eds.): Historisches Wörterbuch der Philosophie – Band 10, pp. 825–856

Urry, J., 2004: The “System” of Automobility. In: Theory Culture Society 21/4–5, pp. 25–39

Vaughan, D., 1996: The Challenger Launch Decision. Risky Technology, Culture, and Deviance at NASA. Chicago

Vince, G., 2011: An Epoch Debate. In: Science 334 (2011), pp. 32–37

WBGU – Wissenschaftlicher Beirat der Bundesregierung Globale Umweltveränderungen, 1999: Strategien zur Bewältigung globaler Umweltrisiken. Jahresgutachten 1998. Berlin

Wenzel, F.; Bendimerad, F.; Sinha, R., 2007: Megacities – Megarisks. In: Natural Hazards 42 (2007), pp. 481–491

Willke, H., 2007: Smart Governance. Governing the Global Knowledge Society. Frankfurt a. M.

Kontakt

Dr. phil. Christian Büscher
Institut für Technikfolgenabschätzung und Systemanalyse (ITAS)
Karlsruher Institut für Technologie (KIT)
Karlstr. 11, 76133 Karlsruhe
Tel.: +49 721 608-23181
E-Mail: christian buescher∂kit edu