Governance between ignorance and evidence: Technology assessment in the context of pandemic crisis management

Gabriel Bartl*, 1

* Corresponding author:

1 Centre Marc Bloch, Humboldt-Universität zu Berlin, Berlin, DE

Abstract  This article examines the relationship between knowledge and ignorance in the context of crises and corresponding technological solutions. It focuses on the case of pandemic simulation models as a specific form of dealing with uncertainty, which marks a transition from classical risk management to algorithmically organized anticipation practices. The thesis of the paper is that technology assessment is affected by this development when it comes to reflecting on the normative premises and social and political implications of digital crisis technologies. This refers in particular to what is considered crisis-relevant knowledge in the first place, according to what logics it circulates, and what attributions and effects can be observed with regard to digital crisis technologies. Against this background, the paper discusses the relevance of social science knowledge as well as the role of deliberative practices in times of crisis.

Governance zwischen Nichtwissen und Evidenz: Technikfolgenabschätzung im Kontext des Pandemie-Krisenmanagements

Zusammenfassung  Dieser Artikel beleuchtet das Verhältnis zwischen Wissen und Nichtwissen im Kontext von Krisen und entsprechenden technologischen Lösungsangeboten. Dabei wird exemplarisch auf den Fall von pandemischen Simulationsmodellen als eine spezifische Form des Umgangs mit Unsicherheit fokussiert, die einen Übergang vom klassischen Risikomanagement zu algorithmisch organisierten Antizipationspraktiken markiert. These des Beitrags ist, dass die Technikfolgenabschätzung von dieser Entwicklung nicht unberührt bleibt, wenn es darum geht, über die normativen Prämissen und die gesellschaftlichen und politischen Implikationen digitaler Krisentechnologien zu reflektieren. Dies bezieht sich insbesondere darauf, was überhaupt als krisenrelevantes Wissen angesehen wird, nach welchen Logiken dieses zirkuliert und welche Zuschreibungen und Effekte sich mit Blick auf digitale Krisentechnologien beobachten lassen. Vor diesem Hintergrund wird die Relevanz sozialwissenschaftlichen Wissens in Krisenzeiten ebenso diskutiert wie die Rolle deliberativer Praktiken.

Keywords  ignorance, technologies of preparedness, pandemic simulation models, co-production of knowledge

This article is part of the Special topic “Potentials of technology assessment in sudden and enduring crises,” edited by T. Sinozic-Martinez, J. Hahn and N. Weinberger.

© 2023 by the authors; licensee oekom. This Open Access article is licensed under a Creative Commons Attribution 4.0 International License (CC BY).

TATuP 32/2 (2023)32/2, S. 30–35,

Received: 16. 01. 2023; revised version accepted: 03. 05. 2023; published online: 06. 07. 2023 (peer review)


In times of crises policy makers must take decisions under high degrees of uncertainty and pressures of time. This is where the promises of modern security and crisis technologies (e.g., digital warning apps or different sorts of surveillance devices) come into play in order to meet those challenges (Bartl 2020). In light of this development nothing can hide the fact that “digital prediction tools increasingly complement or replace other practices of coping with an uncertain future” (Heimstädt et al. 2021, p. 1). Thus, it came as no surprise that simulations and contagion models as specific methods of crisis reaction gained enormous importance, especially during the COVID-19 pandemic (Kaminski et al. 2023). The calculative determination of indicators and thresholds as well as the prognostic modeling of risks thereby provide policy makers with seemingly unambiguous knowledge that claims objectivity and hard evidence, in contrast to the supposed relativity and context-dependency of social science knowledge (Boin et al. 2021).

Several follow-up questions arise from the proclaimed surge in the importance of technologies for dealing with crises that will be reflected with regard to the COVID-19 pandemic. Firstly, this touches on the question of what evidence and evidence-based action should be, particularly in the context of scientific policy advice. Hence, it is important to decide to what extent scientific expertise should be based primarily on quantitative approaches like modeling and forecasting or whether ‘non-numerical’ qualitative knowledge can also provide useful insights for crises like pandemics. So, if future crises increasingly tend to be characterized by a high degree of ignorance how can knowledge-intensive data-based approaches take this into account? The relationship between knowledge and ignorance is thus seen as a crucial component of knowledge-intensive technologies.

The thesis of an increasing technologization in dealing with crises and uncertainties, as seen in the pandemic, raises the question of what role technology assessment (TA) should take in relation to the aforementioned aspects. In addition to considering the influence of technology on society, including unintended side effects, it must – presumably even more than before – explore and develop options for action and design. The particular difficulty here lies in the volatility and dynamics of crises that contradict universal, temporally stable and unambiguous advice.

The forecasting of crises and the shift from risk to anticipation

Anticipatory crisis management in the light of ignorance and uncertainty: COVID-19 as paradigmatic example

The COVID-19 pandemic can not only be operationalized as a health crisis but also as political, social, and economic crisis culminating in a state of exception that called for extraordinary measures. Hereby, high degrees of uncertainty, threat and urgency (Boin et al. 2018) affected the quality of decision-making processes and created tensions with regard to the relationship between scientific expertise and policy making.

The precise analysis of the characteristics and dynamics of uncertainty and the relationship between knowledge and ignorance is an essential prerequisite for answering the question of what solution is to be found and what role technology should play in shaping the future. Against the background of crises like pandemic emergencies, it is noticeable that the future has become something threatening that has to be controlled. In this context, Staab (2022, p. 77) diagnoses that it is less and less a matter of conquering the future, and instead the focus is increasingly on making the present fit for the future.

This claim leads to the development of specific counter-strategies, especially in response to acute crises, among which anticipation seems to emerge as the winner with regard to political crisis responses. Both in the context of foresight programs and within certain practices of preparedness, the mode of anticipation is mostly at the center. When considering anticipation as a concept, it is striking that “anticipatory action” (Anderson 2010, p. 778) is based on practices that go beyond the analytics of risk in a way that lacking probabilistic knowledge on past events is compensated by making use of specific approaches such as simulation models during the COVID-19 pandemic. This is due to the insight that in the context of highly dynamic pandemics with new and mutating virus variants probabilistic risk assessments are considered rather unsuitable because uncertainties are highly complex and oftentimes hardly quantifiable.

Instead of relying on a workable management of calculable risks, anticipation transfers unmanageable ignorance and contingency into more manageable practical challenges. However, this transformation is extremely presuppositional. This is because dealing with contingency is a crucial aspect if one wants to think of the future in the plural and understand it as a process of negotiation instead of assuming technologically controlled constructions of unambiguity. Compared to theory-guided risk-based approaches, machine learning procedures, for example, supplement causation as the classical scientific procedure when data are analyzed only according to the principle of correlation. This ‘end of theory’ (Anderson 2008) marks a paradigm shift in dealing with knowledge by questioning hitherto fundamental scientific methods and principles, which is also highly consequential for society.

Crisis technologies and the circulation of knowledge and power: the case of COVID-19 simulation models

The shift from causation to correlation also manifests in the use of computer simulations for pandemic crisis management, when simulations are viewed as a form of algorithmic decision making (Hälterlein 2023, p. 32). Although the majority of computer simulations during COVID-19 relied on explanation-based statistical methods (SIR and SEIR models), Bayesian methods, network models or agent-based models. While the latter were used less frequently during the COVID-19 pandemic, this could change in the future when faster calculating capacities will likely favor the use of machine learning and AI models. In particular, the possibility of modeling individual behavior, for example in the context of agent-based models, offers the promise of more realistic results (Hälterlein 2023, p. 31). This is another reason why these kinds of simulations are likely to become increasingly important in the context of scientific policy advice (Eyert 2020).

Considering the role of simulations as number-based crisis management approaches during the COVID-19 pandemic it should be noted that a suggestion of unambiguousness can often be found in the context of certain practices of constructing and modeling uncertainty. On the one hand, quantitative modeling and the resulting number-based forecasts of pandemic incidence certainly provide important guidance for policy-making. On the other hand, precisely because of their numerical orientation, number-based recommendations run the risk of obscuring ambiguities, especially when they are presented as prognostic models. In addition, modeling often does not account for its biasing effects. Although numbers per se make no claim to neutrality or evidence, they are performative by influencing and changing social realities. Consequently, the normative assumptions, biases, and limitations of simulations should be disclosed to policy makers (the latter, for example, by specifying significance levels) and should also be considered in the design of corresponding decision support systems (Hälterlein 2023). Conversely, this then also means that modelers must not come up with political claims – as, however, happened in the case of the COVID-19 pandemic.

Although numbers per se make no claim to neutrality or evidence, they are performative by influencing and changing social realities.

Looking at the nexus between science and policy – and thus between knowledge and power – two things stand out. First, especially in the presence of massive epistemic uncertainty, as in the case of COVID-19, there may be competing evidence that can lead policymakers to select specific evidence to either support or prevent a lockdown. In addition to the phenomenon of selective knowledge reception by policy makers, however, selectivity considering knowledge production must also be reflected. This implies that one should not only focus on the production of knowledge, but also pay more attention to the production of ignorance, as McGoey (2012), for example, proposes by developing the notion of “strategic ignorance”. For her, ignorance is not only a negative phenomenon, but also an instrument of governance. In terms of pandemic simulation models, this means that attention should also be paid to what is not being modeled. An example of this is the fact that more family-friendly policies (such as keeping schools and daycare centers operating) were not included in available simulation models, at least until June 2020 (Eyert 2020). Thus, if certain policy options for action are not represented in the simulations, this highlights the power that arises from possible selectivity in the modeled parameters.

Finally, this effect is reinforced by practices of visualization and their objectifying character. This, in turn, can function as a kind of evidence-based legitimacy for policy making because visualizations are not neutral representations of data. Rather they are to be seen as the result of normative judgements and interest-led decisions. Hence, public health simulations based on the visualization of numbers have the potential to contribute to the stabilization of power positions. In this respect “images of epidemics and zoonoses are not mere representations of infectious diseases and their social impact, but rather actants in a broader political economic arena of power and knowledge” (Keck et al. 2019, p. 8). So, on the one hand, the use of simulation models enables responsible emergency actors to perform within the regime of public health by referring to their capacities to govern rationally. On the other hand, regarding the problem of accountability in algorithmic governance technologies like pandemic simulation models, political decisions no longer appear to be attributable which marks a substantial democratic deficit through a high degree of diffusion of political responsibility.

Technology assessment in the context of dealing with crises

Based on the situation just outlined, the roles and challenges of TA will be discussed in the following three major points:

1. Technology assessment needs to reflect on the construction principles of crisis technologies

The social construction of technology has various facets. Regarding the collection of data, it is crucial to decide which data appear to be useful at all. According to Benjamin Bratton’s (2022) interpretation of the COVID-19 pandemic, we are collecting the wrong data to be able to successfully counter health crises. He therefore argues for an adaptation of the infrastructures of knowledge, planning and intervention, which should not only be more preventive, but also more inclusive. In this respect, TA would also have to ask the question to what extent and in what ways crisis technologies like pandemic simulation models contribute to enacting or stabilizing certain social conditions at the expense of others through their inherent materialization of values.

Besides inclusiveness and equity, transparency is also elementary for the assessment of crisis technologies, considering that the assumptions inscribed in technologies tend to be opaque and thus hide power relations (Bartl 2023). This is because the concrete social conditions of the production of science-based expertise often remain hidden. The phenomenon of situated knowledge (Haraway 1988) and the fact that scientific knowledge is based on certain premises and is selected, translated, and often simplified in mediation processes are hardly transparent to the public which is, as explicated above, also true for the usage of simulation models during the pandemic.

Certain TA approaches like constructive TA (CTA) (Rip et al. 1995) or Responsible Research and Innovation (RRI) (von Schomberg 2013) already focus on uncovering the hidden premises of technologies. They also address the issue of conflicting values. This is crucial because when the social construction of technology is understood as an examination of the transfer of social complexity into binary code structures within algorithmic governance approaches, the materialization of values inevitably includes the dimension of conflict and power. Thus, CTA and RRI enable to open up the discourse when developing technological solutions for the anticipation of future crises. This opening can also contribute to more just crisis responses by initiating dialogue at an early stage in the construction of respective technologies when fairness and legitimacy issues are considered within social negotiation processes of dealing with crises.

It could also be rewarding if TA reflected more thoroughly from a perspective driven by Sciences and Technology Studies on the circulation of power and knowledge, for example within the whole system of pandemic emergency management (containing organizational hierarchies and competences as well as legal frameworks and political decision processes), by asking what enables, disables and shapes those circulations and flows of knowledge and power, especially within and between science and politics in times of crises. For TA, this implies, for example, to draw attention to the fact that behind visualizations that appear objective, there are often struggles for normative claims. At this it is also indispensable that TA addresses its own normative premises (Kollek 2019).

2. Technology assessment needs to highlight to what extent dealing with crises is a contingent process

Speaking of transparency regarding the importance of uncovering the implicit and explicit construction principles of new technologies, according to Amoore (2020), it is not sufficient to open black boxes. In her view algorithms and their social relations cannot be described simply by analyzing their code, nor can ethics be coded into algorithms. Instead, Amoore proposes to address the ethical responsibility of algorithms by uncovering the socio-technical conditions under which they emerge and function. In her view the problems of algorithmic predictions result from a “calculative rationality that reduces the multiplicity of potentials to a single output” (Amoore 2020, p. 51). Consequently, in this sense, it cannot be a matter of making algorithmic procedures transparent to gain legitimacy.

Several conclusions can be drawn from Amoore’s position that technologies of anticipation in particular tend to decrease the multiplicity of options and thus narrow down the contingency spectrum. To respond to the phenomenon of contingency constriction, attention needs to be paid to a diversity of (societal) stakeholders in order to take different forms of knowledge and ideas into account at an early stage. By considering multiple perspectives for contingent and complex problems, the focus is not on one major solution. Instead, it is about revealing and communicating the variety of preferences, values, attitudes, and risk perceptions regarding the effects and implications of technologies. This is based on the assumption that risk conflicts by no means follow epistemically clearly definable contours, but rather extend along social political and economic categories. Risk management developed from within society could thus arrive at completely different answers, far removed from the monopoly position of technical solutions (Voss 2022).

Behind visualizations that appear objective, there are often struggles for normative claims.

At this point, it should be mentioned that TA has long advocated thinking in terms of alternative options, for example by including civil society perspectives in participatory TA processes (Krings and Grunwald 2013). A perspective that looks in a similar direction can be found in the conceptions of ‘post-normal science’. Representatives of post-normal science assume that if four criteria are met (facts uncertain, values in dispute, stakes high, decisions urgent), as was the case in the COVID-19 pandemic, the classic understanding of science is no longer sufficient and should therefore be expanded: “Under post-normal conditions, the knowledge base should be pluralized and diversified to include the widest possible range of high-quality potentially usable knowledges and sources of relevant wisdom, without enforcing the demand for science to speak with one voice.” (Waltner-Toews et al. 2020)

However, with regard to the diversification of knowledge within the framework of participatory processes, it must be noted that many TA-driven participation processes have remained unclear in their objectives and diffuse in their methodology, which has disappointed the original expectations (Grunwald 2022, pp. 85–91). This has not least to do with the fact that participation requires time and resources, which are scarce commodities, especially in times of crisis. In this respect, TA is particularly called upon to develop innovative approaches that take into account the specifics of crises. With reference to the possibilities of deliberative practices, TA should therefore firstly evaluate how participation – as a pluralistic knowledge resource – can be made more fruitful to produce knowledge around crises. This can also mean to consider alternative forms of dealing with uncertainty and crises within deliberative practices and actively make use of experimental approaches, which are, if at all, only conditionally programmable, because ‘machines’ are not equipped with situational awareness to the same extent as humans (Suchman 2015).

3. Technology assessment needs to reflect the importance of social science knowledge for dealing with crises and relate it to natural science knowledge within policy advice panels

The demand for a stronger pluralization of knowledge resources naturally also refers to science itself. In this respect, the importance of social science knowledge for dealing with crises must also be reflected by TA and related to the features of natural science knowledge. However, it could be observed that social science knowledge was insufficiently mobilized for dealing with COVID-19. This could be related to the fact that the epistemic structure of natural and technical scientific knowledge accommodates the logics of modern forms of governance. Thus, the calculative determination of indicators, thresholds, and prognostic modeling of risks provides policymakers with seemingly unambiguous knowledge that suggests objectivity, in contrast to the supposed relativity and context-dependence of social science knowledge. Especially in times of digital solutionism, it is important that the return of technology-deterministic beliefs is critically reflected after they had been pushed into the background by social constructivist concepts for a long time (Grunwald 2022, pp. 93–94).

Technology assessment must provide answers to why dealing with ignorance and uncertainty is primarily technologically framed.

However, beyond natural science knowledge production, if social science expertise is ignored despite apparent crisis-related entanglements, this threatens to run counter to effective, equitable, and legitimate crisis responses (Hulme et al. 2020), because, again, the narrowing of perspectives results in a loss of options for action that limits the range of possibilities for shaping the future. Moreover, focusing on the advice of a few selected scientific disciplines could lead to the stabilization of path dependencies that are more likely to counteract than to promote a creative and innovative approach to crises.

Rather than generating unambiguous recommendations for action, it is much more a matter of repeatedly exploring the dynamics of these relationships anew and making them usable as a driving force for research. The basis for this is a processual understanding of policy advice that puts forward proposals on how political decisions can be corporatively co-produced without being overwhelmed by an inherently ‘better’ scientific knowledge whose value-settings are not democratically clarified (Blätte 2019). Thus, the debate whether evidence is sufficient as the sole guide for policy decisions has led to controversies not only since the COVID-19 pandemic, which have rightly brought the importance of value-based trade-offs within political decision-making processes back to the fore (Bogner 2021). A possible orientation for this could be again provided by the approach of ‘post normal science’, which argues for a reversal of the hegemony of hard facts over soft values (Funtowicz and Ravetz 2003).


The starting point of this contribution was that ignorance and uncertainty as characteristic elements of crises limit the ability to govern successfully. Also, they erode the plausibility of calculable risk prevention which paves the way for the promises of preparedness technologies. But technologies alone cannot initiate fundamental change and have the tendency to narrow down the multiplicity of options for alternative modes of crisis response. TA must actively address this issue and provide answers to the question why dealing with ignorance and uncertainty is primarily technologically framed. This also implicates to challenge all those particular narratives of control that are associated with crisis technologies and that suggest the existence of unambiguous epistemic facts. With reference to simulation models the latter can induce a false sense of certainty, especially when we fail to recall the underlying assumptions. This also touches upon the collection of data: What is evidence and which data are essential to deal with crises? Here, TA should question the objectivity of numbers, which appear like evidence, but can contain severe frictions with regard to normative premises and related value conflicts. Hence, TA should shed light on the expectations of technological solutions during crises and ask how these are justified at the interface between science and politics.

This also means that in policy advice, the interpretative claims, e.g., of simulation models should not only be questioned, but also it should be discussed which legitimizing function they have for political decisions. In this respect, TA should critically reflect hierarchies in different forms of knowledge flows, for instance with regard to the importance of social science or collaborative knowledge production compared to natural science approaches at the nexus between science and policy. Thus, it should be illuminated how both the construction and circulation of crisis-relevant knowledge is influenced, i. e. supported or hindered, by techno-material, institutional, organizational, but also normative factors.

One has to admit that TA has a tough job, since its methods and procedures take time and pressure of time is a typical feature of crises. Nevertheless, it is important that TA provides both society and political decision-makers with reflexive orientational knowledge.

Funding  This article has received funding by the Federal Ministry of Education and Research (BMBF) as part of the author’s work in the research project ‘Multiple Crises. COVID-19 and the Entanglements of Public Health, Security and Ecology in Europe’.

Competing interests  The author declares no competing interests.


Amoore, Louise (2020): Cloud ethics. Algorithms and the attributes of ourselves and others. Durham, NC: Duke University Press.

Anderson, Ben (2010): Preemption, precaution, preparedness. Anticipatory action and future geographies. In: Progress in Human Geography 34 (6), pp. 777–798.

Anderson, Chris (2008): The end of theory. The data deluge makes the scientific method obsolete. In: Wired 16 (7). Available online at, last accessed on 11. 05. 2023.

Bartl, Gabriel (2020): Implikationen der Technisierung von Sicherheit. Theoretische Perspektiven und forschungsmethodische Herausforderungen. In: Sabrina Ellebrecht, Nicholas Eschenbruch and Peter Zoche (eds.): Sicherheitslagen und Sicherheitstechnologien. Beiträge der ersten Sommerakademie der zivilen Sicherheitsforschung 2018. Münster: LIT Verlag, pp. 125–138.

Bartl, Gabriel (2023): Krise und technologischer Solutionismus. Die politische Dimension des digitalisierten Umgangs mit Unsicherheit. In: Andreas Wagener and Carsten Stark (eds.): Die Digitalisierung des Politischen. Wiesbaden: Springer, pp. 45–62.

Blätte, Andreas (2019): Politikberatung aus sozialwissenschaftlicher Perspektive. In: Svenja Falk, Manuela Glaab, Andrea Römmele, Henrik Schober and Martin Thunert (eds.): Handbuch Politikberatung. Wiesbaden: Springer, pp. 25–38.

Bogner, Alexander (2021): Die Epistemisierung des Politischen. Wie die Macht des Wissens die Demokratie gefährdet. Ditzingen: Reclam.

Boin, Arjen; ’t Hart, Paul; Kuipers, Sanneke (2018): The crisis approach. In: Havidán Rodríguez, William and Joseph Trainor (eds.): Handbook of disaster research. Cham: Springer, pp. 23–38.

Boin, Arjen; McConnell, Allan; ’t Hart, Paul (2021): Governing the pandemic. The politics of navigating a mega-crisis. Cham: Palgrave Macmillan.

Bratton, Benjamin (2022): The revenge of the real. Politics for a post-pandemic world. New York, NY: Verso.

Eyert, Florian (2020): Epidemie und Modellierung. Das Mathematische ist politisch. In: WZB Mitteilungen 168, pp. 82–85. Available online at, last accessed on 11. 05. 2023.

Funtowicz, Silvio; Ravetz, Jerome (2003): Post-normal science. In: International Society for Ecological Economics (ed.): Internet Encyclopaedia of Ecological Economics. Boston, MA: ISEE. Available online at, last accessed on 11. 05. 2023.

Grunwald, Armin (2022): Technikfolgenabschätzung. Einführung. Baden-Baden: Nomos.

Hälterlein, Jens (2023): Agentenbasierte Modellierung und Simulation im Pandemiemanagement. In: TATuP – Journal for Technology Assessment in Theory and Practice 32 (1), pp. 30–35.

Haraway, Donna (1988): Situated knowledges. The science question in feminism and the privilege of partial perspective. In: Feminist Studies 14 (3), pp. 575–599.

Heimstädt, Maximilian; Egbert, Simon; Esposito, Elena (2021): A pandemic of prediction. On the circulation of contagion models between public health and public safety. Sociologica 14 (3), pp. 1–24.

Hulme, Mike; Lidskog, Rolf; White, James; Standring, Adam (2020): Social scientific knowledge in times of crisis. What climate change can learn from coronavirus (and vice versa). In: WIREs Climate Change 11 (4), p. e656.

Kaminski, Andreas; Gramelsberger, Gabriele; Scheer, Dirk (2023): Modeling for policy and technology assessment. Challenges from computerbased simulations and artificial intelligence. In: TATuP – Journal for Technology Assessment in Theory and Practice 32 (1), pp. 11–17.

Keck, Frédéric; Kelly, Ann; Lynteris, Christos (2019): Introduction. The anthropology of epidemics. In: Ann Kelly, Frédéric Keck and Christos Lynteris (eds.): The anthropology of epidemics. London: Routledge, pp. 1–24.

Kollek, Regine (2019): Normativität in der Technikfolgenabschätzung. In: TATuP – Journal for Technology Assessment in Theory and Practice 28 (1), pp. 11–64.

Krings, Bettina-Johanna; Grunwald, Armin (2013): Partizipation als konzeptionelles Strukturprinzip von TA. In: TATuP – Journal for Technology Assessment in Theory and Practice 22 (1), pp. 73–75.

McGoey, Linsey (2012): The logic of strategic ignorance. In: The British Journal of Sociology 63 (3), pp. 553–576.

Rip, Arie; Misa, Thomas; Schot, Johan (1995): Managing technology in society. The approach of constructive technology assessment. London: Pinter.

Staab, Philipp (2022): Anpassung. Leitmotiv der nächsten Gesellschaft. Berlin: Suhrkamp.

Suchman, Lucy (2015): Situational awareness. Deadly bioconvergence at the boundaries of bodies and machines. In: Media Tropese Journal 5 (1), pp. 1–24. Available online at, last accessed on 11. 05. 2023.

von Schomberg, René (2013): A vision of responsible research and innovation. In: Richard Owen, John Bessant and Maggy Heintz (eds.): Responsible innovation. Managing the responsible emergence of science and innovation in society. Chichester: Wiley, pp. 51–70.

Voss, Martin (2022): Institutionelles Risikomanagement. In: Aus Politik und Zeitgeschichte 72 (23–25), pp. 19–25. Available online at, last accessed on 11. 05. 2023.

Waltner-Toews, David et al. (2020): Post-normal pandemics. Why COVID-19 requires a new approach to science. In: Bioleft News, 06. 04. 2020. Available online at, last accessed on 11. 05. 2023.


Dr. Gabriel Bartl

has dealt with the phenomenon of insecurity in the context of several research projects. Again and again, the central question was which social implications result from the observable trend towards a digitalization of security. Currently, he is leading the research project ‘Multiple Crises’ at the Centre Marc Bloch e. V., Humboldt-Universität zu Berlin.