Editorial

Armin Grunwald*, 1

* Corresponding author: armin.grunwald@kit.edu

1 Institute for Technology Assessment and Systems Analysis, Karlsruhe Institute of Technology, Karlsruhe, DE

© 2025 by the authors; licensee oekom. This Open Access article is published under a Creative Commons Attribution 4.0 International Licence (CC BY).

TATuP 34/3 (2025): p. 3–3, https://doi.org/10.14512/tatup.7265

Published online: 15. 12. 2025

Technology assessment (TA) owes its existence primarily to a problem that no one wants to have: unintended and often negative consequences of technology. Climate change, microplastics, biodiversity loss, nuclear waste, traffic noise, and anti-democratic communication on the internet – it would be much more pleasant if none of this existed. TA is therefore intended to identify not only opportunities and desirable innovations for ongoing and future technological progress but also to consider potential negative effects at an early stage so that these can be prevented, minimized, compensated for, or limited.

An obvious question is whether unintended negative consequences of technology cannot be prevented at their root: in the design of the technology itself. If technology could be developed so that its production, use, and disposal were automatically climate-friendly, environmentally compatible, socially just, and in line with democracy and human rights, then the problem of negative consequences of technology would be solved for the future. The consequences that have already occurred, such as climate change, would still have to be dealt with, but after that, the Anthropocene would look different – much brighter than today. And TA would have achieved the greatest success possible for problem-oriented research: The problem would be solved and TA would become superfluous.

As tempting as this sounds, an old experience speaks against it: Technological consequences are not simply consequences of technology. Rather, they arise as combined effects of technical parameters and human behavior and are therefore, in a sense, socio-technical. For example, climate change is not simply a consequence of combustion engines and fossil-fueled power plants, but rather of the fact that millions and billions of people use the energy they provide, have developed lifestyles with corresponding consumption and mobility habits, and require extensive industrial production to maintain them.

This central human factor would have to be excluded from the emergence of technological consequences in order to realize the above vision. Approaches such as ‘ethics by design,’ ‘privacy by design,’ and ‘sustainability by design’ suggest that the consequences of technology can be controlled and steered in desired directions by enforcing certain consequences of technology use in technical design and preventing others.

Whether and to what extent this can be achieved remains an open question, especially in view of human creativity, which time and again defies technical constraints and leads to surprises. This Special topic brings together exciting examples and approaches to answer this question.

Armin Grunwald

Author figure
Armin Grunwald

Institute for Technology Assessment and Systems Analysis, Karlsruhe Institute of Technology, Karlsruhe, DE

armin.grunwald@kit.edu