<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article
  PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD with MathML3 v1.2 20190208//EN" "JATS-journalpublishing1-mathml3.rng">
<article xmlns:xlink="http://www.w3.org/1999/xlink"
         article-type="research-article"
         dtd-version="1.2"
         xml:lang="en"><?letex RNG_JATS-journalpublishing1-mathml3 ok?>
   <front>
      <journal-meta>
         <journal-id/>
         <journal-title-group>
            <journal-title>TATuP – Journal for Technology Assessment in Theory and Practice</journal-title>
         </journal-title-group>
         <issn pub-type="ppub">2568-020X</issn>
      </journal-meta>
      <article-meta>
         <article-id>7231</article-id>
         <article-id pub-id-type="doi">10.14512/tatup.7231</article-id>
         <article-categories>
            <subj-group>
               <subject>Research article</subject>
            </subj-group>
            <subj-group>
               <subject>Special topic · Technology assessment and future warfare: The Good, the Bad, and the Ugly</subject>
            </subj-group>
         </article-categories>
         <title-group>
            <article-title xml:lang="en">Rethinking participatory technology assessment in security governance</article-title>
            <trans-title-group>
               <trans-title xml:lang="de">Neue Ansätze zur partizipativen Technikfolgenabschätzung in der Sicherheitsgovernance</trans-title>
            </trans-title-group>
         </title-group>
         <contrib-group>
            <contrib contrib-type="author" corresp="yes" id="Au1" xlink:href="#Aff1">
               <contrib-id contrib-id-type="orcid">https://orcid.org/0000-0001-7482-465X</contrib-id>
               <name name-style="western">
                  <surname>Mahr</surname>
                  <given-names>Dana</given-names>
                  <prefix>Dr.</prefix>
               </name>
               <address>
                  <email>dana.mahr@kit.edu</email>
               </address>
               <bio>
                  <boxed-text id="FPar1" specific-use="Style1">
                     <caption>
                        <title>DR. DANA MAHR</title>
                     </caption>
                     <p>is a researcher at the Institute for Technology Assessment and Systems Analysis (ITAS) at the Karlsruhe Institute of Technology (KIT). She earned her PhD from Bielefeld University in 2013 and previously held research positions at the Université de Genève and the Universität zu Lübeck.</p>
                     <fig id="Figa">
                        <label/>
                        <caption>
                           <title/>
                        </caption>
                        <graphic specific-use="Print" xlink:href="910000_2026_7231_Figa_Print.eps"/>
                        <graphic specific-use="HTML" xlink:href="910000_2026_7231_Figa_HTML.png"/>
                     </fig>
                  </boxed-text>
               </bio>
               <aff id="Aff1">
                  <institution>Karlsruhe Institute of Technology</institution>
                  <institution content-type="dept">Institute for Technology Assessment and Systems Analysis</institution>
                  <addr-line>
                     <city>Karlsruhe</city>
                     <country>Germany</country>
                  </addr-line>
               </aff>
            </contrib>
         </contrib-group>
         <pub-date date-type="pub">
            <day>23</day>
            <month>03</month>
            <year>2026</year>
         </pub-date>
         <fpage>41</fpage>
         <lpage>47</lpage>
         <permissions>
            <copyright-year>2026</copyright-year>
            <copyright-holder>by the author(s); licensee oekom</copyright-holder>
            <license>
               <license-p>This Open Access article is published under a Creative Commons Attribution 4.0 International Licence (CC BY).</license-p>
            </license>
         </permissions>
         <abstract abstract-type="summary" id="Abs1" xml:lang="en">
            <title>Abstract</title>
            <p>This article explores participatory technology assessment (pTA) in security contexts and asks whether it has a purely symbolic function or whether it can enable real democratic influence. Googles’ involvement in the U.S. military’s Project Maven serves as an example. At the time, Google employees publicly protested against their company’s involvement in the military use of its AI, leading Google to drop its contract with the Pentagon. However, a literature review has shown that secrecy and power asymmetries are typical characteristics of security innovations, so formal pTA rarely goes beyond symbolic politics. Nonetheless, conflicts such as these can open up opportunities for public scrutiny and democratic influence.</p>
         </abstract>
         <abstract abstract-type="summary" id="Abs2" xml:lang="de">
            <title>Zusammenfassung</title>
            <p>Dieser Beitrag geht der Frage nach, ob partizipative Technikfolgenabschätzung (pTA) in der Sicherheitsgovernance eine rein symbolische Funktion hat oder ob sie echte demokratische Einflussnahme ermöglicht. Die Beteiligung Googles am U.S.-Militärprojekt Maven dient als Beispiel. Google-Mitarbeiterinnen und -Mitarbeiter protestierten damals öffentlich gegen die Beteiligung ihres Unternehmens an der militärischen Nutzung seiner KI, woraufhin Google seinen Vertrag mit dem Pentagon nicht verlängerte. Eine Literaturanalyse hat jedoch gezeigt, dass Geheimhaltung und Machtasymmetrien typische Merkmale von sicherheitsrelevanten Innovationen sind und formale pTA somit selten über Symbolpolitik hinausgeht. Dennoch können Konfliktsituationen wie diese neue Möglichkeiten der öffentlichen Kontrolle und demokratischen Einflussnahme eröffnen.</p>
         </abstract>
         <kwd-group>
            <compound-kwd>
               <compound-kwd-part content-type="code">heading</compound-kwd-part>
               <compound-kwd-part content-type="text">Keywords</compound-kwd-part>
            </compound-kwd>
            <compound-kwd>
               <compound-kwd-part content-type="code"/>
               <compound-kwd-part content-type="text">technology assessment</compound-kwd-part>
            </compound-kwd>
            <compound-kwd>
               <compound-kwd-part content-type="code"/>
               <compound-kwd-part content-type="text">military research</compound-kwd-part>
            </compound-kwd>
            <compound-kwd>
               <compound-kwd-part content-type="code"/>
               <compound-kwd-part content-type="text">public participation</compound-kwd-part>
            </compound-kwd>
            <compound-kwd>
               <compound-kwd-part content-type="code"/>
               <compound-kwd-part content-type="text">arms control</compound-kwd-part>
            </compound-kwd>
            <compound-kwd>
               <compound-kwd-part content-type="code"/>
               <compound-kwd-part content-type="text">artificial intelligence</compound-kwd-part>
            </compound-kwd>
         </kwd-group>
      </article-meta>
      <notes>
         <sec sec-type="referencedarticle">
            <title/>
            <p>
               <italic>This article is part of the Special topic</italic> “Technology assessment and future warfare: The Good, the Bad, and the Ugly,” <italic>edited by K. Weber, M. Bresinsky. <ext-link xlink:href="https://doi.org/10.14512/tatup.7286">https://doi.org/10.14512/tatup.7286</ext-link>
               </italic>
            </p>
         </sec>
      </notes>
   </front>
   <body>
      <sec id="Sec1">
         <label>1</label>
         <title>Introduction</title>
         <p>Participatory technology assessment (pTA) refers to processes in which citizens and stakeholders are involved in decisions about new technologies (Hennen <xref ref-type="bibr" rid="CR12">2012</xref>). According to democratic principles, risk and ethical issues should not be determined solely by expert committees; the public affected should also be heard. In areas of security policy (i.e., technologies that affect the military, intelligence services, or issues of war and peace) these democratic demands face particular hurdles: secrecy, a discourse of urgency (emergency and threat rhetoric), and powerful, often hierarchical actors stand in the way of broad participation. This often means that participation processes in this sector remain rather formal or symbolic acts, without enabling actual co-creation (Heide and Villeneuve <xref ref-type="bibr" rid="CR11">2021</xref>).</p>
         <p>A prominent example of this tension is the Pentagon’s ‘Project Maven’ (2017–2019), which developed AI technology for object recognition in drone videos. Google was involved in the project as an industry partner at times, but this only became known after the fact. In April 2018, more than 3,100 Google employees protested in an open letter against their company’s role in Maven and demanded that Google should not participate “in the business of war” (Google Employees <xref ref-type="bibr" rid="CR8">2018</xref>). The letter, with its core message ‘Google should not be in the business of war,’ made headlines around the world. This sparked a broad debate that revealed contradictions between the company’s publicly proclaimed values and its involvement in military applications. Under pressure from internal protests and negative publicity, Google announced in June 2018 that it would allow the Maven contract with the Department of Defense to expire. This case illustrates, on the one hand, the desire for democratic influence on military technology developments, but on the other hand, the narrow limits of such influence under security policy conditions: Despite Google’s withdrawal, the actual project development continued unabated, now with other companies in the background.</p>
         <p>Against this backdrop, the research question arises as to what participatory processes look like and how they work in areas of tight secrecy. Specifically, the article asks: Does pTA in the defense and security sector remain mere symbolic politics? Or are there ways in which participation can lead to real change despite power asymmetries and secrecy? How does this finding contribute to the debate on democratic control of security technologies? The aim of this article is to explore this tension and contribute to technology assessment (TA) and science and technology studies (STS) by examining participatory approaches in the hitherto neglected field of security governance.</p>
         <p content-type="eyecatcher" specific-use="Style2">Does participatory technology assessment in the defense and security sector remain mere symbolic politics?</p>
      </sec>
      <sec id="Sec2">
         <label>2</label>
         <title>Literature review</title>
         <p>Early pTA scholarship argued that involving citizens in technology decisions (via citizens’ juries, consensus conferences, etc.) enhances legitimacy (Hennen et al. <xref ref-type="bibr" rid="CR13">2023</xref>; Wesselink et al. <xref ref-type="bibr" rid="CR36">2011</xref>). In practice, however, critics observed that such exercises often reproduce existing hierarchies rather than disrupting them (Felt and Fochler <xref ref-type="bibr" rid="CR6">2008</xref>). This prompted STS scholars to reject the notion of a single ‘public.’ Instead, they emphasize multiple, context-specific publics (Warner <xref ref-type="bibr" rid="CR35">2002</xref>; Marres <xref ref-type="bibr" rid="CR20">2007</xref>). For example, Pesch et al. (<xref ref-type="bibr" rid="CR25">2020</xref>) advocate forming dedicated ‘local publics’ so that acceptability is judged by diverse community stakeholders (including local communities) rather than an abstract general public, reflecting the co-production of knowledge and values (Witjes <xref ref-type="bibr" rid="CR37">2017</xref>). In short, contemporary literature suggests that meaningful pTA must engage various publics rather than treat participation as a one-size-fits-all or token gesture.</p>
         <p>Security conditions create further challenges. Democratic theory holds that participation only matters when processes are transparent, inclusive, and answerable to citizens (Habermas <xref ref-type="bibr" rid="CR10">2001</xref>; Dryzek <xref ref-type="bibr" rid="CR5">2002</xref>). But secrecy in defense policymaking directly conflicts with these ideals. Analysts warn that secret policymaking produces a ‘knowledge deficit’ and prevents citizens from authorizing or contesting hidden actions (Mokrosinska <xref ref-type="bibr" rid="CR22">2023</xref>), while withholding information in military affairs erodes citizen consent (Perthes <xref ref-type="bibr" rid="CR24">2011</xref>). Appeals to ‘necessary secrecy’ may even cloak unethical practices (Born and Leigh <xref ref-type="bibr" rid="CR2">2007</xref>). Under such conditions, participatory exercises risk being purely symbolic. When urgency and confidentiality dominate, public forums tend to rubber-stamp pre-set outcomes (Taylor et al. <xref ref-type="bibr" rid="CR32">2017</xref>). In military contexts these pressures intensify: Secrecy is woven into operations (Mickan <xref ref-type="bibr" rid="CR21">2013</xref>) even as life-and-death technologies arguably demand popular scrutiny. Overall, scholars conclude that confidentiality requirements routinely undermine accountability, often relegating participation to a thin veneer that deepens public exclusion (Guerrero <xref ref-type="bibr" rid="CR9">2018</xref>).</p>
         <p>Research on emerging military technologies shows that narrative framing also plays a performative role. Malmio (<xref ref-type="bibr" rid="CR19">2023</xref>) finds that debates on AI (e.g., Project Maven) are framed by competing ‘enabling’ narratives (AI as life-saving precision tool) versus ‘constraining’ narratives (emphasizing human accountability), and that these frames actively shape technological trajectories by defining which ethical choices seem possible. More broadly, foresight studies note a consensus among experts on the need for civilian engagement in security innovation, but lament a lack of concrete participation models for defense (Vicente-Oliva <xref ref-type="bibr" rid="CR34">2025</xref>). As Hennen et al. (<xref ref-type="bibr" rid="CR13">2023</xref>) and Ladikas et al. (<xref ref-type="bibr" rid="CR17">2023</xref>) argue, transnational security challenges strain national TA designs, calling for new, cross-border engagement frameworks beyond the nation-state.</p>
         <p>Nonetheless, informal or insurgent forms of influence can emerge. Whistleblowers, media exposés, or employee activism sometimes thrust hidden military R&amp;D into public view. Lindelauf and Meerveld (<xref ref-type="bibr" rid="CR18">2025</xref>) propose ‘hybrid’ transparency arrangements (for example, keeping military AI algorithms closed to the general market but open to audit by vetted, trusted partners) to build trust without fully sacrificing confidentiality. The Project Maven case exemplifies the limits of such bottom-up contestation: Google employees’ protests forced the company to withdraw (imposing an internal pause and ethical commitment), but the Pentagon’s AI program continued largely as planned (Xue and Guo <xref ref-type="bibr" rid="CR38">2024</xref>). Thus, while these episodes can expose value conflicts and impose constraints on some actors, entrenched secrecy and power asymmetries often mean that underlying innovation trajectories remain unaffected.</p>
      </sec>
      <sec id="Sec3">
         <label>3</label>
         <title>Methodology</title>
         <p>This paper follows a qualitative case study design using Project Maven as an example. This case study was selected because Project Maven is a prominent current example of the intertwining of high technology and the security apparatus, and it can be partially reconstructed through public documents and debates. The case thus provides empirical insights into otherwise difficult-to-access processes at the interface between the technology industry and the military. At the same time, Maven paradigmatically illustrates the thesis of the tension between (symbolic) participation and democratic conflict potential.</p>
         <p>The analysis is divided into two steps: First, a comprehensive literature review was conducted, bringing together theoretical and conceptual foundations from the fields of TA, security research, STS, and democracy theory. Second, primary and secondary sources on the Project Maven case were evaluated. To this end, the available sources were systematically recorded: official government documents (e.g., reports from the U.S. Department of Defense and analyses from the Congressional Research Service), parliamentary hearings and debate contributions, press articles (international and US-internal, 2017–2024), reports from non-governmental organizations and think tanks, and public statements by the companies and actors involved (e.g., blog posts by Google executives, the published corporate mission statement on AI ethics, open letters from employees). The identified documents were examined using qualitative content analysis. Key events and decisions in the course of Maven were chronologically reconstructed, identifying key actors (Pentagon, Google management, Google employees, media, politicians) and their contributions to the discourse. Particular attention was paid to informal forms of participation (such as employee protests) and the institutional framework (e.g., the role of internal company forums vs. the lack of government participation formats). The case analysis is theory-driven: It embeds the empirical findings in discussions about participation and democracy in order to make empirical and normative classifications.</p>
         <p>Research in security policy contexts poses specific methodological challenges. Much of it takes place in areas that are closed to the public; relevant documents are often subject to high levels of secrecy. The present study therefore relies on publicly available information. This entails limitations: The presentation is based on what has come to light (e.g., through media reports). It is naturally impossible to obtain a complete picture of internal decision-making processes at the Department of Defense or Google. In addition, the statements made by the actors must be viewed in the context of their possible self-interests (e.g., corporate statements may be strategically motivated). To counteract distortions, source triangulation was attempted: Wherever possible, information is corroborated by several independent sources (e.g., press reports are compared with official statements and subsequent analyses). Nevertheless, the results must be interpreted with the caveat that they primarily reflect publicly documented dynamics. These limitations are disclosed in the discussion in order to make the scope of the conclusions transparent.</p>
      </sec>
      <sec id="Sec4">
         <label>4</label>
         <title>Case study: Project Maven</title>
         <sec id="Sec5">
            <label>4.1</label>
            <title>Background of the project</title>
            <p>
               <italic>Project Maven</italic>, formally known as the Algorithmic Warfare Cross-Functional Team, was launched in April 2017 by the U.S. Department of Defense. The goal was to accelerate the use of artificial intelligence in the military, in particular through the development of computer vision algorithms for object recognition in video recordings from drones (U.S. DoD <xref ref-type="bibr" rid="CR33">2017</xref>; Pellerin <xref ref-type="bibr" rid="CR23">2017</xref>; Jones <xref ref-type="bibr" rid="CR16">2018</xref>). The Pentagon was responding to a perceived gap: While AI (especially machine learning for image recognition) was making rapid progress in the civilian sector, military agencies were lagging behind. Instead of building AI capabilities exclusively in-house, the Department of Defense pursued a public-private partnership strategy and sought specific collaboration with the tech industry. By the end of 2017, Google had already joined Project Maven as its most important corporate partner (Malmio <xref ref-type="bibr" rid="CR19">2023</xref>). Google contributed its expertise in the development of machine learning models, cloud infrastructure, and mass data processing. Initially, this was done confidentially: The cooperation between Google and the Pentagon was not made public, presumably to avoid internal and external criticism in advance. Within Google itself, however, at least a circle of employees was aware that software was being adapted for military purposes. In early 2018, the first information about Maven reached the media, immediately triggering an ethical controversy (Crofts and van Rijswijk <xref ref-type="bibr" rid="CR4">2020</xref>). The public suddenly questioned whether a company whose products are used by billions of people should be involved in AI for the military.</p>
         </sec>
         <sec id="Sec6">
            <label>4.2</label>
            <title>Protest by Google employees</title>
            <p>The revelation of Google’s contribution to Project Maven acted as a catalyst for internal protest within the company. In the spring of 2018, an unprecedented wave of employee activism formed at Google (Scheiber and Conger <xref ref-type="bibr" rid="CR29">2020</xref>). Over 3,100 employees signed an open letter to the company’s management with the unambiguous demand: “Google should not be in the business of war” (Google Employees <xref ref-type="bibr" rid="CR8">2018</xref>). This letter, which was made public in April 2018, expressed a deep concern shared by many employees: Namely, that the AI tools they had developed could ultimately be used in drone missions to automate lethal decisions (Google Employees <xref ref-type="bibr" rid="CR8">2018</xref>). The signatories argued that their work should not be repurposed for military use without transparent debate and without their consent.</p>
            <p>There were several remarkable aspects to the protest. First, it was one of the most visible cases of ethical dissent from within the tech industry. Google employees publicly took a stand against a lucrative defense project of their own employer – a move that was unprecedented on this scale. Scholars interpreted this event as an early example of ‘bottom-up governance’: It was not regulatory agencies or NGOs, but the company’s own employees who pushed a technology company to rethink its role in military innovation (Crofts and van Rijswijk <xref ref-type="bibr" rid="CR4">2020</xref>). Second, the protest exposed the contradictions in Google’s corporate culture. For years, Google had cultivated a certain moral identity with the slogan ‘Don’t be evil’; in 2018, however, this motto was quietly replaced by the more innocuous ‘Do the right thing’ (Google <xref ref-type="bibr" rid="CR7">n.d.</xref>; Horwitz <xref ref-type="bibr" rid="CR15">2022</xref>). For the protesting employees, the collaboration with the Pentagon represented a clear betrayal of values – both the old and the new company credo. This was a clash between the pursuit of profit, public image, and ethical responsibility. The controversy thus exemplified how a private company can become a venue for social negotiation when government decision-making processes remain closed: The workforce sparked a debate that should have taken place on the socio-political stage, for example in parliaments.</p>
         </sec>
         <sec id="Sec7">
            <label>4.3</label>
            <title>Reactions from the company</title>
            <p>In the face of growing criticism, Google’s management felt compelled to act quickly. The management team, led by CEO Sundar Pichai and the head of the cloud division, Diane Greene, initially sought dialogue with the employees. Internal town hall meetings and discussion groups were organized, during which management promised to rethink its own guidelines for military contracts. In fact, two far-reaching announcements followed at the end of May/beginning of June 2018: First, Diane Greene informed employees that Google would not renew the current Maven contract when it expired in 2019 (Statt <xref ref-type="bibr" rid="CR31">2018</xref>). Second, on June 7, 2018, Google published a set of AI principles that would henceforth serve as ethical guidelines for Google’s development of artificial intelligence (Pichai <xref ref-type="bibr" rid="CR27">2018</xref>). These principles explicitly prohibited the development of AI for weapons systems, but left open the possibility of continuing to operate in other areas of defense, such as cybersecurity and logistics (Shane and Wakabayashi <xref ref-type="bibr" rid="CR30">2018</xref>).</p>
            <p>On the one hand, the announcement of these AI principles was a direct concession to employee protests: Google attempted to address ethical concerns and regain lost trust. On the other hand, it clearly served to enhance the company’s external reputation: The aim was to signal that Google was handling AI responsibly. Nevertheless, criticism from various quarters was inevitable. Scientists and observers noted that voluntary commitments such as these ‘ethical principles’ are often non-binding in corporate practice and lack enforcement (Malmio <xref ref-type="bibr" rid="CR19">2023</xref>). Without external control mechanisms, such guidelines could easily be circumvented or adapted if commercial or political interests suggested it. Furthermore, it became clear that although Google’s direct withdrawal from Maven represented a symbolic victory for the protesters, the Pentagon project itself continued unabated. In other words, the company’s decision primarily affected its public image – military-technological development as such remained unaffected (Hogue <xref ref-type="bibr" rid="CR14">2021</xref>). This already suggests that symbolic successes are possible, but do little to change structural path dependencies.</p>
         </sec>
         <sec id="Sec8">
            <label>4.4</label>
            <title>Publicity and progress of the project</title>
            <p>What is striking about the Project Maven conflict is the lack of institutionalized public deliberation beyond Google. Neither the Department of Defense nor the U.S. Congress initiated any public technology impact assessments or citizen dialogues on the ethical and social implications of the use of AI in the military during that period (Hogue <xref ref-type="bibr" rid="CR14">2021</xref>). The debate took place primarily in internal company forums, in expert circles (e.g., posts in technology and legal blogs), and in specialized media. Thus, to put it bluntly, the debate remained elitist and technocratic rather than democratic and participatory. Although mainstream media reported on the internal conflict at Google and there were comments from NGOs, there was no broad public participation or parliamentary hearing on Maven (U.S. Department of Defense <xref ref-type="bibr" rid="CR33">2017</xref>). This underscores the problematic fact that security policy technology decisions are often made without broad public feedback – unless internal whistleblowers or protests happen to make them public.</p>
            <p content-type="eyecatcher" specific-use="Style2">The debate remained elitist and technocratic rather than democratic and participatory.</p>
            <p>After Google’s withdrawal, it soon became clear who would continue Maven’s development. In 2019, media reports indicated that Palantir Technologies had assumed key parts of the project. According to Business Insider, the company internally launched the code-named project ‘Tron’ to deliver AI models for drone video analysis, thereby continuing Google’s earlier work (Peterson <xref ref-type="bibr" rid="CR26">2019</xref>). The report noted that Palantir had taken over Project Maven after Google ended its Pentagon contract in March 2019 following employee protests. Palantir (long linked to the security sector) thus seamlessly replaced Google.</p>
            <p>The seamless continuity of the project under different leadership highlights a core element of security-driven innovation: Even if a single company withdraws due to public pressure, the technological path remains intact as long as the need within the security apparatus persists. In the case of Maven, this is exactly what happened. The fundamental military demand for AI-supported analysis of surveillance data remained unbroken – and Palantir was found to be a willing replacement. In fact, the Department of Defense expanded its cooperation with Palantir in the following years. In May 2024, Reuters reported that the Pentagon had awarded Palantir a $480 million contract for an advanced Maven Smart System (Reuters <xref ref-type="bibr" rid="CR28">2024</xref>). The specialist portal C4ISRNet also reported that a five-year contract would enable the wider use of this system (Albon <xref ref-type="bibr" rid="CR1">2024</xref>). It is clear that structural drivers (in this case, the strategic imperative to improve the evaluation of intelligence material through AI) ensure that a project like Maven continues, regardless of interim reputation crises or personnel changes on the provider side.</p>
            <p>Overall, the Maven case demonstrates the limits of corporate-driven protests: While the action taken by Google employees was able to influence the behavior of a single company, it did not change the long-term course of government arms policy. This continues to be determined by geopolitical priorities (e.g., the technological race with rivals such as China) and the interests of security agencies. Thus, innovation remained in the world, only the constellation of actors shifted slightly. This presents a dilemma for democratic control: Even if interventions are successful in individual cases, they must be institutionally anchored in order to have more than a symbolic effect – otherwise, the development of military technology will take place under different circumstances.</p>
         </sec>
      </sec>
      <sec id="Sec9">
         <label>5</label>
         <title>Discussion</title>
         <p>The Project Maven case highlights institutional, epistemic, and power-political barriers that restrict genuine participation in security innovation. Institutionally, there is a deep asymmetry between the security apparatus and the democratic public sphere. Military and intelligence agencies operate within a closed ‘security zone,’ shielded by secrecy rules, clearances, and black budgets that place decisions beyond public control. As Mickan (<xref ref-type="bibr" rid="CR21">2013</xref>) notes, secrecy is a constitutive element of the military, yet this tension must not come at the expense of informed citizens. In Maven, no formal participation took place – basic information such as data use or algorithmic design remained classified. The only ‘participants’ were Google employees, whose protest mattered mainly because Google’s public brand and dependence on skilled labor gave them leverage. By contrast, traditional defense contractors face no comparable pressure; their activities usually remain invisible and only surface through leaks or media scrutiny.</p>
         <p>Epistemically, knowledge asymmetries further block participation. Maven’s AI systems functioned as a black box: Neither internal actors nor the public knew how models operated or what data they used. Without technical transparency, meaningful debate is impossible. Even Google staff demanding insight into the project met resistance, while external observers had no access to relevant facts. As Lindelauf and Meerveld (<xref ref-type="bibr" rid="CR18">2025</xref>) argue, limited openness toward trusted partners could strengthen accountability, but under current secrecy regimes participation remains largely superficial. The epistemic gap between experts with clearance and lay publics thus widens, reinforcing perceptions of incompetence on the latter’s side.</p>
         <p content-type="eyecatcher" specific-use="Style2">Without technical transparency, meaningful debate is impossible.</p>
         <p>Power-political barriers arise when urgency and threat narratives justify exceptional measures. In the Maven debate, references to a technological race with China served to sideline democratic procedures – a typical case of securitization. Born and Leigh (<xref ref-type="bibr" rid="CR2">2007</xref>) warn that such appeals to ‘necessary secrecy’ can mask dubious practices. The decision to partner with Google was driven by efficiency, not public deliberation. Civil society’s influence was indirect: Only when reputational risk threatened Google did management react. As Vicente-Oliva (2025) notes, experts call for more citizen involvement in defense foresight, yet no concrete models exist. Maven confirms that participatory impulses remain extra-institutional and symbolically charged.</p>
         <p>To address these tensions, several reforms have been proposed. Advisory boards with security clearance could review classified projects while representing diverse perspectives, though confidentiality risks curbing critique. Confidential parliamentary hearings might allow selected experts and NGOs to advise defense committees behind closed doors, using existing democratic channels but offering limited public transparency. Simulated citizen forums could model deliberation through hypothetical scenarios, though their policy impact would remain uncertain. Each approach faces resistance from security institutions reluctant to share authority. Yet incremental steps (such as independent ombudsmen or periodic public reports on defense AI) could create ‘embedded transparency’ without endangering operational secrecy.</p>
         <p>Overall, the Maven protests exposed systemic tensions rather than resolving them. Participation under secrecy remains ad hoc and largely symbolic: It can illuminate value conflicts but rarely transforms underlying power structures or decision-making in security policy.</p>
      </sec>
      <sec id="Sec10">
         <label>6</label>
         <title>Conclusion</title>
         <p>PTA in security governance often remains largely symbolic because secrecy, urgency narratives, and hierarchical power structures render transparency and inclusion difficult. However, the Project Maven case illustrates that critical contestation can still emerge. Protests of these effects are contingent and fragile – Maven’s AI project continued under other contractors, showing that symbolic resistance rarely changes structural power asymmetries. In sum, pTA in the security sector remains constrained but not futile: Even symbolic participation can become politically productive when moments of contestation are harnessed to uphold democratic oversight and accountability.</p>
      </sec>
   </body>
   <back>
      <ack>
         <p>
            <boxed-text id="FPar2" specific-use="Style1">
               <caption>
                  <title>Funding</title>
               </caption>
               <p>This article received no funding.</p>
            </boxed-text>
         </p>
         <p>
            <boxed-text id="FPar3" specific-use="Style1">
               <caption>
                  <title>Competing interests</title>
               </caption>
               <p>The author declares no competing interests.</p>
            </boxed-text>
         </p>
         <p>
            <boxed-text id="FPar4" specific-use="Style1">
               <caption>
                  <title>Ethical oversight</title>
               </caption>
               <p>The author confirms that all procedures were performed in compliance with relevant laws and institutional guidelines.</p>
            </boxed-text>
         </p>
         <p>
            <boxed-text id="FPar5" specific-use="Style1">
               <caption>
                  <title>Acknowledgements</title>
               </caption>
               <p>I would like to thank Nora Weinberger, whose sharp questions, generous feedback, and intellectual sparring were essential to this paper. Her critical reflections helped me see the text more clearly. This article carries her traces.</p>
            </boxed-text>
         </p>
      </ack>
      <ref-list id="Bib1">
         <title>References</title>
         <ref specific-use="2" id="CR1">
            <mixed-citation>Albon, Courtney (2024): Palantir wins contract to expand access to Project Maven AI tools. In: C4ISRNET, 30.05.2024. Available online at <ext-link xlink:href="https://www.c4isrnet.com/artificial-intelligence/2024/05/30/palantir-wins-contract-to-expand-access-to-project-maven-ai-tools/">https://www.c4isrnet.com/artificial-intelligence/2024/05/30/palantir-wins-contract-to-expand-access-to-project-maven-ai-tools/</ext-link>, last accessed on 04.11.2025.</mixed-citation>
         </ref>
         <ref id="CR2">
            <citation-alternatives>
               <element-citation publication-type="book">
                  <person-group person-group-type="author">
                     <name content-type="author">
                        <surname>Born</surname>
                        <given-names>H</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Leigh</surname>
                        <given-names>I</given-names>
                     </name>
                  </person-group>
                  <date>
                     <year>2007</year>
                  </date>
                  <source content-type="BookTitle">Democratic accountability of intelligence services</source>
                  <publisher-name>Geneva Centre for the Democratic Control of Armed Forces</publisher-name>
                  <publisher-loc>Geneva</publisher-loc>
               </element-citation>
               <mixed-citation>Born, Hans; Leigh, Ian (2007): Democratic accountability of intelligence services. Geneva: Geneva Centre for the Democratic Control of Armed Forces.</mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR4">
            <citation-alternatives>
               <element-citation publication-type="journal">
                  <person-group person-group-type="author">
                     <name content-type="author">
                        <surname>Crofts</surname>
                        <given-names>P</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Rijswijk</surname>
                        <given-names>H</given-names>
                        <suffix>van</suffix>
                     </name>
                  </person-group>
                  <date>
                     <year>2020</year>
                  </date>
                  <article-title>Negotiating ‘evil’. Google, Project Maven and the corporate form</article-title>
                  <issue>1</issue>
                  <page-range>75–90</page-range>
                  <volume-id content-type="bibarticledoi">10.5204/lthj.v2i1.1313</volume-id>
                  <source content-type="journal">Law, Technology and Humans</source>
                  <volume>2</volume>
               </element-citation>
               <mixed-citation>Crofts, Penny; van Rijswijk, Honni (2020): Negotiating ‘evil’. Google, Project Maven and the corporate form. In: Law, Technology and Humans 2 (1), pp. 75–90. <ext-link xlink:href="https://doi.org/10.5204/lthj.v2i1.1313">https://doi.org/10.5204/lthj.v2i1.1313</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR5">
            <citation-alternatives>
               <element-citation publication-type="book">
                  <name content-type="author">
                     <surname>Dryzek</surname>
                     <given-names>J</given-names>
                  </name>
                  <date>
                     <year>2002</year>
                  </date>
                  <source content-type="BookTitle">Deliberative democracy and beyond. Liberals, critics, contestations</source>
                  <publisher-name>Oxford University Press</publisher-name>
                  <publisher-loc>Oxford</publisher-loc>
                  <volume-id content-type="bibbookdoi">10.1093/019925043X.001.0001</volume-id>
               </element-citation>
               <mixed-citation>Dryzek, John (2002): Deliberative democracy and beyond. Liberals, critics, contestations. Oxford: Oxford University Press. <ext-link xlink:href="https://doi.org/10.1093/019925043X.001.0001">https://doi.org/10.1093/019925043X.001.0001</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR6">
            <citation-alternatives>
               <element-citation publication-type="journal">
                  <person-group person-group-type="author">
                     <name content-type="author">
                        <surname>Felt</surname>
                        <given-names>U</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Fochler</surname>
                        <given-names>M</given-names>
                     </name>
                  </person-group>
                  <date>
                     <year>2008</year>
                  </date>
                  <article-title>The bottom-up meanings of the concept of public participation in science and technology</article-title>
                  <issue>7</issue>
                  <page-range>489–499</page-range>
                  <volume-id content-type="bibarticledoi">10.3152/030234208X329086</volume-id>
                  <source content-type="journal">Science and Public Policy</source>
                  <volume>35</volume>
               </element-citation>
               <mixed-citation>Felt, Ulrike; Fochler, Maximilian (2008): The bottom-up meanings of the concept of public participation in science and technology. In: Science and Public Policy 35 (7), pp. 489–499. <ext-link xlink:href="https://doi.org/10.3152/030234208X329086">https://doi.org/10.3152/030234208X329086</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref specific-use="2" id="CR7">
            <mixed-citation>Google (n.d.): Our approach to information. How search works. Available online at <ext-link xlink:href="https://www.google.com/intl/en_us/search/howsearchworks/our-approach/">https://www.google.com/intl/en_us/search/howsearchworks/our-approach/</ext-link>, last accessed on 04.11.2025.</mixed-citation>
         </ref>
         <ref specific-use="2" id="CR8">
            <mixed-citation>Google Employees (2018): Open letter to Sundar Pichai. Available online at <ext-link xlink:href="https://static01.nyt.com/files/2018/technology/googleletter.pdf">https://static01.nyt.com/files/2018/technology/googleletter.pdf</ext-link>, last accessed on 04.11.2025.</mixed-citation>
         </ref>
         <ref id="CR9">
            <citation-alternatives>
               <element-citation publication-type="book">
                  <name content-type="author">
                     <surname>Guerrero</surname>
                     <given-names>A</given-names>
                  </name>
                  <date>
                     <year>2018</year>
                  </date>
                  <source content-type="BookTitle">Defense and ignorance. War, secrecy, and the possibility of popular sovereignty</source>
                  <publisher-name>Oxford University Press</publisher-name>
                  <publisher-loc>Oxford</publisher-loc>
                  <volume-id content-type="bibbookdoi">10.1093/oso/9780190922542.003.0016</volume-id>
               </element-citation>
               <mixed-citation>Guerrero, Alexander (2018): Defense and ignorance. War, secrecy, and the possibility of popular sovereignty. Oxford: Oxford University Press. <ext-link xlink:href="https://doi.org/10.1093/oso/9780190922542.003.0016">https://doi.org/10.1093/oso/9780190922542.003.0016</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR10">
            <citation-alternatives>
               <element-citation publication-type="book">
                  <person-group person-group-type="author">
                     <name content-type="author">
                        <surname>Habermas</surname>
                        <given-names>J</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Rehg</surname>
                        <given-names>W</given-names>
                     </name>
                  </person-group>
                  <date>
                     <year>2001</year>
                  </date>
                  <source content-type="BookTitle">Between facts and norms. Contributions to a discourse theory of law and democracy</source>
                  <publisher-name>MIT Press</publisher-name>
                  <publisher-loc>Cambridge</publisher-loc>
               </element-citation>
               <mixed-citation>Habermas, Jürgen; Rehg, William (2001): Between facts and norms. Contributions to a discourse theory of law and democracy. Cambridge, MA: MIT Press.</mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR11">
            <citation-alternatives>
               <element-citation publication-type="journal">
                  <person-group person-group-type="author">
                     <name content-type="author">
                        <surname>Heide</surname>
                        <given-names>M</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Villeneuve</surname>
                        <given-names>J-P</given-names>
                     </name>
                  </person-group>
                  <date>
                     <year>2021</year>
                  </date>
                  <article-title>Framing national security secrecy. A conceptual review</article-title>
                  <issue>2</issue>
                  <page-range>238–256</page-range>
                  <volume-id content-type="bibarticledoi">10.1177/00207020211016475</volume-id>
                  <source content-type="journal">International Journal: Canada’s Journal of Global Policy Analysis</source>
                  <volume>76</volume>
               </element-citation>
               <mixed-citation>Heide, Marlen; Villeneuve, Jean-Patrick (2021): Framing national security secrecy. A conceptual review. In: International Journal: Canada’s Journal of Global Policy Analysis 76 (2), pp. 238–256. <ext-link xlink:href="https://doi.org/10.1177/00207020211016475">https://doi.org/10.1177/00207020211016475</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR12">
            <citation-alternatives>
               <element-citation publication-type="journal">
                  <name content-type="author">
                     <surname>Hennen</surname>
                     <given-names>L</given-names>
                  </name>
                  <date>
                     <year>2012</year>
                  </date>
                  <article-title>Why do we still need participatory technology assessment?</article-title>
                  <issue>1–2</issue>
                  <page-range>27–41</page-range>
                  <volume-id content-type="bibarticledoi">10.1007/s10202-012-0122-5</volume-id>
                  <source content-type="journal">Poiesis &amp; Praxis</source>
                  <volume>9</volume>
               </element-citation>
               <mixed-citation>Hennen, Leonhard (2012): Why do we still need participatory technology assessment? In: Poiesis &amp; Praxis 9 (1–2), pp. 27–41. <ext-link xlink:href="https://doi.org/10.1007/s10202-012-0122-5">https://doi.org/10.1007/s10202-012-0122-5</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR13">
            <citation-alternatives>
               <element-citation publication-type="chapter">
                  <person-group person-group-type="author">
                     <name content-type="author">
                        <surname>Hennen</surname>
                        <given-names>L</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Peissl</surname>
                        <given-names>W</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Hahn</surname>
                        <given-names>J</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Ladikas</surname>
                        <given-names>M</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Est</surname>
                        <given-names>R</given-names>
                        <suffix>van</suffix>
                     </name>
                     <name content-type="author">
                        <surname>Lindner</surname>
                        <given-names>R</given-names>
                     </name>
                  </person-group>
                  <person-group person-group-type="editor">
                     <name content-type="editor">
                        <surname>Hennen</surname>
                        <given-names>L</given-names>
                     </name>
                     <name content-type="editor">
                        <surname>Hahn</surname>
                        <given-names>J</given-names>
                     </name>
                     <name content-type="editor">
                        <surname>Ladikas</surname>
                        <given-names>M</given-names>
                     </name>
                     <name content-type="editor">
                        <surname>Lindner</surname>
                        <given-names>R</given-names>
                     </name>
                     <name content-type="editor">
                        <surname>Peissl</surname>
                        <given-names>W</given-names>
                     </name>
                     <name content-type="editor">
                        <surname>Est</surname>
                        <given-names>R</given-names>
                        <suffix>van</suffix>
                     </name>
                  </person-group>
                  <date>
                     <year>2023</year>
                  </date>
                  <chapter-title>Introduction. Technology assessment beyond national boundaries</chapter-title>
                  <page-range>1–14</page-range>
                  <volume-id content-type="bibchapterdoi">10.1007/978-3-031-10617-0_1</volume-id>
                  <source content-type="BookTitle">Technology assessment in a globalized world</source>
                  <publisher-name>Springer</publisher-name>
                  <publisher-loc>Cham</publisher-loc>
               </element-citation>
               <mixed-citation>Hennen, Leonhard; Peissl, Walter; Hahn, Julia; Ladikas, Miltos; van Est, Rinie; Lindner, Ralf (2023): Introduction. Technology assessment beyond national boundaries. In: Leonhard Hennen, Julia Hahn, Miltos Ladikas, Ralf Lindner, Walter Peissl and Rinie van Est (eds.): Technology assessment in a globalized world. Cham: Springer, pp. 1–14. <ext-link xlink:href="https://doi.org/10.1007/978-3-031-10617-0_1">https://doi.org/10.1007/978-3-031-10617-0_1</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR14">
            <citation-alternatives>
               <element-citation publication-type="chapter">
                  <name content-type="author">
                     <surname>Hogue</surname>
                     <given-names>S</given-names>
                  </name>
                  <person-group person-group-type="editor">
                     <name content-type="editor">
                        <surname>Završnik</surname>
                        <given-names>A</given-names>
                     </name>
                     <name content-type="editor">
                        <surname>Badalič</surname>
                        <given-names>V</given-names>
                     </name>
                  </person-group>
                  <date>
                     <year>2021</year>
                  </date>
                  <chapter-title>Project Maven, big data, and ubiquitous knowledge. The impossible promises and hidden politics of algorithmic security vision</chapter-title>
                  <page-range>203–221</page-range>
                  <volume-id content-type="bibchapterdoi">10.1007/978-3-030-73276-9_10</volume-id>
                  <source content-type="BookTitle">Automating crime prevention, surveillance, and military operations</source>
                  <publisher-name>Springer</publisher-name>
                  <publisher-loc>Cham</publisher-loc>
               </element-citation>
               <mixed-citation>Hogue, Simon (2021): Project Maven, big data, and ubiquitous knowledge. The impossible promises and hidden politics of algorithmic security vision. In: Aleš Završnik and Vasja Badalič (eds.): Automating crime prevention, surveillance, and military operations. Cham: Springer, pp. 203–221. <ext-link xlink:href="https://doi.org/10.1007/978-3-030-73276-9_10">https://doi.org/10.1007/978-3-030-73276-9_10</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref specific-use="2" id="CR15">
            <mixed-citation>Horwitz, Josh (2022): Google is losing ‘Don’t be evil’ in its code of conduct, and what’s left is corporate jargon. In: Quartz, 20.07.2022. Available online at <ext-link xlink:href="https://qz.com/1282892/google-is-losing-dont-be-evil-in-its-code-of-conduct-and-whats-left-is-corporate-jargon">https://qz.com/1282892/google-is-losing-dont-be-evil-in-its-code-of-conduct-and-whats-left-is-corporate-jargon</ext-link>, last accessed on 31.10.2025.</mixed-citation>
         </ref>
         <ref specific-use="2" id="CR16">
            <mixed-citation>Jones, Felicity (2018): Project Maven. Machine learning in the military target selection process. In: Technology and Operations Management, 13.11.2018. Available online at <ext-link xlink:href="https://d3.harvard.edu/platform-rctom/submission/project-maven-machine-learning-in-the-military-target-selection-process/">https://d3.harvard.edu/platform-rctom/submission/project-maven-machine-learning-in-the-military-target-selection-process/</ext-link>, last accessed on 04.11.2025.</mixed-citation>
         </ref>
         <ref id="CR17">
            <citation-alternatives>
               <element-citation publication-type="chapter">
                  <person-group person-group-type="author">
                     <name content-type="author">
                        <surname>Ladikas</surname>
                        <given-names>M</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Hahn</surname>
                        <given-names>J</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Hennen</surname>
                        <given-names>L</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Est</surname>
                        <given-names>R</given-names>
                        <suffix>van</suffix>
                     </name>
                     <name content-type="author">
                        <surname>Peissl</surname>
                        <given-names>W</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Lindner</surname>
                        <given-names>R</given-names>
                     </name>
                  </person-group>
                  <person-group person-group-type="editor">
                     <name content-type="editor">
                        <surname>Hennen</surname>
                        <given-names>L</given-names>
                     </name>
                     <name content-type="editor">
                        <surname>Hahn</surname>
                        <given-names>J</given-names>
                     </name>
                     <name content-type="editor">
                        <surname>Miltos Ladikas</surname>
                     </name>
                     <name content-type="editor">
                        <surname>Lindner</surname>
                        <given-names>R</given-names>
                     </name>
                     <name content-type="editor">
                        <surname>Peissl</surname>
                        <given-names>W</given-names>
                     </name>
                     <name content-type="editor">
                        <surname>Est</surname>
                        <given-names>R</given-names>
                        <suffix>van</suffix>
                     </name>
                  </person-group>
                  <date>
                     <year>2023</year>
                  </date>
                  <chapter-title>The shape of global technology assessment</chapter-title>
                  <page-range>225–235</page-range>
                  <volume-id content-type="bibchapterdoi">10.1007/978-3-031-10617-0_11</volume-id>
                  <source content-type="BookTitle">Technology assessment in a globalized world</source>
                  <publisher-name>Springer</publisher-name>
                  <publisher-loc>Cham</publisher-loc>
               </element-citation>
               <mixed-citation>Ladikas, Miltos; Hahn, Julia; Hennen, Leonhard; van Est, Rinie; Peissl, Walter; Lindner, Ralf (2023): The shape of global technology assessment. In: Leonhard Hennen, Julia Hahn, Miltos Ladikas, Ralf Lindner, Walter Peissl and Rinie van Est (eds.): Technology assessment in a globalized world. Cham: Springer, pp. 225–235. <ext-link xlink:href="https://doi.org/10.1007/978-3-031-10617-0_11">https://doi.org/10.1007/978-3-031-10617-0_11</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref specific-use="2" id="CR18">
            <mixed-citation>Lindelauf, Roy; Meerveld, Herwin (2025): Building trust in military AI starts with opening the black box. In: War on the Rocks, 12.08.2025. Available online at <ext-link xlink:href="https://warontherocks.com/2025/08/building-trust-in-military-ai-starts-with-opening-the-black-box/">https://warontherocks.com/2025/08/building-trust-in-military-ai-starts-with-opening-the-black-box/</ext-link>, last accessed on 31.10.2025.</mixed-citation>
         </ref>
         <ref id="CR19">
            <citation-alternatives>
               <element-citation publication-type="journal">
                  <name content-type="author">
                     <surname>Malmio</surname>
                     <given-names>I</given-names>
                  </name>
                  <date>
                     <year>2023</year>
                  </date>
                  <article-title>Ethics as an enabler and a constraint. Narratives on technology development and artificial intelligence in military affairs through the case of Project Maven</article-title>
                  <page-range>102193</page-range>
                  <volume-id content-type="bibarticledoi">10.1016/j.techsoc.2022.102193</volume-id>
                  <source content-type="journal">Technology in Society</source>
                  <volume>72</volume>
               </element-citation>
               <mixed-citation>Malmio, Irja (2023): Ethics as an enabler and a constraint. Narratives on technology development and artificial intelligence in military affairs through the case of Project Maven. In: Technology in Society 72, p. 102193. <ext-link xlink:href="https://doi.org/10.1016/j.techsoc.2022.102193">https://doi.org/10.1016/j.techsoc.2022.102193</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR20">
            <citation-alternatives>
               <element-citation publication-type="journal">
                  <name content-type="author">
                     <surname>Marres</surname>
                     <given-names>N</given-names>
                  </name>
                  <date>
                     <year>2007</year>
                  </date>
                  <article-title>The issues deserve more credit. Pragmatist contributions to the study of public involvement in controversy</article-title>
                  <issue>5</issue>
                  <page-range>759–780</page-range>
                  <volume-id content-type="bibarticledoi">10.1177/0306312706077367</volume-id>
                  <source content-type="journal">Social Studies of Science</source>
                  <volume>37</volume>
               </element-citation>
               <mixed-citation>Marres, Noortje (2007): The issues deserve more credit. Pragmatist contributions to the study of public involvement in controversy. In: Social Studies of Science 37 (5), pp. 759–780. <ext-link xlink:href="https://doi.org/10.1177/0306312706077367">https://doi.org/10.1177/0306312706077367</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref specific-use="2" id="CR21">
            <mixed-citation>Mickan, Thomas (2013): Kommentar. Geheimhaltung, Demokratie und Militär. In: IMI-Standpunkt 34, 18.07.2013. Available online at <ext-link xlink:href="https://www.imi-online.de/2013/07/18/kommentar-geheimhaltung-demokratie-und-militar">https://www.imi-online.de/2013/07/18/kommentar-geheimhaltung-demokratie-und-militar</ext-link>, last accessed on 31.10.2025.</mixed-citation>
         </ref>
         <ref id="CR22">
            <citation-alternatives>
               <element-citation publication-type="journal">
                  <name content-type="author">
                     <surname>Mokrosinska</surname>
                     <given-names>D</given-names>
                  </name>
                  <date>
                     <year>2023</year>
                  </date>
                  <article-title>Necessary but illegitimate. On democracy’s secrets</article-title>
                  <issue>1</issue>
                  <page-range>73–97</page-range>
                  <volume-id content-type="bibarticledoi">10.1017/S0034670522000936</volume-id>
                  <source content-type="journal">The Review of Politics</source>
                  <volume>85</volume>
               </element-citation>
               <mixed-citation>Mokrosinska, Dorota (2023): Necessary but illegitimate. On democracy’s secrets. In: The Review of Politics 85 (1), pp. 73–97. <ext-link xlink:href="https://doi.org/10.1017/S0034670522000936">https://doi.org/10.1017/S0034670522000936</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref specific-use="2" id="CR23">
            <mixed-citation>Pellerin, Cheryl (2017): Project Maven industry day pursues artificial intelligence for DoD challenges. In: U.S. Department of Defense News, 27.10.2017. Available online at <ext-link xlink:href="https://www.war.gov/News/News-Stories/Article/Article/1356172/project-maven-industry-day-pursues-artificial-intelligence-for-dod-challenges/">https://www.war.gov/News/News-Stories/Article/Article/1356172/project-maven-industry-day-pursues-artificial-intelligence-for-dod-challenges/</ext-link>, last accessed on 31.10.2025.</mixed-citation>
         </ref>
         <ref id="CR24">
            <citation-alternatives>
               <element-citation publication-type="chapter">
                  <name content-type="author">
                     <surname>Perthes</surname>
                     <given-names>V</given-names>
                  </name>
                  <person-group person-group-type="editor">
                     <name content-type="editor">
                        <surname>Gilmer</surname>
                        <given-names>E</given-names>
                     </name>
                     <name content-type="editor">
                        <surname>Geiselberger</surname>
                        <given-names>H</given-names>
                     </name>
                  </person-group>
                  <date>
                     <year>2011</year>
                  </date>
                  <chapter-title>Wikileaks und warum Diskretion in der Außen- und Sicherheitspolitik wichtig ist</chapter-title>
                  <page-range>164–174</page-range>
                  <source content-type="BookTitle">Wikileaks und die Folgen. Netz – Medien – Politik</source>
                  <publisher-name>Suhrkamp</publisher-name>
                  <publisher-loc>Frankfurt a. M.</publisher-loc>
               </element-citation>
               <mixed-citation>Perthes, Volker (2011): Wikileaks und warum Diskretion in der Außen- und Sicherheitspolitik wichtig ist. In: Eva Gilmer and Heinrich Geiselberger (eds.): Wikileaks und die Folgen. Netz – Medien – Politik. Frankfurt a. M.: Suhrkamp, pp. 164–174.</mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR25">
            <citation-alternatives>
               <element-citation publication-type="journal">
                  <person-group person-group-type="author">
                     <name content-type="author">
                        <surname>Pesch</surname>
                        <given-names>U</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Huijts</surname>
                        <given-names>N</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Bombaerts</surname>
                        <given-names>G</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Doorn</surname>
                        <given-names>N</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Hunka</surname>
                        <given-names>A</given-names>
                     </name>
                  </person-group>
                  <date>
                     <year>2020</year>
                  </date>
                  <article-title>Creating ‘local publics’. Responsibility and involvement in decision-making on technologies with local impacts</article-title>
                  <issue>4</issue>
                  <page-range>2215–2234</page-range>
                  <volume-id content-type="bibarticledoi">10.1007/s11948-020-00199-0</volume-id>
                  <source content-type="journal">Science and Engineering Ethics</source>
                  <volume>26</volume>
               </element-citation>
               <mixed-citation>Pesch, Udo; Huijts, Nicole; Bombaerts, Gunter; Doorn, Neelke; Hunka, Agnieszka (2020): Creating ‘local publics’. Responsibility and involvement in decision-making on technologies with local impacts. In: Science and Engineering Ethics 26 (4), pp. 2215–2234. <ext-link xlink:href="https://doi.org/10.1007/s11948-020-00199-0">https://doi.org/10.1007/s11948-020-00199-0</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref specific-use="2" id="CR26">
            <mixed-citation>Peterson, Becky (2019): Palantir grabbed Project Maven defense contract after Google left the program. Sources. In: Business Insider, 10.12.2019. Available online at <ext-link xlink:href="https://www.businessinsider.com/palantir-took-over-from-google-on-project-maven-2019-12">https://www.businessinsider.com/palantir-took-over-from-google-on-project-maven-2019-12</ext-link>, last accessed on 04.11.2025.</mixed-citation>
         </ref>
         <ref specific-use="2" id="CR27">
            <mixed-citation>Pichai, Sundar (2018): AI at Google. Our principles. In: The Keyword (Google Blog), 07.06.2018. Available online at <ext-link xlink:href="https://blog.google/technology/ai/ai-principles/">https://blog.google/technology/ai/ai-principles/</ext-link>, last accessed on 31.10.2025.</mixed-citation>
         </ref>
         <ref specific-use="2" id="CR28">
            <mixed-citation>Reuters (2024): Pentagon awards $480 million deal to Palantir for ‘Maven’ prototype. In: Reuters, 30.05.2024. Available online at <ext-link xlink:href="https://www.reuters.com/technology/palantir-wins-480-million-us-army-deal-maven-prototype-2024-05-29/">https://www.reuters.com/technology/palantir-wins-480-million-us-army-deal-maven-prototype-2024-05-29/</ext-link>, last accessed on 31.10.2025.</mixed-citation>
         </ref>
         <ref specific-use="2" id="CR29">
            <mixed-citation>Scheiber, Noam; Conger, Kate (2020): The great Google revolt. In: The New York Times, 18.02.2020. Available online at <ext-link xlink:href="https://www.nytimes.com/interactive/2020/02/18/magazine/google-revolt.html">https://www.nytimes.com/interactive/2020/02/18/magazine/google-revolt.html</ext-link>, last accessed on 31.10.2025.</mixed-citation>
         </ref>
         <ref specific-use="2" id="CR30">
            <mixed-citation>Shane, Scott; Wakabayashi, Daisuke (2018): ‘The business of war’. Google employees protest work for the Pentagon. In: The New York Times, 04.04.2018. Available online at <ext-link xlink:href="https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon-project.html">https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon-project.html</ext-link>, last accessed on 04.11.2025.</mixed-citation>
         </ref>
         <ref specific-use="2" id="CR31">
            <mixed-citation>Statt, Nick (2018): Google reportedly leaving Project Maven military AI program after 2019. In: The Verge, 01.06.2018. Available online at <ext-link xlink:href="https://www.theverge.com/2018/6/1/17418406/google-maven-drone-imagery-ai-contract-expire">https://www.theverge.com/2018/6/1/17418406/google-maven-drone-imagery-ai-contract-expire</ext-link>, last accessed on 04.11.2025.</mixed-citation>
         </ref>
         <ref id="CR32">
            <citation-alternatives>
               <element-citation publication-type="journal">
                  <person-group person-group-type="author">
                     <name content-type="author">
                        <surname>Taylor</surname>
                        <given-names>B</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Bean</surname>
                        <given-names>H</given-names>
                     </name>
                     <name content-type="author">
                        <surname>O’Gorman</surname>
                        <given-names>N</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Rice</surname>
                        <given-names>R</given-names>
                     </name>
                  </person-group>
                  <date>
                     <year>2017</year>
                  </date>
                  <article-title>A fearful engine of power. Conceptualizing the communication–security relationship</article-title>
                  <issue>2</issue>
                  <page-range>111–135</page-range>
                  <volume-id content-type="bibarticledoi">10.1080/23808985.2017.1312482</volume-id>
                  <source content-type="journal">Annals of the International Communication Association</source>
                  <volume>41</volume>
               </element-citation>
               <mixed-citation>Taylor, Bryan; Bean, Hamilton; O’Gorman, Ned; Rice, Rebecca (2017): A fearful engine of power. Conceptualizing the communication–security relationship. In: Annals of the International Communication Association 41 (2), pp. 111–135. <ext-link xlink:href="https://doi.org/10.1080/23808985.2017.1312482">https://doi.org/10.1080/23808985.2017.1312482</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR33">
            <citation-alternatives>
               <element-citation publication-type="book">
                  <string-name>U.S. DoD – United States Department of Defense</string-name>
                  <date>
                     <year>2017</year>
                  </date>
                  <comment>https://www.govexec.com/media/gbc/docs/pdfs_edit/establishment_of_the_awcft_project_maven.pdf, last accessed on 04.11.2025</comment>
                  <source content-type="BookTitle">Establishment of an algorithmic warfare cross-functional team (Project Maven)</source>
                  <publisher-name>United States Department of Defense</publisher-name>
                  <publisher-loc>Washington, DC</publisher-loc>
               </element-citation>
               <mixed-citation>U.S. DoD – United States Department of Defense (2017): Establishment of an algorithmic warfare cross-functional team (Project Maven). Washington, DC: United States Department of Defense. Available online at <ext-link xlink:href="https://www.govexec.com/media/gbc/docs/pdfs_edit/establishment_of_the_awcft_project_maven.pdf">https://www.govexec.com/media/gbc/docs/pdfs_edit/establishment_of_the_awcft_project_maven.pdf</ext-link>, last accessed on 04.11.2025.</mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR34">
            <citation-alternatives>
               <element-citation publication-type="journal">
                  <name content-type="author">
                     <surname>Vicente-Oliva</surname>
                     <given-names>S</given-names>
                  </name>
                  <date>
                     <year>2025</year>
                  </date>
                  <article-title>Participation of civil society in security and defense foresight exercises</article-title>
                  <issue>1</issue>
                  <page-range>e206</page-range>
                  <volume-id content-type="bibarticledoi">10.1002/ffo2.206</volume-id>
                  <source content-type="journal">Futures &amp; Foresight Science</source>
                  <volume>7</volume>
               </element-citation>
               <mixed-citation>Vicente-Oliva, Silvia (2025): Participation of civil society in security and defense foresight exercises. In: Futures &amp; Foresight Science 7 (1), p. e206. <ext-link xlink:href="https://doi.org/10.1002/ffo2.206">https://doi.org/10.1002/ffo2.206</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR35">
            <citation-alternatives>
               <element-citation publication-type="journal">
                  <name content-type="author">
                     <surname>Warner</surname>
                     <given-names>M</given-names>
                  </name>
                  <date>
                     <year>2002</year>
                  </date>
                  <article-title>Publics and counterpublics</article-title>
                  <issue>1</issue>
                  <page-range>49–90</page-range>
                  <volume-id content-type="bibarticledoi">10.1215/08992363-14-1-49</volume-id>
                  <source content-type="journal">Public Culture</source>
                  <volume>14</volume>
               </element-citation>
               <mixed-citation>Warner, Michael (2002): Publics and counterpublics. In: Public Culture 14 (1), pp. 49–90. <ext-link xlink:href="https://doi.org/10.1215/08992363-14-1-49">https://doi.org/10.1215/08992363-14-1-49</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR36">
            <citation-alternatives>
               <element-citation publication-type="journal">
                  <person-group person-group-type="author">
                     <name content-type="author">
                        <surname>Wesselink</surname>
                        <given-names>A</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Paavola</surname>
                        <given-names>J</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Fritsch</surname>
                        <given-names>O</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Renn</surname>
                        <given-names>O</given-names>
                     </name>
                  </person-group>
                  <date>
                     <year>2011</year>
                  </date>
                  <article-title>Rationales for public participation in environmental policy and governance. Practitioners’ perspectives</article-title>
                  <issue>11</issue>
                  <page-range>2688–2704</page-range>
                  <volume-id content-type="bibarticledoi">10.1068/a44161</volume-id>
                  <source content-type="journal">Environment and Planning A: Economy and Space</source>
                  <volume>43</volume>
               </element-citation>
               <mixed-citation>Wesselink, Anna; Paavola, Jouni; Fritsch, Oliver; Renn, Ortwin (2011): Rationales for public participation in environmental policy and governance. Practitioners’ perspectives. In: Environment and Planning A: Economy and Space 43 (11), pp. 2688–2704. <ext-link xlink:href="https://doi.org/10.1068/a44161">https://doi.org/10.1068/a44161</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR37">
            <citation-alternatives>
               <element-citation publication-type="book">
                  <name content-type="author">
                     <surname>Witjes</surname>
                     <given-names>N</given-names>
                  </name>
                  <date>
                     <year>2017</year>
                  </date>
                  <comment>https://mediatum.ub.tum.de/doc/1350479/document.pdf, last accessed on 04.11.2025</comment>
                  <source content-type="BookTitle">The co-production of science, technology and global politics. Exploring emergent fields of knowledge and policy</source>
                  <publisher-name>Technische Universität München</publisher-name>
                  <publisher-loc>Munich</publisher-loc>
               </element-citation>
               <mixed-citation>Witjes, Nina (2017): The co-production of science, technology and global politics. Exploring emergent fields of knowledge and policy. Munich: Technische Universität München. Available online at <ext-link xlink:href="https://mediatum.ub.tum.de/doc/1350479/document.pdf">https://mediatum.ub.tum.de/doc/1350479/document.pdf</ext-link>, last accessed on 04.11.2025.</mixed-citation>
            </citation-alternatives>
         </ref>
         <ref id="CR38">
            <citation-alternatives>
               <element-citation publication-type="journal">
                  <person-group person-group-type="author">
                     <name content-type="author">
                        <surname>Xue</surname>
                        <given-names>J</given-names>
                     </name>
                     <name content-type="author">
                        <surname>Lifu</surname>
                        <given-names>G</given-names>
                     </name>
                  </person-group>
                  <date>
                     <year>2024</year>
                  </date>
                  <article-title>AI Cold War with China? The advantage of public conversations about ethics</article-title>
                  <issue>1</issue>
                  <page-range>1–18</page-range>
                  <volume-id content-type="bibarticledoi">10.60690/vdnrw404</volume-id>
                  <source content-type="journal">GRACE: Global Review of AI Community Ethics</source>
                  <volume>2</volume>
               </element-citation>
               <mixed-citation>Xue, Jonathan; Guo, Lifu (2024): AI Cold War with China? The advantage of public conversations about ethics. In: GRACE: Global Review of AI Community Ethics 2 (1), p. 1–18. <ext-link xlink:href="https://doi.org/10.60690/vdnrw404">https://doi.org/10.60690/vdnrw404</ext-link>
               </mixed-citation>
            </citation-alternatives>
         </ref>
      </ref-list>
   </back>
</article>
