Technological developments such as artificial intelligence (AI) and social media are changing the way people interact with each other and their environment worldwide. In times of global crises such as the COVID-19 pandemic or geopolitical tensions, such as the Russian attack on Ukraine, such technologies are playing an increasingly important role in security policy. They are used for political communication and disinformation as well as for governance in crisis situations.

Stefka Schmid‘s dissertation analyses how security is being renegotiated in three central areas: AI innovation policy, cooperative work in security-critical scenarios and dealing with disinformation in social media. She also analyses how interactions with technological solutions yield dynamics of depoliticization and politicization.

On January 29, 2025, Stefka Schmid successfully defended her doctoral thesis, earning the title of Dr. phil. at the Department of History and Social Science of the Technical University of Darmstadt.

The entire PEASEC team extends its heartfelt congratulations to our new *Dr. phil.* Stefka Schmid!

Her dissertation was supervised by Prof. Dr. Dr. Christian Reuter, who also served as the first referee. Prof. Dr. Matthias Leese (ETH Zürich) and Prof. Dr. Jens Steffek (Institute for Political Science, TUDa) acted as the second referees. The examination committee was chaired by Prof. Dr. Christian Stecker (Institute for Political Science, TUDa) and included Prof. Dr. Andreas Kaminski (Institute for Philosophy, TUDa).

Governing (In)Security: Socio-technical Interactions in International Relations 

Information technology is used across the globe, permeating different spheres of life. In the midst of geopolitical tensions and multiple crises, technology has become highly important to security governance. Against this backdrop and connecting to critical security studies (CSS), I focus on socio-technical practices that are enacted in different fields: (1) AI innovation policies, (2) safety-critical scenarios, and (3) misinformation on social media. In these, security is negotiated and on high demand. Thus, the first part focuses on how Chinese, EU and US policies refer to design characteristics and multiple contexts of use in the problematization of AI innovation. The second part investigates computer-supported cooperative work (CSCW) of both formal and informal collectives in safety-critical scenarios. Self-governance in crises requires collaborative work and routines, is self-referential and, at times, comprises care practices. The third part addresses (de)politicization in the context of user-centered interventions aiming at mitigating the spread of misinformation on social media. Situated in HCI and as an interdisciplinary endeavor, this work allows for a reflexive approach that both integrates and reflects on problem solving approaches that are co-constitutive to security governance.

Selected Publications within the PhD thesis:

  1. Stefka Schmid, Daniel Lambach, Carlo Diehl, Christian Reuter (2025)
    Arms Race or Innovation Race? Geopolitical AI Development. Geopolitics. (IF 3.0)
  2. Stefka Schmid, Binh-Chau Pham, Anna-Katharina Ferl (2024)
    Trust in Artificial Intelligence: Producing Ontological Security through Governmental Visions.
    Cooperation and Conflict, 0(0). doi:10.1177/00108367241288073 (IF 1.9)
    [Download PDF]
  3. Stefka Schmid (2022)
    Trustworthy and Explainable: A European Vision of (Weaponised) Artificial Intelligence.
    Die Friedens-Warte / Journal of International Peace and Organization (JIPO), 95(3-4), 290–315.
    doi:10.35998/fw-2022-0013
    [Download PDF]
  4. Stefka Schmid (2023)
    Safe and Secure? Visions of Military Human-Computer Interaction.
    Mensch und Computer – Workshopband.
    doi:10.18420/muc2023-mci-ws01-365
    [Download PDF]
  5. Steffen Haesler, Stefka Schmid, Annemike Sophia Vierneisel, Christian Reuter (2021)
    Stronger Together: How Neighborhood Groups Build up a Virtual Network during the COVID-19 Pandemic.
    Proceedings of the ACM: Human Computer Interaction (PACM): Computer-Supported Cooperative Work and Social Computing, 5(CSCW2).
    doi:10.1145/3476045 (ICORE-A)
    [Download PDF]
  6. Stefka Schmid, Laura Guntrum, Steffen Haesler, Lisa Schultheiß, Christian Reuter (2023)
    Digital Volunteers in the COVID-19 Pandemic: CareWork on Social Media for Socio-technical Resilience.
    Weizenbaum Journal of the Digital Society, 3(1).
    doi:10.34669/WI.WJDS/3.3.6
    [Download PDF]
  7. Katrin Hartwig, Stefka Schmid, Tom Biselli, Helene Pleil, Christian Reuter (2024)
    Misleading Information in Crises: Exploring Content-specific Indicators for Misleading Information on Twitter from a User Perspective.
    Behaviour & Information Technology.
    doi:10.1080/0144929X.2024.2373166 (IF 2.9)
    [Download PDF]
  8. Stefka Schmid, Katrin Hartwig, Robert Cieslinski, Christian Reuter (2024)
    Digital Resilience in Dealing with Misinformation on Social Media during COVID-19: A Web Application to Assist Users in Crises.
    Information Systems Frontiers (ISF).
    doi:10.1007/s10796-022-10347-5 (IF 6.9)
    [Download PDF]
Promotion Dr. phil. Stefka Schmid

Projects:

Further News about PhD Defences

Governing (In)Security: Socio-technical Interactions in International Relations – Congratulations to *Dr. phil.* Stefka Schmid on the successful defense of her doctoral thesis