Franziska Herbert, M.Sc.

ehem. Mitarbeiterin

Technische Universität Darmstadt, Fachbereich Informatik,
Wissenschaft und Technik für Frieden und Sicherheit (PEASEC)

Franziska Herbert, M.Sc. war wissenschaftliche Mitarbeiterin am Lehrstuhl Wissenschaft und Technik für Frieden und Sicherheit (PEASEC) im Fachbereich Informatik der Technischen Universität Darmstadt.

Sie studierte Psychologie an der Technischen Universität Darmstadt. Neben dem Studium arbeitete sie im Bereich Software Usability, an psychologischen Gerichtsgutachten und betreute eine interdisziplinäre Lehrveranstaltung für Psychologie-Erstsemester. Im Rahmen ihrer Masterarbeit untersuchte sie den Einfluss unterschiedlicher Erklärungen auf das Vertrauen in künstliche Intelligenz. In Ihrer Freizeit engagierte sie sich als Vertreterin der Studierenden in verschiedenen Gremien der Universität.

Ihre wissenschaftlichen Interessen liegen in den Bereichen der Mensch-Computer-Interaktion, Privatheit sowie Vertrauen in Technik.

Publikationen

  • Enno Steinbrink, Tom Biselli, Sebastian Linsner, Franziska Herbert, Christian Reuter (2023)
    Privacy Perception and Behaviour in Safety-Critical Environments
    In: Nina Gerber, Alina Stöver, Karola Marky: Human Factors in Privacy Research. Cham: Springer International Publishing, , 237–251.
    [BibTeX] [Abstract] [Download PDF]

    When considering privacy, context, and environmental circumstances can have a strong influence on individual decisions and user behavior. Especially in crises or threatening situations, privacy may conflict with other values, such as personal safety and health. In other cases, personal or public safety can also be dependent on privacy: the context of flight shows how, for those affected, the value of data protection can increase as a result of an increased threat situation. Thus, when individual sovereignty—the autonomous development of one’s own will—or safety is highly dependent on information flows, people tend to be more protective of their privacy in order to maintain their information sovereignty. But also, the context of agriculture, as part of the critical infrastructure, shows how privacy concerns can affect the adoption of digital tools. With these two examples, flight and migration as well as agriculture, this chapter presents some exemplary results that illustrate the importance of the influence of situational factors on perceived information sovereignty and the evaluation of privacy.

    @incollection{steinbrink_privacy_2023,
    address = {Cham},
    title = {Privacy {Perception} and {Behaviour} in {Safety}-{Critical} {Environments}},
    isbn = {978-3-031-28643-8},
    url = {https://doi.org/10.1007/978-3-031-28643-8_12},
    abstract = {When considering privacy, context, and environmental circumstances can have a strong influence on individual decisions and user behavior. Especially in crises or threatening situations, privacy may conflict with other values, such as personal safety and health. In other cases, personal or public safety can also be dependent on privacy: the context of flight shows how, for those affected, the value of data protection can increase as a result of an increased threat situation. Thus, when individual sovereignty—the autonomous development of one’s own will—or safety is highly dependent on information flows, people tend to be more protective of their privacy in order to maintain their information sovereignty. But also, the context of agriculture, as part of the critical infrastructure, shows how privacy concerns can affect the adoption of digital tools. With these two examples, flight and migration as well as agriculture, this chapter presents some exemplary results that illustrate the importance of the influence of situational factors on perceived information sovereignty and the evaluation of privacy.},
    booktitle = {Human {Factors} in {Privacy} {Research}},
    publisher = {Springer International Publishing},
    author = {Steinbrink, Enno and Biselli, Tom and Linsner, Sebastian and Herbert, Franziska and Reuter, Christian},
    editor = {Gerber, Nina and Stöver, Alina and Marky, Karola},
    year = {2023},
    keywords = {HCI, UsableSec, Security, Projekt-CROSSING, Projekt-ATHENE-FANCY, Projekt-GRKPrivacy},
    pages = {237--251},
    }

  • Tom Biselli, Enno Steinbrink, Franziska Herbert, Gina Maria Schmidbauer-Wolf, Christian Reuter (2022)
    On the Challenges of Developing a Concise Questionnaire to Identify Privacy Personas
    Proceedings on Privacy Enhancing Technologies (PoPETs) (4):645–669. doi:10.56553/popets-2022-0126
    [BibTeX] [Abstract] [Download PDF]

    Concise instruments to determine privacy personas – typical privacy-related user groups – are not available at present. Consequently, we aimed to identify them on a privacy knowledge–privacy behavior ratio based on a self-developed instrument. To achieve this, we conducted an item analysis (N = 820) and a confirmatory factor analysis (CFA) (N = 656) of data based on an online study with German participants. Starting with 81 items, we reduced those to an eleven-item questionnaire with the two scales privacy knowledge and privacy behavior. A subsequent cluster analysis (N = 656) revealed three distinct user groups: (1) Fundamentalists scoring high in privacy knowledge and behavior, (2) Pragmatists scoring average in privacy knowledge and behavior and (3) Unconcerned scoring low in privacy knowledge and behavior. In a closer inspection of the questionnaire, the CFAs supported the model with a close global fit based on RMSEA in a training and to a lesser extent in a cross-validation sample. Deficient local fit as well as validity and reliability coefficients well below generally accepted thresholds, however, revealed that the questionnaire in its current form cannot be considered a suitable measurement instrument for determining privacy personas. The results are discussed in terms of related persona conceptualizations, the importance of a methodologically sound investigation of corresponding privacy dimensions and our lessons learned.

    @article{biselli_challenges_2022,
    title = {On the {Challenges} of {Developing} a {Concise} {Questionnaire} to {Identify} {Privacy} {Personas}},
    url = {https://petsymposium.org/2022/files/papers/issue4/popets-2022-0126.pdf},
    doi = {10.56553/popets-2022-0126},
    abstract = {Concise instruments to determine privacy personas – typical privacy-related user groups – are not available at present. Consequently, we aimed to identify them on a privacy knowledge–privacy behavior ratio based on a self-developed instrument. To achieve this, we conducted an item analysis (N = 820) and a confirmatory factor analysis (CFA) (N = 656) of data based on an online study with German participants. Starting with 81 items, we reduced those to an eleven-item questionnaire with the two scales privacy knowledge and privacy behavior. A subsequent cluster analysis (N = 656) revealed three distinct user groups: (1) Fundamentalists scoring high in privacy knowledge and behavior, (2) Pragmatists scoring average in privacy knowledge and behavior and (3) Unconcerned scoring low in privacy knowledge and behavior. In a closer inspection of the questionnaire, the CFAs supported the model with a close global fit based on RMSEA in a training and to a lesser extent in a cross-validation sample. Deficient local fit as well as validity and reliability coefficients well below generally accepted thresholds, however, revealed that the questionnaire in its current form cannot be considered a suitable measurement instrument for determining privacy personas. The results are discussed in terms of related persona conceptualizations, the importance of a methodologically sound investigation of corresponding privacy dimensions and our lessons learned.},
    number = {4},
    journal = {Proceedings on Privacy Enhancing Technologies (PoPETs)},
    author = {Biselli, Tom and Steinbrink, Enno and Herbert, Franziska and Schmidbauer-Wolf, Gina Maria and Reuter, Christian},
    year = {2022},
    keywords = {HCI, Selected, UsableSec, Security, A-Paper, Ranking-CORE-A, Projekt-CROSSING, Projekt-ATHENE-FANCY, AuswahlUsableSec, Projekt-GRKPrivacy},
    pages = {645--669},
    }

  • Franziska Herbert, Gina Maria Schmidbauer-Wolf, Christian Reuter (2021)
    Who Should Get My Private Data in Which Case? Evidence in the Wild
    Mensch und Computer – Tagungsband New York. doi:10.1145/3473856.3473879
    [BibTeX] [Abstract] [Download PDF]

    As a result of the ongoing digitalization of our everyday lives, the amount of data produced by everyone is steadily increasing. This happens through personal decisions and items, such as the use of social media or smartphones, but also through more and more data acquisition in public spaces, such as e.g., Closed Circuit Television. Are people aware of the data they are sharing? What kind of data do people want to share with whom? Are people aware if they have Wi-Fi, GPS, or Bluetooth activated as potential data sharing functionalities on their phone? To answer these questions, we conducted a representative online survey as well as face-to-face interviews with users in Germany. We found that most users wanted to share private data on premise with most entities, indicating that willingness to share data depends on who has access to the data. Almost half of the participants would be more willing to share data with specific entities (state bodies & rescue forces) in the event that an acquaintance is endangered. For Wi-Fi and GPS the frequencies of self-reported and actual activation on the smartphone are almost equal, but 17\% of participants were unaware of the Bluetooth status on their smartphone. Our research is therefore in line with other studies suggesting relatively low privacy awareness of users.

    @inproceedings{herbert_who_2021,
    address = {New York},
    title = {Who {Should} {Get} {My} {Private} {Data} in {Which} {Case}? {Evidence} in the {Wild}},
    url = {http://www.peasec.de/paper/2021/2021_Herbert_SchmidbauerWolfReuter_WhoShouldGetMyPrivateDateinWhichCase_MuC.pdf},
    doi = {10.1145/3473856.3473879},
    abstract = {As a result of the ongoing digitalization of our everyday lives, the amount of data produced by everyone is steadily increasing. This happens through personal decisions and items, such as the use of social media or smartphones, but also through more and more data acquisition in public spaces, such as e.g., Closed Circuit Television. Are people aware of the data they are sharing? What kind of data do people want to share with whom? Are people aware if they have Wi-Fi, GPS, or Bluetooth activated as potential data sharing functionalities on their phone? To answer these questions, we conducted a representative online survey as well as face-to-face interviews with users in Germany. We found that most users wanted to share private data on premise with most entities, indicating that willingness to share data depends on who has access to the data. Almost half of the participants would be more willing to share data with specific entities (state bodies \& rescue forces) in the event that an acquaintance is endangered. For Wi-Fi and GPS the frequencies of self-reported and actual activation on the smartphone are almost equal, but 17\% of participants were unaware of the Bluetooth status on their smartphone. Our research is therefore in line with other studies suggesting relatively low privacy awareness of users.},
    booktitle = {Mensch und {Computer} - {Tagungsband}},
    publisher = {ACM},
    author = {Herbert, Franziska and Schmidbauer-Wolf, Gina Maria and Reuter, Christian},
    year = {2021},
    keywords = {UsableSec, Security, Projekt-ATHENE-FANCY},
    }

  • Franziska Herbert, Gina Maria Schmidbauer-Wolf, Christian Reuter (2020)
    Differences in IT Security Behavior and Knowledge of Private Users in Germany
    Proceedings of the International Conference on Wirtschaftsinformatik (WI) Potsdam, Germany. doi:10.30844/wi_2020_v3-herbert
    [BibTeX] [Abstract] [Download PDF]

    The German Federal Office for Information Security (Bundesamt für Sicherheit in der Informationstechnik) offers advice and recommendations for private users on how to behave securely. Based on these recommendations we investigate the IT security knowledge and behavior of private users with a rep- resentative study of the German population (N = 1.219). Additionally, we ana- lyze the role of socio-demographic factors (gender, age, education, political ori- entation) for security knowledge and behavior. Results show that German pri- vate users have only moderate IT security knowledge and behavior, with as- pects as gender, age, education and political orientation partly having an influ- ence. Men, higher educated and politically moderately oriented participants show higher security knowledge, whereas young people and those less knowl- edgeable about security behave less security-conscious. Additionally, security knowledge and behavior correlate moderately. Therefore, to increase private users‘ IT security we suggest to increase education and training especially for users being young, politically right-wing or female.

    @inproceedings{herbert_differences_2020,
    address = {Potsdam, Germany},
    title = {Differences in {IT} {Security} {Behavior} and {Knowledge} of {Private} {Users} in {Germany}},
    url = {https://library.gito.de/wp-content/uploads/2021/08/V3_Herbert-Differences_in_IT_Security_Behavior_and_Knowledge-541_c.pdf},
    doi = {10.30844/wi_2020_v3-herbert},
    abstract = {The German Federal Office for Information Security (Bundesamt für Sicherheit in der Informationstechnik) offers advice and recommendations for private users on how to behave securely. Based on these recommendations we investigate the IT security knowledge and behavior of private users with a rep- resentative study of the German population (N = 1.219). Additionally, we ana- lyze the role of socio-demographic factors (gender, age, education, political ori- entation) for security knowledge and behavior. Results show that German pri- vate users have only moderate IT security knowledge and behavior, with as- pects as gender, age, education and political orientation partly having an influ- ence. Men, higher educated and politically moderately oriented participants show higher security knowledge, whereas young people and those less knowl- edgeable about security behave less security-conscious. Additionally, security knowledge and behavior correlate moderately. Therefore, to increase private users' IT security we suggest to increase education and training especially for users being young, politically right-wing or female.},
    booktitle = {Proceedings of the {International} {Conference} on {Wirtschaftsinformatik} ({WI})},
    author = {Herbert, Franziska and Schmidbauer-Wolf, Gina Maria and Reuter, Christian},
    year = {2020},
    keywords = {Ranking-CORE-C, Ranking-VHB-C, UsableSec, Security, Ranking-WKWI-A, Projekt-ATHENE-FANCY},
    pages = {1--16},
    }

  • Gina Maria Schmidbauer-Wolf, Franziska Herbert, Christian Reuter (2019)
    Ein Kampf gegen Windmühlen: qualitative Studie über Informatikabsolvent_innen und ihre Datenprivatheit
    Mensch und Computer – Workshopband Hamburg, Germany. doi:10.18420/muc2019-ws-302-06
    [BibTeX] [Abstract] [Download PDF]

    Wie werden eigene private Daten geschützt? Um dieser Frage nachzugehen, wurde in einer qualitativen Studie mit sechs Informatikabsolvent_innen erfragt, wie diese die Privatheit ihrer Daten schützen. Das Ziel der teilstrukturierten Inter- views war es einen möglichst breiten Überblick über tatsäch- lich verwendete Techniken und Technologien zum Schutz der privaten Daten zu gewinnen. Während sich die Vermu- tung bestätigte, dass alle Teilnehmer_innen ein Bewusstsein für die Brisanz ihrer privaten Daten hatten, unterschieden sich die Definitionen ebendieser privaten Daten sowie das Verhalten, um diese zu schützen. Es konnte beobachtet wer- den, dass viel Wissen in diesem Bereich nicht zwangsläufig zu einem vorsichtigeren Handeln führt. Mögliche genannte Strategien zum Schutz der eigenen Daten sind: Informiert bleiben, Datensparsamkeit, Vermeidung der Produkte be- stimmter Konzerne sowie Resignation. Als Motivation für das jeweilige Verhalten wurden sowohl politische, philoso- phische, utilitaristische, als auch angstgetriebene Gründe genannt. Letztere können in Angst vor Diebstahl und Angst vor Andersbehandlung unterschieden werden.

    @inproceedings{schmidbauer-wolf_kampf_2019,
    address = {Hamburg, Germany},
    title = {Ein {Kampf} gegen {Windmühlen}: qualitative {Studie} über {Informatikabsolvent}\_innen und ihre {Datenprivatheit}},
    url = {https://dl.gi.de/bitstream/handle/20.500.12116/25168/302-06.pdf},
    doi = {10.18420/muc2019-ws-302-06},
    abstract = {Wie werden eigene private Daten geschützt? Um dieser Frage nachzugehen, wurde in einer qualitativen Studie mit sechs Informatikabsolvent\_innen erfragt, wie diese die Privatheit ihrer Daten schützen. Das Ziel der teilstrukturierten Inter- views war es einen möglichst breiten Überblick über tatsäch- lich verwendete Techniken und Technologien zum Schutz der privaten Daten zu gewinnen. Während sich die Vermu- tung bestätigte, dass alle Teilnehmer\_innen ein Bewusstsein für die Brisanz ihrer privaten Daten hatten, unterschieden sich die Definitionen ebendieser privaten Daten sowie das Verhalten, um diese zu schützen. Es konnte beobachtet wer- den, dass viel Wissen in diesem Bereich nicht zwangsläufig zu einem vorsichtigeren Handeln führt. Mögliche genannte Strategien zum Schutz der eigenen Daten sind: Informiert bleiben, Datensparsamkeit, Vermeidung der Produkte be- stimmter Konzerne sowie Resignation. Als Motivation für das jeweilige Verhalten wurden sowohl politische, philoso- phische, utilitaristische, als auch angstgetriebene Gründe genannt. Letztere können in Angst vor Diebstahl und Angst vor Andersbehandlung unterschieden werden.},
    booktitle = {Mensch und {Computer} - {Workshopband}},
    publisher = {Gesellschaft für Informatik e.V.},
    author = {Schmidbauer-Wolf, Gina Maria and Herbert, Franziska and Reuter, Christian},
    year = {2019},
    keywords = {Security, Projekt-ATHENE-FANCY},
    pages = {256--264},
    }

  • Gina Maria Schmidbauer-Wolf, Franziska Herbert, Christian Reuter (2019)
    Responsible Data Usage in Smart Cities: Privacy in Everyday Life vs. Reacting to Emergency Situations
    SCIENCE PEACE SECURITY ’19 – Proceedings of the Interdisciplinary Conference on Technical Peace and Security Research Darmstadt, Germany.
    [BibTeX] [Abstract] [Download PDF]

    Smart cities want to provide a better life to their citizens, e.g. regarding health care, infrastruc- ture, better safety and security. This can be achieved by using more and new technology and by interconnecting and analysing new and existent devices. Thus, public spaces and buildings will be equipped with more interconnected input and output modalities. This ongoing technolo- gization of public spaces creates opportunities for making everyone’s life more secure, while at the same time everyone’s personal privacy is endangered. So how is this balancing act tackled and dealt with right now? What fears do citizens have regarding their security as well as their privacy? This paper provides first insights into the topic privacy in smart cities regarding that smart cities need data which can be provided by and of people. The paper raises the question if collecting people’s data, and thus enabling smart cities, is ethical and if not, how it can be assured to be ethical.

    @inproceedings{schmidbauer-wolf_responsible_2019,
    address = {Darmstadt, Germany},
    title = {Responsible {Data} {Usage} in {Smart} {Cities}: {Privacy} in {Everyday} {Life} vs. {Reacting} to {Emergency} {Situations}},
    url = {https://tuprints.ulb.tu-darmstadt.de/id/eprint/9164},
    abstract = {Smart cities want to provide a better life to their citizens, e.g. regarding health care, infrastruc- ture, better safety and security. This can be achieved by using more and new technology and by interconnecting and analysing new and existent devices. Thus, public spaces and buildings will be equipped with more interconnected input and output modalities. This ongoing technolo- gization of public spaces creates opportunities for making everyone's life more secure, while at the same time everyone's personal privacy is endangered. So how is this balancing act tackled and dealt with right now? What fears do citizens have regarding their security as well as their privacy? This paper provides first insights into the topic privacy in smart cities regarding that smart cities need data which can be provided by and of people. The paper raises the question if collecting people's data, and thus enabling smart cities, is ethical and if not, how it can be assured to be ethical.},
    booktitle = {{SCIENCE} {PEACE} {SECURITY} '19 - {Proceedings} of the {Interdisciplinary} {Conference} on {Technical} {Peace} and {Security} {Research}},
    publisher = {TUprints},
    author = {Schmidbauer-Wolf, Gina Maria and Herbert, Franziska and Reuter, Christian},
    editor = {Reuter, Christian and Altmann, Jürgen and Göttsche, Malte and Himmel, Mirko},
    year = {2019},
    keywords = {HCI, UsableSec, Security, Projekt-CRISP, Projekt-ATHENE-FANCY},
    pages = {70--74},
    }

  • Christian Reuter, Katja Häusser, Mona Bien, Franziska Herbert (2019)
    Between Effort and Security: User Assessment of the Adequacy of Security Mechanisms for App Categories
    Mensch und Computer – Tagungsband Hamburg, Germany. doi:10.1145/3340764.3340770
    [BibTeX] [Abstract] [Download PDF]

    With the increasing popularity of the smartphone, the number of people using it for financial transactions such as online shopping, online banking or mobile payment is also growing. Apps used in these contexts store sensitive and valuable data, creating a need for security measures. It has not yet been researched to what extent certain authentication mechanisms, which can be information-, biometric- as well as token-based, are suitable for individual apps and the respective data. The goal of this work is to assess how perceived security and estimated effort of using such mechanisms, as well as the degree to which app data is considered worth protecting, influence users‘ choices of appropriate measures to protect app categories. Therefore, we conducted a representative study (n=1024). On the one hand, our results show that a positive correlation between perceived security and effort exists for all investigated non-biometric authentication methods. On the other hand, the study sheds light on the differences between the investigated app categories and the users‘ choice of the appropriate security mechanisms for the particular category. In contrast to perceived security having a positive influence on a user’s preference of mechanism, a relation can hardly be identified for effort. Moreover, app data sensitivity does not seem relevant for the users‘ choice of security mechanism.

    @inproceedings{reuter_between_2019,
    address = {Hamburg, Germany},
    title = {Between {Effort} and {Security}: {User} {Assessment} of the {Adequacy} of {Security} {Mechanisms} for {App} {Categories}},
    url = {http://www.peasec.de/paper/2019/2019_ReuterHaeusserBienHerbert_EffortSecurity_MuC.pdf},
    doi = {10.1145/3340764.3340770},
    abstract = {With the increasing popularity of the smartphone, the number of people using it for financial transactions such as online shopping, online banking or mobile payment is also growing. Apps used in these contexts store sensitive and valuable data, creating a need for security measures. It has not yet been researched to what extent certain authentication mechanisms, which can be information-, biometric- as well as token-based, are suitable for individual apps and the respective data. The goal of this work is to assess how perceived security and estimated effort of using such mechanisms, as well as the degree to which app data is considered worth protecting, influence users' choices of appropriate measures to protect app categories. Therefore, we conducted a representative study (n=1024). On the one hand, our results show that a positive correlation between perceived security and effort exists for all investigated non-biometric authentication methods. On the other hand, the study sheds light on the differences between the investigated app categories and the users' choice of the appropriate security mechanisms for the particular category. In contrast to perceived security having a positive influence on a user's preference of mechanism, a relation can hardly be identified for effort. Moreover, app data sensitivity does not seem relevant for the users' choice of security mechanism.},
    booktitle = {Mensch und {Computer} - {Tagungsband}},
    publisher = {ACM},
    author = {Reuter, Christian and Häusser, Katja and Bien, Mona and Herbert, Franziska},
    editor = {Alt, Florian and Bulling, Andreas and Döring, Tanja},
    year = {2019},
    keywords = {HCI, Student, UsableSec, Security, Projekt-CRISP, Projekt-CROSSING, Projekt-ATHENE-FANCY},
    pages = {287--297},
    }