Franziska Herbert, M.Sc.

ehem. Mitarbeiterin

Technische Universität Darmstadt, Fachbereich Informatik,
Wissenschaft und Technik für Frieden und Sicherheit (PEASEC)

Franziska Herbert, M.Sc. war wissenschaftliche Mitarbeiterin am Lehrstuhl Wissenschaft und Technik für Frieden und Sicherheit (PEASEC) im Fachbereich Informatik der Technischen Universität Darmstadt.

Sie studierte Psychologie an der Technischen Universität Darmstadt. Neben dem Studium arbeitete sie im Bereich Software Usability, an psychologischen Gerichtsgutachten und betreute eine interdisziplinäre Lehrveranstaltung für Psychologie-Erstsemester. Im Rahmen ihrer Masterarbeit untersuchte sie den Einfluss unterschiedlicher Erklärungen auf das Vertrauen in künstliche Intelligenz. In Ihrer Freizeit engagierte sie sich als Vertreterin der Studierenden in verschiedenen Gremien der Universität.

Ihre wissenschaftlichen Interessen liegen in den Bereichen der Mensch-Computer-Interaktion, Privatheit sowie Vertrauen in Technik.

Publikationen

  • Tom Biselli, Enno Steinbrink, Franziska Herbert, Gina Maria Schmidbauer-Wolf, Christian Reuter (2022)
    On the Challenges of Developing a Concise Questionnaire to Identify Privacy Personas
    Proceedings on Privacy Enhancing Technologies (PoPETs) .
    [BibTeX] [Abstract]

    Concise instruments to determine privacy personas – typical privacy-related user groups – are not available at present. Consequently, we aimed to identify them on a privacy knowledge–privacy behavior ratio based on a self-developed instrument. To achieve this, we conducted an item analysis (N = 820) and a confirmatory factor analysis (CFA) (N = 656) of data based on an online study with German participants. Starting with 81 items, we reduced those to an eleven-item questionnaire with the two scales privacy knowledge and privacy behavior. A subsequent cluster analysis (N = 656) revealed three distinct user groups: (1) Fundamentalists scoring high in privacy knowledge and behavior, (2) Pragmatists scoring average in privacy knowledge and behavior and (3) Unconcerned scoring low in privacy knowledge and behavior. In a closer inspection of the questionnaire, the CFAs supported the model with a close global fit based on RMSEA in a training and to a lesser extent in a cross-validation sample. Deficient local fit as well as validity and reliability coefficients well below generally accepted thresholds, however, revealed that the questionnaire in its current form cannot be considered a suitable measurement instrument for determining privacy personas. The results are discussed in terms of related persona conceptualizations, the importance of a methodologically sound investigation of corresponding privacy dimensions and our lessons learned.

    @article{biselli_challenges_2022,
    title = {On the {Challenges} of {Developing} a {Concise} {Questionnaire} to {Identify} {Privacy} {Personas}},
    abstract = {Concise instruments to determine privacy personas – typical privacy-related user groups – are not available at present. Consequently, we aimed to identify them on a privacy knowledge–privacy behavior ratio based on a self-developed instrument. To achieve this, we conducted an item analysis (N = 820) and a confirmatory factor analysis (CFA) (N = 656) of data based on an online study with German participants. Starting with 81 items, we reduced those to an eleven-item questionnaire with the two scales privacy knowledge and privacy behavior. A subsequent cluster analysis (N = 656) revealed three distinct user groups: (1) Fundamentalists scoring high in privacy knowledge and behavior, (2) Pragmatists scoring average in privacy knowledge and behavior and (3) Unconcerned scoring low in privacy knowledge and behavior. In a closer inspection of the questionnaire, the CFAs supported the model with a close global fit based on RMSEA in a training and to a lesser extent in a cross-validation sample. Deficient local fit as well as validity and reliability coefficients well below generally accepted thresholds, however, revealed that the questionnaire in its current form cannot be considered a suitable measurement instrument for determining privacy personas. The results are discussed in terms of related persona conceptualizations, the importance of a methodologically sound investigation of corresponding privacy dimensions and our lessons learned.},
    journal = {Proceedings on Privacy Enhancing Technologies (PoPETs)},
    author = {Biselli, Tom and Steinbrink, Enno and Herbert, Franziska and Schmidbauer-Wolf, Gina Maria and Reuter, Christian},
    year = {2022},
    keywords = {A-Paper, AuswahlUsableSec, HCI, Projekt-ATHENE-FANCY, Projekt-GRKPrivacy, Ranking-CORE-A, Security, Selected, UsableSec},
    }

  • Franziska Herbert, Gina Maria Schmidbauer-Wolf, Christian Reuter (2021)
    Who Should Get My Private Data in Which Case? Evidence in the Wild
    Mensch und Computer 2021 – Tagungsband New York. doi:10.1145/3473856.3473879
    [BibTeX] [Download PDF]

    @inproceedings{herbert_who_2021,
    address = {New York},
    title = {Who {Should} {Get} {My} {Private} {Data} in {Which} {Case}? {Evidence} in the {Wild}},
    url = {http://www.peasec.de/paper/2021/2021_Herbert_SchmidbauerWolfReuter_WhoShouldGetMyPrivateDateinWhichCase_MuC.pdf},
    doi = {10.1145/3473856.3473879},
    booktitle = {Mensch und {Computer} 2021 - {Tagungsband}},
    publisher = {ACM},
    author = {Herbert, Franziska and Schmidbauer-Wolf, Gina Maria and Reuter, Christian},
    year = {2021},
    keywords = {Security, UsableSec, Projekt-ATHENE-FANCY},
    }

  • Franziska Herbert, Gina Maria Schmidbauer-Wolf, Christian Reuter (2020)
    Differences in IT Security Behavior and Knowledge of Private Users in Germany
    Proceedings of the International Conference on Wirtschaftsinformatik (WI) Potsdam, Germany. doi:10.30844/wi_2020_v3-herbert
    [BibTeX] [Abstract] [Download PDF]

    The German Federal Office for Information Security (Bundesamt für Sicherheit in der Informationstechnik) offers advice and recommendations for private users on how to behave securely. Based on these recommendations we investigate the IT security knowledge and behavior of private users with a rep- resentative study of the German population (N = 1.219). Additionally, we ana- lyze the role of socio-demographic factors (gender, age, education, political ori- entation) for security knowledge and behavior. Results show that German pri- vate users have only moderate IT security knowledge and behavior, with as- pects as gender, age, education and political orientation partly having an influ- ence. Men, higher educated and politically moderately oriented participants show higher security knowledge, whereas young people and those less knowl- edgeable about security behave less security-conscious. Additionally, security knowledge and behavior correlate moderately. Therefore, to increase private users‘ IT security we suggest to increase education and training especially for users being young, politically right-wing or female.

    @inproceedings{herbert_differences_2020,
    address = {Potsdam, Germany},
    title = {Differences in {IT} {Security} {Behavior} and {Knowledge} of {Private} {Users} in {Germany}},
    url = {https://library.gito.de/open-access-pdf/V3_Herbert-Differences_in_IT_Security_Behavior_and_Knowledge-541_c.pdf},
    doi = {10.30844/wi_2020_v3-herbert},
    abstract = {The German Federal Office for Information Security (Bundesamt für Sicherheit in der Informationstechnik) offers advice and recommendations for private users on how to behave securely. Based on these recommendations we investigate the IT security knowledge and behavior of private users with a rep- resentative study of the German population (N = 1.219). Additionally, we ana- lyze the role of socio-demographic factors (gender, age, education, political ori- entation) for security knowledge and behavior. Results show that German pri- vate users have only moderate IT security knowledge and behavior, with as- pects as gender, age, education and political orientation partly having an influ- ence. Men, higher educated and politically moderately oriented participants show higher security knowledge, whereas young people and those less knowl- edgeable about security behave less security-conscious. Additionally, security knowledge and behavior correlate moderately. Therefore, to increase private users' IT security we suggest to increase education and training especially for users being young, politically right-wing or female.},
    booktitle = {Proceedings of the {International} {Conference} on {Wirtschaftsinformatik} ({WI})},
    author = {Herbert, Franziska and Schmidbauer-Wolf, Gina Maria and Reuter, Christian},
    year = {2020},
    keywords = {Security, UsableSec, Ranking-CORE-C, Ranking-VHB-C, Ranking-WKWI-A, Projekt-ATHENE-FANCY},
    pages = {1--16},
    }

  • Christian Reuter, Katja Häusser, Mona Bien, Franziska Herbert (2019)
    Between Effort and Security: User Assessment of the Adequacy of Security Mechanisms for App Categories
    Mensch und Computer 2019 Hamburg, Germany. doi:10.1145/3340764.3340770
    [BibTeX] [Abstract] [Download PDF]

    With the increasing popularity of the smartphone, the number of people using it for financial transactions such as online shopping, online banking or mobile payment is also growing. Apps used in these contexts store sensitive and valuable data, creating a need for security measures. It has not yet been researched to what extent certain authentication mechanisms, which can be information-, biometric- as well as token-based, are suitable for individual apps and the respective data. The goal of this work is to assess how perceived security and estimated effort of using such mechanisms, as well as the degree to which app data is considered worth protecting, influence users‘ choices of appropriate measures to protect app categories. Therefore, we conducted a representative study (n=1024). On the one hand, our results show that a positive correlation between perceived security and effort exists for all investigated non-biometric authentication methods. On the other hand, the study sheds light on the differences between the investigated app categories and the users‘ choice of the appropriate security mechanisms for the particular category. In contrast to perceived security having a positive influence on a user’s preference of mechanism, a relation can hardly be identified for effort. Moreover, app data sensitivity does not seem relevant for the users‘ choice of security mechanism.

    @inproceedings{reuter_between_2019,
    address = {Hamburg, Germany},
    title = {Between {Effort} and {Security}: {User} {Assessment} of the {Adequacy} of {Security} {Mechanisms} for {App} {Categories}},
    url = {http://www.peasec.de/paper/2019/2019_ReuterHaeusserBienHerbert_EffortSecurity_MuC.pdf},
    doi = {10.1145/3340764.3340770},
    abstract = {With the increasing popularity of the smartphone, the number of people using it for financial transactions such as online shopping, online banking or mobile payment is also growing. Apps used in these contexts store sensitive and valuable data, creating a need for security measures. It has not yet been researched to what extent certain authentication mechanisms, which can be information-, biometric- as well as token-based, are suitable for individual apps and the respective data. The goal of this work is to assess how perceived security and estimated effort of using such mechanisms, as well as the degree to which app data is considered worth protecting, influence users' choices of appropriate measures to protect app categories. Therefore, we conducted a representative study (n=1024). On the one hand, our results show that a positive correlation between perceived security and effort exists for all investigated non-biometric authentication methods. On the other hand, the study sheds light on the differences between the investigated app categories and the users' choice of the appropriate security mechanisms for the particular category. In contrast to perceived security having a positive influence on a user's preference of mechanism, a relation can hardly be identified for effort. Moreover, app data sensitivity does not seem relevant for the users' choice of security mechanism.},
    booktitle = {Mensch und {Computer} 2019},
    publisher = {ACM},
    author = {Reuter, Christian and Häusser, Katja and Bien, Mona and Herbert, Franziska},
    editor = {Alt, Florian and Bulling, Andreas and Döring, Tanja},
    year = {2019},
    keywords = {HCI, Projekt-CRISP, Projekt-CROSSING, Security, Student, UsableSec, Projekt-ATHENE-FANCY},
    pages = {287--297},
    }

  • Gina Maria Schmidbauer-Wolf, Franziska Herbert, Christian Reuter (2019)
    Ein Kampf gegen Windmühlen: qualitative Studie über Informatikabsolvent_innen und ihre Datenprivatheit
    Mensch und Computer 2019 – Workshopband Hamburg, Germany. doi:10.18420/muc2019-ws-302-06
    [BibTeX] [Abstract] [Download PDF]

    Wie werden eigene private Daten geschützt? Um dieser Frage nachzugehen, wurde in einer qualitativen Studie mit sechs Informatikabsolvent_innen erfragt, wie diese die Privatheit ihrer Daten schützen. Das Ziel der teilstrukturierten Inter- views war es einen möglichst breiten Überblick über tatsäch- lich verwendete Techniken und Technologien zum Schutz der privaten Daten zu gewinnen. Während sich die Vermu- tung bestätigte, dass alle Teilnehmer_innen ein Bewusstsein für die Brisanz ihrer privaten Daten hatten, unterschieden sich die Definitionen ebendieser privaten Daten sowie das Verhalten, um diese zu schützen. Es konnte beobachtet wer- den, dass viel Wissen in diesem Bereich nicht zwangsläufig zu einem vorsichtigeren Handeln führt. Mögliche genannte Strategien zum Schutz der eigenen Daten sind: Informiert bleiben, Datensparsamkeit, Vermeidung der Produkte be- stimmter Konzerne sowie Resignation. Als Motivation für das jeweilige Verhalten wurden sowohl politische, philoso- phische, utilitaristische, als auch angstgetriebene Gründe genannt. Letztere können in Angst vor Diebstahl und Angst vor Andersbehandlung unterschieden werden.

    @inproceedings{schmidbauer-wolf_kampf_2019,
    address = {Hamburg, Germany},
    title = {Ein {Kampf} gegen {Windmühlen}: qualitative {Studie} über {Informatikabsolvent}\_innen und ihre {Datenprivatheit}},
    url = {https://dl.gi.de/bitstream/handle/20.500.12116/25168/302-06.pdf},
    doi = {10.18420/muc2019-ws-302-06},
    abstract = {Wie werden eigene private Daten geschützt? Um dieser Frage nachzugehen, wurde in einer qualitativen Studie mit sechs Informatikabsolvent\_innen erfragt, wie diese die Privatheit ihrer Daten schützen. Das Ziel der teilstrukturierten Inter- views war es einen möglichst breiten Überblick über tatsäch- lich verwendete Techniken und Technologien zum Schutz der privaten Daten zu gewinnen. Während sich die Vermu- tung bestätigte, dass alle Teilnehmer\_innen ein Bewusstsein für die Brisanz ihrer privaten Daten hatten, unterschieden sich die Definitionen ebendieser privaten Daten sowie das Verhalten, um diese zu schützen. Es konnte beobachtet wer- den, dass viel Wissen in diesem Bereich nicht zwangsläufig zu einem vorsichtigeren Handeln führt. Mögliche genannte Strategien zum Schutz der eigenen Daten sind: Informiert bleiben, Datensparsamkeit, Vermeidung der Produkte be- stimmter Konzerne sowie Resignation. Als Motivation für das jeweilige Verhalten wurden sowohl politische, philoso- phische, utilitaristische, als auch angstgetriebene Gründe genannt. Letztere können in Angst vor Diebstahl und Angst vor Andersbehandlung unterschieden werden.},
    booktitle = {Mensch und {Computer} 2019 - {Workshopband}},
    publisher = {Gesellschaft für Informatik e.V.},
    author = {Schmidbauer-Wolf, Gina Maria and Herbert, Franziska and Reuter, Christian},
    year = {2019},
    keywords = {Security, Projekt-ATHENE-FANCY},
    pages = {256--264},
    }

  • Gina Maria Schmidbauer-Wolf, Franziska Herbert, Christian Reuter (2019)
    Responsible Data Usage in Smart Cities: Privacy in Everyday Life vs. Reacting to Emergency Situations
    SCIENCE PEACE SECURITY ’19 – Proceedings of the Interdisciplinary Conference on Technical Peace and Security Research Darmstadt, Germany.
    [BibTeX] [Abstract] [Download PDF]

    Smart cities want to provide a better life to their citizens, e.g. regarding health care, infrastruc- ture, better safety and security. This can be achieved by using more and new technology and by interconnecting and analysing new and existent devices. Thus, public spaces and buildings will be equipped with more interconnected input and output modalities. This ongoing technolo- gization of public spaces creates opportunities for making everyone’s life more secure, while at the same time everyone’s personal privacy is endangered. So how is this balancing act tackled and dealt with right now? What fears do citizens have regarding their security as well as their privacy? This paper provides first insights into the topic privacy in smart cities regarding that smart cities need data which can be provided by and of people. The paper raises the question if collecting people’s data, and thus enabling smart cities, is ethical and if not, how it can be assured to be ethical.

    @inproceedings{schmidbauer-wolf_responsible_2019,
    address = {Darmstadt, Germany},
    title = {Responsible {Data} {Usage} in {Smart} {Cities}: {Privacy} in {Everyday} {Life} vs. {Reacting} to {Emergency} {Situations}},
    url = {https://tuprints.ulb.tu-darmstadt.de/id/eprint/9164},
    abstract = {Smart cities want to provide a better life to their citizens, e.g. regarding health care, infrastruc- ture, better safety and security. This can be achieved by using more and new technology and by interconnecting and analysing new and existent devices. Thus, public spaces and buildings will be equipped with more interconnected input and output modalities. This ongoing technolo- gization of public spaces creates opportunities for making everyone's life more secure, while at the same time everyone's personal privacy is endangered. So how is this balancing act tackled and dealt with right now? What fears do citizens have regarding their security as well as their privacy? This paper provides first insights into the topic privacy in smart cities regarding that smart cities need data which can be provided by and of people. The paper raises the question if collecting people's data, and thus enabling smart cities, is ethical and if not, how it can be assured to be ethical.},
    booktitle = {{SCIENCE} {PEACE} {SECURITY} '19 - {Proceedings} of the {Interdisciplinary} {Conference} on {Technical} {Peace} and {Security} {Research}},
    publisher = {TUprints},
    author = {Schmidbauer-Wolf, Gina Maria and Herbert, Franziska and Reuter, Christian},
    editor = {Reuter, Christian and Altmann, Jürgen and Göttsche, Malte and Himmel, Mirko},
    year = {2019},
    keywords = {HCI, Projekt-CRISP, Security, UsableSec, Projekt-ATHENE-FANCY},
    pages = {70--74},
    }