Wissenschaftlicher Mitarbeiter / Doktorand

Kontakt: kuehn(at)peasec.tu-darmstadt.de

Technische Universität Darmstadt, Fachbereich Informatik,
Wissenschaft und Technik für Frieden und Sicherheit (PEASEC)
Pankratiusstraße 2, 64289 Darmstadt, Raum 110

Online-Profiles: Homepage | ORCID | Google Scholar

DE

Philipp Kühn, M.Sc. ist wissenschaftlicher Mitarbeiter und Doktorand am Lehrstuhl Wissenschaft und Technik für Frieden und Sicherheit (PEASEC) im Fachbereich Informatik der Technischen Universität Darmstadt. Er ist aktuell in den Projekten SecUrban (2020-2023, BMBF+HMWK) und CYWARN (2020-2023, BMBF) tätig und erforscht primär die Themenfelder der Gewinnung von Informationen aus öffentlichen Datenquellen, mit dem Fokus der IT-Sicherheit, deren Aufbereitung und Weiterverarbeitung. Dabei werden Methodiken aus dem Bereich Natural Language Processing, wie auch Deep Learning verwendet. Desweiteren forscht er auch zu Themenfeldern der zwischenstaatlichen Kooperation im Bereich von IT-Sicherheit.

Er studierte Informatik (B.Sc.) an der Technischen Universität Darmstadt und vertiefte sich im Fach IT Sicherheit (M.Sc.). Neben seinem Studium arbeitete er als Hilfswissenschaftler am Fraunhofer-Institut für Sichere Informationstechnologie in den Bereichen Privacy Enhancing Technologies und Distributed Ledger Technologies. Im Rahmen der Hochschuldidaktischen Arbeitsstelle der Technischen Universität Darmstadt bot er Weiterbildungen mit den Schwerpunkten Kommunikation, Sozialkompetenz, Selbstorganisation und Methodenkompetenz an.

EN

Philipp Kühn, M.Sc. is a research associate and doctoral student at the chair of Science and Technology for Peace and Security (PEASEC) in the department of computer science of the Technical University of Darmstadt. He is currently involved in the SecUrban (2020-2023, BMBF+HMWK) and CYWARN (2020-2023, BMBF) projects. He primarily researches the topics of extracting information from public data sources, with a focus on IT security, its preparation and further processing. For this purpose, he uses methods from the field of Natural Language Processing as well as Deep Learning. Furthermore, he also conducts research on topics of intergovernmental cooperation in the field of IT security.

He studied Computer Science (B.Sc.) at the Technical University of Darmstadt and deepened his knowledge in IT security (M.Sc.). In addition to his studies, he worked as an adjunct scientist at the Fraunhofer-Institut für Sichere Informationstechnologie in the areas of Privacy Enhancing Technologies and Distributed Ledger Technologies. As part of the Center for Educational Development and Technology he offered further training with a focus on communication, social competence, self-organization and methodological competences.

Publikationen

2024

  • Markus Bayer, Philipp Kuehn, Ramin Shanehsaz, Christian Reuter (2024)
    CySecBERT: A Domain-Adapted Language Model for the Cybersecurity Domain
    ACM Transactions on Privacy and Security (TOPS) ;27(2). doi:10.1145/3652594
    [BibTeX] [Abstract] [Download PDF]

    The field of cybersecurity is evolving fast. Security professionals are in need of intelligence on past, current and – ideally – on upcoming threats, because attacks are becoming more advanced and are increasingly targeting larger and more complex systems. Since the processing and analysis of such large amounts of information cannot be addressed manually, cybersecurity experts rely on machine learning techniques. In the textual domain, pre-trained language models like BERT have proven to be helpful as they provide a good baseline for further fine-tuning. However, due to the domain-knowledge and the many technical terms in cybersecurity, general language models might miss the gist of textual information. For this reason, we create a high-quality dataset and present a language model specifically tailored to the cybersecurity domain which can serve as a basic building block for cybersecurity systems. The model is compared on 15 tasks: Domain-dependent extrinsic tasks for measuring the performance on specific problems, intrinsic tasks for measuring the performance of the internal representations of the model as well as general tasks from the SuperGLUE benchmark. The results of the intrinsic tasks show that our model improves the internal representation space of domain words compared to the other models. The extrinsic, domain-dependent tasks, consisting of sequence tagging and classification, show that the model performs best in cybersecurity scenarios. In addition, we pay special attention to the choice of hyperparameters against catastrophic forgetting, as pre-trained models tend to forget the original knowledge during further training.

    @article{bayer_cysecbert_2024,
    title = {{CySecBERT}: {A} {Domain}-{Adapted} {Language} {Model} for the {Cybersecurity} {Domain}},
    volume = {27},
    issn = {2471-2566},
    url = {https://peasec.de/paper/2024/2024_BayerKuehnShanesazReuter_CySecBERT_TOPS.pdf},
    doi = {10.1145/3652594},
    abstract = {The field of cybersecurity is evolving fast. Security professionals are in need of intelligence on past, current and - ideally - on upcoming threats, because attacks are becoming more advanced and are increasingly targeting larger and more complex systems. Since the processing and analysis of such large amounts of information cannot be addressed manually, cybersecurity experts rely on machine learning techniques. In the textual domain, pre-trained language models like BERT have proven to be helpful as they provide a good baseline for further fine-tuning. However, due to the domain-knowledge and the many technical terms in cybersecurity, general language models might miss the gist of textual information. For this reason, we create a high-quality dataset and present a language model specifically tailored to the cybersecurity domain which can serve as a basic building block for cybersecurity systems. The model is compared on 15 tasks: Domain-dependent extrinsic tasks for measuring the performance on specific problems, intrinsic tasks for measuring the performance of the internal representations of the model as well as general tasks from the SuperGLUE benchmark. The results of the intrinsic tasks show that our model improves the internal representation space of domain words compared to the other models. The extrinsic, domain-dependent tasks, consisting of sequence tagging and classification, show that the model performs best in cybersecurity scenarios. In addition, we pay special attention to the choice of hyperparameters against catastrophic forgetting, as pre-trained models tend to forget the original knowledge during further training.},
    number = {2},
    journal = {ACM Transactions on Privacy and Security (TOPS)},
    author = {Bayer, Markus and Kuehn, Philipp and Shanehsaz, Ramin and Reuter, Christian},
    month = apr,
    year = {2024},
    note = {Place: New York, NY, USA
    Publisher: Association for Computing Machinery},
    keywords = {Student, Security, UsableSec, Projekt-CYWARN, Projekt-ATHENE-CyAware, Projekt-CYLENCE, A-Paper, Ranking-CORE-A, Ranking-ImpactFactor},
    }

  • Philipp Kuehn, Dilara Nadermahmoodi, Moritz Kerk, Christian Reuter (2024)
    ThreatCluster: Threat Clustering for Information Overload Reduction in Computer Emergency Response Teams
    arXiv. doi:10.48550/arXiv.2210.14067
    [BibTeX] [Abstract] [Download PDF]

    The ever-increasing number of threats and the existing diversity of information sources pose challenges for Computer Emergency Response Teams (CERTs). To respond to emerging threats, CERTs must gather information in a timely and comprehensive manner. But the volume of sources and information leads to information overload. This paper contributes to the question of how to reduce information overload for CERTs. We propose clustering incoming information as scanning this information is one of the most tiresome, but necessary, manual steps. Based on current studies, we establish conditions for such a framework. Different types of evaluation metrics are used and selected in relation to the framework conditions. Furthermore, different document embeddings and distance measures are evaluated and interpreted in combination with clustering methods. We use three different corpora for the evaluation, a novel ground truth corpus based on threat reports, one security bug report (SBR) corpus, and one with news articles. Our work shows, it is possible to reduce the information overload by up to 84.8\% with homogeneous clusters. A runtime analysis of the clustering methods strengthens the decision of selected clustering methods. The source code and dataset will be made publicly available after acceptance.

    @misc{kuehn_threatcluster_2024,
    title = {{ThreatCluster}: {Threat} {Clustering} for {Information} {Overload} {Reduction} in {Computer} {Emergency} {Response} {Teams}},
    shorttitle = {{ThreatCluster}},
    url = {http://arxiv.org/abs/2210.14067},
    doi = {10.48550/arXiv.2210.14067},
    abstract = {The ever-increasing number of threats and the existing diversity of information sources pose challenges for Computer Emergency Response Teams (CERTs). To respond to emerging threats, CERTs must gather information in a timely and comprehensive manner. But the volume of sources and information leads to information overload. This paper contributes to the question of how to reduce information overload for CERTs. We propose clustering incoming information as scanning this information is one of the most tiresome, but necessary, manual steps. Based on current studies, we establish conditions for such a framework. Different types of evaluation metrics are used and selected in relation to the framework conditions. Furthermore, different document embeddings and distance measures are evaluated and interpreted in combination with clustering methods. We use three different corpora for the evaluation, a novel ground truth corpus based on threat reports, one security bug report (SBR) corpus, and one with news articles. Our work shows, it is possible to reduce the information overload by up to 84.8\% with homogeneous clusters. A runtime analysis of the clustering methods strengthens the decision of selected clustering methods. The source code and dataset will be made publicly available after acceptance.},
    urldate = {2024-03-18},
    publisher = {arXiv},
    author = {Kuehn, Philipp and Nadermahmoodi, Dilara and Kerk, Moritz and Reuter, Christian},
    month = mar,
    year = {2024},
    note = {arXiv:2210.14067 [cs]
    version: 2},
    keywords = {Student, Security, UsableSec, Projekt-CYWARN, Projekt-ATHENE-SecUrban},
    }

  • Philipp Kuehn, Kyra Wittorf, Christian Reuter (2024)
    Navigating the Shadows: Manual and Semi-Automated Evaluation of the Dark Web for Cyber Threat Intelligence
    IEEE Access ;12:118903–118922. doi:10.1109/ACCESS.2024.3448247
    [BibTeX] [Abstract] [Download PDF]

    In today’s world, cyber-attacks are becoming more frequent and thus proactive protection against them is becoming more important. Cyber Threat Intelligence (CTI) is a possible solution, as it collects threat information in various information sources and derives stakeholder intelligence to protect one’s infrastructure. The current focus of CTI in research is the clear web, but the dark web may contain further information. To further advance protection, this work analyzes the dark web as Open Source Intelligence (OSINT) data source to complement current CTI information. The underlying assumption is that hackers use the dark web to exchange, develop, and share information and assets. This work aims to understand the structure of the dark web and identify the amount of its openly available CTI related information. We conducted a comprehensive literature review for dark web research and CTI. To follow this up we manually investigated and analyzed 65 dark web forum (DWF), 7 single-vendor shops, and 72 dark web marketplace (DWM). We documented the content and relevance of DWFs and DWMs for CTI, as well as challenges during the extraction and provide mitigations. During our investigation we identified IT security relevant information in both DWFs and DWMs, ranging from malware toolboxes to hacking-as-a-service. One of the most present challenges during our manual analysis were necessary interactions to access information and anti-crawling measures, i.e., CAPTCHAs. This analysis showed 88\% of marketplaces and 53\% of forums contained relevant data. Our complementary semi-automated analysis of 1,186,906 onion addresses indicates, that the necessary interaction makes it difficult to see the dark web as an open, but rather treat it as specialized information source, when clear web information does not suffice.

    @article{kuehn_navigating_2024,
    title = {Navigating the {Shadows}: {Manual} and {Semi}-{Automated} {Evaluation} of the {Dark} {Web} for {Cyber} {Threat} {Intelligence}},
    volume = {12},
    issn = {2169-3536},
    shorttitle = {Navigating the {Shadows}},
    url = {https://ieeexplore.ieee.org/document/10643518},
    doi = {10.1109/ACCESS.2024.3448247},
    abstract = {In today’s world, cyber-attacks are becoming more frequent and thus proactive protection against them is becoming more important. Cyber Threat Intelligence (CTI) is a possible solution, as it collects threat information in various information sources and derives stakeholder intelligence to protect one’s infrastructure. The current focus of CTI in research is the clear web, but the dark web may contain further information. To further advance protection, this work analyzes the dark web as Open Source Intelligence (OSINT) data source to complement current CTI information. The underlying assumption is that hackers use the dark web to exchange, develop, and share information and assets. This work aims to understand the structure of the dark web and identify the amount of its openly available CTI related information. We conducted a comprehensive literature review for dark web research and CTI. To follow this up we manually investigated and analyzed 65 dark web forum (DWF), 7 single-vendor shops, and 72 dark web marketplace (DWM). We documented the content and relevance of DWFs and DWMs for CTI, as well as challenges during the extraction and provide mitigations. During our investigation we identified IT security relevant information in both DWFs and DWMs, ranging from malware toolboxes to hacking-as-a-service. One of the most present challenges during our manual analysis were necessary interactions to access information and anti-crawling measures, i.e., CAPTCHAs. This analysis showed 88\% of marketplaces and 53\% of forums contained relevant data. Our complementary semi-automated analysis of 1,186,906 onion addresses indicates, that the necessary interaction makes it difficult to see the dark web as an open, but rather treat it as specialized information source, when clear web information does not suffice.},
    journal = {IEEE Access},
    author = {Kuehn, Philipp and Wittorf, Kyra and Reuter, Christian},
    year = {2024},
    keywords = {Student, Security, UsableSec, Projekt-CYWARN, Projekt-ATHENE-SecUrban, Ranking-CORE-A, Ranking-ImpactFactor},
    pages = {118903--118922},
    }

  • Christian Reuter, Jonas Franken, Thomas Reinhold, Philipp Kuehn, Marc-André Kaufhold, Thea Riebe, Katrin Hartwig, Tom Biselli, Stefka Schmid, Laura Guntrum, Steffen Haesler (2024)
    Informatik für den Frieden: Perspektive von PEASEC zu 40 Jahren FIfF
    FIfF-Kommunikation: 2024.
    [BibTeX] [Abstract] [Download PDF]

    Fortschritte in Wissenschaft und Technik, besonders der Informatik, spielen im Kontext von Frieden und Sicherheit eine essenzielle Rolle. Der Lehrstuhl Wissenschaft und Technik für Frieden und Sicherheit (PEASEC) an der Technischen Universität Darmstadt verbindet Informatik mit Friedens-, Konflikt- und Sicherheitsforschung.

    @techreport{reuter_informatik_2024,
    address = {FIfF-Kommunikation},
    title = {Informatik für den {Frieden}: {Perspektive} von {PEASEC} zu 40 {Jahren} {FIfF}},
    url = {https://peasec.de/paper/2024/2024_Reuteretal_InformatikFuerFrieden_fiff.pdf},
    abstract = {Fortschritte in Wissenschaft und Technik, besonders der Informatik, spielen im Kontext von Frieden und Sicherheit eine essenzielle Rolle. Der Lehrstuhl Wissenschaft und Technik für Frieden und Sicherheit (PEASEC) an der Technischen Universität Darmstadt verbindet Informatik mit Friedens-, Konflikt- und Sicherheitsforschung.},
    author = {Reuter, Christian and Franken, Jonas and Reinhold, Thomas and Kuehn, Philipp and Kaufhold, Marc-André and Riebe, Thea and Hartwig, Katrin and Biselli, Tom and Schmid, Stefka and Guntrum, Laura and Haesler, Steffen},
    year = {2024},
    keywords = {Peace, Security},
    }

    2023

  • Philipp Kuehn, Mike Schmidt, Markus Bayer, Christian Reuter (2023)
    ThreatCrawl: A BERT-based Focused Crawler for the Cybersecurity Domain
    2023.
    [BibTeX] [Abstract] [Download PDF]

    Publicly available information contains valuable information for Cyber Threat Intelligence (CTI). This can be used to prevent attacks that have already taken place on other systems. Ideally, only the initial attack succeeds and all subsequent ones are detected and stopped. But while there are different standards to exchange this information, a lot of it is shared in articles or blog posts in non-standardized ways. Manually scanning through multiple online portals and news pages to discover new threats and extracting them is a time-consuming task. To automize parts of this scanning process, multiple papers propose extractors that use Natural Language Processing (NLP) to extract Indicators of Compromise (IOCs) from documents. However, while this already solves the problem of extracting the information out of documents, the search for these documents is rarely considered. In this paper, a new focused crawler is proposed called ThreatCrawl, which uses Bidirectional Encoder Representations from Transformers (BERT)-based models to classify documents and adapt its crawling path dynamically. While ThreatCrawl has difficulties to classify the specific type of Open Source Intelligence (OSINT) named in texts, e.g., IOC content, it can successfully find relevant documents and modify its path accordingly. It yields harvest rates of up to 52\%, which are, to the best of our knowledge, better than the current state of the art.

    @techreport{kuehn_threatcrawl_2023,
    title = {{ThreatCrawl}: {A} {BERT}-based {Focused} {Crawler} for the {Cybersecurity} {Domain}},
    shorttitle = {{ThreatCrawl}},
    url = {http://arxiv.org/abs/2304.11960},
    abstract = {Publicly available information contains valuable information for Cyber Threat Intelligence (CTI). This can be used to prevent attacks that have already taken place on other systems. Ideally, only the initial attack succeeds and all subsequent ones are detected and stopped. But while there are different standards to exchange this information, a lot of it is shared in articles or blog posts in non-standardized ways. Manually scanning through multiple online portals and news pages to discover new threats and extracting them is a time-consuming task. To automize parts of this scanning process, multiple papers propose extractors that use Natural Language Processing (NLP) to extract Indicators of Compromise (IOCs) from documents. However, while this already solves the problem of extracting the information out of documents, the search for these documents is rarely considered. In this paper, a new focused crawler is proposed called ThreatCrawl, which uses Bidirectional Encoder Representations from Transformers (BERT)-based models to classify documents and adapt its crawling path dynamically. While ThreatCrawl has difficulties to classify the specific type of Open Source Intelligence (OSINT) named in texts, e.g., IOC content, it can successfully find relevant documents and modify its path accordingly. It yields harvest rates of up to 52\%, which are, to the best of our knowledge, better than the current state of the art.},
    number = {arXiv:2304.11960},
    urldate = {2023-04-27},
    institution = {arXiv},
    author = {Kuehn, Philipp and Schmidt, Mike and Bayer, Markus and Reuter, Christian},
    month = apr,
    year = {2023},
    note = {arXiv:2304.11960 [cs]},
    keywords = {Student, Security, Projekt-CYWARN, Projekt-ATHENE-SecUrban},
    }

  • Philipp Kuehn, David N. Relke, Christian Reuter (2023)
    Common vulnerability scoring system prediction based on open source intelligence information sources
    Computers & Security . doi:10.1016/j.cose.2023.103286
    [BibTeX] [Abstract] [Download PDF]

    The number of newly published vulnerabilities is constantly increasing. Until now, the information available when a new vulnerability is published is manually assessed by experts using a ()cvss vector and score. This assessment is time consuming and requires expertise. Various works already try to predict vectors or scores using machine learning based on the textual descriptions of the vulnerability to enable faster assessment. However, for this purpose, previous works only use the texts available in databases such as nvd. With this work, the publicly available web pages referenced in the nvd are analyzed and made available as sources of texts through web scraping. A dl based method for predicting the vector is implemented and evaluated. The present work provides a classification of the nvd’s reference texts based on the suitability and crawlability of their texts. While we identified the overall influence of the additional texts is negligible, we outperformed the state-of-the-art with our dl prediction models.

    @article{kuehn_common_2023,
    title = {Common vulnerability scoring system prediction based on open source intelligence information sources},
    url = {https://peasec.de/paper/2023/2023_KuehnRelkeReuter_CommonVulnerabilityScoringSystemOSINT_CompSec.pdf},
    doi = {10.1016/j.cose.2023.103286},
    abstract = {The number of newly published vulnerabilities is constantly increasing. Until now, the information available when a new vulnerability is published is manually assessed by experts using a ()cvss vector and score. This assessment is time consuming and requires expertise. Various works already try to predict vectors or scores using machine learning based on the textual descriptions of the vulnerability to enable faster assessment. However, for this purpose, previous works only use the texts available in databases such as nvd. With this work, the publicly available web pages referenced in the nvd are analyzed and made available as sources of texts through web scraping. A dl based method for predicting the vector is implemented and evaluated. The present work provides a classification of the nvd’s reference texts based on the suitability and crawlability of their texts. While we identified the overall influence of the additional texts is negligible, we outperformed the state-of-the-art with our dl prediction models.},
    journal = {Computers \& Security},
    author = {Kuehn, Philipp and Relke, David N. and Reuter, Christian},
    year = {2023},
    keywords = {Student, Security, UsableSec, Projekt-CYWARN, Projekt-ATHENE-SecUrban, Ranking-ImpactFactor, Ranking-CORE-B},
    }

  • Thomas Reinhold, Philipp Kuehn, Daniel Günther, Thomas Schneider, Christian Reuter (2023)
    ExTRUST: Reducing Exploit Stockpiles With a Privacy-Preserving Depletion Systems for Inter-State Relationships
    IEEE Transactions on Technology and Society ;4(2):158–170. doi:10.1109/TTS.2023.3280356
    [BibTeX] [Abstract] [Download PDF]

    Cyberspace is a fragile construct threatened by malicious cyber operations of different actors, with vulnerabilities in IT hardware and software forming the basis for such activities, thus also posing a threat to global IT security. Advancements in the field of artificial intelligence accelerate this development, either with artificial intelligence enabled cyber weapons, automated cyber defense measures, or artificial intelligence-based threat and vulnerability detection. Especially state actors, with their long-term strategic security interests, often stockpile such knowledge of vulnerabilities and exploits to enable their military or intelligence service cyberspace operations. While treaties and regulations to limit these developments and to enhance global IT security by disclosing vulnerabilities are currently being discussed on the international level, these efforts are hindered by state concerns about the disclosure of unique knowledge and about giving up tactical advantages. This leads to a situation where multiple states are likely to stockpile at least some identical exploits, with technical measures to enable a depletion process for these stockpiles that preserve state secrecy interests and consider the special constraints of interacting states as well as the requirements within such environments being non-existent. This paper proposes such a privacy-preserving approach that allows multiple state parties to privately compare their stock of vulnerabilities and exploits to check for items that occur in multiple stockpiles without revealing them so that their disclosure can be considered. We call our system ExTRUST and show that it is scalable and can withstand several attack scenarios. Beyond the intergovernmental setting, ExTRUST can also be used for other zero-trust use cases, such as bug-bounty programs.

    @article{reinhold_extrust_2023,
    title = {{ExTRUST}: {Reducing} {Exploit} {Stockpiles} {With} a {Privacy}-{Preserving} {Depletion} {Systems} for {Inter}-{State} {Relationships}},
    volume = {4},
    url = {https://peasec.de/paper/2023/2023_ReinholdKuehnGuentherSchneiderReuter_ExTrust-ehem-BlockED_TTaS.pdf},
    doi = {10.1109/TTS.2023.3280356},
    abstract = {Cyberspace is a fragile construct threatened by malicious cyber operations of different actors, with vulnerabilities in IT hardware and software forming the basis for such activities, thus also posing a threat to global IT security. Advancements in the field of artificial intelligence accelerate this development, either with artificial intelligence enabled cyber weapons, automated cyber defense measures, or artificial intelligence-based threat and vulnerability detection. Especially state actors, with their long-term strategic security interests, often stockpile such knowledge of vulnerabilities and exploits to enable their military or intelligence service cyberspace operations. While treaties and regulations to limit these developments and to enhance global IT security by disclosing vulnerabilities are currently being discussed on the international level, these efforts are hindered by state concerns about the disclosure of unique knowledge and about giving up tactical advantages. This leads to a situation where multiple states are likely to stockpile at least some identical exploits, with technical measures to enable a depletion process for these stockpiles that preserve state secrecy interests and consider the special constraints of interacting states as well as the requirements within such environments being non-existent. This paper proposes such a privacy-preserving approach that allows multiple state parties to privately compare their stock of vulnerabilities and exploits to check for items that occur in multiple stockpiles without revealing them so that their disclosure can be considered. We call our system ExTRUST and show that it is scalable and can withstand several attack scenarios. Beyond the intergovernmental setting, ExTRUST can also be used for other zero-trust use cases, such as bug-bounty programs.},
    number = {2},
    journal = {IEEE Transactions on Technology and Society},
    author = {Reinhold, Thomas and Kuehn, Philipp and Günther, Daniel and Schneider, Thomas and Reuter, Christian},
    year = {2023},
    keywords = {Peace, Student, Projekt-ATHENE-SecUrban, Projekt-CROSSING, A-Paper, Selected, Cyberwar, AuswahlPeace, Projekt-GRKPrivacy},
    pages = {158--170},
    }

    2022

  • Ali Sercan Basyurt, Jennifer Fromm, Philipp Kuehn, Marc-André Kaufhold, Milad Mirabaie (2022)
    Help Wanted – Challenges in Data Collection, Analysis and Communication of Cyber Threats in Security Operation Centers
    Proceedings of the International Conference on Wirtschaftsinformatik (WI) Nürnberg.
    [BibTeX] [Abstract] [Download PDF]

    Security Operation Centers are tasked with collecting and analyzing cyber threat data from multiple sources to communicate warning messages and solutions. These tasks are extensive and resource consuming, which makes supporting approaches valuable to experts. However, to implement such approaches, information about the challenges these experts face while performing these tasks is necessary. We therefore conducted semi-structured expert interviews to identify these challenges. By doing so, valuable insights into these challenges based on expert knowledge is acquired, which in return could be leveraged to develop automated approaches to support experts and address these challenges.

    @inproceedings{basyurt_help_2022,
    address = {Nürnberg},
    title = {Help {Wanted} - {Challenges} in {Data} {Collection}, {Analysis} and {Communication} of {Cyber} {Threats} in {Security} {Operation} {Centers}},
    url = {http://www.peasec.de/paper/2022/2022_BasyourtFrommKuehnKaufholdMirabaie_HelpWantedChallengesDataCollectionAnalysisCommunication_WI.pdf},
    abstract = {Security Operation Centers are tasked with collecting and analyzing cyber threat data from multiple sources to communicate warning messages and solutions. These tasks are extensive and resource consuming, which makes supporting approaches valuable to experts. However, to implement such approaches, information about the challenges these experts face while performing these tasks is necessary. We therefore conducted semi-structured expert interviews to identify these challenges. By doing so, valuable insights into these challenges based on expert knowledge is acquired, which in return could be leveraged to develop automated approaches to support experts and address these challenges.},
    booktitle = {Proceedings of the {International} {Conference} on {Wirtschaftsinformatik} ({WI})},
    author = {Basyurt, Ali Sercan and Fromm, Jennifer and Kuehn, Philipp and Kaufhold, Marc-André and Mirabaie, Milad},
    year = {2022},
    keywords = {Security, Projekt-CYWARN, Ranking-CORE-C},
    }

  • Philipp Kuehn, Julian Bäumler, Marc-André Kaufhold, Marc Wendelborn, Christian Reuter (2022)
    The Notion of Relevance in Cybersecurity: A Categorization of Security Tools and Deduction of Relevance Notions
    Mensch und Computer 2022 – Workshopband Darmstadt. doi:10.18420/muc2022-mci-ws01-220
    [BibTeX] [Abstract] [Download PDF]

    Proper cybersecurity requires timely information to defend the IT infrastructure. In a dynamic field like cybersecurity, gathering up-to-date information is usually a manual, time-consuming, and exhaustive task. Automatic and usable approaches are supposed to be a solution to this problem, but for this, they require a notion of information relevance to distinguish relevant from irrelevant information. First, on the basis of a literature review, this paper proposes a novel cybersecurity tool categorization based on corresponding tool types with their respective definitions and core features. Second, it elaborates information used in each category and deduces notions of relevance. Third, it outlines how these findings informed the design of a security dashboard to guide computer emergency response team staff in identifying current threats in open source intelligence sources while mitigating information overload.

    @inproceedings{kuehn_notion_2022,
    address = {Darmstadt},
    series = {Mensch und {Computer} 2022 - {Workshopband}},
    title = {The {Notion} of {Relevance} in {Cybersecurity}: {A} {Categorization} of {Security} {Tools} and {Deduction} of {Relevance} {Notions}},
    url = {https://dl.gi.de/handle/20.500.12116/39072},
    doi = {10.18420/muc2022-mci-ws01-220},
    abstract = {Proper cybersecurity requires timely information to defend the IT infrastructure. In a dynamic field like cybersecurity, gathering up-to-date information is usually a manual, time-consuming, and exhaustive task. Automatic and usable approaches are supposed to be a solution to this problem, but for this, they require a notion of information relevance to distinguish relevant from irrelevant information. First, on the basis of a literature review, this paper proposes a novel cybersecurity tool categorization based on corresponding tool types with their respective definitions and core features. Second, it elaborates information used in each category and deduces notions of relevance. Third, it outlines how these findings informed the design of a security dashboard to guide computer emergency response team staff in identifying current threats in open source intelligence sources while mitigating information overload.},
    language = {en},
    booktitle = {Mensch und {Computer} 2022 - {Workshopband}},
    publisher = {Gesellschaft für Informatik},
    author = {Kuehn, Philipp and Bäumler, Julian and Kaufhold, Marc-André and Wendelborn, Marc and Reuter, Christian},
    year = {2022},
    keywords = {Student, Security, UsableSec, Projekt-CYWARN, Projekt-ATHENE-SecUrban},
    }

  • Thea Riebe, Philipp Kuehn, Philipp Imperatori, Christian Reuter (2022)
    U.S. Security Policy: The Dual-Use Regulation of Cryptography and its Effects on Surveillance
    European Journal for Security Research (EJSR) . doi:10.1007/s41125-022-00080-0
    [BibTeX] [Abstract] [Download PDF]

    Cryptography has become ubiquitous in communication technology and is considered a necessary part of information security. However, both the regulation to restrict access to cryptography, as well as practices to weaken or break encryption, are part of the States’ security policies. The United States (U.S.) regulate cryptography for export in international trade as a dual-use good. However, the regulation has been increasingly loosened and transferred to bilateral agreements with Information and Communication Technology companies. At the same time, the National Security Agency attempted to implement a government encryption standard to guarantee itself easier access to data, thus progressively expanding surveillance on non-U.S. citizens. In this paper, using comparative policy analysis, we examine the evolution of both security policies by tracing the historical development of U.S. regulation of cryptography as a dual-use good, and surveillance technologies, and practices used from the 1990s to today. We conclude that the impact of the dual-use regulation has affected the efficiency of surveillance technology, by loosening regulations only for mass communication services, thereby supporting the proliferation of surveillance intermediaries, while working on strategies to collaborate and exploit their coverage.

    @article{riebe_us_2022,
    title = {U.{S}. {Security} {Policy}: {The} {Dual}-{Use} {Regulation} of {Cryptography} and its {Effects} on {Surveillance}},
    url = {https://link.springer.com/content/pdf/10.1007/s41125-022-00080-0.pdf},
    doi = {10.1007/s41125-022-00080-0},
    abstract = {Cryptography has become ubiquitous in communication technology and is considered a necessary part of information security. However, both the regulation to restrict access to cryptography, as well as practices to weaken or break encryption, are part of the States’ security policies. The United States (U.S.) regulate cryptography for export in international trade as a dual-use good. However, the regulation has been increasingly loosened and transferred to bilateral agreements with Information and Communication Technology companies. At the same time, the National Security Agency attempted to implement a government encryption standard to guarantee itself easier access to data, thus progressively expanding surveillance on non-U.S. citizens. In this paper, using comparative policy analysis, we examine the evolution of both security policies by tracing the historical development of U.S. regulation of cryptography as a dual-use good, and surveillance technologies, and practices used from the 1990s to today. We conclude that the impact of the dual-use regulation has affected the efficiency of surveillance technology, by loosening regulations only for mass communication services, thereby supporting the proliferation of surveillance intermediaries, while working on strategies to collaborate and exploit their coverage.},
    journal = {European Journal for Security Research (EJSR)},
    author = {Riebe, Thea and Kuehn, Philipp and Imperatori, Philipp and Reuter, Christian},
    year = {2022},
    keywords = {Student, Security, Projekt-CYWARN, Projekt-KontiKat},
    }

    2021

  • Rolf Egert, Nina Gerber, Jasmin Haunschild, Philipp Kuehn, Verena Zimmermann (2021)
    Towards Resilient Critical Infrastructures – Motivating Users to Contribute to Smart Grid Resilience
    i-com – Journal of Interactive Media ;20(2):161–175. doi:10.1515/icom-2021-0021
    [BibTeX] [Abstract] [Download PDF]

    Smart cities aim at improving efficiency while providing safety and security by merging conventional infrastructures with information and communication technology. One strategy for mitigating hazardous situations and improving the overall resilience of the system is to involve citizens. For instance, smart grids involve prosumers – capable of producing and consuming electricity – who can adjust their electricity profile dynamically (i.e., decrease or increase electricity consumption), or use their local production to supply electricity to the grid. This mitigates the impact of peak-consumption periods on the grid and makes it easier for operators to control the grid. This involvement of prosumers is accompanied by numerous socio-technical challenges, including motivating citizens to contribute by adjusting their electricity consumption to the requirements of the energy grid. Towards this end, this work investigates motivational strategies and tools, including nudging, persuasive technologies, and incentives, that can be leveraged to increase the motivation of citizens. We discuss long-term and side effects and ethical and privacy considerations, before portraying bug bounty programs, gamification and apps as technologies and strategies to communicate the motivational strategies to citizens.

    @article{egert_towards_2021,
    series = {i-com},
    title = {Towards {Resilient} {Critical} {Infrastructures} - {Motivating} {Users} to {Contribute} to {Smart} {Grid} {Resilience}},
    volume = {20},
    url = {https://www.degruyter.com/document/doi/10.1515/icom-2021-0021/html},
    doi = {10.1515/icom-2021-0021},
    abstract = {Smart cities aim at improving efficiency while providing safety and security by merging conventional infrastructures with information and communication technology. One strategy for mitigating hazardous situations and improving the overall resilience of the system is to involve citizens. For instance, smart grids involve prosumers - capable of producing and consuming electricity - who can adjust their electricity profile dynamically (i.e., decrease or increase electricity consumption), or use their local production to supply electricity to the grid. This mitigates the impact of peak-consumption periods on the grid and makes it easier for operators to control the grid. This involvement of prosumers is accompanied by numerous socio-technical challenges, including motivating citizens to contribute by adjusting their electricity consumption to the requirements of the energy grid. Towards this end, this work investigates motivational strategies and tools, including nudging, persuasive technologies, and incentives, that can be leveraged to increase the motivation of citizens. We discuss long-term and side effects and ethical and privacy considerations, before portraying bug bounty programs, gamification and apps as technologies and strategies to communicate the motivational strategies to citizens.},
    number = {2},
    journal = {i-com - Journal of Interactive Media},
    author = {Egert, Rolf and Gerber, Nina and Haunschild, Jasmin and Kuehn, Philipp and Zimmermann, Verena},
    year = {2021},
    keywords = {Security, Projekt-CYWARN, Projekt-emergenCITY, Projekt-ATHENE-SecUrban, Infrastructure},
    pages = {161--175},
    }

  • Marc-André Kaufhold, Jennifer Fromm, Thea Riebe, Milad Mirbabaie, Philipp Kuehn, Ali Sercan Basyurt, Markus Bayer, Marc Stöttinger, Kaan Eyilmez, Reinhard Möller, Christoph Fuchß, Stefan Stieglitz, Christian Reuter (2021)
    CYWARN: Strategy and Technology Development for Cross-Platform Cyber Situational Awareness and Actor-Specific Cyber Threat Communication
    Mensch und Computer 2018 – Workshopband Bonn. doi:10.18420/muc2021-mci-ws08-263
    [BibTeX] [Abstract] [Download PDF]

    Despite the merits of digitisation in private and professional spaces, critical infrastructures and societies are increasingly ex-posed to cyberattacks. Thus, Computer Emergency Response Teams (CERTs) are deployed in many countries and organisations to enhance the preventive and reactive capabilities against cyberattacks. However, their tasks are getting more complex by the increasing amount and varying quality of information dissem-inated into public channels. Adopting the perspectives of Crisis Informatics and safety-critical Human-Computer Interaction (HCI) and based on both a narrative literature review and group discussions, this paper first outlines the research agenda of the CYWARN project, which seeks to design strategies and technolo-gies for cross-platform cyber situational awareness and actor-spe-cific cyber threat communication. Second, it identifies and elabo-rates eight research challenges with regard to the monitoring, analysis and communication of cyber threats in CERTs, which serve as a starting point for in-depth research within the project.

    @inproceedings{kaufhold_cywarn_2021,
    address = {Bonn},
    series = {Mensch und {Computer} 2021 - {Workshopband}},
    title = {{CYWARN}: {Strategy} and {Technology} {Development} for {Cross}-{Platform} {Cyber} {Situational} {Awareness} and {Actor}-{Specific} {Cyber} {Threat} {Communication}},
    url = {https://dl.gi.de/server/api/core/bitstreams/8f470f6b-5050-4fb9-b923-d08cf84c17b7/content},
    doi = {10.18420/muc2021-mci-ws08-263},
    abstract = {Despite the merits of digitisation in private and professional spaces, critical infrastructures and societies are increasingly ex-posed to cyberattacks. Thus, Computer Emergency Response Teams (CERTs) are deployed in many countries and organisations to enhance the preventive and reactive capabilities against cyberattacks. However, their tasks are getting more complex by the increasing amount and varying quality of information dissem-inated into public channels. Adopting the perspectives of Crisis Informatics and safety-critical Human-Computer Interaction (HCI) and based on both a narrative literature review and group discussions, this paper first outlines the research agenda of the CYWARN project, which seeks to design strategies and technolo-gies for cross-platform cyber situational awareness and actor-spe-cific cyber threat communication. Second, it identifies and elabo-rates eight research challenges with regard to the monitoring, analysis and communication of cyber threats in CERTs, which serve as a starting point for in-depth research within the project.},
    booktitle = {Mensch und {Computer} 2018 - {Workshopband}},
    publisher = {Gesellschaft für Informatik},
    author = {Kaufhold, Marc-André and Fromm, Jennifer and Riebe, Thea and Mirbabaie, Milad and Kuehn, Philipp and Basyurt, Ali Sercan and Bayer, Markus and Stöttinger, Marc and Eyilmez, Kaan and Möller, Reinhard and Fuchß, Christoph and Stieglitz, Stefan and Reuter, Christian},
    year = {2021},
    keywords = {Security, Projekt-CYWARN},
    }

  • Philipp Kuehn, Markus Bayer, Marc Wendelborn, Christian Reuter (2021)
    OVANA: An Approach to Analyze and Improve the Information Quality of Vulnerability Databases
    Proceedings of the 16th International Conference on Availability, Reliability and Security (ARES 2021) . doi:10.1145/3465481.3465744
    [BibTeX] [Abstract] [Download PDF]

    Vulnerability databases are one of the main information sources for IT security experts. Hence, the quality of their information is of utmost importance for anyone working in this area. Previous work has shown that machine readable information is either missing, incorrect, or inconsistent with other data sources. In this paper, we introduce a system called Overt Vulnerability source ANAlysis (OVANA), utilizing state-of-the-art machine learning (ML) and natural-language processing (NLP) techniques, which analyzes the information quality (IQ) of vulnerability databases, searches the free-form description for relevant information missing from structured fields, and updates it accordingly. Our paper shows that OVANA is able to improve the IQ of the National Vulnerability Database by 51.23\% based on the indicators of accuracy, completeness, and uniqueness. Moreover, we present information which should be incorporated into the structured fields to increase the uniqueness of vulnerability entries and improve the discriminability of different vulnerability entries. The identified information from OVANA enables a more targeted vulnerability search and provides guidance for IT security experts in finding relevant information in vulnerability descriptions for severity assessment.

    @inproceedings{kuehn_ovana_2021,
    title = {{OVANA}: {An} {Approach} to {Analyze} and {Improve} the {Information} {Quality} of {Vulnerability} {Databases}},
    isbn = {978-1-4503-9051-4},
    url = {https://peasec.de/paper/2021/2021_KuehnBayerWendelbornReuter_OVANAQualityVulnerabilityDatabases_ARES.pdf},
    doi = {10.1145/3465481.3465744},
    abstract = {Vulnerability databases are one of the main information sources for IT security experts. Hence, the quality of their information is of utmost importance for anyone working in this area. Previous work has shown that machine readable information is either missing, incorrect, or inconsistent with other data sources. In this paper, we introduce a system called Overt Vulnerability source ANAlysis (OVANA), utilizing state-of-the-art machine learning (ML) and natural-language processing (NLP) techniques, which analyzes the information quality (IQ) of vulnerability databases, searches the free-form description for relevant information missing from structured fields, and updates it accordingly. Our paper shows that OVANA is able to improve the IQ of the National Vulnerability Database by 51.23\% based on the indicators of accuracy, completeness, and uniqueness. Moreover, we present information which should be incorporated into the structured fields to increase the uniqueness of vulnerability entries and improve the discriminability of different vulnerability entries. The identified information from OVANA enables a more targeted vulnerability search and provides guidance for IT security experts in finding relevant information in vulnerability descriptions for severity assessment.},
    booktitle = {Proceedings of the 16th {International} {Conference} on {Availability}, {Reliability} and {Security} ({ARES} 2021)},
    publisher = {ACM},
    author = {Kuehn, Philipp and Bayer, Markus and Wendelborn, Marc and Reuter, Christian},
    year = {2021},
    keywords = {Peace, Security, Projekt-CYWARN, Projekt-ATHENE-SecUrban, AuswahlPeace, Ranking-CORE-B},
    pages = {1--11},
    }

  • Thea Riebe, Tristan Wirth, Markus Bayer, Philipp Kuehn, Marc-André Kaufhold, Volker Knauthe, Stefan Guthe, Christian Reuter (2021)
    CySecAlert: An Alert Generation System for Cyber Security Events Using Open Source Intelligence Data
    Information and Communications Security (ICICS) . doi:10.1007/978-3-030-86890-1_24
    [BibTeX] [Abstract] [Download PDF]

    Receiving relevant information on possible cyber threats, attacks, and data breaches in a timely manner is crucial for early response. The social media platform Twitter hosts an active cyber security community. Their activities are often monitored manually by security experts, such as Computer Emergency Response Teams (CERTs). We thus propose a Twitter-based alert generation system that issues alerts to a system operator as soon as new relevant cyber security related topics emerge. Thereby, our system allows us to monitor user accounts with significantly less workload. Our system applies a supervised classifier, based on active learning, that detects tweets containing relevant information. The results indicate that uncertainty sampling can reduce the amount of manual relevance classification effort and enhance the classifier performance substantially compared to random sampling. Our approach reduces the number of accounts and tweets that are needed for the classifier training, thus making the tool easily and rapidly adaptable to the specific context while also supporting data minimization for Open Source Intelligence (OSINT). Relevant tweets are clustered by a greedy stream clustering algorithm in order to identify significant events. The proposed system is able to work near real-time within the required 15-minutes time frame and detects up to 93.8\% of relevant events with a false alert rate of 14.81\%.

    @inproceedings{riebe_cysecalert_2021,
    title = {{CySecAlert}: {An} {Alert} {Generation} {System} for {Cyber} {Security} {Events} {Using} {Open} {Source} {Intelligence} {Data}},
    url = {https://peasec.de/paper/2021/2021_RiebeWirthBayerKuehnKaufholdKnautheGutheReuter_CySecAlertOpenSourceIntelligence_ICICS.pdf},
    doi = {10.1007/978-3-030-86890-1_24},
    abstract = {Receiving relevant information on possible cyber threats, attacks, and data breaches in a timely manner is crucial for early response. The social media platform Twitter hosts an active cyber security community. Their activities are often monitored manually by security experts, such as Computer Emergency Response Teams (CERTs). We thus propose a Twitter-based alert generation system that issues alerts to a system operator as soon as new relevant cyber security related topics emerge. Thereby, our system allows us to monitor user accounts with significantly less workload. Our system applies a supervised classifier, based on active learning, that detects tweets containing relevant information. The results indicate that uncertainty sampling can reduce the amount of manual relevance classification effort and enhance the classifier performance substantially compared to random sampling. Our approach reduces the number of accounts and tweets that are needed for the classifier training, thus making the tool easily and rapidly adaptable to the specific context while also supporting data minimization for Open Source Intelligence (OSINT). Relevant tweets are clustered by a greedy stream clustering algorithm in order to identify significant events. The proposed system is able to work near real-time within the required 15-minutes time frame and detects up to 93.8\% of relevant events with a false alert rate of 14.81\%.},
    booktitle = {Information and {Communications} {Security} ({ICICS})},
    author = {Riebe, Thea and Wirth, Tristan and Bayer, Markus and Kuehn, Philipp and Kaufhold, Marc-André and Knauthe, Volker and Guthe, Stefan and Reuter, Christian},
    year = {2021},
    keywords = {Student, Security, UsableSec, Projekt-CYWARN, Projekt-ATHENE-SecUrban, Ranking-CORE-B},
    pages = {429--446},
    }

    2020

  • Philipp Kuehn, Thea Riebe, Lynn Apelt, Max Jansen, Christian Reuter (2020)
    Sharing of Cyber Threat Intelligence between States
    S+F Sicherheit und Frieden / Peace and Security ;38(1):22–28. doi:10.5771/0175-274X-2020-1-22
    [BibTeX] [Abstract] [Download PDF]

    Threats in cyberspace have increased in recent years due to the increment of offensive capabilities by states. Approaches to mitigate the security dilemma in cyberspace within the UN are deadlocked, as states have not been able to achieve agreements. However, from the perspective of IT-Security, there are Cyber Threat Intelligence (CTI) platforms to share and analyze cyber threats for a collective crisis management. To investigate, whether or not CTI platforms can be used as a confidence-building measure between states and international organizations, we portray current CTI platforms, showcase political requirements, and answer the question of how CTI communication may contribute to confidence-building in international affairs. Our results suggest the need to further develop analytical capabilities, as well as the implementation of a broad social, political, and legal environment for international CTI sharing.

    @article{kuehn_sharing_2020,
    title = {Sharing of {Cyber} {Threat} {Intelligence} between {States}},
    volume = {38},
    url = {http://www.peasec.de/paper/2020/2020_KuehnRiebeApeltJansenReuter_SharingCyberThreatIntelligence_SF.pdf},
    doi = {10.5771/0175-274X-2020-1-22},
    abstract = {Threats in cyberspace have increased in recent years due to the increment of offensive capabilities by states. Approaches to mitigate the security dilemma in cyberspace within the UN are deadlocked, as states have not been able to achieve agreements. However, from the perspective of IT-Security, there are Cyber Threat Intelligence (CTI) platforms to share and analyze cyber threats for a collective crisis management. To investigate, whether or not CTI platforms can be used as a confidence-building measure between states and international organizations, we portray current CTI platforms, showcase political requirements, and answer the question of how CTI communication may contribute to confidence-building in international affairs. Our results suggest the need to further develop analytical capabilities, as well as the implementation of a broad social, political, and legal environment for international CTI sharing.},
    number = {1},
    journal = {S+F Sicherheit und Frieden / Peace and Security},
    author = {Kuehn, Philipp and Riebe, Thea and Apelt, Lynn and Jansen, Max and Reuter, Christian},
    year = {2020},
    keywords = {Peace, Student, Security, Projekt-CYWARN, Projekt-ATHENE-SecUrban, Cyberwar, Projekt-DualUse},
    pages = {22--28},
    }