Privatheit, Sicherheit und Vertrauen sind relevante Aspekte im Kontext der Mensch-Computer-Interaktion. Dies unterstreicht nicht zuletzt die umfangreiche Forschung in jenem Bereich:
Die Konferenz „Mensch und Computer“ bietet eine Plattform für Beiträge und Diskussionen zu innovativen Formen der Interaktion zwischen Menschen und Technik, zu nutzerorientierten Entwicklungsmethoden, interaktiven Anwendungen und weiteren Themen aus dem Spannungsfeld zwischen NutzerInnen, Organisationen und Gemeinschaften einerseits sowie zu ihren Informations- und Kommunikationstechnologien andererseits. Ziel der Tagung ist es, innovative Forschungsergebnisse zu diskutieren, den Informationsaustausch zwischen Wissenschaft und Praxis zu fördern, die Relevanz nutzungs- und aufgabengerechter Technikgestaltung in Wissenschaft und Öffentlichkeit zu sensibilisieren sowie Forschungsaktivitäten und Ausbildung in diesem Feld anzuregen. 2021 findet die Konferenz virtuell aus Ingolstadt statt. Details zur Teilnahme finden sich hier.
Im von Prof. Christian Reuter moderierten Track „Privacy, Security & Trust“ am Dienstag, 14-15:30h werden folgende Beiträge vorgestellt:
14:00 – 14:15
Who Should Get My Private Data in Which Case? Evidence in the Wild
1Ruhr-Universität Bochum, Germany; 2Technische Universität Darmstadt, Germany
As a result of the ongoing digitalization of our everyday lives the amount of data produced by everyone is steadily increasing. This happens through personal decisions and items, such as the use of social media or smartphones but also through more and more data capturing in public spaces, such as e.g., Closed Circuit Television. Are people aware of the data they are sharing? What kind of data do people want to share with whom? Are people aware if they have Wi-Fi, GPS, Bluetooth as potential data sharing functionalities activated on their phone? To answer these questions, we conducted a representative online survey as well as face-to-face interviews with users in Germany. We found users wanting to share private data on premise with most entities, showing that willingness to share data depends on who has access to the data. Almost half of the participants would be more willing to share data with specific entities (state bodies & rescue forces) when an acquaintance is endangered. For Wi-Fi and GPS the frequencies of self-reported and actual activation on the smartphone are almost equal, but 17% of participants were unaware of the Bluetooth status on their smartphone. Our research is therefore in line with other studies suggesting low privacy awareness of users.
14:15 – 14:30
A Consumer Perspective on Privacy Risk Awareness of Connected Car Data Use
Universität Siegen, Germany
New cars are increasingly „connected“ by default. Since not having a car is not an option for many people, understanding the privacy implications of driving connected cars and using their data-based services is an even more pressing issue than for expendable consumer products. While risk-based approaches to privacy are well established in law, they have only begun to gain traction in HCI. These approaches are understood not only to increase acceptance but also to help consumers make choices that meet their needs. To the best of our knowledge, perceived risks in the context of connected cars have not been studied before. To address this gap, our study reports on the analysis of a survey with 18 open-ended questions distributed to 1,000 households in a medium-sized German city. Our findings provide qualitative insights into existing attitudes and use cases of connected car features and, most importantly, a list of perceived risks themselves. Taking the perspective of consumers, we argue that these can help inform consumers about data use in connected cars in a user-friendly way. Finally, we show how these risks fit into and extend existing risk taxonomies from other contexts with a stronger social perspective on risks of data use.
14:30 – 14:45
The Effect of Explanations on Trust in an Assistance System for Public Transport Users and the Role of the Propensity to Trust
University of Kassel, Germany
The present study aimed to investigate whether explanations increase trust in an assistance system. Moreover, we wanted to take the role of the individual propensity to trust in technology into account. We conducted an empirical study in a virtual reality environment where 40 participants interacted with a specific assistance system for public transport users. The study was in a 2×2 mixed design with the within-subject factor assistance system feature (trip planner and connection request) and the between-subject factor explanation (with or without) We measured trust as explicit trust via a questionnaire and as implicit trust via an operationalization of the participants’ behavior. The results showed that trust propensity predicted explicit trust, and explanations increased explicit trust significantly. This was not the case for implicit trust, though, suggesting that explicit and implicit trust do not necessarily coincide. In conclusion, our results complement the literature on explainable artificial intelligence and trust in automation and provide topics for future research regarding the effect of explanations on trust in assistance systems or other technologies.
14:45 – 15:00
Design Considerations for Usable Authentication in Smart Homes
1Bundeswehr University Munich, Germany; 2LMU Munich, Germany
Smart home devices are on the rise. To provide their rich variety of features, they collect, store and process a considerable amount of (potentially) sensitive user data. However, authentication mechanisms on such devices a) have limited usability or b) are non-existing. To close this gap, we investigated, on one hand, users’ perspectives towards potential privacy and security risks as well as how they imagine ideal authentication mechanisms in future smart homes. On the other hand, we considered security experts‘ perspectives on authentication for smart homes. In particular, we conducted semi-structured interviews (N=20) with potential smart home users using the story completion method and a focus group with security experts (N=10). We found what kind of devices users would choose and why, potential challenges regarding privacy and security, and potential solutions. We discussed and verified these with security experts. We derive and reflect on a set of design implications for usable authentication mechanisms for smart homes and suggest directions for future research. Our work can assist designers and practitioners when implementing appropriate security mechanisms for smart homes.
15:00 – 15:15
The U in Crypto Stands for Usable: An Empirical Study of User Experience with Mobile Cryptocurrency Wallets
1Freie Universität Berlin, Deutschland; 2The University of British Columbia
In a corpus of 45,821 app reviews of the top five mobile cryptocurrency wallets, we identified and qualitatively analyzed 6,859 reviews pertaining to the user experience (UX) with those wallets. Our analysis suggests that both new and experienced users struggle with general and domain-specific UX issues that, aside from frustration and disengagement, might lead to dangerous errors and irreversible monetary losses. We reveal shortcomings of current wallet UX as well as users’ misconceptions, some of which can be traced back to a reliance on their understanding of conventional payment systems. For example, some users believed that transactions were free, reversible, and could be canceled anytime, which is not the case in reality. Correspondingly, these beliefs often resulted in unmet expectations. Based on our findings, we provide recommendations on how to design cryptocurrency wallets that both alleviate the identified issues and counteract some of the misconceptions in order to better support newcomers.
15:15 – 15:18
Towards Warranted Trust: A Model on the Relation Between Actual and Perceived System Trustworthiness
1Universitätsklinikum Gießen und Marburg, Phillips-Universität Marburg, Germany; 2Universität des Saarlandes, Germany
The public discussion about trustworthy AI is fueling research on new methods to make AI explainable and fair. However, users may incorrectly assess system trustworthiness and could consequently overtrust untrustworthy systems or undertrust trustworthy systems. In order to understand what determines accurate assessments of system trustworthiness we apply Brunswik’s Lens Model and the Realistic Accuracy Model. The assumption is that the actual trustworthiness of a system cannot be accessed directly and is therefore inferred via cues to form a user’s perceived trustworthiness. The accuracy of trustworthiness assessment then depends on: cue relevance, availability, detection, and utilization. We describe how the model can be used to systematically investigate determinants that increase the match between system’s actual trustworthiness and user’s perceived trustworthiness in order to achieve warranted trust.
15:18 – 15:21
Investigating barriers for the adoption of the German contact-tracing app and the influence of a video intervention on user acceptance
Universität Regensburg, Germany
Despite the efforts that were put into data protection and data security, the overall adoption rate of the German contact-tracing app falls behind the estimated threshold of 60 percent which would be needed to suppress the virus effectively. Therefore, we conducted a questionnaire-based study to analyze barriers for the acceptance and to investigate the effect of a video intervention on the acceptance. Acceptance was measured using the technology acceptance model (TAM, Venkatesh & Bala (2008)). TAM measures were collected before and after watching the video intervention. Qualitative data on attitudes about the app was gathered through open questions and attitudinal items. 81 datasets from users with no prior experience were included in the further analysis. Results show that the video intervention did not increase behavioral intention (BI) significant-ly. However, the average scores of two important determinants of BI, Perceived Ease of Use (PEOU) and Perceived Usefulness (PU) increased significantly from 4.00 to 4.61 for PEOU (p<.00) and from 1.93 to 2.16 for PU (p<.001) with high and medium effect sizes. Qualitative data analysis indicates that the main barriers for adoption are perceived high risks and costs but low perceived personal benefits. One third of respondents (27/81) have privacy concerns and more than 40 percent of partici-pants state they see „no benefits“(36/81).
15:21 – 15:24
The Interplay between Personal Relationships & Shoulder Surfing Mitigation
University of Glasgow, United Kingdom
Shoulder surfing refers to observing someone’s device screen without their consent. Conspicuously switching off the screen upon noticing a friend observing private messages may create an embarrassing situation. Initial evidence indicates that users adopt strategies to mitigate shoulder surfing based on their relationship to the observer. However, the social implications of such mitigation strategies remain largely unexplored. We present findings from an interview study with 12 participants to address this. We analyze experiences with shoulder surfers of different relationships to the user and collect feedback on eleven state-of-the-arts strategies for mitigating shoulder surfing. We show that the user-observer relationship impacts the choice of mitigation methods and that users often do not want observers to know they were caught. Based on our results, we conclude with implications for designing socially acceptable privacy protection mechanisms on mobile devices.
15:24 – 15:27
Exploring Users‘ Perceived Control over Technology
Hochschule Ruhr West, Germany
Intelligent systems become more and more a part of our everyday lives and typically act autonomously. Design guidelines and constructs related to the control of traditional systems often do not apply to them. Still, perceived control over these systems is important to users and affect acceptance and intention to use them. This paper presents an explorative online study. Participants named systems over which they sense much or less control and described features and properties that lead to that perception or that affect their desire for control. We found that (1) perceived control is strongly influenced by not directly control-related design features such as effective or efficient use, (2) poor comprehensibility and malfunctioning are highly affecting users control feeling, (3) users value customizability and the possibility for personalization of systems, (4) people are highly aware of privacy control issues of modern online connected technology, and (5) smart systems face the same control-related challenges as non-smart systems, but suffer from still being new to the users. Our findings help to understand the complex phenomena of perceived control over system with different levels of intelligence and autonomy from the users‘ perspective and give suggestions for the design of future systems
15:27 – 15:30
Tracing Covid-19 – Older Adults’ Attitudes Toward Digital Contact Tracing andHow to Increase Their Participation
1Center for Human-Computer Interaction, University of Salzburg; 2Austrian Institute of Technology, Vienna
The COVID-19 pandemic poses major challenges for health care systems. Contact tracing apps are being used around the world to help track and break the chain of infection. We conducted a qualitative study with eight older adults (65+) to find out how their lives had been affected by the pandemic. One of the topics covered was the “Stopp Corona” app, a contact tracing app by the national Red Cross and participants‘ attitude towards it. Despite the fact that most participants did not use the app they expressed little concerns about the misuse of data as they have high trust in the Red Cross and the national government. Highlighting the societal benefit of contact tracing seems to be a major factor for uptake. Based on our results in comparison with recent studies on digital contact tracing, we name four recommendations that may support adoption of contact tracing apps by older adults.