The peer-reviewed journal Behaviour & Information Technology (BIT) has published a special issue edited by Christian Reuter, Amanda Lee Hughes and Cody Buntain on “Combating Information Warfare: User-Centred Countermeasures Against Fake News and Misinformation”. This special issue explores the increasing challenge of misinformation in the digital age, especially during crises such as the Russian war against Ukraine, the Israel-Hamas conflict, and the COVID-19 pandemic. These events highlight how misinformation can be weaponized to manipulate public opinion and sow discord.
During crises, people often rely on digital platforms for collective sense-making, making them particularly vulnerable to misinformation. Disaster-management personnel, public-health officials, and policymakers have raised concerns about the negative effects of misinformation and sought effective countermeasures. This special issue brings together cutting-edge research on user-centred solutions to mitigate the spread of fake news and enhance information reliability.
After two rounds of rigorous peer review, seven articles were accepted for publication, addressing various aspects of misinformation detection, media literacy, and ethical concerns.

Accepted Articles in the Special Issue

1. Combating Information Warfare: State and Trends in User-Centred Countermeasures Against Fake News and Misinformation

Authors: Christian Reuter, Amanda Lee Hughes, Cody Buntain

This article provides an overview of state-of-the-art research on combating misinformation through user-centred approaches. It discusses recent developments in media literacy interventions, cross-cultural differences in misinformation perception, and the effectiveness of different content formats in fostering critical thinking. The study also examines ethical and security considerations in automated misinformation detection.

2. Striking the Balance Between Fake and Real: The Limited Effect of General Warnings About Misinformation

Authors: Michael Hameleers, Toni van der Meer (University of Amsterdam)

This large-scale experiment with 1,105 U.S. participants challenges the effectiveness of generalized warnings against misinformation. Contrary to common assumptions, the study finds that such warnings have negligible effects on people’s ability to discern truth from falsehood. In some cases, warnings even decrease trust in accurate information. The findings suggest that misinformation interventions should be tailored to specific issues and audience trust levels rather than using generic alerts.

3. Challenging Others When Posting Misinformation: A UK vs. Arab Cross-Cultural Comparison on the Perception of Negative Consequences and Injunctive Norms

Authors: Muaadh Noman, Selin Gurgun, Keith Phalp, Preslav Nakov, Raian Ali

This cross-cultural study examines why individuals in the UK and Arab countries may hesitate to challenge misinformation on social media. Factors such as potential relationship damage, perceived futility, and social norms play a key role in shaping corrective behavior. While the UK participants’ willingness to challenge misinformation was influenced by perceived societal norms, Arab participants were more concerned about relationship costs. These insights highlight the importance of culturally tailored interventions to encourage corrective action against misinformation.

4. The Advantage of Videos Over Text to Boost Adolescents’ Lateral Reading in a Digital Workshop

Authors: Carl-Anton Werner Axelsson (Mälardalen University), Thomas Nygren (Uppsala University)

This study examines how different formats (text vs. video) influence adolescents’ ability to verify information online. In a full-factorial experiment involving 178 secondary school students, the researchers found that video-based instructions significantly improved lateral reading skills—an essential technique for fact-checking. The results suggest that video content could be an effective tool in digital literacy education, helping students become more critical consumers of online information.

5. Misleading Information in Crises: Exploring Content-Specific Indicators on Twitter from a User Perspective

Authors: Katrin Hartwig, Stefka Schmid, Tom Biselli, Helene Pleil, Christian Reuter (Technical University of Darmstadt)

This study analyzes 2,382 German-language tweets to identify specific indicators of misleading content, such as rhetorical questions, absolute claims, and sarcastic remarks. The research also examines which indicators users already use to assess credibility and how these can be integrated into digital tools. The findings highlight the potential of user-centred interventions but also reveal challenges, particularly for individuals with entrenched conspiracy beliefs.

6. A Majority-Based Learning System for Detecting Misinformation

Authors: Hanchun Kao, Yu-ju Tu, Yu-Hsiang Huang, Troy Strader

This paper introduces a machine learning system that aggregates predictions from multiple classifiers to improve misinformation detection. By analyzing data from CoFacts, the study demonstrates promising results in identifying false information. However, it also highlights key challenges, such as data imbalances and the ever-evolving nature of misinformation. The findings emphasize the need for continuous adaptation of detection models.

7. Ethical and Safety Considerations in Automated Fake News Detection

Authors: Benjamin Horne (University of Tennessee Knoxville), Dorit Nevo, Susan Smith (Rensselaer Polytechnic Institute)

This study explores the ethical risks associated with automated misinformation detection. Using machine learning models trained on 381,000 news articles, the authors examine biases in AI-based fake news detection and the potential consequences of algorithmic errors. The research calls for interdisciplinary approaches that combine technical solutions with media literacy efforts and psychological misinformation inoculation strategies.

About the Journal: Behaviour & Information Technology (BIT)

Behaviour & Information Technology (BIT) is a leading peer-reviewed journal published by Taylor & Francis. It focuses on usability, human-computer interaction (HCI), and user-centred design. BIT has an Impact Factor of 2.9  and is listed as a top publication in the HCI category on Google Scholar. https://scholar.google.com/citations?view_op=top_venues&hl=de&vq=eng_humancomputerinteraction

For more details on the special issue and access to the published articles, visit the journal’s website.

Special Issue in Behaviour & Information Technology: Combating Information Warfare