Altuncu, Enes (2025) Fact-checking ecosystem: from modelling to semi-automated stakeholder-based detection of false information. Doctor of Philosophy (PhD) thesis, University of Kent,. (doi:10.22024/UniKent/01.02.108725) (Access to this publication is currently restricted. You may be able to access a copy if URLs are provided) (KAR id:108725)
|
PDF
Language: English Restricted to Repository staff only until January 2026.
This work is licensed under a Creative Commons Attribution 4.0 International License.
|
|
|
Contact us about this publication
|
|
| Official URL: https://doi.org/10.22024/UniKent/01.02.108725 |
|
Abstract
Humanity has suffered from false, misleading, and malicious information, in short, problematic information, throughout history. Nevertheless, the advances in digital technologies, including the Internet, social media, and smart devices, accelerated and facilitated the dissemination of problematic information. As a complex phenomenon with several aspects to be studied, problematic information remains within many disciplines’ areas of interest. Nevertheless, the lack of a unified understanding of the phenomenon complicates the development of more effective solutions that benefit from multidisciplinary literature to detect problematic information. Focusing on a common type of problematic information, false information, this thesis aims to contribute to filling this gap by presenting a more comprehensive approach to understanding and combating false information. To this end, a topical analysis of the relevant literature was conducted through topic modelling over research publications. This was complemented by the generation of an entity-relationship model of the ecosystem to develop a better conceptual understanding of the landscape and to produce a tool that can be used to guide conceptual analysis of false information-related work and scenarios. Taking a closer look at the proposed model, then, user attitudes and behaviours towards fact-checking tools were examined through an online user survey designed with the help of a new taxonomy of fact-checking tools to collect some empirical evidence for the design of more trustworthy solutions. Most importantly, the conducted survey revealed that semi-automated fact-checking systems, combining human intelligence with automation speed and scalability, can build a higher level of trust among users, compared to manual and fully automated solutions. Therefore, based on the derived conceptual understanding and supported by the obtained empirical evidence, a semi-automated fact-checking tool that is entirely based on expert opinion discovery called aedFaCT was designed, implemented and tested. The presented research in this thesis indicates the potential of semi-automated approaches for the development of more trustworthy and usable fact-checking solutions. The impact of the presented work can be further enhanced by addressing its limitations and incorporating new technological advancements, including large language models (LLMs) and explainable artificial intelligence (xAI).
| Item Type: | Thesis (Doctor of Philosophy (PhD)) |
|---|---|
| Thesis advisor: | Li, Shujun |
| Thesis advisor: | Nurse, Jason |
| DOI/Identification number: | 10.22024/UniKent/01.02.108725 |
| Uncontrolled keywords: | fact-checking, false information detection, semi-automated, ecosystem modelling |
| Subjects: | Q Science > QA Mathematics (inc Computing science) > QA 76 Software, computer programming, |
| Institutional Unit: | Schools > School of Computing |
| Former Institutional Unit: |
Divisions > Division of Computing, Engineering and Mathematical Sciences > School of Computing
|
| Funders: | University of Kent (https://ror.org/00xkeyj56) |
| SWORD Depositor: | System Moodle |
| Depositing User: | System Moodle |
| Date Deposited: | 11 Feb 2025 17:10 UTC |
| Last Modified: | 20 May 2025 10:29 UTC |
| Resource URI: | https://kar.kent.ac.uk/id/eprint/108725 (The current URI for this page, for reference purposes) |
- Export to:
- RefWorks
- EPrints3 XML
- BibTeX
- CSV
- Depositors only (login required):

Altmetric
Altmetric