Cookies

We use cookies to ensure that we give you the best experience on our website. By continuing to browse this repository, you give consent for essential cookies to be used. You can read more about our Privacy and Cookie Policy.


Durham e-Theses
You are in:

Cost Benefit Analysis of Online Harms

MARKEVICIUTE, KAROLINA (2025) Cost Benefit Analysis of Online Harms. Doctoral thesis, Durham University.

[img]
Preview
PDF - Accepted Version
3890Kb

Abstract

The rapid evolution of digital technologies and algorithmic content curation has transformed the online information ecosystem, enabling unprecedented connectivity and knowledge exchange while intensifying harms such as privacy erosion, algorithmic targeting, and the exploitation of user vulnerabilities. These developments have deepened challenges surrounding information integrity, trust, and individual agency, particularly as online platforms facilitate the circulation of both misinformation (inadvertent inaccuracies) and disinformation (proactive deception). The latter presents acute risks in politically volatile and strategically contested contexts where informational asymmetry may be deliberately weaponised.

This thesis constructs a behavioural game-theoretic model to examine online information consumption under uncertainty and the cognitive burden of costly verification. Drawing on decision theory, behavioural economics, and contract theory, the model elucidates how users allocate cognitive effort to assess content in the presence of unreliable sources and malicious strategic intent designed to influence belief formation. The user decision process is formalised in a dynamic optimisation problem, where verification effort is governed by the trade-off between the perceived value of accurate information and the escalating cognitive and emotional toll of engagement. While the framework provides an analytical distinction between misinformation and disinformation, its empirical validation is focused on the latter due to its acute strategic significance. Accordingly, the model is calibrated using qualitative data from semi-structured interviews with Ukrainian individuals navigating the volatile information space during the ongoing 2022 Russian invasion of Ukraine, where disinformation is widespread, substantiating its capacity to capture nuanced behavioural adaptation in high-stakes settings characterised by uncertainty and strategic manipulation.

The analytical findings reveal that users adopt diverse strategies in response to rising verification costs and informational risk, including reliance on heuristics, selective scrutiny, and withdrawal from engagement. These behavioural patterns reflect forms of constrained optimisation rather than irrationality, driven by cognitive and emotional limitations under conditions of sustained stress. Crucially, the analysis indicates that increased exposure to disinformation does not consistently prompt greater verification effort; in certain circumstances, disengagement becomes an adaptive means of preserving cognitive equilibrium and stability. These findings complicate prevailing assumptions about information resilience, highlighting that user responses may reflect survival-oriented behaviour rather than active empowerment.

By integrating empirical insight with formal modelling, the study advances understanding of informational vulnerability and user decision-making in strategically manipulated online environments. The findings emphasise the need for interventions that alleviate cognitive overload and support strategic verification behaviours. Such measures are essential for strengthening collective resilience and safeguarding the integrity of information ecosystems in the presence of coordinated digital manipulation and epistemically corrosive content.

Item Type:Thesis (Doctoral)
Award:Doctor of Philosophy
Faculty and Department:Faculty of Business
Thesis Date:2025
Copyright:Copyright of this thesis is held by the author
Deposited On:12 Jan 2026 08:43

Social bookmarking: del.icio.usConnoteaBibSonomyCiteULikeFacebookTwitter