File(s) not publicly available
Reason: The data set remains private to protect participants' privacy and their accounts from censorship and harm.
Platform Gaslighting: A User-Centric Insight into Manipulated Realities in Online Content Moderation - interview transcripts
This is the repository concerning the interview connected to the above study, 12 interviews carried out with censored, marginalised content creators in the UK, Ireland, Italy, Australia and the USA.
The linked study delves into communications dynamics between social media platforms and users as they negotiate the complexities of governance policies. Using Meta and TikTok as case studies, we reveal that gaslighting - traditionally associated with relationship abuse where one partner undermines the validity of the other's experience - is a pervasive platforms’ communications strategy, manifesting in numerous instances where automated and human platform communications have directly contradicted users' experiences, evidence, and research. We analyse 36 diverse interview datasets and six public platform responses to governance issues, highlighting the systemic nature of this phenomenon within digital spaces. We therefore broaden the scholarly understanding of platform gaslighting by delving beyond shadowbanning and the isolated platform-to-user dialogue to explore a wider range of communications concerning governance. Our participants’ experiences show that gaslighting can be used to highlight corporate power imbalances in platform-user interactions, especially in situations of opaque governance following not just shadowbanning, but also de-platforming on the back of malicious flagging. Our dataset draws from seemingly disparate groups who share moderation experiences: Jewish creators engaged in combating antisemitism, Palestinian creators advocating for human rights, and sex-positive creators, whose expertise and stories are dismissed and belittled by platforms as a form of damage control in the face of adverse governance. We demonstrate how the dismissal or minimization of participants’ traumatic experiences by platforms' automated processes and human teams is weaponized to inflict epistemic injustice, consolidate power, and evade accountability
Funding
Centre for Digital Citizens - Next Stage Digital Economy Centre
Engineering and Physical Sciences Research Council
Find out more...