File(s) not publicly available
Reason: The data are not publicly available due to the privacy of research participants. To request access please contact carolina.are@northumbria.ac.uk
"‘Dysfunctional’ appeals and failures of algorithmic justice in Instagram and TikTok content moderation" - interview transcripts
This is the repository concerning the interview connected to the above study, 12 interviews carried out with censored, marginalised content creators in the UK, Ireland, Italy, Australia and the USA.
The linked study examines users’ perceptions of justice when using appeals on Instagram and TikTok, focusing on the barriers de-platformed users across fields like activism, sex work, sex education and LGBTQIA+ self-expression face when using these platforms’ automated appeals to recover their de-platformed content and/or accounts. Examining appeals from a platform governance standpoint and drawing from fairness and due process literature, this study finds concerning loopholes within these platforms’ appeals, leaving room for discrimination, fraud and scams and leading to user disempowerment. Through interviews with de-platformed users, this paper reveals significant barriers faced by particularly transgender and sex working users when recovering their de-platformed accounts through in-platform appeals. With metaphors of an ‘algorithmic cop, jury and judge’, this paper concludes that the needs of marginalised users have been designed out of content moderation and of platforms’ processes, leading them to experience the appeals system as opaque, unfair and unjust.
Funding
Centre for Digital Citizens - Next Stage Digital Economy Centre
Engineering and Physical Sciences Research Council
Find out more...