Participant notes for policy recommendations towards the "Re-designing users into platform governance: Tackling content moderation inequalities with impacted users" paper
This data set contains the notes compiled in Google Docs by workshop participants during the policy recommendations co-design sessions we carried out in March and April 2023 towards a paper and report aiming to provide user-centred solutions to platform governance issues. More information about the paper and report below.
Content moderation often fails to take the human experience into account, prioritising speed and platform interests in removing objectionable content but lacking the necessary empathy for users who experience censorship, emotional distress, loss of livelihood and network. Few solutions focus on tackling the issues faced by users mistakenly caught in the net of platform governance, and especially the intersection of bodies, sex, sex work with technology. We address this gap by working with LGBTQIA+ users, nude and sexual content creators, journalists and activists to draft policy recommendations prompting platforms to improve their transparency, safety, education, fairness, due process and contextual knowledge to re-design users into a system that excludes them both at the usage and governance stage. These recommendations highlight how platform governance, the tech sector's fast growth and civil society and regulators’ failure to catch up with it can leave users vulnerable to unfair governance and exclusion.
The solutions are anonymised for participant safety. For more information, contact carolina.are@northumbria.ac.uk.
Funding
Cyber-Security across the Life Span (cSaLSA)
Engineering and Physical Sciences Research Council
Find out more...