Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas
dc.contributor.author | Haimson, Oliver L. | |
dc.contributor.author | Delmonaco, Daniel | |
dc.contributor.author | Nie, Peipei | |
dc.contributor.author | Wegner, Andrea | |
dc.date.accessioned | 2021-09-22T14:54:03Z | |
dc.date.available | 2021-09-22T14:54:03Z | |
dc.date.issued | 2021-10 | |
dc.identifier.citation | Proceedings of the ACM on Human-Computer Interaction (PACM HCI) 5, CSCW2, Article 466 (October 2021), 35 pages | en_US |
dc.identifier.issn | 2573-0142 | |
dc.identifier.uri | https://hdl.handle.net/2027.42/169587 | en |
dc.description.abstract | Social media sites use content moderation to attempt to cultivate safe spaces with accurate information for their users. However, content moderation decisions may not be applied equally for all types of users, and may lead to disproportionate censorship related to people’s genders, races, or political orientations. We conducted a mixed methods study involving qualitative and quantitative analysis of survey data to understand which types of social media users have content and accounts removed more frequently than others, what types of content and accounts are removed, and how content removed may differ between groups. We found that three groups of social media users in our dataset experienced content and account removals more often than others: political conservatives, transgender people, and Black people. However, the types of content removed from each group varied substantially. Conservative participants’ removed content included content that was offensive or allegedly so, misinformation, Covid-related, adult, or hate speech. Transgender participants’ content was often removed as adult despite following site guidelines, critical of a dominant group (e.g., men, white people), or specifically related to transgender or queer issues. Black participants’ removed content was frequently related to racial justice or racism. More broadly, conservative participants’ removals often involved harmful content removed according to site guidelines to create safe spaces with accurate information, while transgender and Black participants’ removals often involved content related to expressing their marginalized identities that was removed despite following site policies or fell into content moderation gray areas. We discuss potential ways forward to make content moderation more equitable for marginalized social media users, such as embracing and designing specifically for content moderation gray areas. | en_US |
dc.description.sponsorship | National Science Foundation grant #1942125 | en_US |
dc.language.iso | en_US | en_US |
dc.publisher | ACM | en_US |
dc.subject | content moderation | en_US |
dc.subject | social media | en_US |
dc.subject | marginalization | en_US |
dc.subject | misinformation | en_US |
dc.subject | hate speech | en_US |
dc.subject | transgender | en_US |
dc.subject | Black | en_US |
dc.subject | race | en_US |
dc.subject | gender | en_US |
dc.subject | LGBTQ | en_US |
dc.subject | conservative | en_US |
dc.subject | political orientation | en_US |
dc.title | Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas | en_US |
dc.type | Article | en_US |
dc.subject.hlbsecondlevel | Information Science | |
dc.subject.hlbtoplevel | Social Sciences | |
dc.description.peerreviewed | Peer Reviewed | en_US |
dc.contributor.affiliationumcampus | Ann Arbor | en_US |
dc.description.bitstreamurl | http://deepblue.lib.umich.edu/bitstream/2027.42/169587/1/content_moderation_CSCW_2021_camera_ready.pdf | |
dc.identifier.doi | https://dx.doi.org/10.7302/2632 | |
dc.identifier.source | Proceedings of the ACM on Human-Computer Interaction | en_US |
dc.identifier.orcid | 0000-0001-6552-4540 | en_US |
dc.description.filedescription | Description of content_moderation_CSCW_2021_camera_ready.pdf : Main article | |
dc.description.depositor | SELF | en_US |
dc.identifier.name-orcid | Haimson, Oliver; 0000-0001-6552-4540 | en_US |
dc.working.doi | 10.7302/2632 | en_US |
dc.owningcollname | Information, School of (SI) |
Files in this item
Remediation of Harmful Language
The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.
Accessibility
If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.