Show simple item record

Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas

dc.contributor.authorHaimson, Oliver L.
dc.contributor.authorDelmonaco, Daniel
dc.contributor.authorNie, Peipei
dc.contributor.authorWegner, Andrea
dc.date.accessioned2021-09-22T14:54:03Z
dc.date.available2021-09-22T14:54:03Z
dc.date.issued2021-10
dc.identifier.citationProceedings of the ACM on Human-Computer Interaction (PACM HCI) 5, CSCW2, Article 466 (October 2021), 35 pagesen_US
dc.identifier.issn2573-0142
dc.identifier.urihttps://hdl.handle.net/2027.42/169587en
dc.description.abstractSocial media sites use content moderation to attempt to cultivate safe spaces with accurate information for their users. However, content moderation decisions may not be applied equally for all types of users, and may lead to disproportionate censorship related to people’s genders, races, or political orientations. We conducted a mixed methods study involving qualitative and quantitative analysis of survey data to understand which types of social media users have content and accounts removed more frequently than others, what types of content and accounts are removed, and how content removed may differ between groups. We found that three groups of social media users in our dataset experienced content and account removals more often than others: political conservatives, transgender people, and Black people. However, the types of content removed from each group varied substantially. Conservative participants’ removed content included content that was offensive or allegedly so, misinformation, Covid-related, adult, or hate speech. Transgender participants’ content was often removed as adult despite following site guidelines, critical of a dominant group (e.g., men, white people), or specifically related to transgender or queer issues. Black participants’ removed content was frequently related to racial justice or racism. More broadly, conservative participants’ removals often involved harmful content removed according to site guidelines to create safe spaces with accurate information, while transgender and Black participants’ removals often involved content related to expressing their marginalized identities that was removed despite following site policies or fell into content moderation gray areas. We discuss potential ways forward to make content moderation more equitable for marginalized social media users, such as embracing and designing specifically for content moderation gray areas.en_US
dc.description.sponsorshipNational Science Foundation grant #1942125en_US
dc.language.isoen_USen_US
dc.publisherACMen_US
dc.subjectcontent moderationen_US
dc.subjectsocial mediaen_US
dc.subjectmarginalizationen_US
dc.subjectmisinformationen_US
dc.subjecthate speechen_US
dc.subjecttransgenderen_US
dc.subjectBlacken_US
dc.subjectraceen_US
dc.subjectgenderen_US
dc.subjectLGBTQen_US
dc.subjectconservativeen_US
dc.subjectpolitical orientationen_US
dc.titleDisproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areasen_US
dc.typeArticleen_US
dc.subject.hlbsecondlevelInformation Science
dc.subject.hlbtoplevelSocial Sciences
dc.description.peerreviewedPeer Revieweden_US
dc.contributor.affiliationumcampusAnn Arboren_US
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/169587/1/content_moderation_CSCW_2021_camera_ready.pdf
dc.identifier.doihttps://dx.doi.org/10.7302/2632
dc.identifier.sourceProceedings of the ACM on Human-Computer Interactionen_US
dc.identifier.orcid0000-0001-6552-4540en_US
dc.description.filedescriptionDescription of content_moderation_CSCW_2021_camera_ready.pdf : Main article
dc.description.depositorSELFen_US
dc.identifier.name-orcidHaimson, Oliver; 0000-0001-6552-4540en_US
dc.working.doi10.7302/2632en_US
dc.owningcollnameInformation, School of (SI)


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.