"What are you doing, TikTok" : How Marginalized Social Media Users Perceive, Theorize, and "Prove" Shadowbanning
dc.contributor.author | Delmonaco, Daniel | |
dc.contributor.author | Mayworm, Samuel | |
dc.contributor.author | Thach, Hibby | |
dc.contributor.author | Guberman, Josh | |
dc.contributor.author | Augusta, Aurelia | |
dc.contributor.author | Haimson, Oliver L. | |
dc.date.accessioned | 2024-03-07T15:50:14Z | |
dc.date.available | 2024-03-07T15:50:14Z | |
dc.date.issued | 2024-04 | |
dc.identifier.uri | https://hdl.handle.net/2027.42/192621 | en |
dc.description.abstract | Shadowbanning is a unique content moderation strategy receiving recent media attention for the ways it impacts marginalized social media users and communities. Social media companies often deny this content moderation practice despite user experiences online. In this paper, we use qualitative surveys and interviews to understand how marginalized social media users make sense of shadowbanning, develop folk theories about shadowbanning, and attempt to prove its occurrence. We find that marginalized social media users collaboratively develop and test algorithmic folk theories to make sense of their unclear experiences with shadowbanning. Participants reported direct consequences of shadowbanning, including frustration, decreased engagement, the inability to post specific content, and potential financial implications. They reported holding negative perceptions of platforms where they experienced shadowbanning, sometimes attributing their shadowbans to platforms’ deliberate suppression of marginalized users’ content. Some marginalized social media users acted on their theories by adapting their social media behavior to avoid potential shadowbans. We contribute collaborative algorithm investigation: a new concept describing social media users’ strategies of collaboratively developing and testing algorithmic folk theories. Finally, we present design and policy recommendations for addressing shadowbanning and its potential harms. | en_US |
dc.description.sponsorship | National Science Foundation award #1942125 | en_US |
dc.language.iso | en_US | en_US |
dc.publisher | ACM | en_US |
dc.subject | content moderation | en_US |
dc.subject | social media | en_US |
dc.subject | marginalization | en_US |
dc.subject | shadowbanning | en_US |
dc.subject | algorithmic folk theories | en_US |
dc.subject | collaborative algorithm investigation | en_US |
dc.title | "What are you doing, TikTok" : How Marginalized Social Media Users Perceive, Theorize, and "Prove" Shadowbanning | en_US |
dc.type | Article | en_US |
dc.subject.hlbsecondlevel | Information Science | |
dc.subject.hlbtoplevel | Social Sciences | |
dc.description.peerreviewed | Peer Reviewed | en_US |
dc.contributor.affiliationum | Information, School of | en_US |
dc.contributor.affiliationumcampus | Ann Arbor | en_US |
dc.description.bitstreamurl | http://deepblue.lib.umich.edu/bitstream/2027.42/192621/1/Shadowbanning_CSCW23_MinorRevisions.pdf | |
dc.identifier.doi | 10.1145/3637431 | |
dc.identifier.doi | https://dx.doi.org/10.7302/22437 | |
dc.identifier.source | Proceedings of the ACM Human Computer Interaction (PACM HCI) (CSCW 2024) | en_US |
dc.description.mapping | c4321027-eaa6-44f5-a298-a6880ec181d5 | en_US |
dc.identifier.orcid | 0000-0001-6552-4540 | en_US |
dc.description.filedescription | Description of Shadowbanning_CSCW23_MinorRevisions.pdf : Main article | |
dc.description.depositor | SELF | en_US |
dc.identifier.name-orcid | Haimson, Oliver; 0000-0001-6552-4540 | en_US |
dc.working.doi | 10.7302/22437 | en_US |
dc.owningcollname | Information, School of (SI) |
Files in this item
Remediation of Harmful Language
The University of Michigan Library aims to describe its collections in a way that respects the people and communities who create, use, and are represented in them. We encourage you to Contact Us anonymously if you encounter harmful or problematic language in catalog records or finding aids. More information about our policies and practices is available at Remediation of Harmful Language.
Accessibility
If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.