Not logged in.

Contribution Details

Type Conference or Workshop Paper
Scope Discipline-based scholarship
Published in Proceedings Yes
Title Group Fairness for Content Creators: the Role of Human and Algorithmic Biases under Popularity-based Recommendations
Organization Unit
Authors
  • Stefania Gavrila-Ionescu
  • Aniko Hannak
  • Nicolo Pagan
Presentation Type paper
Item Subtype Original Work
Refereed Yes
Status Published in final form
Language
  • English
ISBN 979-8-4007-0241-9
Page Range 863 - 870
Event Title RecSys '23: 17th ACM Conference on Recommender Systems
Event Type conference
Event Location Singapore
Event Start Date September 18 - 2023
Event End Date September 22 - 2023
Series Name Proceedings of the ACM Conference on Recommender Systems
Publisher ACM Digital library
Abstract Text The Creator Economy faces concerning levels of unfairness. Content creators (CCs) publicly accuse platforms of purposefully reducing the visibility of their content based on protected attributes, while platforms place the blame on viewer biases. Meanwhile, prior work warns about the “rich-get-richer” effect perpetuated by existing popularity biases in recommender systems: Any initial advantage in visibility will likely be exacerbated over time. What remains unclear is how the biases based on protected attributes from platforms and viewers interact and contribute to the observed inequality in the context of popularity-biased recommender systems. The difficulty of the question lies in the complexity and opacity of the system. To overcome this challenge, we design a simple agent-based model (ABM) that unifies the platform systems which allocate the visibility of CCs (e.g., recommender systems, moderation) into a single popularity-based function, which we call the visibility allocation system (VAS). Through simulations, we find that although viewer homophilic biases do alone create inequalities, small levels of additional biases in VAS are more harmful. From the perspective of interventions, our results suggest that (a) attempts to reduce attribute-biases in moderation and recommendations should precede those reducing viewers’ homophilic tendencies, (b) decreasing the popularity-biases in VAS decreases but not eliminates inequalities, (c) boosting the visibility of protected CCs to overcome viewers’ homophily with respect to one fairness metric is unlikely to produce fair outcomes with respect to all metrics, and (d) the process is also unfair for viewers and this unfairness could be overcome through the same interventions. More generally, this work demonstrates the potential of using ABMs to better understand the causes and effects of biases and interventions within complex sociotechnical systems.
Free access at DOI
Digital Object Identifier 10.1145/3604915.3608841
Other Identification Number merlin-id:24163
PDF File Download from ZORA
Export BibTeX
EP3 XML (ZORA)