Mitigating Demographic Biases in Social Media-Based Recommender Systems

Author/Creator ORCID

Date

2019-08-04

Department

Program

Citation of Original Publication

Rashidul Islam, Kamrun Naher Keya, Shimei Pan, James Foulds (2019); Mitigating Demographic Biases in Social Media-Based Recommender Systems; In KDD ’19: Social Impact Track, August 04–08, 2019, Anchorage, Alaska; ACM, New York, NY, USA, 3 pages; https://www.kdd.org/kdd2019/docs/Islam_Keya_Pan_Foulds_KDDsocialImpactTrack.pdf

Rights

This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.

Abstract

As a growing proportion of our daily human interactions are digitized and subjected to algorithmic decision-making on social media platforms, it has become increasingly important to ensure that these algorithms behave in a fair manner. In this work, we study fairness in collaborative-filtering recommender systems trained on social media data. We empirically demonstrate the prevalence of demographic bias in these systems for a large Facebook dataset, both in terms of encoding harmful stereotypes, and in the impact on consequential decisions such as recommending academic concentrations to the users. We then develop a simple technique to mitigate bias in social media-based recommender systems, and show that this results in fairer behavior with only a minor loss in accuracy.