A User Study on a De-biased Career Recommender System
Loading...
Links to Files
Permanent Link
Author/Creator ORCID
Date
Type of Work
Department
Program
Citation of Original Publication
Clarice Wang et al., A User Study on a De-biased Career Recommender System,
Rights
This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
Subjects
Abstract
AI is increasingly being used in making consequential decisions such as determining whether someone is granted parole or not (Angwin et al., 2016). Unfortunately, there have been a wide range of recent discoveries of biased AI systems that are
prejudiced against certain groups of people (Dastin, 2018; Noble, 2018; Angwin et al., 2016). In this research, we focus on developing new techniques that mitigate gender biases in automated career recommendation systems. Since biases are typically inherent in AI systems trained on data influenced by our society, an AI recommender must be ”de-biased” to avoid reinforcing harmful stereotypes (e.g., recommending computer programming to boys and nursing to girls) (Bolukbasi et al., 2016;
Yao and Huang, 2017). Although it is technically possible to remove biases from an AI system, it is unclear whether intended users prefer such a system. We conduct a user study to investigate this