User Acceptance of Gender Stereotypes in Automated Career Recommendations

dc.contributor.authorWang, Clarice
dc.contributor.authorWang, Kathryn
dc.contributor.authorBian, Andrew
dc.contributor.authorIslam, Rashidul
dc.contributor.authorKeya, Kamrun Naher
dc.contributor.authorFoulde, James
dc.contributor.authorPan, Shimei
dc.date.accessioned2021-06-29T20:19:47Z
dc.date.available2021-06-29T20:19:47Z
dc.date.issued2021-07-28
dc.description.abstractCurrently, there is a surge of interest in fair Artificial Intelligence (AI) and Machine Learning (ML) research which aims to mitigate discriminatory bias in AI algorithms, e.g. along lines of gender, age, and race. While most research in this domain focuses on developing fair AI algorithms, in this work, we show that a fair AI algorithm on its own may be insufficient to achieve its intended results in the real world. Using career recommendation as a case study, we build a fair AI career recommender by employing gender debiasing machine learning techniques. Our offline evaluation showed that the debiased recommender makes fairer career recommendations without sacrificing its accuracy. Nevertheless, an online user study of more than 200 college students revealed that participants on average prefer the original biased system over the debiased system. Specifically, we found that perceived gender disparity is a determining factor for the acceptance of a recommendation. In other words, our results demonstrate we cannot fully address the gender bias issue in AI recommendations without addressing the gender bias in humans.en
dc.description.urihttps://arxiv.org/abs/2106.07112v2en
dc.format.extent13 pagesen
dc.genrejournal articles preprintsen
dc.identifierdoi:10.13016/m26ssh-41yx
dc.identifier.doihttps://doi.org/10.48550/arXiv.2106.07112
dc.identifier.urihttp://hdl.handle.net/11603/21844
dc.language.isoenen
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Information Systems Department Collection
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Student Collection
dc.rightsAttribution 4.0 International (CC BY 4.0)*
dc.rightsThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/*
dc.titleUser Acceptance of Gender Stereotypes in Automated Career Recommendationsen
dc.title.alternativeUser Acceptance of Gender Stereotypes in Automated Career Recommendations
dc.title.alternativeBias: Friend or Foe? User Acceptance of Gender Stereotypes in Automated Career Recommendations
dc.typeTexten

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
2106.07112v2.pdf
Size:
1.08 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
2.56 KB
Format:
Item-specific license agreed upon to submission
Description: