Do Humans Prefer Debiased AI Algorithms? A Case Study in Career Recommendation

dc.contributor.authorWang, Clarice
dc.contributor.authorWang, Kathryn
dc.contributor.authorBian, Andrew Y.
dc.contributor.authorIslam, Rashidul
dc.contributor.authorKeya, Kamrun Naher
dc.contributor.authorFoulds, James
dc.contributor.authorPan, Shimei
dc.date.accessioned2024-02-27T17:44:54Z
dc.date.available2024-02-27T17:44:54Z
dc.date.issued2022-03-22
dc.descriptionIUI '22: 27th International Conference on Intelligent User Interfaces (March 2022)
dc.description.abstractCurrently, there is a surge of interest in fair Artificial Intelligence (AI) and Machine Learning (ML) research which aims to mitigate discriminatory bias in AI algorithms, e.g. along lines of gender, age, and race. While most research in this domain focuses on developing fair AI algorithms, in this work, we examine the challenges which arise when human- fair-AI interact. Our results show that due to an apparent conflict between human preferences and fairness, a fair AI algorithm on its own may be insufficient to achieve its intended results in the real world. Using college major recommendation as a case study, we build a fair AI recommender by employing gender debiasing machine learning techniques. Our offline evaluation showed that the debiased recommender makes fairer and more accurate college major recommendations. Nevertheless, an online user study of more than 200 college students revealed that participants on average prefer the original biased system over the debiased system. Specifically, we found that the perceived gender disparity associated with a college major is a determining factor for the acceptance of a recommendation. In other words, our results demonstrate we cannot fully address the gender bias issue in AI recommendations without addressing the gender bias in humans. They also highlight the urgent need to extend the current scope of fair AI research from narrowly focusing on debiasing AI algorithms to including new persuasion and bias explanation technologies in order to achieve intended societal impacts.
dc.description.sponsorshipThis work was performed under the following fnancial assistance award: 60NANB18D227 from U.S. Department of Commerce, National Institute of Standards and Technology. This material is based upon work supported by the National Science Foundation under Grant No.’s IIS2046381; IIS1850023; IIS1927486. Any opinions, fndings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily refect the views of the National Science Foundation
dc.description.urihttps://dl.acm.org/doi/10.1145/3490099.3511108
dc.format.extent14 pages
dc.genreconference papers and proceedings
dc.genrepostprints
dc.identifierdoi:10.13016/m2ap53-p7r7
dc.identifier.citationClarice Wang, Kathryn Wang, Andrew Bian, Rashidul Islam, Kamrun Naher Keya, James Foulds, and Shimei Pan. 2022. Do Humans Prefer Debiased AI Algorithms? A Case Study in Career Recommendation. In 27th International Conference on Intelligent User Interfaces (IUI '22). Association for Computing Machinery, New York, NY, USA, 134–147. https://doi.org/10.1145/3490099.3511108
dc.identifier.urihttps://doi.org/10.1145/3490099.3511108
dc.identifier.urihttp://hdl.handle.net/11603/31709
dc.publisherACM
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Information Systems Department Collection
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Student Collection
dc.rightsThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
dc.titleDo Humans Prefer Debiased AI Algorithms? A Case Study in Career Recommendation
dc.typeText
dcterms.creatorhttps://orcid.org/0000-0001-5276-5708
dcterms.creatorhttps://orcid.org/0000-0003-0935-4182
dcterms.creatorhttps://orcid.org/0000-0002-5989-8543

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
10317696.pdf
Size:
920.81 KB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
2.56 KB
Format:
Item-specific license agreed upon to submission
Description: