Mitigating Demographic Bias in AI-based Resume Filtering
No Thumbnail Available
Links to Files
Author/Creator
Author/Creator ORCID
Date
2020-07-13
Type of Work
Department
Program
Citation of Original Publication
Deshpande, Ketki V., Shimei Pan, and James R. Foulds. “Mitigating Demographic Bias in AI-Based Resume Filtering.” In Adjunct Publication of the 28th ACM Conference on User Modeling, Adaptation and Personalization, 268–75. UMAP ’20 Adjunct. New York, NY, USA: Association for Computing Machinery, 2020. https://doi.org/10.1145/3386392.3399569.
Rights
This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
Abstract
With increasing diversity in the labor market as well as the work force, employers receive resumes from an increasingly diverse population. However, studies and field experiments have confirmed the presence of bias in the labor market based on gender, race, and ethnicity. Many employers use automated resume screening to filter the many possible matches. Depending on how the automated screening algorithm is trained it can potentially exhibit bias towards a particular population by favoring certain socio-linguistic characteristics. The resume writing style and socio-linguistics are a potential source of bias as they correlate with protected characteristics such as ethnicity. A biased dataset is often translated into biased AI algorithms and de-biasing algorithms are being contemplated. In this work, we study the effects of socio-linguistic bias on resume to job description matching algorithms. We develop a simple technique, called fair-tf-idf, to match resumes with job descriptions in a fair way by mitigating the socio-linguistic bias.