Mitigating Socio-lingustic Bias in Job Recommendation
Loading...
Permanent Link
Author/Creator
Author/Creator ORCID
Date
Type of Work
Department
Program
Citation of Original Publication
Deshpande, Ketki V, Shimei Pan, and James R Foulds. “Mitigating Socio-Lingustic Bias in Job Recommendation,” 2020. https://jfoulds.informationsystems.umbc.edu/papers/2020/Deshpande%20(2020)%20-%20Mitigating%20Socio-lingustic%20Bias%20in%20Job%20Recommendation%20(MASC-SLL).pdf
Rights
This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
Subjects
Abstract
With increasing diversity in the job market as well as the workforce, employers receive resumes from an increasingly diverse population. Many employers have started using automated resume screening to filter the many possible matches. Depending on how the automated screening algorithm is trained it may show bias towards a particular population by favoring certain socio-linguistic characteristics. The resume writing style and socio-linguistics are a potential source of bias as
they correlate with protected characteristics. Studies and field experiments in the past have confirmed the presence of bias in the labor market based on gender, race (Bertrand and Mullainathan, 2004), and ethnicity (Oreopoulos, 2011). A biased
dataset is often translated into biased AI algorithms (Rudinger et al., 2017) and de-biasing algorithms are being contemplated (Bolukbasi et al., 2016). In this work, we aim to identify and mitigate the effects of socio-linguistic bias on resume to job description matching algorithms