Foulds, James RDeshpande, Ketki Vinod2021-09-012021-09-012020-01-2012193http://hdl.handle.net/11603/22823With increasing diversity in the market as well as the work force there is an increasing chance of employers getting resumes from a diverse population. Many employers have started using automated resume screening to filter the many possible matches. Depending on how the automated screening algorithm is trained it may show bias towards a particular population by favoring certain socio-linguistic characteristics. Studies and field experiments in the past have confirmed the presence of bias in the labor market based on gender, race, and ethnicity. A biased dataset is often translated into biased AI algorithms and de-biasing algorithms are being contemplated. In this theses, I analyzed the effects of socio-linguistic bias on resume to job description matching algorithm. I have also developed a simple technique to match resumes with job description in a fairer way by mitigating the socio-linguistic bias.application:pdffair machine learningjob recommendationterm weightingtf-idfRaising Both the Ceiling and the Floor: Mitigating Demographic Bias in AI-based Career CounselingText