Differential Fairness
Loading...
Permanent Link
Author/Creator
Author/Creator ORCID
Date
2019
Type of Work
Department
Program
Citation of Original Publication
Foulds, James R.; Islam, Rashidul; Keya, Kamrun Naher; Pan, Shimei; Differential Fairness; NeurIPS 2019 Workshop on Machine Learning with Guarantees, Vancouver, Canada. (2019); https://www.semanticscholar.org/paper/Differential-Fairness-Foulds-Islam/cf3081d5fa83750a89898ae1adcef7925ed8af81
Rights
This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
Subjects
Abstract
We propose differential fairness, a multi-attribute definition of fairness in machine learning which is informed by the framework of intersectionality, a critical lens arising from the humanities literature, leveraging connections between differential privacy and legal notions of fairness. We show that our criterion behaves sensibly for any subset of the set of protected attributes, and we prove economic, privacy, and generalization guarantees. We provide a learning algorithm which respects our differential fairness criterion. Experiments on the COMPAS criminal recidivism dataset and census data demonstrate the utility of our methods.