An Intersectional Definition of Fairness
Loading...
Links to Files
Author/Creator
Author/Creator ORCID
Date
2020-05-27
Type of Work
Department
Program
Citation of Original Publication
Foulds, James R., Rashidul Islam, Kamrun Naher Keya, and Shimei Pan. “An Intersectional Definition of Fairness.” In 2020 IEEE 36th International Conference on Data Engineering (ICDE), 1918–21, 2020. https://doi.org/10.1109/ICDE48307.2020.00203.
Rights
© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Abstract
We propose differential fairness, a multi-attribute definition of fairness in machine learning which is informed by intersectionality, a critical lens arising from the humanities literature, leveraging connections between differential privacy and legal notions of fairness. We show that our criterion behaves sensibly for any subset of the set of protected attributes, and we prove economic, privacy, and generalization guarantees. We provide a learning algorithm which respects our differential fairness criterion. Experiments on the COMPAS criminal recidivism dataset and census data demonstrate the utility of our methods.