Foulds, JamesPan, Shimei2018-09-052018-09-052020-05-27Foulds, James R., Rashidul Islam, Kamrun Naher Keya, and Shimei Pan. “An Intersectional Definition of Fairness.” In 2020 IEEE 36th International Conference on Data Engineering (ICDE), 1918–21, 2020. https://doi.org/10.1109/ICDE48307.2020.00203.https://doi.org/10.1109/ICDE48307.2020.00203http://hdl.handle.net/11603/112272020 IEEE 36th International Conference on Data Engineering (ICDE), 20-24 April 2020, Dallas, TX, USAWe propose differential fairness, a multi-attribute definition of fairness in machine learning which is informed by intersectionality, a critical lens arising from the humanities literature, leveraging connections between differential privacy and legal notions of fairness. We show that our criterion behaves sensibly for any subset of the set of protected attributes, and we prove economic, privacy, and generalization guarantees. We provide a learning algorithm which respects our differential fairness criterion. Experiments on the COMPAS criminal recidivism dataset and census data demonstrate the utility of our methods.16 pagesen-US© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Computer Science - Machine LearningComputer Science - Computers and SocietyStatistics - Machine Learningmeasure of fairnessalgorithms and dataprotected attributessystems of powersystems of oppressionracegendersexual orientationclassdisabilityAn Intersectional Definition of FairnessText