An Intersectional Definition of Fairness
Links to Fileshttps://arxiv.org/abs/1807.08362
MetadataShow full item record
Type of Work16 PAGES
journal articles preprints
RightsThis item may be protected under Title 17 of the U.S. Copyright Law. It is made available by UMBC for non-commercial research and education. For permission to publish or reproduce, please contact the author.
SubjectsComputer Science - Machine Learning
Computer Science - Computers and Society
Statistics - Machine Learning
measure of fairness
algorithms and data
systems of power
systems of oppression
We introduce a measure of fairness for algorithms and data with regard to multiple protected attributes. Our proposed definition, differential fairness, is informed by the framework of intersectionality, which analyzes how interlocking systems of power and oppression affect individuals along overlapping dimensions including race, gender, sexual orientation, class, and disability. We show that our criterion behaves sensibly for any subset of the set of protected attributes, and we illustrate links to differential privacy. A case study on census data demonstrates the utility of our approach.