Diverse Knowledge Distillation (DKD): A Solution for Improving The Robustness of Ensemble Models Against Adversarial Attacks

dc.contributor.authorMirzaeian, Ali
dc.contributor.authorKosecka, Jana
dc.contributor.authorHomayoun, Houman
dc.contributor.authorMohsenin, Tinoosh
dc.contributor.authorSasan, Avesta
dc.date.accessioned2021-02-16T17:45:06Z
dc.date.available2021-02-16T17:45:06Z
dc.description.abstractThis paper proposes an ensemble learning model that is resistant to adversarial attacks. To build resilience, we introduced a training process where each member learns a radically distinct latent space. Member models are added one at a time to the ensemble. Simultaneously, the loss function is regulated by a reverse knowledge distillation, forcing the new member to learn different features and map to a latent space safely distanced from those of existing members. We assessed the security and performance of the proposed solution on image classification tasks using CIFAR10 and MNIST datasets and showed security and performance improvement compared to the state of the art defense methods.en_US
dc.description.sponsorshipThis work was supported by Centauri Corp. and the National Science Foundation (NSF) through Computer Systems Research (CSR) program under NSF award number 1718538.en_US
dc.description.urihttps://arxiv.org/abs/2006.15127en_US
dc.format.extent6 pagesen_US
dc.genrejournal article preprintsen_US
dc.identifierdoi:10.13016/m2hksl-vixr
dc.identifier.citationAli Mirzaeian, Jana Kosecka, Houman Homayoun, Tinoosh Mohsenin and Avesta Sasan, Diverse Knowledge Distillation (DKD): A Solution for Improving The Robustness of Ensemble Models Against Adversarial Attacks, https://arxiv.org/abs/2006.15127en_US
dc.identifier.urihttp://hdl.handle.net/11603/21039
dc.language.isoen_USen_US
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department Collection
dc.relation.ispartofUMBC Faculty Collection
dc.rightsThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
dc.titleDiverse Knowledge Distillation (DKD): A Solution for Improving The Robustness of Ensemble Models Against Adversarial Attacksen_US
dc.typeTexten_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
2006.15127.pdf
Size:
2.28 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
2.56 KB
Format:
Item-specific license agreed upon to submission
Description: