Improving Out-of-Distribution Detection via Epistemic Uncertainty Adversarial Training

dc.contributor.authorEverett, Derek
dc.contributor.authorNguyen, Andre T.
dc.contributor.authorRichards, Luke E.
dc.contributor.authorRaff, Edward
dc.date.accessioned2022-10-11T16:41:31Z
dc.date.available2022-10-11T16:41:31Z
dc.date.issued2022-09-09
dc.description.abstractThe quantification of uncertainty is important for the adoption of machine learning, especially to reject out-of-distribution (OOD) data back to human experts for review. Yet progress has been slow, as a balance must be struck between computational efficiency and the quality of uncertainty estimates. For this reason many use deep ensembles of neural networks or Monte Carlo dropout for reasonable uncertainty estimates at relatively minimal compute and memory. Surprisingly, when we focus on the real-world applicable constraint of ≤1% false positive rate (FPR), prior methods fail to reliably detect OOD samples as such. Notably, even Gaussian random noise fails to trigger these popular OOD techniques. We help to alleviate this problem by devising a simple adversarial training scheme that incorporates an attack of the epistemic uncertainty predicted by the dropout ensemble. We demonstrate this method improves OOD detection performance on standard data (i.e., not adversarially crafted), and improves the standardized partial AUC from near-random guessing performance to ≥0.75.en
dc.description.urihttps://arxiv.org/abs/2209.03148en
dc.format.extent8 pagesen
dc.genrejournal articlesen
dc.genrepreprintsen
dc.identifierdoi:10.13016/m2sntu-ef3v
dc.identifier.urihttps://doi.org/10.48550/arXiv.2209.03148
dc.identifier.urihttp://hdl.handle.net/11603/26144
dc.language.isoenen
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department Collection
dc.relation.ispartofUMBC Student Collection
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)*
dc.rightsThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.en
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.titleImproving Out-of-Distribution Detection via Epistemic Uncertainty Adversarial Trainingen
dc.typeTexten
dcterms.creatorhttps://orcid.org/0000-0002-9900-1972en

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
2209.03148.pdf
Size:
4.35 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
2.56 KB
Format:
Item-specific license agreed upon to submission
Description: