CompRess: Self-Supervised Learning by Compressing Representations

dc.contributor.authorKoohpayegani, Soroush Abbasi
dc.contributor.authorTejankar, Ajinkya
dc.contributor.authorPirsiavash, Hamed
dc.date.accessioned2020-12-09T18:07:22Z
dc.date.available2020-12-09T18:07:22Z
dc.date.issued2020-10-28
dc.description.abstractSelf-supervised learning aims to learn good representations with unlabeled data. Recent works have shown that larger models benefit more from self-supervised learning than smaller models. As a result, the gap between supervised and self-supervised learning has been greatly reduced for larger models. In this work, instead of designing a new pseudo task for self-supervised learning, we develop a model compression method to compress an already learned, deep self-supervised model (teacher) to a smaller one (student). We train the student model so that it mimics the relative similarity between the data points in the teacher's embedding space. For AlexNet, our method outperforms all previous methods including the fully supervised model on ImageNet linear evaluation (59.0% compared to 56.5%) and on nearest neighbor evaluation (50.7% compared to 41.4%). To the best of our knowledge, this is the first time a self-supervised AlexNet has outperformed supervised one on ImageNet classification.en_US
dc.description.sponsorshipThis material is based upon work partially supported by the United States Air Force under Contract No. FA8750-19-C-0098, funding from SAP SE, and also NSF grant number 1845216. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the United States Air Force, DARPA, and other funding agencies. Moreover, we would like to thank Vipin Pillai and Erfan Noury for the valuable initial discussions. We also acknowledge the fruitful comments by all reviewers specifically by Reviewer 2 for suggesting to use teacher’s queue for the student, which improved our results.en_US
dc.description.urihttps://arxiv.org/abs/2010.14713en_US
dc.format.extent16 pagesen_US
dc.genreconference papers and proceedings preprintsen_US
dc.identifierdoi:10.13016/m2n1nd-9amp
dc.identifier.citationSoroush Abbasi Koohpayegani, Ajinkya Tejankar and Hamed Pirsiavash, CompRess: Self-Supervised Learning by Compressing Representations, https://arxiv.org/abs/2010.14713en_US
dc.identifier.urihttp://hdl.handle.net/11603/20214
dc.language.isoen_USen_US
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department Collection
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Student Collection
dc.rightsThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
dc.rightsAttribution 4.0 International*
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/*
dc.titleCompRess: Self-Supervised Learning by Compressing Representationsen_US
dc.typeTexten_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
2010.14713.pdf
Size:
3.2 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
2.56 KB
Format:
Item-specific license agreed upon to submission
Description: