PRANC: Pseudo RAndom Networks for Compacting deep models

dc.contributor.authorNooralinejad, Parsa
dc.contributor.authorAbbasi, Ali
dc.contributor.authorKolouri, Soheil
dc.contributor.authorPirsiavash, Hamed
dc.date.accessioned2022-11-14T15:43:18Z
dc.date.available2022-11-14T15:43:18Z
dc.date.issued2022-06-16
dc.descriptionInternational Conference on Computer Vision; Paris, France; October 2 - 6, 2023
dc.description.abstractCommunication becomes a bottleneck in various distributed Machine Learning settings. Here, we propose a novel training framework that leads to highly efficient communication of models between agents. In short, we train our network to be a linear combination of many pseudo-randomly generated frozen models. For communication, the source agent transmits only the ‘seed’ scalar used to generate the pseudo-random ‘basis’ networks along with the learned linear mixture coefficients. Our method, denoted as PRANC, learns almost 100× fewer parameters than a deep model and still performs well on several datasets and architectures. PRANC enables 1) efficient communication of models between agents, 2) efficient model storage, and 3) accelerated inference by generating layer-wise weights on the fly. We test PRANC on CIFAR-10, CIFAR-100, tinyImageNet, and ImageNet-100 with various architectures like AlexNet, LeNet, ResNet18, ResNet20, and ResNet56 and demonstrate a massive reduction in the number of parameters while providing satisfactory performance on these benchmark datasets. The code is available https://github.com/UCDvision/PRANCen
dc.description.sponsorshipThis material is based upon work partially supported by the Defense Advanced Research Projects Agency (DARPA) under Contract No. HR00112190135 and HR00112090023, the United States Air Force under Contract No. FA8750-19-C-0098, funding from SAP SE, and NSF grants 1845216 and 1920079. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the United States Air Force, DARPA, or other funding agencies.en
dc.description.urihttps://openaccess.thecvf.com/content/ICCV2023/html/Nooralinejad_PRANC_Pseudo_RAndom_Networks_for_Compacting_Deep_Models_ICCV_2023_paper.htmlen
dc.format.extent11 pagesen
dc.genreconference papers and proceedingsen
dc.genrepostprintsen
dc.identifierdoi:10.13016/m29z1z-h9vo
dc.identifier.urihttp://hdl.handle.net/11603/26316
dc.language.isoenen
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department Collection
dc.relation.ispartofUMBC Faculty Collection
dc.rightsThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.en
dc.titlePRANC: Pseudo RAndom Networks for Compacting deep modelsen
dc.typeTexten

Files

Original bundle

Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Name:
Nooralinejad_PRANC_Pseudo_RAndom_Networks_for_Compacting_Deep_Models_ICCV_2023_paper.pdf
Size:
8.81 MB
Format:
Adobe Portable Document Format
Description:
Loading...
Thumbnail Image
Name:
Nooralinejad_PRANC_Pseudo_RAndom_ICCV_2023_supplemental.pdf
Size:
38.5 MB
Format:
Adobe Portable Document Format
Description:
Supplement

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
2.56 KB
Format:
Item-specific license agreed upon to submission
Description: