Compression of deep neural networks

dc.contributor.advisorPirsiavash, Hamed
dc.contributor.authorDAMADI, SEYYED MOHAMMAD SAEED
dc.contributor.departmentComputer Science and Electrical Engineering
dc.contributor.programEngineering, Electrical
dc.date.accessioned2024-01-10T20:04:10Z
dc.date.available2024-01-10T20:04:10Z
dc.date.issued2021-01-01
dc.description.abstractCompressing of deep neural networks aims at finding a sparse network that performs as well as a dense network but with significantly less parameters. This compression is called pruning. As a result of pruning, the energy consumption reduces, hardware requirements are relaxed, and responses to queries become faster. The pruning problem yields a constrained, stochastic, nonconvex, and non-differentiable optimization problem with a very large size. All these barriers can be bypassed by solving an approximate problem. To do so, we present ÒAmenable Sparse Network InvestigatorÓ (ASNI) algorithm that utilizes a novel pruning strategy based on a sigmoid function that induces sparsity level globally over the course of one single round of training. The ASNI algorithm fulfills both tasks that current state-of-the-art strategies can only do one of them. This algorithm has two subalgorithms: 1) ASNI-I, 2) ASNI-II. The first subalgorithmlearns an accurate sparse off-the-shelf network only in one single round of training. ASNI-II learns a sparse network and an initialization that is quantized, compressed, and from which the sparse network is trainable.
dc.formatapplication:pdf
dc.genrethesis
dc.identifier.other12480
dc.identifier.urihttp://hdl.handle.net/11603/31258
dc.languageen
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department Collection
dc.relation.ispartofUMBC Theses and Dissertations Collection
dc.relation.ispartofUMBC Graduate School Collection
dc.relation.ispartofUMBC Student Collection
dc.rightsThis item may be protected under Title 17 of the U.S. Copyright Law. It is made available by UMBC for non-commercial research and education. For permission to publish or reproduce, please see http://aok.lib.umbc.edu/specoll/repro.php or contact Special Collections at speccoll(at)umbc.edu
dc.sourceOriginal File Name: DAMADI_umbc_0434M_12480.pdf
dc.subjectASNI
dc.subjectCompressed quantized initialization
dc.subjectCompression of Deep neural network
dc.subjectDeep learning
dc.subjectPruning
dc.titleCompression of deep neural networks
dc.typeText
dcterms.accessRightsDistribution Rights granted to UMBC by the author.

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
DAMADI_umbc_0434M_12480.pdf
Size:
3.05 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Damadi_Open.pdf
Size:
322.15 KB
Format:
Adobe Portable Document Format
Description: