OPTIMIZATION ALGORITHMS FOR TRAINING DEEP NEURAL NETWORKS

dc.contributor.advisorPotra, Florian
dc.contributor.authorPraniewicz, Jared
dc.contributor.departmentMathematics and Statistics
dc.contributor.programMathematics, Applied
dc.date.accessioned2022-09-29T15:38:18Z
dc.date.available2022-09-29T15:38:18Z
dc.date.issued2021-01-01
dc.description.abstractA formal representation of a deep neural network is constructed, andit is demonstrated that networks satisfying the representation can be trained via feed forward back propagation efficiently. Analysis of the formal representation proves that optimization algorithms cannot have a computational complexity of less than O( | E | ) due to the dependence on back propagation. To ground the work in practice, a comparison is made of the popular optimization algorithms in use for training deep neural networks. The commonalities of the current algorithms provide a list of features to use and avoid when developing new deep learning optimization algorithms. Finally, two new optimization algorithms are developed. The first is linearized stochastic gradient descent (LSGD) which is a predictor-corrector method. Testing shows that LSGD achieves comparable or superior quality of fit to SGD, but with quicker and more stable initial training. The second is approximate stabilized Hessian gradient descent (ASHgrad) which is a quasiNewton method. ASHgrad finds high quality critical points and trains rapidly, but is slow to compute due to limitations in the current machine learning frameworks.
dc.formatapplication:pdf
dc.genredissertations
dc.identifierdoi:10.13016/m2niyv-ccnx
dc.identifier.other12447
dc.identifier.urihttp://hdl.handle.net/11603/26030
dc.languageen
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Mathematics and Mathematics and Statistics Department Collection
dc.relation.ispartofUMBC Theses and Dissertations Collection
dc.relation.ispartofUMBC Graduate School Collection
dc.relation.ispartofUMBC Student Collection
dc.rightsThis item may be protected under Title 17 of the U.S. Copyright Law. It is made available by UMBC for non-commercial research and education. For permission to publish or reproduce, please see http://aok.lib.umbc.edu/specoll/repro.php or contact Special Collections at speccoll(at)umbc.edu
dc.sourceOriginal File Name: Praniewicz_umbc_0434D_12447.pdf
dc.subjectAlgorithms
dc.subjectDeep Neural Networks
dc.subjectOptimization
dc.titleOPTIMIZATION ALGORITHMS FOR TRAINING DEEP NEURAL NETWORKS
dc.typeText
dcterms.accessRightsDistribution Rights granted to UMBC by the author.
dcterms.accessRightsAccess limited to the UMBC community. Item may possibly be obtained via Interlibrary Loan thorugh a local library, pending author/copyright holder's permission.

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Praniewicz_umbc_0434D_12447.pdf
Size:
677.11 KB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Praniewicz-Jared_Open.pdf
Size:
246.69 KB
Format:
Adobe Portable Document Format
Description: