Real-time Automatic Hyperparameter Tuning for Deep Learning in Serverless Cloud

dc.contributor.advisorYesha, Yelena
dc.contributor.authorKaplunovich, Alexander
dc.contributor.departmentComputer Science and Electrical Engineering
dc.contributor.programComputer Science
dc.date.accessioned2021-09-01T13:55:42Z
dc.date.available2021-09-01T13:55:42Z
dc.date.issued2020-01-01
dc.description.abstractDeep Neural Networks are used to solve many problems. The importance of the subject can be demonstrated by the fact that the 2019 Turing Award was given to Yoshua Bengio, Geoffrey Hinton, and Yann LeCun, the godfathers of AI and Neural Networks. In spite of the numerous advancements in the field, most of the machine learning models are being tuned manually. Experienced Data Scientists spend their valuable time tuning hyperparameters such as dropout rate, loss, or number of neurons. To solve this problem, we have implemented a flexible automatic real-time hyperparameter tuning methodology for an arbitrary Deep Neural Network (DNN) written in Python and Keras. We have also utilized several state-of-the-art cloud services, such as trigger based serverless computing (Lambda) and EC2 GPU spot instances, to implement automation, reliability and scalability. We have been using the available state-of-the-art algorithms for random search in the grid, Bayesian optimization, static code analysis to find the available hyperparameter candidates and code refactoring to tune those parameters in the Cloud. In this dissertations we have developed an automatic platform for hyperparameter optimization. Our approach works for an arbitrary neural network model written in Python and Keras. We have also integrated our methodology with trigger-based serverless cloud, providing infinite scalability, and reliability to the system. We have designed and developed an innovative cloud architecture for our methodology, orchestrating model tuning at scale, monitoring our experiments and utilizing numerous state-of-the-art cloud services. Our novel approach detects potential hyperparameters automatically from the source code, updates the original model to tune the parameters, runs the evaluation in the Cloud on spot instances, finds the optimal hyperparameters, and saves the results in the No-SQL database. We hope our platform will help millions of Data Scientists and researchers save their valuable time and money by tuning their advanced Machine Learning models faster.
dc.formatapplication:pdf
dc.genredissertations
dc.identifierdoi:10.13016/m2yeuk-xe3x
dc.identifier.other12310
dc.identifier.urihttp://hdl.handle.net/11603/22881
dc.languageen
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department Collection
dc.relation.ispartofUMBC Theses and Dissertations Collection
dc.relation.ispartofUMBC Graduate School Collection
dc.relation.ispartofUMBC Student Collection
dc.sourceOriginal File Name: Kaplunovich_umbc_0434D_12310.pdf
dc.subjectCloud
dc.subjectData Science
dc.subjectHyperparameters
dc.subjectMachine Learning
dc.subjectNeural Networks
dc.subjectServerless
dc.titleReal-time Automatic Hyperparameter Tuning for Deep Learning in Serverless Cloud
dc.typeText
dcterms.accessRightsAccess limited to the UMBC community. Item may possibly be obtained via Interlibrary Loan thorugh a local library, pending author/copyright holder's permission.
dcterms.accessRightsThis item may be protected under Title 17 of the U.S. Copyright Law. It is made available by UMBC for non-commercial research and education. For permission to publish or reproduce, please see http://aok.lib.umbc.edu/specoll/repro.php or contact Special Collections at speccoll(at)umbc.edu

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Kaplunovich_umbc_0434D_12310.pdf
Size:
2.94 MB
Format:
Adobe Portable Document Format