Promising Hyperparameter Configurations for Deep Fully Connected Neural Networks to Improve Image Reconstruction in Proton Radiotherapy
Loading...
Links to Files
Author/Creator ORCID
Date
2022-01-13
Type of Work
Department
Program
Citation of Original Publication
S. A. York et al., "Promising Hyperparameter Configurations for Deep Fully Connected Neural Networks to Improve Image Reconstruction in Proton Radiotherapy," 2021 IEEE International Conference on Big Data (Big Data), Orlando, FL, USA, 2021, pp. 5648-5657, doi: 10.1109/BigData52589.2021.9671404.
Rights
© 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Abstract
Proton therapy is a unique form of radiotherapy
that utilizes protons to treat cancer by irradiating cancerous
tumors while avoiding unnecessary radiation exposure to surrounding healthy tissues. Real-time imaging of prompt gamma
rays can be used as a tool to make this form of therapy
more effective. The use of Compton cameras is one proposed
method for the real-time imaging of prompt gamma rays that
are emitted by the proton beams as they travel through a
patient’s body. The non-zero time resolution of the Compton
camera, during which all interactions are recorded as occurring
simultaneously, causes the reconstructed images to be noisy and
insufficiently detailed to evaluate the proton delivery for the
patient. Deep Learning has been a promising method used to
remove and correct the different problems existing within the
Compton Camera’s data. Previous papers have demonstrated the
effectiveness of using deep fully connected networks to correct
improperly ordered gamma interactions within the data. We
do a moderately large hyperparameter grid search to find a
promising set which yields competitive performance but contains
fewer neurons making it compact. The studies which have many
neurons, many layers, and a non-zero dropout rate have the
best testing accuracy. These many neuron and many layer
networks still have significantly fewer total neurons than the
current neural network implementation. If given considerably
more training time these compact networks could yield equal, if
not superior, testing accuracy when compared to larger networks.
More improvements are still needed for clinical use and we are
currently experimenting with recurrent neural networks to test
the viability of this type of architecture for this application.