Enhanced Deep Learning Super-Resolution for Bathymetry Data

dc.contributor.authorLi, Xingyan
dc.contributor.authorLi, Jian
dc.contributor.authorWilliams, Zachary
dc.contributor.authorHuang, Xin
dc.contributor.authorCarroll, Mark
dc.contributor.authorWang, Jianwu
dc.date.accessioned2023-10-06T19:23:39Z
dc.date.available2023-10-06T19:23:39Z
dc.date.issued2023-03-13
dc.description2022 IEEE/ACM International Conference on Big Data Computing, Applications and Technologies (BDCAT); Vancouver, WA, USA; 06-09 December 2022en
dc.description.abstractSpatial resolution is critical for observing and monitoring environmental phenomena. Acquiring high-resolution bathymetry data directly from satellites is not always feasible due to limitations on equipment, so spatial data scientists and researchers turn to single image super-resolution (SISR) methods that utilize deep learning techniques as an alternative method to increase pixel density. While super resolution residual networks (e.g., SR-ResNet) are promising for this purpose, several challenges still need to be addressed: (1) Earth data such as bathymetry is expensive to obtain and relatively limited in its data record amount; (2) certain domain knowledge needs to be complied with during model training; (3) certain areas of interest require more accurate measurements than other areas. To address these challenges, following the transfer learning principle, we study how to leverage an existing pre-trained super-resolution deep learning model, namely SR-ResNet, for high-resolution bathymetry data generation. We further enhance the SR-ResNet model to add corresponding loss functions based on domain knowledge. To let the model perform better for certain spatial areas, we add additional loss functions to increase the penalty of the areas of interest. Our experiments show our approaches achieve higher accuracy than most baseline models when evaluating using metrics including MSE, PSNR, and SSIM.en
dc.description.sponsorshipThis work is partially supported by NSF grants: CAREER: Big Data Climate Causality (OAC-1942714) and HDR Institute: HARP - Harnessing Data and Model Revolution in the Polar Regions (OAC-2118285).en
dc.description.urihttps://ieeexplore.ieee.org/document/10062090en
dc.format.extent10 pagesen
dc.genreconference papers and proceedingsen
dc.identifierdoi:10.13016/m2a6lf-yjya
dc.identifier.citationLi, Xingyan, Jian Li, Zachary Williams, Xin Huang, Mark Carroll, and Jianwu Wang. “Enhanced Deep Learning Super-Resolution for Bathymetry Data.” In 2022 IEEE/ACM International Conference on Big Data Computing, Applications and Technologies (BDCAT), 49–57, 2022. https://doi.org/10.1109/BDCAT56447.2022.00014.en
dc.identifier.urihttps://doi.org/10.1109/BDCAT56447.2022.00014
dc.identifier.urihttp://hdl.handle.net/11603/30015
dc.language.isoenen
dc.publisherIEEEen
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Information Systems Department Collection
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Student Collection
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department
dc.relation.ispartofUMBC Joint Center for Earth Systems Technology (JCET)
dc.rightsPublic Domain Mark 1.0*
dc.rightsThis work was written as part of one of the author's official duties as an Employee of the United States Government and is therefore a work of the United States Government. In accordance with 17 U.S.C. 105, no copyright protection is available for such works under U.S. Law.en
dc.rights.urihttp://creativecommons.org/publicdomain/mark/1.0/*
dc.titleEnhanced Deep Learning Super-Resolution for Bathymetry Dataen
dc.typeTexten
dcterms.creatorhttps://orcid.org/0000-0002-9933-1170en

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Enhanced_Deep_Learning_Super-Resolution_for_Bathymetry_Data.pdf
Size:
1.49 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
2.56 KB
Format:
Item-specific license agreed upon to submission
Description: