Machine Learning for Inferring CO2 Fluxes: The New Metaphysics of Neural Nets

dc.contributor.authorNguyen, Phuong
dc.contributor.authorHalem, Milton
dc.date.accessioned2020-01-27T15:16:53Z
dc.date.available2020-01-27T15:16:53Z
dc.date.issued2019-10-18
dc.description.abstractThe advent of direct high-resolution global surface measurements of CO2 from the recently launched NASA Orbiting Carbon Observatory (OCO-2) satellite offers an opportunity to improve the estimate of Net Ecosystem Exchange (NEE) over land. Long-term measurements of CO2flux obtained from eddy covariance instruments on Flux towers show large annual differences with NEE calculated by inverse methods using Land Surface Photosynthetic models. This suggests consideration of alternative approaches for calculating seasonal to annual global CO2 flux over land. Recent advances in deep machine learning models, including recurrent neural nets, have been successfully applied to many inverse measurement problems in the Earth and space sciences. We present evaluations of two deep machine learning models for estimating CO2 flux or NEE using station tower data acquired from the DOE Atmospheric Radiation Measurement (ARM), AmeriFlux and Fluxnet2015 station datasets. Our results indicate that deep learning models employing Recurrent Neural Networks (RNN) with the Long Short Term Memory (LSTM) provide significantly more accurate predictions of CO2 flux (~22% -28% improvements) than Feed Forward Neural Nets (FFNN) in terms of Root Mean Square Errors, correlation coefficients and anomaly correlations with observations. It was found that using heat flux as input variables also produce more accurate CO2 flux or NEE predictions. A non-intuitive machine learning metaphysical result was observed by the omission of CO2 concentrations as an input variable. Neural net models, in most cases, produce comparable accuracies of CO2 flux or NEE inferences, when trained with and without CO2 for the same station data.en_US
dc.description.sponsorshipThis study was funded by the NASA grant number NNH16ZDA001N-AIST16-0091. Special thanks are due to IBM for providing two Minsky nodes with each node having dual IBM Power8+CPUs and4 Nvidia P100 GPUs. We also acknowledge the support of the NSF supported Center for Hybrid Multicore Productivity Research at UMBC for providing access to this High Performance Computing resource and supporting staff. Authors acknowledge Dr. Pierre Gentine at the Dept of Earth and Environmental Engineering, Columbia University for providing comments. The data underlying the analyses was downloaded from the DOE ARM station athttps://adc.arm.gov, AmeriFlux data at http://ameriflux.lbl.gov/ and http://fluxnet.fluxdata.org/.en_US
dc.description.urihttps://eartharxiv.org/284f5/en_US
dc.format.extent21 pagesen_US
dc.genrejournal articles preprintsen_US
dc.identifierdoi:10.13016/m2zrpo-97yu
dc.identifier.citationNguyen, Phuong; Halem, Milton; Machine Learning for Inferring CO2 Fluxes: The New Metaphysics of Neural Nets (2019); https://eartharxiv.org/284f5/en_US
dc.identifier.urihttps://doi.org/10.31223/osf.io/284f5
dc.identifier.urihttp://hdl.handle.net/11603/17056
dc.language.isoen_USen_US
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department Collection
dc.relation.ispartofUMBC Faculty Collection
dc.rightsThis item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
dc.subjectinferring CO2 fluxesen_US
dc.subjectdeep learning modelsen_US
dc.subjectsensitivity testsen_US
dc.titleMachine Learning for Inferring CO2 Fluxes: The New Metaphysics of Neural Netsen_US
dc.typeTexten_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
2019_SubmitTo_EarthArxivPhuong_Halem_RNN_CO2_Flux (1).pdf
Size:
1.94 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.56 KB
Format:
Item-specific license agreed upon to submission
Description: