Halem, MiltonVashistha, Harsh Vardhan2021-01-292021-01-292018-01-0111930http://hdl.handle.net/11603/20775Data assimilation (DA) is the process of updating model forecasts (priors) with information from the observations of complete or incomplete state variables. The goal is to produce an improved model state (posteriors) which better represents its system. While previous work on using Artificial Neural Networks (ANN) for the purpose of data assimilation already exists, most of this work is on using feedforward networks and there is not much on Recurrent Neural Networks (RNN) for DA. In this paper, we proposed a way to use Long Short-Term Memory (LSTM) networks, a type of RNN to do ensemble DA on Lorenz-63 and Lorenz-96 models. Both of these models are known to be chaotic in nature and while Lorenz-63 has only 3 state variables, Lorenz-96 system of equations can represent much more than just 3 variables. We implemented a pre-processing pipeline to feed data to this network and presented the results of data assimilation using our network. We showed that LSTM networks can be used for the purpose of data assimilation and can produce results as good as the Ensemble Kalman Filter (EnKF) algorithm. Our network was trained against the Data Assimilation results of an EnKF for both of these models. LSTM networks were chosen since these networks avoid the problem of vanishing or exploding gradients which a vanilla RNN implementation suffers from.application:pdfArtificial Neural NetworkChaotic ModelsData AssimilationLorenz modelsLSTMRecurrent Neural NetworkRNN/LSTM Data Assimilation for the Lorenz Chaotic ModelsText