Publication:
Effect of Gradient Descent Optimizers and Dropout Technique on Deep Learning LSTM Performance in Rainfall‑runoff Modeling

No Thumbnail Available
Date
2023
Authors
"Duong Tran Anh, Dat Vi Thanh, Hoang Minh Le, Bang Tran Sy, Ahad Hasan Tanim, Quoc Bao Pham, Thanh Duc Dang, Son T. Mai, Nguyen Mai Dang."
Journal Title
Journal ISSN
Volume Title
Publisher
Research Projects
Organizational Units
Journal Issue
Abstract
Machine learning and deep learning (ML-DL) based models are widely used for rainfall runof prediction and they have potential to substitute process-oriented physics based numerical models. However, developing an ML model has also performance uncertainty because of inaccurate choices of hyperparameters and neural networks architectures. Thus, this study aims to search for best optimization algorithms to be used in ML-DL models namely, RMSprop, Adagrad, Adadelta, and Adam optimizers, as well as dropout tech niques to be integrated into the Long Short Term Memory (LSTM) model to improve forecasting accuracy of rainfall-runof modeling. A deep learning LSTMs were developed using 480 model architectures at two hydro-meteorological stations of the Mekong Delta, Vietnam, namely Chau Doc and Can Tho. The model performance is tested with the most ideally suited LSTM optimizers utilizing combinations of four dropout percentages respec tively, 0%, 10%, 20%, and 30%. The Adagrad optimizer shows the best model performance in the model testing. Deep learning LSTM models with 10% dropout made the best predic tion results while signifcantly reducing overftting tendency of the forecasted time series. The fndings of this study are valuable for ML-based hydrological models set up by identi fying a suitable gradient descent (GD) optimizer and optimal dropout ratio to enhance the performance and forecasting accuracy of the ML model.
Description
Keywords
Optimizers · Dropout technique · LSTM · Rainfall-runof · Mekong delta
Citation