River discharge simulation using variable parameter McCarthy–Muskingum and wavelet-support vector machine methods
Date published
Free to read from
Authors
Supervisor/s
Journal Title
Journal ISSN
Volume Title
Publisher
Department
Type
ISSN
Format
Citation
Abstract
In this study, an extended version of variable parameter McCarthy–Muskingum (VPMM) method originally proposed by Perumal and Price (J Hydrol 502:89–102, 2013) was compared with the widely used data-based model, namely support vector machine (SVM) and hybrid wavelet-support vector machine (WASVM) to simulate the hourly discharge in Neckar River wherein significant lateral flow contribution by intermediate catchment rainfall prevails during flood wave movement. The discharge data from the year 1999 to 2002 have been used in this study. The extended VPMM method has been used to simulate 9 flood events of the year 2002, and later the results were compared with SVM and WASVM models. The analysis of statistical and graphical results suggests that the extended VPMM method was able to predict the flood wave movement better than the SVM and WASVM models. A model complexity analysis was also conducted which suggests that the two parameter-based extended VPMM method has less complexity than the three parameter-based SVM and WASVM model. Further, the model selection criteria also give the highest values for VPMM in 7 out of 9 flood events. The simulation of flood events suggested that both the approaches were able to capture the underlying physics and reproduced the target value close to the observed hydrograph. However, the VPMM models are slightly more efficient and accurate, than the SVM and WASVM model which are based only on the antecedent discharge data. The study captures the current trend in the flood forecasting studies and showed the importance of both the approaches (physical and data-based modeling). The analysis of the study suggested that these approaches complement each other and can be used in accurate yet less computational intensive flood forecasting.