Figure 8 is here.
Fig. 8 Graphical hydrograph separation method: For the
Gōno River catchmentN \(\ \approx\) 4 days.
In the case of flood assessment, information and complexity measures
should be customized in such a way that describe the patterns of floods
in terms of occurrence, frequency, etc., moreover, it should be
emphasized that it is very important to define the appropriate value forQthreshold . In this work, we adapted the
corresponding discharge value of the
designated maximum water level
introduced by the MLIT for each gauging station as a threshold and we
used the maximum daily discharge data. According to the MLIT, this level
is used as a guide for municipal mayors to issue evacuation warns, also
is used as a reference for evacuation decisions by the local residents,
etc.
We believe that employing two characters as a word length is, therefore,
suitable for the assessment of different future flood scenarios.
Increasing the word length to describe flood patterns, is not useful in
our opinion, because we assume that having high floods for more than two
days means a great natural and national disaster, bearing in mind that
we are considering the maximum daily discharge in our analyses. The
proposed analyses in this work suggest that the characterizing system
patterns by means of information and complexity measures could be
customized to be used in different ranges of applications.
In the case of flood assessment, one of the most important applications
of considering different word patterns is to propose new contour
inundation maps and/or hazard maps for the different discharge gauging
stations, to support policy makers to improve their understanding and
choose better decisions and alternatives for the related issues and
future projects.
Inferences from low and high frequency
analyses
Quantifying streamflow patterns by means of information and complexity
metrics and addressing different aggregation lengths revealed various
interesting behaviors of streamflow during low and high frequencies.
Regarding low frequency findings, it can be seen that using different
aggregation lengths, the information metrics (metric entropy and mean
information gain) for streamflow data recorded at the studied stations
have obviously two scaling regimes. The first one with steep slope for
shorter AL ranges, and the other one for longer AL ranges. In fact, this
finding matches with the results of Al Sawaf et al., (2017) who studied
the discharge fluctuations in the Gōno River by means of Detrended
Fluctuation Analysis (DFA) and reported of the presence of two scaling
regimes of the river discharge fluctuations separated by a crossover
time observed around 3-5 days. To compare, it can be noticed in Fig. 5(a
& b) the long AL ranges (i.e. AL greater than 20 has a mild slope which
is similar to the outcomes of DFA results indicating that this range may
reveal the long-memory characteristics of the river flow fluctuations.
Of interest, both information and complexity contents evaluated for the
studied stations showed similar crossover times detected roughly at
AL\(\approx\) 20 hours equivalent to 80 hours (see Fig. 7(c&d)).
However, one of the challenging tasks in DFA or spectral analyses is to
find the crossover time accurately. In the case of these approaches, the
crossover time is usually estimated by performing a linear regression
fit to the suspected regimes separately, thus, the intersection point of
the two fitting lines composes the crossover time. Nevertheless, the
findings revealed that crossover times may be estimated from the
corresponding aggregation length time where the fluctuation complexity
value reaches its peak according to the information-complexity diagram
as can be seen in Fig. 6 (Also, refer to the Table 1 in the
supplementary materials). In the case of the Ozekiyama station, the
crossover time observed at AL= 14 hours equivalent to 56 hours, i.e. the
crossover time (56 hours) = aggregation length (14) * word length (4
characters). Therefore, further investigations are still required to
interpret and decipher the nested relationships between the information
metrics and fractal analysis.
Regarding the high frequency analyses, an interesting phenomenon was
observed namely the presence of an extra scaling regime that occurs
during sub-daily scales captured by FAT records can be observed at
AL\(\leq\ 4\) equivalent to 16 hours. To verify the existence of this
scale, we estimated the power spectrum for the discharge records
obtained by both Ozekiyama and FAT stations using the proposed model by
Dolgonosov et al., (2008), presented in Fig. 9. As can be distinguished,
the spectral analysis shows that both RC and FAT data have two main
scaling denoted by S1 for long ranges that are roughly quite similar,
and S2 for mid ranges (Fig. 9) with a crossover time around 60 hours
which is very near to AL=14 (i.e. 56 hours). Nevertheless, it can also
be realized that the presence of a specific slope captured by FAT data
namely S3 which is somehow near to AL=4 (i.e. 16 hours). This finding
seems to confirm our hypothesis about the existence of an additional
scaling regime happens within very short time scales and can be captured
by FAT as can be seen in Fig. 7a. However, the slight variation may be
confirmed by comparing with another scaling method (fractal analysis) or
considering shorter word length for high frequency analysis (e.g. 3
characters per word).
The last remaining question is why there is an additional regime that
was captured by means of FAT. Though it needs further exploration to
clearly describe this phenomenon, it can be said that the FAT system
measures the discharge according to the fundamental discharge equation
as given in Eq (2). Unlike discharge estimated by means of the RC
method, the velocity and area (depth) terms are embedded directly in
streamflow estimates and hence FAT can clearly show the high resolution
of discharge estimate.